CI Filter to create Black & White image?

You can use the CIColorMap filter. Give it a 20(w) by 1(h) jpg where the left half is white and the right half is black (or the other way around) and use that as your color map gradient. That seems to hammer down the colors nicely. I originally tried a 2x1 image with 1 white pixel and one black, but it looked like it got interpolated a bit. I went up to 20x1 and it worked fine.

Hint: I used Core Image Funhouse (not Quartz Composer) to experiment.


Many solutions on the internet create grayscale values. There is no need for filters. If you need such than use code such as

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
CGImageRef greyImage = CGImageCreateCopyWithColorSpace(backingImage, colorSpace);
CGColorSpaceRelease(colorSpace);

Proper threshold filter follows: inputThreshold can be float value from 0.0f to 0.5f

https://gist.github.com/xhruso00/a3f8a9c8ae7e33b8b23d

Detailed recipe can be found at https://developer.apple.com/library/ios/documentation/graphicsimaging/Conceptual/CoreImaging/ci_custom_filters/ci_custom_filters.html#//apple_ref/doc/uid/TP30001185-CH6-CJBEDHHH

Multiple metallibs depending on target Compile each .metal file into a separate .metallib

Metal file (fast math iOS12+)

#include <metal_stdlib>
using namespace metal;

//https://en.wikipedia.org/wiki/List_of_monochrome_and_RGB_palettes

//https://en.wikipedia.org/wiki/Relative_luminance
//https://en.wikipedia.org/wiki/Grayscale
constant half3 kRec709Luma  = half3(0.2126, 0.7152, 0.0722);
constant half3 kRec601Luma  = half3(0.299 , 0.587 , 0.114);
//constant float3 kRec2100Luma = float3(0.2627, 0.6780, 0.0593);

#include <CoreImage/CoreImage.h>

extern "C" { namespace coreimage {

    half lumin601(half3 p)
    {
        return dot(p.rgb, kRec601Luma);
    }

    half lumin709(half3 p)
    {
        return dot(p.rgb, kRec709Luma);
    }

    half4 thresholdFilter(sample_h image, half threshold)
    {
        half4 pix = unpremultiply(image);
        half luma = lumin601(pix.rgb);
        pix.rgb = half3(step(threshold, luma));
        return premultiply(pix);
    }
}}

Objective-C file

#import "CIFilter+BlackAndWhiteThresholdFilter.h"
#include <objc/runtime.h>
#import <CoreImage/CoreImageDefines.h>
#import <Metal/Metal.h>

@class BlackAndWhiteThresholdFilter;

static NSString *const kCIInputThreshold = @"inputThreshold";
NSString *const kBlackAndWhiteThresholdFilterName = @"BlackAndWhiteThreshold";
static NSString *const kBlackAndWhiteThresholdFilterDisplayName = @"Black & White Threshold";

@implementation CIFilter(BlackAndWhiteThresholdFilter)

@dynamic inputImage;
@dynamic threshold;

+ (CIFilter<BlackAndWhiteThreshold>*) blackAndWhiteThresholdFilter
{
    [BlackAndWhiteThresholdFilter class]; //kick off initialize to register filter
    CIFilter<BlackAndWhiteThreshold>*filter = (CIFilter<BlackAndWhiteThreshold>*)[CIFilter filterWithName:kBlackAndWhiteThresholdFilterName];
    static dispatch_once_t oncePredicate;
    dispatch_once(&oncePredicate, ^{
        /// convenience removing input keyword
        class_addMethod([self class], @selector(threshold), (IMP)floatGetter, "f@:");
        class_addMethod([self class], @selector(setThreshold:), (IMP)floatSetter, "v@:f");
    });
    return filter;
}

static float floatGetter(id self, SEL _cmd) {

    NSString *selector = NSStringFromSelector(_cmd);
    ///capitalize first letter
    NSString *firstLetter = [[selector substringWithRange:NSMakeRange(0, 1)] uppercaseString];
    NSString *key = [@"input" stringByAppendingString:[selector stringByReplacingCharactersInRange:NSMakeRange(0, 1) withString:firstLetter]];
    id value = [self valueForKey:key];
    float number = NAN;
    if (value && [value isKindOfClass:[NSNumber class]]) {
        number = [value floatValue];
    }
    return number;
}

static void floatSetter(id self, SEL _cmd, float value) {
    NSString *selector = NSStringFromSelector(_cmd);
    NSString *aaa = [selector stringByReplacingCharactersInRange:NSMakeRange(0, 3) withString:@"input"];
    [self setValue:@(value) forKey:[aaa substringWithRange:NSMakeRange(0, [aaa length] - 1)]];
}


@end

@interface BlackAndWhiteThresholdFilter()
{
    CIColorKernel *_kernel;
}
@end

@implementation BlackAndWhiteThresholdFilter

//more https://developer.apple.com/library/ios/documentation/graphicsimaging/Conceptual/CoreImaging/ci_image_units/ci_image_units.html#//apple_ref/doc/uid/TP30001185-CH7-SW8
+ (void)initialize
{
    //verify registration with [CIFilter filterNamesInCategories:@[kCICategoryVideo]]
    //registering class responsible for CIFilter execution
    [CIFilter registerFilterName:kBlackAndWhiteThresholdFilterName
                     constructor:(id <CIFilterConstructor>)self //self means class BlackAndWhiteThresholdFilter
                 classAttributes:@{
                     kCIAttributeFilterCategories: @[
                             kCICategoryVideo,
                             kCICategoryStillImage,
                             kCICategoryCompositeOperation,
                             kCICategoryInterlaced,
                             kCICategoryNonSquarePixels
                     ],
                     kCIAttributeFilterDisplayName: kBlackAndWhiteThresholdFilterDisplayName,

                 }];
}


+ (CIFilter *)filterWithName:(NSString *)aName
{
    return [[self alloc] init];
}

- (instancetype)init {
    self = [super init];
    if (self) {

        BOOL supportsMetal;

#if TARGET_OS_IOS
        supportsMetal = MTLCreateSystemDefaultDevice() != nil; //this forces GPU on macbook to switch immediatelly
#else
        supportsMetal = [MTLCopyAllDevices() count] >= 1;
#endif

        if (@available(macOS 10.13, *)) {

            //10.13 there are macbooks without metal + other forced installations
            if (supportsMetal) {
                _kernel = [self metalKernel];
            } else {
                _kernel = [self GLSLKernel];
            }

        } else {
            _kernel = [self GLSLKernel];
        }

        if (_kernel == nil) return nil;
    }

    return self;
}

- (CIColorKernel *)metalKernel
{

    NSURL *URL = [[NSBundle mainBundle] URLForResource:@"default" withExtension:@"metallib"]; //default is
    NSData *data = [NSData dataWithContentsOfURL:URL];
    NSError *error;
    if (error) {
        NSLog(@"%@", error);
    }
    return [CIColorKernel kernelWithFunctionName:@"thresholdFilter" fromMetalLibraryData:data error:&error];
}

- (CIColorKernel *)GLSLKernel //OpenGL Shading Language
{
    // WWDC 2017 510 - disadvanage is that this needs to be compiled on first run (performance penalty)
    NSString *kernelString = [self colorKernelText];
    return [CIColorKernel kernelWithString:kernelString]; // to suppress warning define CI_SILENCE_GL_DEPRECATION in pre-processor macros
}

- (NSString *)colorKernelText
{
    return
    @""
    "float lumin601(vec3 p)"
    "{"
    "  return dot(p, vec3(0.299 , 0.587 , 0.114));"
    "}"
    ""
    "kernel vec4 thresholdFilter(__sample image, float inputThreshold)"
    "{"
    "  vec4 src = unpremultiply( image) );"
    "  float luma = lumin601( src.rgb );"
    "  src.rgb = vec3( step( inputThreshold, luma));"
    "  return premultiply(src);"
    "}";
}

//kept for reference purpose 
- (NSString *)oldNonColorKernelText
{
    return
    @""
    "float lumin601(vec3 p)"
    "{"
    "  return dot(p, vec3(0.299 , 0.587 , 0.114));"
    "}"
    ""
    "kernel vec4 thresholdFilter(sampler image, float inputThreshold)"
    "{"
    "  vec4 src = unpremultiply( sample(image, samplerCoord(image)) );"
    "  float luma = lumin601( src.rgb );"
    "  src.rgb = vec3( step( inputThreshold, luma));"
    "  return premultiply(src);"
    "}";
}

- (NSArray *)inputKeys {
    return @[kCIInputImageKey, kCIInputThreshold];
}

- (NSArray *)outputKeys {
    return @[kCIOutputImageKey];
}

// ------------  ------------  ------------  ------------  ------------  ------------
#pragma mark - CIFilter Protocol

+ (NSDictionary *)customAttributes
{
    NSDictionary *inputThreshold = @{
                                         kCIAttributeType: kCIAttributeTypeScalar,
                                         kCIAttributeMin: @0.0f,
                                         kCIAttributeMax: @1.0f,
                                         kCIAttributeIdentity : @0.00,
                                         kCIAttributeDefault: @0.5f,
                                         };

    return @{
             kCIInputThreshold : inputThreshold,
             // This is needed because the filter is registered under a different name than the class.
             kCIAttributeFilterName : kBlackAndWhiteThresholdFilterName
             };
}

- (CIImage *)outputImage {

    CIImage *inputImage = [self inputImage];

    if ([self inputImage] == nil) {
        return nil;
    }

    CIImage *outputImage;

    outputImage = [_kernel applyWithExtent:[inputImage extent]
                               roiCallback:^CGRect(int index, CGRect destRect) { return destRect; }
                                 arguments:@[inputImage, [self inputThreshold]]];

    return outputImage;
}

@end

Updated with visual difference between black&white(left) and grayscale(right)

enter image description here


Took some time to figure out the code for CIColorMap so i wanted to post this. Joshua has the answer above. This is just an example of implementation...

CIImage *beginImage = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"wedding" ofType:@"jpg"]]];
CIImage *inputGradientImage = [CIImage imageWithContentsOfURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"grad" ofType:@"png"]]];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:@"CIColorMap" keysAndValues:kCIInputImageKey, beginImage, @"inputGradientImage",inputGradientImage, nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImage = [UIImage imageWithCGImage:cgimg];
self.imageView.image = newImage;
CGImageRelease(cgimg);