IOS 中的CoreImage框架(framework) - time4cnblogs
- coreimage framework 组成
apple 已经帮我们把image的处理分类好,来看看它的结构:
主要分为三部分:
1)定义部分:CoreImage 何CoreImageDefines。见名思义,代表了CoreImage 这个框架和它的定义。
2)操作部分:
滤镜(CIFliter):CIFilter 产生一个CIImage。典型的,接受一到多的图片作为输入,经过一些过滤操作,产生指定输出的图片。
检测(CIDetector):CIDetector 检测处理图片的特性,如使用来检测图片中人脸的眼睛、嘴巴、等等。
特征(CIFeature):CIFeature 代表由 detector处理后产生的特征。
3)图像部分:
画布(CIContext):画布类可被用与处理Quartz 2D 或者 OpenGL。可以用它来关联CoreImage类。如滤镜、颜色等渲染处理。
颜色(CIColor): 图片的关联与画布、图片像素颜色的处理。
向量(CIVector): 图片的坐标向量等几何方法处理。
图片(CIImage): 代表一个图像,可代表关联后输出的图像。
2. 处理步骤:
1)create a ciimage object;
2) create a cifilter object and set input values
3) create a cicontext object.
4) render the filter output image into a cgimage
3.注意
a。关注Ciimage 产生的途径:
1)通过URL和Data
2)通过其他图片类转换,CGImageRef或其他图片。
3)通过CVpixelBufferRef。
4)一组像素Data。
b. 图片颜色,KCCImageColorSpace 来重载默认颜色空间。
c. 图片Metadata。
4. 使用滤镜。
CISepiaTone、CiColorControls、CIHueBlendMode。
处理过程:多个CImage输入 -- 》 CIHeBlendMode --》 CiSepiatone。
渲染输出:
流程: 获取context -》 转成CIimage -》 渲染成CGImageRef -》 转换为UIimage -》 释放 CGImageRef -》 使用UIImage。
5.脸部检测
自动增强: CIRedEyeCorrection 、CIFaceBalance(调整图片来给出更好的皮肤色调)、CIVibrance(在不扭曲皮肤色调的情况下,增加饱和度)、CIToneCurve(调整图片对比)、高亮阴影调整。
- (UIImage*)saturateImage:(float)saturationAmount withContrast:(float)contrastAmount{
UIImage *sourceImage = self; CIContext *context = [CIContext contextWithOptions:nil]; CIFilter *filter= [CIFilter filterWithName:@"CIColorControls"]; CIImage *inputImage = [[CIImage alloc] initWithImage:sourceImage]; [filter setValue:inputImage forKey:@"inputImage"]; [filter setValue:[NSNumber numberWithFloat:saturationAmount] forKey:@"inputSaturation"];
[filter setValue:[NSNumber numberWithFloat:contrastAmount] forKey:@"inputContrast"]; return [UIImage imageWithCGImage:[context createCGImage:filter.outputImage fromRect:filter.outputImage.extent]]; } - (UIImage*)vignetteWithRadius:(float)inputRadius andIntensity:(float)inputIntensity{ CIContext *context = [CIContext contextWithOptions:nil]; CIFilter *filter= [CIFilter filterWithName:@"CIVignette"]; CIImage *inputImage = [[CIImage alloc] initWithImage:self]; [filter setValue:inputImage forKey:@"inputImage"]; [filter setValue:[NSNumber numberWithFloat:inputIntensity] forKey:@"inputIntensity"];
[filter setValue:[NSNumber numberWithFloat:inputRadius] forKey:@"inputRadius"]; return [UIImage imageWithCGImage:[context createCGImage:[filter outputImage] fromRect:filter.outputImage.extent]]; } -(UIImage*)worn{
CIImage *beginImage = [[CIImage alloc] initWithImage:self]; CIFilter *filter = [CIFilter filterWithName:@"CIWhitePointAdjust"
keysAndValues: kCIInputImageKey, beginImage,
@"inputColor",[CIColor colorWithRed: green: blue: alpha:],
nil];
CIImage *outputImage = [filter outputImage]; CIFilter *filterB = [CIFilter filterWithName:@"CIColorControls"
keysAndValues: kCIInputImageKey, outputImage,
@"inputSaturation", [NSNumber numberWithFloat:.],
@"inputContrast", [NSNumber numberWithFloat:0.8],
nil];
CIImage *outputImageB = [filterB outputImage]; CIFilter *filterC = [CIFilter filterWithName:@"CITemperatureAndTint"
keysAndValues: kCIInputImageKey, outputImageB,
@"inputNeutral",[CIVector vectorWithX: Y: Z:],
@"inputTargetNeutral",[CIVector vectorWithX: Y: Z:],
nil];
CIImage *outputImageC = [filterC outputImage];
CIContext *context = [CIContext contextWithOptions:nil];
return [UIImage imageWithCGImage:[context createCGImage:outputImageC fromRect:outputImageC.extent] scale:1.0 orientation:self.imageOrientation];
} -(UIImage* )blendMode:(NSString *)blendMode withImageNamed:(NSString *) imageName{ /*
Blend Modes CISoftLightBlendMode
CIMultiplyBlendMode
CISaturationBlendMode
CIScreenBlendMode
CIMultiplyCompositing
CIHardLightBlendMode
*/ CIImage *inputImage = [[CIImage alloc] initWithImage:self]; //try with different textures
CIImage *bgCIImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:imageName]]; CIContext *context = [CIContext contextWithOptions:nil]; CIFilter *filter= [CIFilter filterWithName:blendMode]; // inputBackgroundImage most be the same size as the inputImage [filter setValue:inputImage forKey:@"inputBackgroundImage"];
[filter setValue:bgCIImage forKey:@"inputImage"]; return [UIImage imageWithCGImage:[context createCGImage:[filter outputImage] fromRect:filter.outputImage.extent]]; } - (UIImage *)curveFilter
{
CIImage *inputImage =[[CIImage alloc] initWithImage:self]; CIContext *context = [CIContext contextWithOptions:nil]; CIFilter *filter = [CIFilter filterWithName:@"CIToneCurve"]; [filter setDefaults];
[filter setValue:inputImage forKey:kCIInputImageKey];
[filter setValue:[CIVector vectorWithX:0.0 Y:0.0] forKey:@"inputPoint0"]; // default
[filter setValue:[CIVector vectorWithX:0.25 Y:0.15] forKey:@"inputPoint1"];
[filter setValue:[CIVector vectorWithX:0.5 Y:0.5] forKey:@"inputPoint2"];
[filter setValue:[CIVector vectorWithX:0.75 Y:0.85] forKey:@"inputPoint3"];
[filter setValue:[CIVector vectorWithX:1.0 Y:1.0] forKey:@"inputPoint4"]; // default return [UIImage imageWithCGImage:[context createCGImage:[filter outputImage] fromRect:filter.outputImage.extent]]; }