http://*.com/questions/17448102/ios-masking-an-image-keeping-retina-scale-factor-in-account
I want to mask an image by passing another image as mask. I am able to mask the image but the resulting image doesn't look good. It is jagged at borders.
I guess the problem is related to retina graphics. The scale property for the two images are different as:
- The image from which I want to mask has a scale value 1. This image generally has a resolution greater than 1000x1000 pixels.
- The image according to which I want the resulting image(image having black and white colors only) has scale value 2. This image is generally of resolution 300x300 pixels.
The resulting image has a scale value of 1.
The code I am using is:
+ (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
CGImageRef maskRef = maskImage.CGImage;
CGImageRef mask = CGImageMaskCreate(CGImageGetWidth(maskRef),
CGImageGetHeight(maskRef),
CGImageGetBitsPerComponent(maskRef),
CGImageGetBitsPerPixel(maskRef),
CGImageGetBytesPerRow(maskRef),
CGImageGetDataProvider(maskRef), NULL, false);
CGImageRef masked = CGImageCreateWithMask([image CGImage], mask);
CGImageRelease(mask);
UIImage *maskedImage = [UIImage imageWithCGImage:masked ];
CGImageRelease(masked);
return maskedImage;
}
How can I get a masked image which follows retina scale?