自定义拍照或者录视频的功能,就需要用到AVFoundation框架,目前我只用到了拍照,所以记录下自定义拍照用法,视频用法等用上了再补充,应该是大同小异
以拍照过程为例,实现主要包括以下几个部分:
1,首先要判断用户授权:
let authorizationStatus = AVCaptureDevice.authorizationStatusForMediaType(AVMediaTypeVideo)
switch authorizationStatus {
case.NotDetermined:
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: { (granted: Bool) -> Void in
if granted {
self.configureCamera()
} else {
self.showErrorAlertView()
}
})
case.Authorized:
self.configureCamera()
default:
self.showErrorAlertView()
case.NotDetermined:
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: { (granted: Bool) -> Void in
if granted {
self.configureCamera()
} else {
self.showErrorAlertView()
}
})
case.Authorized:
self.configureCamera()
default:
self.showErrorAlertView()
}
2,AVCaptureInput
AVCaptureInput是抽象出来拍照过程的输入端,这个可以是前置摄像头,后置摄像头或者麦克风
需要通过 public init(device: AVCaptureDevice!) throws这个方法来初始化获得
而AVCaptureDevice 是苹果定义的捕获设备的抽象,用 public class func devicesWithMediaType(mediaType: String!) -> [AnyObject]! 方法来获得。
这个mediaType: String 又是定义在 AVMediaFormat.h.中的字符串,这里用到的是 AVMediaTypeVideo
3,AVCaptureOutput
AVCaptureOutput是抽象出来的拍照过程的输出端,可以是图片,视频,音频等,主要有:AVCaptureAudioDataOutput、AVCaptureAudioPreviewOutput、AVCaptureFileOutput、AVCaptureStillImageOutput、AVCaptureVideoDataOutput、AVCaptureAudioFileOutput、AVCaptureMovieFileOutput
其中AVCaptureFileOutput依然代表输出到文件的输出端,
这里用到的是AVCaptureStillImageOutput
4,AVCaptureSession
AVCaptureSession就是抽象出来的拍照过程,它需要一个输入:AVCaptureInput,一个输出:AVCaptureOutput,
if session.canAddInput(cameraInput) {
session.addInput(cameraInput)
}
if session.canAddInput(cameraInput) {
session.addInput(cameraInput)
}
然后调用session的startRunning方法,就开始捕获图像信息了,一般这里会用到AVCaptureVideoPreviewLayer,用来做图像的预览:
用session初始化previewLayer
previewLayer = AVCaptureVideoPreviewLayer(session: self.session)
然后设置layer的frame,把该layer添加到你想要显示预览的地方就可以了
5,在调用session的startRunning之前,还可以对相机进行聚焦,白平衡等设置
if let tempDevice = self.currentCameraDevice {
try tempDevice.lockForConfiguration()
if tempDevice.hasFlash {
tempDevice.flashMode = self.flashMode
}
if tempDevice.isFocusModeSupported(.AutoFocus) {
tempDevice.focusMode = .AutoFocus
}
if tempDevice.isWhiteBalanceModeSupported(.AutoWhiteBalance) {
tempDevice.whiteBalanceMode = .AutoWhiteBalance
}
tempDevice.unlockForConfiguration()
if tempDevice.hasFlash {
tempDevice.flashMode = self.flashMode
}
if tempDevice.isFocusModeSupported(.AutoFocus) {
tempDevice.focusMode = .AutoFocus
}
if tempDevice.isWhiteBalanceModeSupported(.AutoWhiteBalance) {
tempDevice.whiteBalanceMode = .AutoWhiteBalance
}
tempDevice.unlockForConfiguration()
}
6,获取图片到指定设备
就是调用stillImageOutput的方法来获取图片,方法如下
let connection = self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(connection) { [unowned self] (imageDataSampleBuffer: CMSampleBuffer!, error) -> Void in
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(connection) { [unowned self] (imageDataSampleBuffer: CMSampleBuffer!, error) -> Void in
if error == nil {
// 如果使用 session .Photo 预设,或者在设备输出设置中明确进行了设置,就能获得已经压缩为JPEG的数据
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
// 样本缓冲区也包含元数据
//let metadata:NSDictionary = CMCopyDictionaryOfAttachments(nil, imageDataSampleBuffer, CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))!
if let image = UIImage(data: imageData) {
//这个image就是获取到的拍照获得的image,这里可以保存到相册,或者修改,或者想干什么干什么
//注意这个image还是摄像头拍下来时的分辨率,并不是你设置的layer大小的,如果还需要剪裁,就剪裁之后在保存
}
} else {
DLog("error while capturing still image: \(error)")
}
} else {
DLog("error while capturing still image: \(error)")
}
}