iOS开发-实现相机app的方法[转载自官方]
This brief code example to illustrates how you can capture video and convert the frames you get to UIImage objects. It shows you how to:
- Create an AVCaptureSession object to coordinate the flow of data from an AV input device to an output
- Find the AVCaptureDevice object for the input type you want
- Create an AVCaptureDeviceInput object for the device
- Create an AVCaptureVideoDataOutput object to produce video frames
- Implement a delegate for the AVCaptureVideoDataOutput object to process video frames
- Implement a function to convert the CMSampleBuffer received by the delegate into a UIImage object
Note: To focus on the most relevant code, this example omits several aspects of a complete application, including memory management. To use AV Foundation, you are expected to have enough experience with Cocoa to be able to infer the missing pieces.
Create and Configure a Capture Session
You use an AVCaptureSession object to coordinate the flow of data from an AV input device to an output. Create a session, and configure it to produce medium resolution video frames.
AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetMedium;
Create and Configure the Device and Device Input
Capture devices are represented by AVCaptureDevice objects; the class provides methods to retrieve an object for the input type you want. A device has one or more ports, configured using an AVCaptureInput object. Typically, you use the capture input in its default configuration.
Find a video capture device, then create a device input with the device and add it to the session.
AVCaptureDevice *device =
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input =[AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
}
[session addInput:input];
Create and Configure the Data Output
You use an AVCaptureVideoDataOutput object to process uncompressed frames from the video being captured. You typically configure several aspects of an output. For video, for example, you can specify the pixel format using the videoSettings property, and cap the frame rate by setting the minFrameDuration property.
Create and configure an output for video data and add it to the session; cap the frame rate to 15 fps by setting the minFrameDuration property to 1/15 second:
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
output.videoSettings =@{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
output.minFrameDuration = CMTimeMake(, );
The data output object uses delegation to vend the video frames. The delegate must adopt the AVCaptureVideoDataOutputSampleBufferDelegate protocol. When you set the data output’s delegate, you must also provide a queue on which callbacks should be invoked.
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
You use the queue to modify the priority given to delivering and processing the video frames.
Implement the Sample Buffer Delegate Method
In the delegate class, implement the method (captureOutput:didOutputSampleBuffer:fromConnection:) that is called when a sample buffer is written. The video data output object delivers frames as CMSampleBuffers, so you need to convert from the CMSampleBuffer to a UIImage object. The function for this operation is shown in “Converting a CMSampleBuffer to a UIImage.”
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
UIImage *image = imageFromSampleBuffer(sampleBuffer);
// Add your code here that uses the image.
}
Remember that the delegate method is invoked on the queue you specified in setSampleBufferDelegate:queue:; if you want to update the user interface, you must invoke any relevant code on the main thread.
Starting and Stopping Recording
After configuring the capture session, you send it a startRunning message to start the recording.
[session startRunning];
To stop recording, you send the session a stopRunning message.
DEMO Code
这个demo 运行时出了一个问题,- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection 里载到图后把图片转换为UIImage后转出来后怎么都不显示图片, 经查后直接转为NSData传出来后一切正常, 特别说明一下。
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetLow;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];
// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
//ARC下不再使用
//dispatch_release(queue);
// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// 添加界面显示
AVCaptureVideoPreviewLayer *previewLayer = nil;
previewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:session] autorelease];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[[self view] layer] bounds];
[previewLayer setBounds:layerRect];
[previewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:previewLayer];
// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
// output.minFrameDuration = CMTimeMake(1, 15);
// Start the session running to start the flow of data
[session startRunning];
sessionGlobal = session;
// Assign session to an ivar.
// [self setSession:session];
isCapture = FALSE;
UIView *v = [[UIView alloc] initWithFrame:CGRectMake(, , , )];
v.backgroundColor = [UIColor blueColor];
v.layer.masksToBounds = YES;
v1 = [v retain];
[self.view addSubview:v];
// [v release];
start = [[NSDate date] timeIntervalSince1970];
before = start;
num = ;
}
(NSTimeInterval)getTimeFromStart
{
NSDate* dat = [NSDate dateWithTimeIntervalSinceNow:];
NSTimeInterval now = [dat timeIntervalSince1970]*;
NSTimeInterval b = now - start;
return b;
}
- (void)showImage:(NSData *)topImageData
{
if(num > )
{
[sessionGlobal stopRunning];
return;
}
num ++;
NSString *numStr = [NSString stringWithFormat:@"%d.jpg", num];
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:numStr];
NSLog(@"PATH : %@", path);
[topImageData writeToFile:path atomically:YES];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(, , , )];
imageView.layer.masksToBounds = YES;
imageView.backgroundColor = [UIColor redColor];
UIImage *img = [[UIImage alloc] initWithData:topImageData];
imageView.image = img;
[img release];
[self.view addSubview:imageView];
[imageView release];
[self.view setNeedsDisplay];
// [v1 setNeedsDisplay];
}
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
NSDate* dat = [NSDate dateWithTimeIntervalSinceNow:];
NSTimeInterval now = [dat timeIntervalSince1970]*;
NSLog(@" before: %f num: %f" , before, now - before);
if((now - before) > )
{
before = [[NSDate date] timeIntervalSince1970];
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
if(image != nil)
{ // NSTimeInterval t = [self getTimeFromStart];
NSData* topImageData = UIImageJPEGRepresentation(image, 1.0);
[self performSelectorOnMainThread:@selector(showImage:) withObject:topImageData waitUntilDone:NO];
}
}
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer,);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!colorSpace)
{
NSLog(@"CGColorSpaceCreateDeviceRGB failure");
return nil;
}
// Get the base address of the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the data size for contiguous planes of the pixel buffer.
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
// Create a Quartz direct-access data provider that uses data we supply
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize,
NULL);
// Create a bitmap image from data supplied by our data provider
CGImageRef cgImage =
CGImageCreate(width,
height,
,
,
bytesPerRow,
colorSpace,
kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
provider,
NULL,
true,
kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
// Create and return an image object representing the specified Quartz image
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CVPixelBufferUnlockBaseAddress(imageBuffer, );
return image;
}
7 结束捕捉
- (void)stopVideoCapture:(id)arg
{
//停止摄像头捕抓
if(self->avCaptureSession){
[self->avCaptureSession stopRunning];
self->avCaptureSession= nil;
[labelStatesetText:@"Video capture stopped"];
}
iOS开发-实现相机app的方法[转载自官方]的更多相关文章
- iOS开发UI篇—APP主流UI框架结构
iOS开发UI篇—APP主流UI框架结构 一.简单示例 说明:使用APP主流UI框架结构完成简单的界面搭建 搭建页面效果: 二.搭建过程和 ...
- iOS开发之提交App中断出现:Cannot proceed with delivery: an existing transporter instance is currently uploading this package
iOS开发之提交App中断出现:Cannot proceed with delivery: an existing transporter instance is currently uploadin ...
- iOS开发 GET、POST请求方法(NSURLSession篇)
NSURLConnection,在iOS9被宣布弃用,本文不使用NSURLConnection进行网络编程,有兴趣的童鞋可以参考: [iOS开发 GET.POST请求方法(NSURLConnectio ...
- iOS开发 GET、POST请求方法(NSURLConnection篇)
Web Service使用的主要协议是HTTP协议,即超文本传输协议. HTTP/1.1协议共定义了8种请求方法(OPTIONS.HEAD.GET.POST.PUT.DELETE.TRACE.CONN ...
- iOS开发中的Html解析方法
iOS开发中的Html解析方法 本文作者为大家介绍了在iOS开发中的Html解析方法,并同时提供了Demo代码的下载链接,Demo 解析了某个网站(具体可在代码中查看)的html网页,提取了图片以及标 ...
- iOS开发 GET、POST请求方法:NSURLSession篇
NSURLConnection,在iOS 9被宣布弃用,本文不使用NSURLConnection进行网络编程,有兴趣的童鞋可以参考: iOS开发 GET.POST请求方法(NSURLConnectio ...
- iOS开发-UIApplication和App启动状态
UIApplication简单从字面上了解就是应用程序,开发的时候有的时候会根据需要调用其中的方法,看起来不起眼,实际在iOS开发UIApplication提供了iOS程序运行期间的控制和协作工作.每 ...
- iOS开发系列之app的一天
本文主要讲述我对 iOS 开发的一些理解,希望能通过 app 从启动到退出,将一些的知识整合起来,形成一条知识链,目前涉及到的知识点有 runloop.runtime.文件存储.界面布局.离线推送.内 ...
- IOS开发之上传APP
IOS开发最终都会上传APP,但是当我们做好一个项目后.在上传AppStore上的时候往往会被各种原因打回来.让人蛋疼无比. 于是总结了比较容易出现项目被打回容易出现的原因 1.程序崩溃会被打回 这个 ...
随机推荐
- 爬虫之mongodb数据库
一 mongodb的介绍 1.易用性:mongodb是一款强大.灵活并且易扩展的数据库.他面向于文档的数据库,而不是关系性数据库.不采用关系型主要是为了获得更好的扩展性.还有一个好处就是面向文档的数据 ...
- 【转】ssh-copy-id帮你建立信任
本原创文章属于<Linux大棚>博客. 博客地址为http://roclinux.cn. 文章作者为roc. == 对于做运维的同学来说,给两台UNIX/Linux机器建立ssh信任关系是 ...
- Python图表绘制:matplotlib绘图库入门(转)
matplotlib 是Python最著名的绘图库,它提供了一整套和matlab相似的命令API,十分适合交互式地行制图.而且也可以方便地将它作为绘图控件,嵌入GUI应用程序中. 它的文档相当完备,并 ...
- Java HttpURLConnection 下载图片 图片全是“加密图片”文字,怎么解决?
package com.qzf.util; import java.io.FileOutputStream;import java.io.IOException;import java.io.Inpu ...
- Docker学习以及镜像制作流程
一.何为Docker Docker 是一个开源的应用容器引擎,基于 Go 语言 并遵从Apache2.0协议开源. Docker 可以让开发者打包他们的应用以及依赖包到一个轻量级.可移植的容器中,然后 ...
- dj 中间件
中间件的概念 中间件顾名思义,是介于request与response处理之间的一道处理过程,相对比较轻量级,并且在全局上改变django的输入与输出.因为改变的是全局,所以需要谨慎实用,用不好会影响到 ...
- Docker mysql 主从
一.独立容器部署mysql主从 # 主从 my.cnf加上 [mysqld] server-id = XXX log-bin = mysql-bin log-bin-index = log-bin.i ...
- leetcode - [7]Binary Tree Preorder Traversal
Given a binary tree, return the preorder traversal of its nodes' values. For example:Given binary tr ...
- STM32F10X-定时器/计数器
1.STM32F10X的计数器与定时器关系 当时钟连接外脉冲时是计数器:当时钟采用系统内部时钟时是定时器! 2.STM32F10X定时器的时钟源 STM32F10X定时器的时钟不是直接来至于APB1和 ...
- Hdu1281 棋盘游戏
棋盘游戏 Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 65536/32768 K (Java/Others) Total Submi ...