移动互联网实时视频通讯之视频采集

时间:2022-03-31 06:15:07

一 、前言

一套完整的实时网络视频通讯系统包括视频采集、视频编码、视频传输、视频解码和播放。对于视频采集,大多数视频编码器对输入原始视频的格式要求是YUV420。YUV420格式是YUV格式的一种,YUV分为三个分量,Y代表亮度,也就是灰度值,U和V表示的是色度,用于描述图像的色彩和饱和度。YUV420格式的数据的各分量的布局是YYYYYYYYUUVV,视频采集时,如果输入的YUV数据各分量不是这种布局,需要先转化为这种布局,然后再送到编码器进行编码。对于android手机,从摄像头捕获的YUV数据格式是NV21,其YUV布局是YYYYYYYYVUVU;对于iphone手机,摄像头采集的YUV数据格式是NV12,其YUV布局是YYYYYYYYUVUV。因此对于android手机和iphone手机,视频采集获得的数据都需 要进行转换为YUV420的数据布局才能进一步进行编码和传输等工作。

 

二、android视频采集

对于android系统,通过Camera.PreviewCallback的onPreviewFrame回调函数,实时截取每一帧视频流数据。在Camera对象上,有3种不同的方式使用这个回调:

setPreviewCallback(Camera.PreviewCallback):在屏幕上显示一个新的预览帧时调用onPreviewFrame方法。

setOneShotPreviewCallback(Camera.PreviewCallback):当下一幅预览图像可用时调用onPreviewFrame。

setPreviewCallbackWithBuffer(Camera.PreviewCallback):在Android 2.2中引入了该方法,其与setPreviewCallback的工作方式相同,但需要提供一个字节数组作为缓冲区,用于保存预览图像数据。这是为了能够更好地管理处理预览图像时使用的内存,避免内存的频繁分配和销毁。

示例代码

 

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
public class VideoActivity extends Activity implements SurfaceHolder.Callback,Camera.PreviewCallback { static int screenWidth = 0;static int screenHeight = 0;private SurfaceView mSurfaceview = null;private SurfaceHolder mSurfaceHolder = null;private Camera mCamera = null;private byte yuv_frame[];private Parameters mParameters;static int mwidth = 320;static int mheight = 240; // Setupprotected void onCreate(Bundle savedInstanceState) { requestWindowFeature(Window.FEATURE_NO_TITLE);super.onCreate(savedInstanceState);setContentView(R.layout.activity_video_capture);mSurfaceview = (SurfaceView) findViewById(R.id.surfaceview);mSurfaceHolder = mSurfaceview.getHolder();mSurfaceHolder.addCallback(this);screenWidth = getWindowManager().getDefaultDisplay().getWidth();screenHeight = getWindowManager().getDefaultDisplay().getHeight();LayoutParams layoutparam = new LayoutParams(screenWidth, screenHeight);SurfaceHolder holder = mSurface.getHolder();holder.setFixedSize(screenWidth,screenHeight);} void startcapture(){try {if (mCamera == null) {mCamera = Camera.open();}mCamera.stopPreview();mParameters = mCamera.getParameters();mParameters.setPreviewSize(mwidth, mheight);//set video resolution ratiomParameters.setPreviewFrameRate(15); //set frame ratemCamera.setParameters(mParameters);int mformat = mParameters.getPreviewFormat();int bitsperpixel = ImageFormat.getBitsPerPixel(mformat);yuv_frame = new byte[mwidth * mheight * bitsperpixel / 8];//buffer to store NV21preview  datamCamera.addCallbackBuffer(yuv_frame);mCamera.setPreviewDisplay(mSurfaceHolder);mCamera.setPreviewCallbackWithBuffer(this);//set callback for cameramCamera.startPreview();//trigger callback onPreviewFrame}catch (IOException e) {e.printStackTrace();}} @Overridepublic void onPreviewFrame(byte[] data, Camera camera) {//add code to process the data captured,data layout is YYYYYYYYVUVU,should be converted to                               // YYYYYYYYUUVV before encodedcamera.addCallbackBuffer(yuv_frame);}protected void onPause() {super.onPause();}public void onResume() {super.onResume();}protected void onDestroy() {if (mCamera != null) {mCamera.setPreviewCallback(null);mCamera.stopPreview(); //stop capture video datamCamera.release();mCamera = null;}super.onDestroy();}@Overridepublic void surfaceCreated(SurfaceHolder holder) {}@Overridepublic void surfaceDestroyed(SurfaceHolder holder) {}@Overridepublic void surfaceChanged(SurfaceHolder holder, int format, int width_size,int height_size) {startcapture();//start capture NV21 video data}}

 

二、IOS视频采集

对于IOS系统,为了完成实时视频采集,首先初始化一个AVCaputureSession对象,AVCaptureSession对象用于将AV输入设备的数据流转换到输出。然后,初始化一个AVCaptureDeviceInput对象,调用addInput方法将AVCaptureDeviceInput对象添加到AVCaptureSession对象。接着初始化一个AVCaptureVideoDataOuput对象,调用addOutput方法将该对象添加到AVCaputureSession对象。AVCaptureVideoDataOutput初始化后可以通过captureOutput:didOutputSampleBuffer:fromConnection:这个委托方法获取视频帧,这个委托方法必须采用AVCaptureVideoDataOutputSampleBufferDelegate协议。

示例代码

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
int frame_rate = 15;int mWidth = 640;int mHeight 480;AVCaptureDeviceInput *videoInput = nil;AVCaptureVideoDataOutput *avCaptureVideoDataOutput =nil;AVCaptureSession* mCaptureSession = nil;AVCaptureDevice *mCaptureDevice = nil- (void)startCapture{if(mCaptureDevice || mCaptureSession){NSLog(@"Already capturing");return;}NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];for (AVCaptureDevice *device in cameras){if (device.position == AVCaptureDevicePositionFront){AVCaptureDevice = device;}}if(AVCaptureDevice == nil){AVCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];}if(mCaptureDevice == nil){NSLog(@"Failed to get valide capture device");return;}NSError *error = nil;videoInput = [AVCaptureDeviceInput deviceInputWithDevice:mCaptureDevice error:&error];if (!videoInput){NSLog(@"Failed to get video input");mCaptureDevice = nil;return;}mCaptureSession = [[AVCaptureSession alloc] init];mCaptureSession.sessionPreset = AVCaptureSessionPreset640x480;//set video resolution ratio[mCaptureSession addInput:videoInput];avCaptureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];NSDictionary *settings = [[NSDictionary alloc] initWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange],                                   kCVPixelBufferPixelFormatTypeKey,[NSNumber numberWithInt: mWidth], (id)kCVPixelBufferWidthKey,[NSNumber numberWithInt: mHeight], (id)kCVPixelBufferHeightKey,nil];avCaptureVideoDataOutput.videoSettings = settings;[settings release];avCaptureVideoDataOutput.minFrameDuration = CMTimeMake(1, frame_rate);//set video frame rateavCaptureVideoDataOutput.alwaysDiscardsLateVideoFrames = YES;dispatch_queue_t queue_ = dispatch_queue_create("www.easemob.com", NULL);[avCaptureVideoDataOutput setSampleBufferDelegate:self queue:queue_];[mCaptureSession addOutput:avCaptureVideoDataOutput];[avCaptureVideoDataOutput release];dispatch_release(queue_);}- (void)stopCapture{ if(mCaptureSession){[mCaptureSession stopRunning];[mCaptureSession removeInput:videoInput];[mCaptureSession removeOutput:avCaptureVideoDataOutput];[avCaptureVideoDataOutput release];[videoInput release];[mCaptureSession release], mCaptureSession = nil;[mCaptureDevice release], mCaptureDevice = nil;}}- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);/* unlock the buffer*/if(CVPixelBufferLockBaseAddress(imageBuffer, 0) == kCVReturnSuccess){UInt8 *bufferbasePtr = (UInt8 *)CVPixelBufferGetBaseAddress(imageBuffer);UInt8 *bufferPtr = (UInt8 *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer,0);UInt8 *bufferPtr1 = (UInt8 *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer,1);size_t buffeSize = CVPixelBufferGetDataSize(imageBuffer);size_t width = CVPixelBufferGetWidth(imageBuffer);size_t height = CVPixelBufferGetHeight(imageBuffer);size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);size_t bytesrow0 = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,0);size_t bytesrow1  = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,1);size_t bytesrow2 = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,2);UInt8 *yuv420_data = (UInt8 *)malloc(width * height *3/ 2);//buffer to store YUV with layout YYYYYYYYUUVV /* convert NV21 data to YUV420*/ UInt8 *pY = bufferPtr ;UInt8 *pUV = bufferPtr1;UInt8 *pU = yuv420_data + width*height;UInt8 *pV = pU + width*height/4;for(int i =0;i<height;i++){memcpy(yuv420_data+i*width,pY+i*bytesrow0,width);}for(int j = 0;j<height/2;j++){for(int i =0;i<width/2;i++){*(pU++) = pUV[i<<1];*(pV++) = pUV[(i<<1) + 1];}pUV+=bytesrow1;}//add code to push yuv420_data to video encoder here free(yuv420_data);/* unlock the buffer*/CVPixelBufferUnlockBaseAddress(imageBuffer, 0);}

 

本文章版权归环信所有,转载请注明出处。更多技术文章请访问http://blog.easemob.com/