在开发音视频项目时,涉及到将iOS的屏幕共享画面传递给他人观看,在iOS和Android端可以用一些通知下发,然后正常显示画面,但是微信小程序的音视频开发,无法正常显示,尤其是当屏幕共享是一个静止画面时,微信小程序无法拉到流,显示黑屏,于是就开始思考现在的解决办法。
在iOS屏幕共享时,主要是通过下面的方法获得屏幕数据。
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
//NSLog(@"执行了数据回调,数据回调,数据回调......类型:%ld", (long)sampleBufferType);
if (!_isBeginSendData) {
_isBeginSendData = YES;
//有一个6的测试机,会偶现此方法不回调,所以当方法回调后,再开始创建
[[SmoothBroadcast sharedInstance] start];
}
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
[[SmoothBroadcast sharedInstance] sendVideoSampleBuffer:sampleBuffer];
break;
// case RPSampleBufferTypeAudioApp:
// // Handle audio sample buffer for app audio
// break;
// case RPSampleBufferTypeAudioMic:
// // Handle audio sample buffer for mic audio
// break;
default:
break;
}
}
然后将屏幕数据转换成VideoFrame视频帧的形式传递给WebRTC处理,就可以实现远端观看屏幕展示。但是在测试中发现,当iPhone手机的屏幕展示的是静止画面时,比如相册中的某个图片等等,上述的系统方法是无法回调数据的,这样微信小程序端只能看到刚开始的画面,之后就黑屏无法拉取视频流。
然后就研究了一下RTCVideoFrame这个类,发现有一个纳秒级的时间戳属性:
/** Timestamp in nanoseconds. */
@property(nonatomic, assign) int64_t timeStampNs;
于是跟踪输出此值,发现当屏幕静止时,数值是不变化的,屏幕动态时,数值也会变化。猜测,WebRTC应该是根据纳秒级的时间戳来判断是否创建新的画面帧,如果不变化,就不再向远端传递数据。尝试在屏幕静止的时候,更改此属性值,将此时此刻的纳秒级的时间戳进行宏定义,然后赋值给此属性,并且使用定时器超过一定时间认为是静止画面,然后再处理时间戳,最后在微信小程序端进行测试,发现微信小程序不再有黑屏现象,问题得以解决,开心~~~
记录一下代码处理逻辑:
- (int)sendVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer {
// if (!_videoProducer) {
// SmoothLogD("屏幕共享输出数据...1111111111");
// return -1025;
// }
if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
!CMSampleBufferDataIsReady(sampleBuffer)) {
SmoothLogD("屏幕共享输出数据...2222222222");
return 0;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (pixelBuffer == nil) {
SmoothLogD("屏幕共享输出数据...33333333333");
return 0;
}
//更新视频帧旋转角度
if (@available(iOS 11.0, *)) {
CGImagePropertyOrientation oritation = (CGImagePropertyOrientation)((__bridge NSNumber *)CMGetAttachment(sampleBuffer, (__bridge CFStringRef)RPVideoSampleOrientationKey, NULL)).unsignedIntValue;
[self changeRotationWithOrientation:oritation];
} else {
// Fallback on earlier versions
}
@autoreleasepool {
// dispatch_semaphore_wait(self.frameLock, DISPATCH_TIME_FOREVER);
int bufferWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
int bufferHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
//根据数据,重新设置宽、高
[_videoSource adaptOutputFormatToWidth:bufferWidth height:bufferHeight fps:(int)_videoConfig.frameRate];
RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
//int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000;
int64_t timeStampNs = SmoothMediaNOW;
RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:_rotation timeStampNs:timeStampNs];
[_videoSource capturer:nil didCaptureVideoFrame:videoFrame];
_lastVideoFrame = videoFrame;
_lastTime = timeStampNs;
//SmoothLogD("屏幕共享输出数据...444444444...%@", videoFrame);
rtcPixelBuffer = nil;
videoFrame = nil;
[self startFrameTimer];
// dispatch_semaphore_signal(self.frameLock);
}
return 0;
}
#pragma mark - 开始定时器
- (void)startFrameTimer {
if (!_frameTimer) {
dispatch_resume(self.frameTimer);
}
}
#pragma mark - 停止定时器
- (void)stopFrameTimer {
if (_frameTimer) {
dispatch_cancel(_frameTimer);
_frameTimer = nil;
}
}
#pragma mark - 定时器
- (dispatch_source_t)frameTimer {
if (!_frameTimer) {
_frameTimer = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, DISPATCH_TIMER_STRICT, dispatch_get_global_queue(0, 0));
dispatch_source_set_timer(_frameTimer, dispatch_walltime(NULL, 0), _frameInterval, 0);
dispatch_source_set_event_handler(_frameTimer, ^{
[self sendLastFrame];
});
}
return _frameTimer;
}
#pragma mark - 发送上一帧
/// 当共享屏幕处于静止页面时,系统的回调方法将不会回调数据,所以定时器发固定帧
-(void)sendLastFrame{
// dispatch_semaphore_wait(self.frameLock, DISPATCH_TIME_FOREVER);
int64_t current = SmoothMediaNOW;//纳秒
int64_t elapse = current - _lastTime;
if (elapse > 1000000000) {//超出1秒时间系统没有给视频帧 开始发送最后一帧来补帧
RTCVideoFrame *frame = _lastVideoFrame;
if (frame) {
//printf("current:%lld last:%lld\n", current, _lastTime);
frame.timeStampNs = current;
[_videoSource capturer:nil didCaptureVideoFrame:frame];
}
}
// dispatch_semaphore_signal(self.frameLock);
}