Silverlight之视频录制

摘要:在前两篇Silverlight的文章中跟大家一块学习了Silverlight的基础知识、Silverlight摄像头麦克风的相关操作以及截图、声音录制等,在文章后面也简单的说明了为什么没有视频录制,今天就和大家一块看一下上一节中最后的一个问题:如何使用Silverlight进行视频录制。

主要内容:

1.NESL项目简介

2.使用NESL实现视频录制

3.注意

一、NESL项目简介

在silverlight 中如何录制视频?相信这个问题有不少朋友都搜索过,但是好像目前还没有见到很好的答案,究其原因其实就是视频编码问题。当然也有朋友提到直接进行截图,只要每秒截取足够多的图片,然后依次播放就可以形成视频。但是我看到国外一个朋友使用此方法进行了几十秒的视频录制,其文件大小就达到了百兆级别,而且还进行了优化。因此这种方式要实现视频录制就目前而言还不是很合适。那么到底有没有好的方法呢?答案是有,但有限制,那就是借助于NESL。

Native Extensions for Silverlight(简称NESL)是由微软Silverlight团队进行开发,其目的主要为了增强Silverlight Out-of-Browser离线应用的功能。大家都知道虽然Silverlight 4的OOB应用支持信任*限提升功能,允许Silverlight的OOB应用对COM组件的访问,但对绝大多数Windows API仍旧无法调用,而NESL的出现正是为了解决这个问题。在最新的NESL 2.0中包含了大量有用的功能,而这其中就包括今天要说的视频编码部分。在NESL中有一个类库Microsoft.Silverlight.Windows.LocalEncode.dll主要负责本地视频和音频编码,这里就是用此类库来解决上面提到的视频录制问题。

二、使用NESL实现视频录制

在Microsoft.Silverlight.Windows.LocalEncode.dll中一个核心类就是EncodeSession,它负责音频和视频的编码输出工作。使用EncodeSession进行视频录制大概分为下面两步:

1.准备输入输出信息

在这个过程中需要定义VideInputFormatInfo、AudioInputFormatInfo、VideoOutputFormatInfo、AudioOutputFormatInfo和OutputContainerInfo,然后调用EncodeSession.Prepare()方法。

2.捕获视频输出

当输入输出信息准备好之后接下来就是调用EncodeSession.Start()方法进行视频编码输出。当然为了接收音频和视频数据必须准备两个sink类,分别继承于AudioSink和VideoSink,在这两个sink中指定CaptureSource,并且在对应的OnSample中调用EncodeSession的WirteVideoSample()和WirteAudioSample()接收并编码数据(关于AudioSink在前面的文章中已经说过,VideoSink与之类似)。

知道了EncodeSession的使用方法后下面就将其操作进行简单封装,LocalCamera.cs是本例中的核心类:

using System;
using System.Collections.ObjectModel;
using System.IO;
using System.Windows;
using System.Windows.Threading;
using System.Windows.Media;
using System.Windows.Controls;
using System.Windows.Shapes;
using Microsoft.Silverlight.Windows.LocalEncode;
 
namespace Cmj.MyWeb.MySilverlight.SilverlightMeida
{
    /// <summary>
    /// 编码状态
    /// </summary>
    public enum EncodeSessionState
    {
        Start,
        Pause,
        Stop
    }
    /// <summary>
    /// 本地视频对象
    /// </summary>
    public class LocalCamera
    {
        private string _saveFullPath = "";
        private uint _videoWidth = 640;
        private uint _videoHeight = 480;
        private VideoSinkExtensions _videoSink = null;
        private AudioSinkExtensions _audioSink= null;
        private EncodeSession _encodeSession = null;
        private UserControl _page = null;
        private CaptureSource _cSource = null;
        public LocalCamera(UserControl page,VideoFormat videoFormat,AudioFormat audioFormat)
        {
            //this._saveFullPath = saveFullPath;
            this._videoWidth = (uint)videoFormat.PixelWidth;
            this._videoHeight = (uint)videoFormat.PixelHeight;
            this._page = page;
            this.SessionState = EncodeSessionState.Stop;
            //this._encodeSession = new EncodeSession();
            _cSource = new CaptureSource();
            this.VideoDevice = DefaultVideoDevice;
            this.VideoDevice.DesiredFormat = videoFormat;
            this.AudioDevice = DefaultAudioDevice;
            this.AudioDevice.DesiredFormat = audioFormat;
            _cSource.VideoCaptureDevice = this.VideoDevice;
            _cSource.AudioCaptureDevice = this.AudioDevice;
            audioInputFormatInfo = new AudioInputFormatInfo() { SourceCompressionType = FormatConstants.AudioFormat_PCM };
            videoInputFormatInfo = new VideoInputFormatInfo() { SourceCompressionType = FormatConstants.VideoFormat_ARGB32 };
            audioOutputFormatInfo = new AudioOutputFormatInfo() { TargetCompressionType = FormatConstants.AudioFormat_AAC };
            videoOutputFormatInfo = new VideoOutputFormatInfo() { TargetCompressionType = FormatConstants.VideoFormat_H264 };
            outputContainerInfo = new OutputContainerInfo() { ContainerType = FormatConstants.TranscodeContainerType_MPEG4 };
        }
 
        public LocalCamera(UserControl page,VideoCaptureDevice videoCaptureDevice,AudioCaptureDevice audioCaptureDevice, VideoFormat videoFormat, AudioFormat audioFormat)
        {
            //this._saveFullPath = saveFullPath;
            this._videoWidth = (uint)videoFormat.PixelWidth;
            this._videoHeight = (uint)videoFormat.PixelHeight;
            this._page = page;
            this.SessionState = EncodeSessionState.Stop;
            //this._encodeSession = new EncodeSession();
            _cSource = new CaptureSource();
            this.VideoDevice = videoCaptureDevice;
            this.VideoDevice.DesiredFormat = videoFormat;
            this.AudioDevice = audioCaptureDevice;
            this.AudioDevice.DesiredFormat = audioFormat;
            _cSource.VideoCaptureDevice = this.VideoDevice;
            _cSource.AudioCaptureDevice = this.AudioDevice;
            audioInputFormatInfo = new AudioInputFormatInfo() { SourceCompressionType = FormatConstants.AudioFormat_PCM };
            videoInputFormatInfo = new VideoInputFormatInfo() { SourceCompressionType = FormatConstants.VideoFormat_ARGB32 };
            audioOutputFormatInfo = new AudioOutputFormatInfo() { TargetCompressionType = FormatConstants.AudioFormat_AAC };
            videoOutputFormatInfo = new VideoOutputFormatInfo() { TargetCompressionType = FormatConstants.VideoFormat_H264 };
            outputContainerInfo = new OutputContainerInfo() { ContainerType = FormatConstants.TranscodeContainerType_MPEG4 };
        }
 
        public EncodeSessionState SessionState
        {
            get;
            set;
        }
        public EncodeSession Session
        {
            get
            {
                return _encodeSession;
            }
            set
            {
                _encodeSession = value;
            }
        }
        /// <summary>
        /// 编码对象所在用户控件对象
        /// </summary>
        public UserControl OwnPage
        {
            get
            {
                return _page;
            }
            set
            {
                _page = value;
            }
        }
        /// <summary>
        /// 捕获源
        /// </summary>
        public CaptureSource Source
        {
            get
            {
                return _cSource;
            }
        }
        /// <summary>
        /// 操作音频对象
        /// </summary>
        public AudioSinkExtensions AudioSink
        {
            get
            {
                return _audioSink;
            }
        }
 
        public static VideoCaptureDevice DefaultVideoDevice
        {
            get
            {
                return CaptureDeviceConfiguration.GetDefaultVideoCaptureDevice();
            }
        }
         
        public static ReadOnlyCollection<VideoCaptureDevice> AvailableVideoDevice
        {
            get
            {
                return CaptureDeviceConfiguration.GetAvailableVideoCaptureDevices();
            }
        }
 
        public VideoCaptureDevice VideoDevice
        {
            get;
            set;
        }
 
        public static AudioCaptureDevice DefaultAudioDevice
        {
            get
            {
                return CaptureDeviceConfiguration.GetDefaultAudioCaptureDevice();
            }
        }
        public static ReadOnlyCollection<AudioCaptureDevice> AvailableAudioDevice
        {
            get
            {
                return CaptureDeviceConfiguration.GetAvailableAudioCaptureDevices();
            }
        }
 
        public AudioCaptureDevice AudioDevice
        {
            get;
            set;
        }
 
        private Object lockObj = new object();
        internal VideoInputFormatInfo videoInputFormatInfo;
        internal AudioInputFormatInfo audioInputFormatInfo;
        internal VideoOutputFormatInfo videoOutputFormatInfo;
        internal AudioOutputFormatInfo audioOutputFormatInfo;
        internal OutputContainerInfo outputContainerInfo;
        /// <summary>
        /// 视频录制
        /// </summary>
        public void StartRecord()
        {
            lock (lockObj)
            {
                if (this.SessionState == EncodeSessionState.Stop)
                {
                    _videoSink = new VideoSinkExtensions(this);
                    _audioSink = new AudioSinkExtensions(this);
                    //_audioSink.VolumnChange += new AudioSinkExtensions.VolumnChangeHanlder(_audioSink_VolumnChange);
                    if (_encodeSession == null)
                    {
                        _encodeSession = new EncodeSession();
                    }
                    PrepareFormatInfo(_cSource.VideoCaptureDevice.DesiredFormat, _cSource.AudioCaptureDevice.DesiredFormat);
                    _encodeSession.Prepare(videoInputFormatInfo, audioInputFormatInfo, videoOutputFormatInfo, audioOutputFormatInfo, outputContainerInfo);
                    _encodeSession.Start(false, 200);
                    this.SessionState = EncodeSessionState.Start;
                }
            }
        }
        /// <summary>
        /// 音量大小指示
        /// </summary>
        /// <param name="sender"></param>
        /// <param name="e"></param>
        //void _audioSink_VolumnChange(object sender, VolumnChangeArgs e)
        //{
        //    this.OwnPage.Dispatcher.BeginInvoke(new Action(() =>
        //    {
        //        (
        //            this.OwnPage.Tag as ProgressBar).Value = e.Volumn;
        //    }));
        //}
 
        /// <summary>
        /// 暂停录制
        /// </summary>
        public void PauseRecord()
        {
            lock (lockObj)
            {
                this.SessionState = EncodeSessionState.Pause;
                _encodeSession.Pause();
            }
        }
        /// <summary>
        /// 停止录制
        /// </summary>
        public void StopRecord()
        {
            lock (lockObj)
            {
                this.SessionState = EncodeSessionState.Stop;
                _encodeSession.Shutdown();
                _videoSink = null;
                _audioSink = null;
            }
        }
 
        /// <summary>
        /// 准备编码信息
        /// </summary>
        /// <param name="videoFormat"></param>
        /// <param name="audioFormat"></param>
        private void PrepareFormatInfo(VideoFormat videoFormat, AudioFormat audioFormat)
        {
            uint FrameRateRatioNumerator = 0;
            uint FrameRateRationDenominator = 0;
            FormatConstants.FrameRateToRatio((float)Math.Round(videoFormat.FramesPerSecond, 2), ref FrameRateRatioNumerator, ref FrameRateRationDenominator);
 
            videoInputFormatInfo.FrameRateRatioNumerator = FrameRateRatioNumerator;
            videoInputFormatInfo.FrameRateRatioDenominator = FrameRateRationDenominator;
            videoInputFormatInfo.FrameWidthInPixels = _videoWidth;
            videoInputFormatInfo.FrameHeightInPixels = _videoHeight ;
            videoInputFormatInfo.Stride = (int)_videoWidth*-4;
 
            videoOutputFormatInfo.FrameRateRatioNumerator = FrameRateRatioNumerator;
            videoOutputFormatInfo.FrameRateRatioDenominator = FrameRateRationDenominator;
            videoOutputFormatInfo.FrameWidthInPixels = videoOutputFormatInfo.FrameWidthInPixels == 0 ? (uint)videoFormat.PixelWidth : videoOutputFormatInfo.FrameWidthInPixels;
            videoOutputFormatInfo.FrameHeightInPixels = videoOutputFormatInfo.FrameHeightInPixels == 0 ? (uint)videoFormat.PixelHeight : videoOutputFormatInfo.FrameHeightInPixels;
 
            audioInputFormatInfo.BitsPerSample = (uint)audioFormat.BitsPerSample;
            audioInputFormatInfo.SamplesPerSecond = (uint)audioFormat.SamplesPerSecond;
            audioInputFormatInfo.ChannelCount = (uint)audioFormat.Channels;
            if (outputContainerInfo.FilePath == null || outputContainerInfo.FilePath == string.Empty)
            {
                _saveFullPath=System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "cCameraRecordVideo.tmp");
            }
            outputContainerInfo.FilePath = _saveFullPath;
            //outputContainerInfo.FilePath = _saveFullPath;
            if (audioOutputFormatInfo.AverageBitrate == 0)
                audioOutputFormatInfo.AverageBitrate = 24000;
            if (videoOutputFormatInfo.AverageBitrate == 0)
                videoOutputFormatInfo.AverageBitrate = 2000000;
        }
 
        /// <summary>
        /// 开始捕获
        /// </summary>
        public void StartCaptrue()
        {
            if (CaptureDeviceConfiguration.AllowedDeviceAccess || CaptureDeviceConfiguration.RequestDeviceAccess())
            {
                _cSource.Start();
            }
        }
 
        /// <summary>
        /// 停止捕获
        /// </summary>
        public void StopCapture()
        {
            _videoSink = null;
            _audioSink = null;
            _cSource.Stop();
        }
 
        /// <summary>
        /// 获得视频
        /// </summary>
        /// <returns></returns>
        public VideoBrush GetVideoBrush()
        {
            VideoBrush vBrush = new VideoBrush();
            vBrush.SetSource(_cSource);
            return vBrush;
        }
 
        /// <summary>
        /// 获得视频
        /// </summary>
        /// <returns></returns>
        public Rectangle GetVideoRectangle()
        {
            Rectangle rctg = new Rectangle();
            rctg.Width = this._videoWidth;
            rctg.Height = this._videoHeight;
            rctg.Fill = GetVideoBrush();
            return rctg;
        }
 
        /// <summary>
        /// 保存视频
        /// </summary>
        public void SaveRecord()
        {
            if (_saveFullPath == string.Empty)
            {
                MessageBox.Show("尚未录制视频,无法进行保存!", "系统提示", MessageBoxButton.OK);
                return;
            }
            SaveFileDialog sfd = new SaveFileDialog
            {
                Filter = "MP4 Files (*.mp4)|*.mp4",
                DefaultExt = ".mp4",
                FilterIndex = 1
            };
 
            if ((bool)sfd.ShowDialog())
            {
                using (Stream stm=sfd.OpenFile())
                {
                    FileStream fs = new FileStream(_saveFullPath, FileMode.Open, FileAccess.Read);
                    try
                    {
                        byte[] buffur = new byte[fs.Length];
                        fs.Read(buffur, 0, (int)fs.Length);
                        stm.Write(buffur, 0, (int)buffur.Length);
                        fs.Close();
                        File.Delete(_saveFullPath);
                    }
                    catch (IOException ioe)
                    {
                        MessageBox.Show("文件保存失败!错误信息如下:"+Environment.NewLine+ioe.Message,"系统提示",MessageBoxButton.OK);
                    }
                    stm.Close();
                }
            }
        }
    }
}

当然上面说过必须有两个Sink:

 

using System;
using System.Windows.Media;
using System.Windows.Controls;
using Microsoft.Silverlight.Windows.LocalEncode;
 
namespace Cmj.MyWeb.MySilverlight.SilverlightMeida
{
    public class VideoSinkExtensions:VideoSink
    {
        //private UserControl _page;
        //private EncodeSession _session;
        private LocalCamera _localCamera;
        public VideoSinkExtensions(LocalCamera localCamera)
        {
            //this._page = page;
            this._localCamera = localCamera;
            //this._session = session;
            this.CaptureSource = _localCamera.Source;
        }
 
        protected override void OnCaptureStarted()
        {
             
        }
 
        protected override void OnCaptureStopped()
        {
 
        }
 
        protected override void OnFormatChange(VideoFormat videoFormat)
        {
 
        }
 
        protected override void OnSample(long sampleTimeInHundredNanoseconds, long frameDurationInHundredNanoseconds, byte[] sampleData)
        {
            if (_localCamera.SessionState == EncodeSessionState.Start)
            {
                _localCamera.OwnPage.Dispatcher.BeginInvoke(new Action<long, long, byte[]>((ts, dur, data) =>
                {
                    _localCamera.Session.WriteVideoSample(data, data.Length, ts, dur);
                }), sampleTimeInHundredNanoseconds, frameDurationInHundredNanoseconds, sampleData);
            }
        }
    }
}

  

using System;
using System.Windows.Media;
using System.Windows.Controls;
using Microsoft.Silverlight.Windows.LocalEncode;
 
 
namespace Cmj.MyWeb.MySilverlight.SilverlightMeida
{
    public class AudioSinkExtensions:AudioSink
    {
        private LocalCamera _localCamera;
        public AudioSinkExtensions(LocalCamera localCamera)
        {
            this._localCamera = localCamera;
            this.CaptureSource = _localCamera.Source;
 
        }
        protected override void OnCaptureStarted()
        {
             
        }
 
        protected override void OnCaptureStopped()
        {
 
        }
 
        protected override void OnFormatChange(AudioFormat audioFormat)
        {
 
        }
 
        protected override void OnSamples(long sampleTimeInHundredNanoseconds, long sampleDurationInHundredNanoseconds, byte[] sampleData)
        {
            if (_localCamera.SessionState == EncodeSessionState.Start)
            {
                _localCamera.OwnPage.Dispatcher.BeginInvoke(new Action<long, long, byte[]>((ts, dur, data) =>
                {
                    _localCamera.Session.WriteAudioSample(data, data.Length, ts, dur);
                }), sampleTimeInHundredNanoseconds, sampleDurationInHundredNanoseconds, sampleData);
 
                //计算音量变化
                //for (int index = 0; index < sampleData.Length; index += 1)
                //{
                //    short sample = (short)((sampleData[index] << 8) | sampleData[index]);
                //    float sample32 = sample / 32768f;
                //    float maxValue = 0;
                //    float minValue = 0;
                //    maxValue = Math.Max(maxValue, sample32);
                //    minValue = Math.Min(minValue, sample32);
                //    float lastPeak = Math.Max(maxValue, Math.Abs(minValue));
                //    float micLevel = (100 - (lastPeak * 100)) * 10;
                //    OnVolumnChange(this, new VolumnChangeArgs() { Volumn=micLevel});
                //}
            }
        }
 
 
        /// <summary>
        /// 定义一个事件,反馈音量变化
        /// </summary>
        /// <param name="sender"></param>
        /// <param name="e"></param>
        //public delegate void VolumnChangeHanlder(object sender, VolumnChangeArgs e);
        //public event VolumnChangeHanlder VolumnChange;
        //private void OnVolumnChange(object sender, VolumnChangeArgs e)
        //{
        //    if (VolumnChange != null)
        //    {
        //        VolumnChange(sender, e);
        //    }
        //}
    }
 
    //public class VolumnChangeArgs : EventArgs
    //{
    //    public float Volumn
    //    {
    //        get;
    //        internal set;
    //    }
    //}
}

有了这三个类,下面准备一个界面,使用LocalCamera进行视频录制操作。

Silverlight之视频录制

需要注意的是保存操作,事实上在EncodeSession中视频的保存路径是在视频录制之前就必须指定的(当然这一点并不难理解,因为长时间的视频录制是会形成很大的文件的,保存之前缓存到内存中也不是很现实),在LocalCamera中对保存方法的封装事实上是文件的读取和删除操作。另外在这个例子中用到了前面文章中自定义的OOB控件,不明白的朋友可以查看前面的文章内容。下面是调用代码:

OK,下面是视频录制的截图:

正在录制

 Silverlight之视频录制

停止录制后保存

Silverlight之视频录制

播放录制的视频

Silverlight之视频录制

三、注意:

1.video sink和audio sink都是运行在不同于UI的各自的线程中,你可以使用UI的Dispathcher或者SynchronizationContext进行不同线程之间的调用。

2.在video sink和audio sink的OnSample方法中必须进行状态判断,因为sink实例创建之后就会执行OnSample方法,但此时EncodeSession还没有启动因此如果不进行状态判读就会抛出com异常。

3.视频的宽度和高度不能够随意指定,这个在NESL的帮助文档中也是特意说明的,如果任意指定同样会抛出异常。

4.最后再次提醒大家,上面的视频录制是基于NESL的因此必须将应用运行到浏览器外(OOB)。

源代码下载Silverlight之视频录制

上一篇:递归遍历目录下面指定的文件名


下一篇:《Linux内核修炼之道》——第2章 配置与编译内核 2.1 配置内核