本系列的前几篇文章中分别介绍了,连接FMS服务器、建立播放程序以及在线视频录制以及回放等功能的实现。相信看过前面几篇文章的朋友已经对FMS有了一定的认识,并熟悉了常用的编程模式。本文将结合前面几篇文章所出现的技术点,来实现一个时时视频聊天程序。
通过FMS实现视频时时聊天其实很简单,也就是操作时时流。如果是单向视频聊天,则两端一边一边为发布端一边为订阅端,如果是双向视频聊天,则两边都分别是发布端和订阅端。
如果从技术实现上来分析,单向视频聊天就是一边发布流另一边播放流,双向视频聊天则是两边都需要提供两个流,一个负责发布流,一个负责播放流。在说专业点就是一个创建流并且发送到服务器的客户端叫发布,一个创建流来接受内容的客户端叫订阅,当同一个客户端同是发布和订阅时,它必须创建两个流,一个是输出流,一个是接受流。
说了这么多下面看看具体是怎么实现的,要实现视频聊天上面分析过,就是一边发布时时视频流一边播放,这同样离不开连接FMS,代码如下:
private function onPublishClick(evt:MouseEvent):void
{
nc = new NetConnection();
nc.connect("rtmp://localhost/LiveStreams");
nc.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
}
{
nc = new NetConnection();
nc.connect("rtmp://localhost/LiveStreams");
nc.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
}
通过点击按扭连接(NetConnection)FMS服务器,然后向FMS发布(publish)视频流,达到视频发布的目的。这里需要注意一点,在发布方法publish()中后一参数为“live”,表示时时视频流。以live的形式发布的流不会在FMS里生成.fla文件,不同于“record”录制视频流生成.flv的视频文件。
private function onNetStatusHandler(evt:NetStatusEvent):void
{
trace(evt.info.code);
if(evt.info.code=="NetConnection.Connect.Success")
{
ns=new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
ns.client=new CustomClient();
ns.attachCamera(cam);
ns.attachAudio(mic);
ns.publish(txtInput.text,"live");
}
}
{
trace(evt.info.code);
if(evt.info.code=="NetConnection.Connect.Success")
{
ns=new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
ns.client=new CustomClient();
ns.attachCamera(cam);
ns.attachAudio(mic);
ns.publish(txtInput.text,"live");
}
}
实现视频发布的核心技术点就是获取视频、音频数据,分别通过Camera和Microphone的静态方法实现。参考代码:
public function PublishStream():void
{
btnPublish.label="发布视频";
btnPublish.addEventListener(MouseEvent.CLICK,onPublishClick);
//获取视频和声音,并将视频显示到Flash界面
cam = Camera.getCamera();
mic = Microphone.getMicrophone();
video = new Video(320,240);
video.attachCamera(cam);
video.x=20;
video.y=20;
addChild(video);
}
{
btnPublish.label="发布视频";
btnPublish.addEventListener(MouseEvent.CLICK,onPublishClick);
//获取视频和声音,并将视频显示到Flash界面
cam = Camera.getCamera();
mic = Microphone.getMicrophone();
video = new Video(320,240);
video.attachCamera(cam);
video.x=20;
video.y=20;
addChild(video);
}
通过以上步骤就完成了视频聊天的视频流发布端的开发,完整的示例代码如下:
package
{
import flash.net.*;
import flash.events.*;
import flash.display.*;
import flash.media.*;
import fl.controls.*;
public class PublishStream extends Sprite
{
private var video:Video;
private var nc:NetConnection;
private var ns:NetStream;
private var cam:Camera;
private var mic:Microphone;
public function PublishStream():void
{
btnPublish.label="发布视频";
btnPublish.addEventListener(MouseEvent.CLICK,onPublishClick);
//获取视频和声音,并将视频显示到Flash界面
cam = Camera.getCamera();
mic = Microphone.getMicrophone();
video = new Video(320,240);
video.attachCamera(cam);
video.x=20;
video.y=20;
addChild(video);
}
private function onPublishClick(evt:MouseEvent):void
{
nc = new NetConnection();
nc.connect("rtmp://localhost/LiveStreams");
nc.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
}
private function onNetStatusHandler(evt:NetStatusEvent):void
{
trace(evt.info.code);
if(evt.info.code=="NetConnection.Connect.Success")
{
ns=new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
ns.client=new CustomClient();
ns.attachCamera(cam);
ns.attachAudio(mic);
ns.publish(txtInput.text,"live");
}
}
}
}
{
import flash.net.*;
import flash.events.*;
import flash.display.*;
import flash.media.*;
import fl.controls.*;
public class PublishStream extends Sprite
{
private var video:Video;
private var nc:NetConnection;
private var ns:NetStream;
private var cam:Camera;
private var mic:Microphone;
public function PublishStream():void
{
btnPublish.label="发布视频";
btnPublish.addEventListener(MouseEvent.CLICK,onPublishClick);
//获取视频和声音,并将视频显示到Flash界面
cam = Camera.getCamera();
mic = Microphone.getMicrophone();
video = new Video(320,240);
video.attachCamera(cam);
video.x=20;
video.y=20;
addChild(video);
}
private function onPublishClick(evt:MouseEvent):void
{
nc = new NetConnection();
nc.connect("rtmp://localhost/LiveStreams");
nc.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
}
private function onNetStatusHandler(evt:NetStatusEvent):void
{
trace(evt.info.code);
if(evt.info.code=="NetConnection.Connect.Success")
{
ns=new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
ns.client=new CustomClient();
ns.attachCamera(cam);
ns.attachAudio(mic);
ns.publish(txtInput.text,"live");
}
}
}
}
视频接收端相对发布端更简单,提供一个NetConnetion连接到发布端的FMS,通过NetStream播放时时视频流就完成。代码很简单,基本上都是在本系列前几篇文章中出现过的代码片段整合而成,详细见下代码块:
package
{
import flash.net.*;
import flash.events.*;
import flash.display.*;
import flash.media.*;
public class LiveStream extends Sprite
{
private var video:Video;
private var nc:NetConnection;
private var ns:NetStream;
public function LiveStream():void
{
nc = new NetConnection();
nc.connect("rtmp://localhost/LiveStreams");
nc.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
}
private function onNetStatusHandler(evt:NetStatusEvent):void
{
if(evt.info.code=="NetConnection.Connect.Success")
{
ns=new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
ns.client=new CustomClient();
video=new Video();
video.attachNetStream(ns);
ns.play("1111");//1111为流的名字,对应于视频流发布端的publish("1111","live").
addChild(video);
}
}
}
}
{
import flash.net.*;
import flash.events.*;
import flash.display.*;
import flash.media.*;
public class LiveStream extends Sprite
{
private var video:Video;
private var nc:NetConnection;
private var ns:NetStream;
public function LiveStream():void
{
nc = new NetConnection();
nc.connect("rtmp://localhost/LiveStreams");
nc.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
}
private function onNetStatusHandler(evt:NetStatusEvent):void
{
if(evt.info.code=="NetConnection.Connect.Success")
{
ns=new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS,onNetStatusHandler);
ns.client=new CustomClient();
video=new Video();
video.attachNetStream(ns);
ns.play("1111");//1111为流的名字,对应于视频流发布端的publish("1111","live").
addChild(video);
}
}
}
}
OK,到这里视频聊天的两端都完成了,如果需要做双向视频聊天,只需要在每一边多提供一个流就可以了,两端都实现发布视频流和接收视频流数据。
或许看完文章的朋友回问到CustomClient是那里来的,CustomClient是为了处理元数据的,可以通过元数据向实况视频添加一些格外的属性,本文暂时不讨论,有兴趣的朋友可以查看官方文档了解。
本文转自 beniao 51CTO博客,原文链接:http://blog.51cto.com/beniao/154094,如需转载请自行联系原作者