前言
本文记录一下这几天调试海思UVC的心得,本次使用的芯片及SDK是海思3531DV100,最终的结果是两路USB摄像头接入海思的两个USB(不同的root hub),HDMI输出四分屏,显示两个摄像头的画面
参考过的文章如下:
https://blog.csdn.net/zhenglie110/article/details/89360312
https://blog.csdn.net/zhenglie110/article/details/89360423
https://blog.csdn.net/zhenglie110/article/details/89361644
http://bbs.ebaina.com/thread-37986-1-1.html
但是上面这些文章里面也不全对,而且每个人需求都不太一样,因此下面我会重头再解释一遍
内核的修改
配置menuconfig增加驱动
make ARCH=arm CROSS_COMPILE=arm-hisiv500-linux- menuconfig
Device Drivers --->
[*] USB support --->
<*> Support for Host-side USB
[*] Enable USB persist by default
<*> xHCI HCD (USB 3.0) support
<*> xHCI support for Hisilicon SoCs
<*> EHCI HCD (USB 2.0) support
[*] Improved Transaction Translator scheduling
<*> Generic EHCI driver for a platform device
<*> OHCI HCD (USB 1.1) support
<*> OHCI support for PCI-bus USB controllers
<*> Generic OHCI driver for a platform device
<*> USB Mass Storage support
<*> USB Gadget Support --->
<*> Multimedia support --->
[*] Cameras/video grabbers support
[*] Media Controller API
[*] V4L2 sub-device userspace API
[*] Media USB Adapters --->
<*> USB Video Class (UVC)
[*] UVC input events device support
<*> GSPCA based webcams --->
[*] Media PCI Adapters --->
[*] V4L platform devices --->
<M> Marvell 88ALP01 (Cafe) CMOS Camera Controller support
<*> SoC camera support
<*> platform camera support
[*] Autoselect ancillary drivers (tuners, sensors, i2c, frontends)
我目前的配置就上面这些,而且是将驱动编译进了内核,网上有人把 Device Drivers -> PHY Subsystem —> Hisilicon Inno USB2 PHY support
打开了,经过我的测试打开这个选项将会导致USB2.0的接口无法识别设备,具体原因不清楚
驱动的修改
将USB摄像头插入板子上,观察打印信息
usb 1-1: USB disconnect, device number 15
usb 1-1: new high-speed USB device number 16 using xhci-hcd
uvcvideo: quirks = 512
uvcvideo: Found UVC 1.00 device USB Camera (0bda:3035)
input: USB Camera as /devices/soc/11000000.xhci/usb1/1-1/1-1:1.0/input/input15
上面已经报告了VID 和 PID,因此不需要在windows里面费那么大劲去找VID和PID
修改文件 linux-3.18.y\drivers\media\usb\uvc\uvc_driver.c
在 struct usb_device_id uvc_ids[]
的末尾模仿之前的加上自己的USB设备信息,如果不加的话,设备插入时调用probe将会按默认的id_table来加载驱动,也就是这个uvc_ids末尾说的Generic USB Video Class
/* my test USB Camera */
{
.match_flags = USB_DEVICE_ID_MATCH_DEVICE| USB_DEVICE_ID_MATCH_INT_INFO,
.idVendor = 0x0bda,
.idProduct = 0x3035,
.bInterfaceClass = USB_CLASS_VIDEO,
.bInterfaceSubClass = 1,
.bInterfaceProtocol = 0,
.driver_info = UVC_QUIRK_RESTRICT_FRAME_RATE},
/* Generic USB Video Class */
{ USB_INTERFACE_INFO(USB_CLASS_VIDEO, 1, 0) },
注意一下这个driver_info的赋值,可以用来限制帧率,UVC_QUIRK_RESTRICT_FRAME_RATE
的值是512,这个设置好像是跟带宽有关系,没有深入了解,如果设的过小,将导致无法出图。而且USB2.0的带宽上限也只有480Mbit/s,连一个摄像头都够呛了。
sample_uvc
直接上sample代码了,原本SDK是没有uvc例子的,这个是我自己实现的,代码上传到附件,包括以下几个文件:
- sample_comm.h
- sample_comm_sys.c
- sample_comm_vdec.c
- sample_comm_vo.c
- sample_comm_vpss.c
- sample_uvc.c
下载地址:https://download.csdn.net/download/whitefish520/13216318
有一处错误,上传后才发现:tv_fmt.fmt.pix.height = uvcParam.u32inputHeight;
以下只粘贴了sample_uvc.c在这里,因为其它的内容基本上都是SDK里面原本就提供的,基本上没有改动,sample_uvc.c是这几天以来的心血,从0开始写的,关于程序中一些设置的解释,放在下一个章节,如果程序看的不太懂,先看后文的解读。
/******************************************************************************
A simple program of Hisilicon Hi35xx video input and output implementation.
Copyright (C), 2014-2015, Hisilicon Tech. Co., Ltd.
******************************************************************************
Modification: 2015-1 Created
******************************************************************************/
#ifdef __cplusplus
#if __cplusplus
extern "C"{
#endif
#endif /* End of #ifdef __cplusplus */
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <pthread.h>
#include <signal.h>
#include <fcntl.h>
#include <errno.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <sys/ioctl.h>
#include <sys/mman.h>
#include <sys/prctl.h>
#include <sys/ioctl.h>
#include <linux/videodev2.h>
#include "sample_comm.h"
#include "hi_comm_vb.h"
#include "mpi_vb.h"
// usb video camera number
#define UVC_NUM 2
// save stream as file, do not store at flash
#define SAVE_FILE 0
// video memory
typedef struct{
void *start;
int length;
}uvc_buf_t;
struct hiUvcParam
{
HI_S32 fd[UVC_NUM];
HI_CHAR cFileName[UVC_NUM][128];
HI_CHAR cNodeName[UVC_NUM][128];
HI_U32 u32inputWidth;
HI_U32 u32inputHeight;
HI_U32 u32Width;
HI_U32 u32Height;
HI_U32 pixelformat;
HI_U32 V4L2_buffer_num;
HI_U32 buffer_num[UVC_NUM];
uvc_buf_t *uvc_buf[UVC_NUM];
HI_U32 u32BlkSize;
HI_U32 u32BlkCnt;
VB_POOL VbPool;
VB_BLK VbBlk;
HI_U32 u32phyAddr;
HI_U8 *pVirAddr;
VDEC_CHN VdChn[UVC_NUM];
VPSS_GRP VpssGrp[UVC_NUM];
VPSS_CHN VpssChn;
VO_DEV VoDev;
VO_CHN VoChn[UVC_NUM];
VO_LAYER VoLayer;
SAMPLE_VO_MODE_E enMode;
HI_BOOL pthRun;
pthread_t ptuvc;
}uvcParam;
/******************************************************************************
* function : YUV422P(YUYV) 转 YUV422SP(NV16)
* YUYVYUYV -> YYYYUVUV
******************************************************************************/
HI_VOID yuv422p_to_yuv422sp(HI_U8* yuv, HI_S32 width, HI_S32 height)
{
HI_S32 i, j, k;
HI_U8 yuv422p[width*height*2];
memcpy(yuv422p, yuv, width*height*2);
HI_U8* y = yuv;
HI_U8* uv = &yuv[width*height];
for(i=0, j=0, k=0; i<width*height*2; i+=4, j+=2, k+=2)
{
y[j] = yuv422p[i];
y[j+1] = yuv422p[i+2];
uv[k] = yuv422p[i+3];
uv[k+1] = yuv422p[i+1];
}
}
/******************************************************************************
* function : usb video camera parameter set
******************************************************************************/
HI_S32 HI_UVC_Param_Set(HI_VOID)
{
HI_S32 i, j=0;
for(i=0; i<UVC_NUM; i++)
{
while(j < 64)
{
sprintf(uvcParam.cNodeName[i], "/dev/video%d", j++);
if(0 == access(uvcParam.cNodeName[i], F_OK))
break;
}
sprintf(uvcParam.cFileName[i], "/app/uvc/uvcVideo%d.yuv", i);
uvcParam.fd[i] = -1;
uvcParam.buffer_num[i] = 0;
uvcParam.uvc_buf[i] = NULL;
uvcParam.pVirAddr = NULL;
uvcParam.VdChn[i] = i;
uvcParam.VpssGrp[i] = i;
uvcParam.VpssChn = i;
}
uvcParam.u32inputWidth = HD_WIDTH;
uvcParam.u32inputHeight = HD_HEIGHT;
uvcParam.u32Width = HD_WIDTH;
uvcParam.u32Height = HD_HEIGHT;
//uvcParam.pixelformat = V4L2_PIX_FMT_YUYV;
uvcParam.pixelformat = V4L2_PIX_FMT_MJPEG;
uvcParam.V4L2_buffer_num = 10;
uvcParam.u32BlkSize = uvcParam.u32Width * uvcParam.u32Height * 2;
uvcParam.u32BlkCnt = 15;
uvcParam.VpssChn = 0;
uvcParam.enMode = VO_MODE_4MUX;
uvcParam.pthRun = HI_TRUE;
if(j >= 64)
return HI_FAILURE;
else
return HI_SUCCESS;
}
/******************************************************************************
* function : open usb video camera device
******************************************************************************/
HI_S32 HI_UVC_Open(HI_VOID)
{
struct v4l2_input inp;
HI_S32 i, j;
for(i=0; i<UVC_NUM; i++)
{
uvcParam.fd[i] = open(uvcParam.cNodeName[i], O_RDWR, 0);
if(uvcParam.fd[i] < 0)
{
SAMPLE_PRT("camera[%d] : %s open failed ! \n", i, uvcParam.cNodeName[i]);
return HI_FAILURE;
}
for(j=0;j<16;j++)
{
inp.index = j;
if (-1 == ioctl (uvcParam.fd[i], VIDIOC_S_INPUT, &inp))
{
SAMPLE_PRT("camera[%d] : VIDIOC_S_INPUT failed %d !\n", i, j);
}
else
{
printf("camera[%d] : VIDIOC_S_INPUT success %d !\n", i, j);
break;
}
}
}
return HI_SUCCESS;
}
/******************************************************************************
* function : close usb video camera device
******************************************************************************/
HI_S32 HI_UVC_Close(HI_VOID)
{
HI_S32 i;
for(i=0; i<UVC_NUM; i++)
{
if(uvcParam.fd[i] > 0)
close(uvcParam.fd[i]);
}
return HI_SUCCESS;
}
/******************************************************************************
* function : usb video camera init
******************************************************************************/
HI_S32 HI_UVC_Init(HI_VOID)
{
HI_S32 i;
struct v4l2_capability cap; /* decive fuction, such as video input */
struct v4l2_fmtdesc fmtdesc; /* detail control value */
struct v4l2_format fmt;
HI_S32 ret = HI_FAILURE;
/* get width and height*/
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
for(i=0; i<UVC_NUM; i++)
{
if(ret = ioctl(uvcParam.fd[i], VIDIOC_G_FMT, &fmt)<0)
{
SAMPLE_PRT("camera[%d] : fail to ioctl VIDIOC_G_FMT\n", i);
return HI_FAILURE;
}
printf("camera[%d] : width:%d, height:%d\n", i, fmt.fmt.pix.width,fmt.fmt.pix.height);
/* check video decive driver capability */
if(ret=ioctl(uvcParam.fd[i], VIDIOC_QUERYCAP, &cap)<0)
{
SAMPLE_PRT("camera[%d] : fail to ioctl VIDEO_QUERYCAP \n", i);
return HI_FAILURE;
}
/*judge wherher or not to be a video-get device*/
if(!(cap.capabilities & V4L2_BUF_TYPE_VIDEO_CAPTURE))
{
SAMPLE_PRT("camera[%d] is not a video capture device\n", i);
return HI_FAILURE;
}
/*judge whether or not to supply the form of video stream*/
if(!(cap.capabilities & V4L2_CAP_STREAMING))
{
SAMPLE_PRT("camera[%d] does not support streaming i/o\n", i);
return HI_FAILURE;
}
printf("camera[%d] driver name is : %s\n", i, cap.driver);
printf("camera[%d] device name is : %s\n", i, cap.card);
printf("camera[%d] bus information: %s\n", i, cap.bus_info);
/*display the format device support*/
/*show all the support format*/
memset(&fmtdesc, 0, sizeof(fmtdesc));
fmtdesc.index = 0; /* the number to check */
fmtdesc.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
while(ioctl(uvcParam.fd[i], VIDIOC_ENUM_FMT, &fmtdesc) != -1)
{
printf("camera[%d] : support device %d.%s\n", i, fmtdesc.index+1, fmtdesc.description);
fmtdesc.index++;
}
}
return HI_SUCCESS;
}
/******************************************************************************
* function : set usb video camera format
******************************************************************************/
HI_S32 HI_UVC_Set_Format(HI_VOID)
{
HI_S32 i;
struct v4l2_format tv_fmt; /* frame format */
/*set the form of camera capture data*/
tv_fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; /*v4l2_buf_typea,camera must use V4L2_BUF_TYPE_VIDEO_CAPTURE*/
tv_fmt.fmt.pix.width = uvcParam.u32inputWidth;
tv_fmt.fmt.pix.height = uvcParam.u32inputHeight;
tv_fmt.fmt.pix.pixelformat = uvcParam.pixelformat;
tv_fmt.fmt.pix.field = V4L2_FIELD_NONE; //场格式
for(i=0; i<UVC_NUM; i++)
{
if (ioctl(uvcParam.fd[i], VIDIOC_S_FMT, &tv_fmt)< 0)
{
SAMPLE_PRT("camera[%d] : VIDIOC_S_FMT set err\n", i);
return HI_FAILURE;
}
}
return HI_SUCCESS;
}
/******************************************************************************
* function : mmap
******************************************************************************/
HI_S32 HI_UVC_Mmap(HI_VOID)
{
HI_S32 i;
/*to request frame cache, contain requested counts*/
struct v4l2_requestbuffers reqbufs;
for(i=0; i<UVC_NUM; i++)
{
memset(&reqbufs, 0, sizeof(reqbufs));
reqbufs.count = uvcParam.V4L2_buffer_num; /*the number of buffer*/
reqbufs.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
reqbufs.memory = V4L2_MEMORY_MMAP;
if(-1 == ioctl(uvcParam.fd[i], VIDIOC_REQBUFS, &reqbufs))
{
SAMPLE_PRT("camera[%d] : Fail to ioctl 'VIDIOC_REQBUFS'\n", i);
return HI_FAILURE;
}
uvcParam.buffer_num[i] = reqbufs.count;
printf("camera[%d] : buffer_num = %d\n", i, uvcParam.buffer_num[i]);
uvcParam.uvc_buf[i] = calloc(reqbufs.count, sizeof(uvc_buf_t));
if(uvcParam.uvc_buf[i] == NULL)
{
SAMPLE_PRT("camera[%d] : Out of memory\n", i);
return HI_FAILURE;
}
/*map kernel cache to user process*/
for(uvcParam.buffer_num[i] = 0; uvcParam.buffer_num[i] < reqbufs.count; uvcParam.buffer_num[i]++)
{
//stand for a frame
struct v4l2_buffer buf;
memset(&buf, 0, sizeof(buf));
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
buf.index = uvcParam.buffer_num[i];
/*check the information of the kernel cache requested*/
if(-1 == ioctl(uvcParam.fd[i], VIDIOC_QUERYBUF, &buf))
{
SAMPLE_PRT("camera[%d] : Fail to ioctl : VIDIOC_QUERYBUF\n", i);
return HI_FAILURE;
}
uvcParam.uvc_buf[i][uvcParam.buffer_num[i]].length = buf.length;
uvcParam.uvc_buf[i][uvcParam.buffer_num[i]].start = (char *)mmap(NULL, buf.length, PROT_READ | PROT_WRITE,MAP_SHARED, uvcParam.fd[i], buf.m.offset);
if(MAP_FAILED == uvcParam.uvc_buf[i][uvcParam.buffer_num[i]].start)
{
SAMPLE_PRT("camera[%d] : Fail to mmap\n", i);
return HI_FAILURE;
}
}
}
return HI_SUCCESS;
}
/******************************************************************************
* function : usb video camera unmap
******************************************************************************/
HI_S32 HI_UVC_Unmap(HI_VOID)
{
HI_U32 i, j;
for(i=0; i<UVC_NUM; i++)
{
for(j = 0; j < uvcParam.buffer_num[i]; j++)
{
if(-1 == munmap(uvcParam.uvc_buf[i][j].start, uvcParam.uvc_buf[i][j].length))
{
return HI_FAILURE;
}
}
if(NULL != uvcParam.uvc_buf[i])
free(uvcParam.uvc_buf[i]);
}
return HI_SUCCESS;
}
/******************************************************************************
* function : usb video camera start capture
******************************************************************************/
HI_S32 HI_UVC_Start(HI_VOID)
{
HI_U32 i, j;
enum v4l2_buf_type type[UVC_NUM];
for(i=0; i<UVC_NUM; i++)
{
/*place the kernel cache to a queue*/
for(j = 0; j < uvcParam.buffer_num[i]; j++)
{
struct v4l2_buffer buf;
memset(&buf, 0, sizeof(buf));
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
buf.index = j;
if(-1 == ioctl(uvcParam.fd[i], VIDIOC_QBUF, &buf))
{
SAMPLE_PRT("camera[%d] : Fail to ioctl 'VIDIOC_QBUF'\n", i);
return HI_FAILURE;
}
}
type[i] = V4L2_BUF_TYPE_VIDEO_CAPTURE;
if(-1 == ioctl(uvcParam.fd[i], VIDIOC_STREAMON, &type[i]))
{
SAMPLE_PRT("camera[%d] : VIDIOC_STREAMON\n", i);
return HI_FAILURE;
}
}
return HI_SUCCESS;
}
/******************************************************************************
* function : usb video camera stop capture
******************************************************************************/
HI_S32 HI_UVC_Stop(HI_VOID)
{
HI_U32 i;
enum v4l2_buf_type type;
type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
for(i=0; i<UVC_NUM; i++)
{
if(-1 == ioctl(uvcParam.fd[i], VIDIOC_STREAMOFF, &type))
{
SAMPLE_PRT("camera[%d] : Fail to ioctl 'VIDIOC_STREAMOFF'\n", i);
return HI_FAILURE;
}
}
return HI_SUCCESS;
}
/******************************************************************************
* function : usb video camera capture mjpeg thread
******************************************************************************/
HI_VOID * HI_UVC_MJPEG_Thread(HI_VOID *pArgs)
{
#if SAVE_FILE
FILE *fp[UVC_NUM];
#endif
fd_set fds;
HI_U8 *pu8Buf[UVC_NUM];
HI_S32 i, j, start, len, s32Ret, fdmax = 0;
HI_S32 s32ReadLen = 0;
HI_U64 u64pts = 0;
HI_BOOL bFindStart[UVC_NUM], bFindEnd[UVC_NUM];
struct timeval tv;
struct v4l2_buffer buf[UVC_NUM];
VDEC_STREAM_S stStream;
struct hiUvcParam *p = &uvcParam;
prctl(PR_SET_NAME, "hi_SendStream2Vdec", 0, 0, 0);
for(i=0; i<UVC_NUM; i++)
{
pu8Buf[i] = malloc(p->u32BlkSize);
if(pu8Buf[i] == NULL)
{
printf("camera[%d] : can't alloc in send stream thread\n", i);
return (HI_VOID *)(HI_FAILURE);
}
}
#if SAVE_FILE
for(i=0; i<UVC_NUM; i++)
{
fp[i] = fopen(p->cFileName[i], "w+");
if(fp[i] == NULL)
{
printf("camera[%d] : can't open file %s\n", i, p->cFileName[i]);
return (HI_VOID *)(HI_FAILURE);
}
}
#endif
while(p->pthRun)
{
FD_ZERO(&fds);
tv.tv_sec = 1; /*Timeout*/
tv.tv_usec = 0;
for(i=0; i<UVC_NUM; i++)
{
bFindStart[i] = HI_FALSE;
bFindEnd[i] = HI_FALSE;
FD_SET(p->fd[i], &fds);
fdmax = p->fd[i] > fdmax ? p->fd[i] : fdmax;
}
s32Ret = select(fdmax + 1, &fds, NULL, NULL, &tv);
if(-1 == s32Ret)
{
if(EINTR == errno)
{
perror("select");
continue;
}
SAMPLE_PRT("Fail to select\n");
break;
}
if(0 == s32Ret)
{
SAMPLE_PRT("select Timeout\n");
continue;
}
//put cache from queue
for(i=0; i<UVC_NUM; i++)
{
memset(&buf[i], 0, sizeof(struct v4l2_buffer));
buf[i].type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf[i].memory = V4L2_MEMORY_MMAP;
if(-1 == ioctl(p->fd[i], VIDIOC_DQBUF, &buf[i]))
{
//SAMPLE_PRT("camera[%d] : Fail to ioctl 'VIDIOC_DQBUF' \n", i);
//不需要打印,因为mjpeg没有yuv那么大,此处会有很多打印,但是显示没有问题
perror("VIDIOC_DQBUF");
continue;
}
if(buf[i].index >= p->buffer_num[i])
{
SAMPLE_PRT("camera[%d] : index error \n", i);
continue;
}
memcpy(pu8Buf[i], p->uvc_buf[i][buf[i].index].start, p->uvc_buf[i][buf[i].index].length);
s32ReadLen = p->uvc_buf[i][buf[i].index].length;
#if SAVE_FILE
fwrite(pu8Buf[i], s32ReadLen, 1, fp[i]);
#endif
for (j=0; j<s32ReadLen-2; j++)
{
if (pu8Buf[i][j] == 0xFF && pu8Buf[i][j+1] == 0xD8)
{
start = j;
bFindStart[i] = HI_TRUE;
j = j + 2;
break;
}
}
for (; j<s32ReadLen-4; j++)
{
if ( (pu8Buf[i][j] == 0xFF) && (pu8Buf[i][j+1]& 0xF0) == 0xD0 )
{
len = (pu8Buf[i][j+2]<<8) + pu8Buf[i][j+3];
j += 1 + len;
}
else
{
break;
}
}
for (; j<s32ReadLen-2; j++)
{
if (pu8Buf[i][j] == 0xFF && pu8Buf[i][j+1] == 0xD9)
{
bFindEnd[i] = HI_TRUE;
break;
}
}
s32ReadLen = j;
if (bFindStart[i] == HI_FALSE)
{
printf("camera[%d] : can not find start code! s32ReadLen %d\n", i, s32ReadLen);
}
else if (bFindEnd[i] == HI_FALSE)
{
printf("camera[%d] : can not find stop code! s32ReadLen %d\n", i, s32ReadLen);
s32ReadLen = j+2;
}
stStream.u64PTS = u64pts;
stStream.pu8Addr = pu8Buf[i] + start;
stStream.u32Len = s32ReadLen - start;
stStream.bEndOfFrame = HI_FALSE;
stStream.bEndOfStream = HI_FALSE;
s32Ret=HI_MPI_VDEC_SendStream(p->VdChn[i], &stStream, 100);
if (HI_SUCCESS != s32Ret)
{
SAMPLE_PRT("camera[%d] : vdec send frame fail for %#x!\n", i, s32Ret);
}
if(-1 == ioctl(p->fd[i], VIDIOC_QBUF,&buf[i]))
{
SAMPLE_PRT("camera[%d] : Fail to ioctl 'VIDIOC_QBUF'\n", i);
}
}
}
printf("uvc thread exit \n");
#if SAVE_FILE
for(i=0; i<UVC_NUM; i++)
{
fclose(fp[i]);
}
#endif
return NULL;
}
/******************************************************************************
* function : usb video camera capture yuv422 thread
******************************************************************************/
HI_VOID * HI_UVC_YUV_Thread(HI_VOID *pArgs)
{
fd_set fds;
HI_S32 s32Ret, fdmax = 0, i;
SIZE_S stSize;
HI_U32 u32LStride;
HI_U32 u32CStride;
HI_U32 u32LumaSize;
HI_U32 u32ChrmSize;
VIDEO_FRAME_INFO_S stVideoFrame;
struct timeval tv;
struct v4l2_buffer buf;
struct hiUvcParam *p = (struct hiUvcParam *)&uvcParam;
prctl(PR_SET_NAME, "hi_SendStream2Vpss", 0, 0, 0);
p->pVirAddr = HI_MPI_SYS_Mmap(p->u32phyAddr, p->u32BlkSize);
stSize.u32Width = p->u32inputWidth;
stSize.u32Height = p->u32inputHeight;
u32LStride = stSize.u32Width;
u32CStride = stSize.u32Width;
u32LumaSize = (stSize.u32Width * stSize.u32Height);
u32ChrmSize = u32LumaSize >> 2;
memset(&stVideoFrame.stVFrame, 0, sizeof(VIDEO_FRAME_S));
stVideoFrame.u32PoolId = HI_MPI_VB_Handle2PoolId(p->VbBlk);
stVideoFrame.stVFrame.u32Width = stSize.u32Width;
stVideoFrame.stVFrame.u32Height = stSize.u32Height;
stVideoFrame.stVFrame.u32Field = VIDEO_FIELD_FRAME;
stVideoFrame.stVFrame.enPixelFormat = SAMPLE_PIXEL_FORMAT;
stVideoFrame.stVFrame.u32Stride[0] = u32LStride;
stVideoFrame.stVFrame.u32Stride[1] = u32CStride;
stVideoFrame.stVFrame.u32Stride[2] = u32CStride;
stVideoFrame.stVFrame.u32PhyAddr[0] = p->u32phyAddr;
stVideoFrame.stVFrame.u32PhyAddr[1] = stVideoFrame.stVFrame.u32PhyAddr[0] + u32LumaSize;
stVideoFrame.stVFrame.u32PhyAddr[2] = stVideoFrame.stVFrame.u32PhyAddr[1] + u32ChrmSize;
stVideoFrame.stVFrame.pVirAddr[0] = p->pVirAddr;
stVideoFrame.stVFrame.pVirAddr[1] = stVideoFrame.stVFrame.pVirAddr[0] + u32LumaSize;
stVideoFrame.stVFrame.pVirAddr[2] = stVideoFrame.stVFrame.pVirAddr[1] + u32ChrmSize;
while(p->pthRun)
{
FD_ZERO(&fds);
for(i=0; i<UVC_NUM; i++)
{
FD_SET(p->fd[i], &fds);
fdmax = p->fd[i] > fdmax ? p->fd[i] : fdmax;
}
tv.tv_sec = 1; /*Timeout*/
tv.tv_usec = 0;
s32Ret = select(fdmax + 1, &fds, NULL, NULL, &tv);
if(-1 == s32Ret)
{
if(EINTR == errno)
{
perror("select");
continue;
}
SAMPLE_PRT("Fail to select \n");
break;
}
if(0 == s32Ret)
{
SAMPLE_PRT("select Timeout \n");
continue;
}
for(i=0; i<UVC_NUM; i++)
{
if(FD_ISSET(p->fd[i], &fds))
{
memset(&buf, 0, sizeof(buf));
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
//put cache from queue
if(-1 == ioctl(p->fd[i], VIDIOC_DQBUF, &buf))
{
SAMPLE_PRT("Fail to ioctl 'VIDIOC_DQBUF'\n");
continue;
}
if(buf.index >= p->buffer_num[i])
{
SAMPLE_PRT("index error \n");
continue;
}
memcpy(p->pVirAddr, p->uvc_buf[i][buf.index].start, p->uvc_buf[i][buf.index].length);
yuv422p_to_yuv422sp(p->pVirAddr, stSize.u32Width, stSize.u32Height);
s32Ret =HI_MPI_VPSS_SendFrame(p->VpssGrp[i], &stVideoFrame, 100);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("vpss send frame fail for %#x!\n", s32Ret);
}
if(-1 == ioctl(p->fd[i], VIDIOC_QBUF,&buf))
{
SAMPLE_PRT("Fail to ioctl 'VIDIOC_QBUF'\n");
}
}
}
}
printf("uvc thread exit \n");
return NULL;
}
/******************************************************************************
* function : usb(1080p) -> VPSS -> VO HD1(1080p60)
******************************************************************************/
HI_S32 SAMPLE_UVC(HI_VOID)
{
HI_CHAR ch;
HI_S32 s32Ret, i;
SIZE_S stSize;
VB_CONF_S stVbConf, stModVbConf;
VDEC_CHN_ATTR_S stVdecChnAttr[UVC_NUM];
VPSS_GRP_ATTR_S stVpssGrpAttr;
VO_PUB_ATTR_S stVoPubAttr;
VO_VIDEO_LAYER_ATTR_S stVoLayerAttr;
// step0 : init uvc
s32Ret = HI_UVC_Param_Set();
if(s32Ret < 0)
goto UVC_EXIT;
s32Ret = HI_UVC_Open();
if(s32Ret < 0)
goto UVC_EXIT;
s32Ret = HI_UVC_Init();
if(s32Ret < 0)
goto UVC_CLOSE;
s32Ret = HI_UVC_Set_Format();
if(s32Ret < 0)
goto UVC_CLOSE;
s32Ret = HI_UVC_Mmap();
if(s32Ret < 0)
goto UVC_CLOSE;
s32Ret = HI_UVC_Start();
if(s32Ret < 0)
goto UVC_UNMAP;
stSize.u32Width = uvcParam.u32Width;
stSize.u32Height = uvcParam.u32Height;
// step1 : init SYS and common VB
memset(&stVbConf, 0, sizeof(VB_CONF_S));
memset(&stVbConf.astCommPool[0].acMmzName, 0, sizeof(stVbConf.astCommPool[0].acMmzName));
stVbConf.u32MaxPoolCnt = 128;
stVbConf.astCommPool[0].u32BlkSize = uvcParam.u32BlkSize;
stVbConf.astCommPool[0].u32BlkCnt = uvcParam.u32BlkCnt;
s32Ret = SAMPLE_COMM_SYS_Init(&stVbConf);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("init sys fail for %#x!\n", s32Ret);
goto HI_SYS_EXIT;
}
// step2: init mod common VB for mjpeg
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
SAMPLE_COMM_VDEC_ModCommPoolConf(&stModVbConf, PT_MJPEG, &stSize, UVC_NUM, HI_FALSE);
s32Ret = SAMPLE_COMM_VDEC_InitModCommVb(&stModVbConf);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("init mod common vb fail for %#x!\n", s32Ret);
goto HI_SYS_EXIT;
}
}
// step3 : create pool for stream for yuv
if(uvcParam.pixelformat == V4L2_PIX_FMT_YUYV)
{
uvcParam.VbPool = HI_MPI_VB_CreatePool(uvcParam.u32BlkSize, uvcParam.u32BlkCnt, NULL);
if ( VB_INVALID_POOLID == uvcParam.VbPool )
{
SAMPLE_PRT("create vb err\n");
goto HI_SYS_EXIT;
}
uvcParam.VbBlk = HI_MPI_VB_GetBlock(uvcParam.VbPool, uvcParam.u32BlkSize, NULL);
if (VB_INVALID_HANDLE == uvcParam.VbBlk )
{
SAMPLE_PRT("get vb block err\n");
goto HI_POOL;
}
uvcParam.u32phyAddr = HI_MPI_VB_Handle2PhysAddr(uvcParam.VbBlk);
if (HI_NULL == uvcParam.u32phyAddr)
{
SAMPLE_PRT("blk to physaddr err\n");
goto HI_BLOCK;
}
}
// step4: start VDEC
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
SAMPLE_COMM_VDEC_ChnAttr(UVC_NUM, stVdecChnAttr, PT_MJPEG, &stSize);
s32Ret = SAMPLE_COMM_VDEC_Start(UVC_NUM, stVdecChnAttr);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("start VDEC fail for %#x!\n", s32Ret);
goto HI_VDEC;
}
}
// step5 : start VPSS
stVpssGrpAttr.enDieMode = VPSS_DIE_MODE_NODIE; //不用去隔行
stVpssGrpAttr.bIeEn = HI_TRUE; //图像增强
stVpssGrpAttr.bDciEn = HI_TRUE; //动态对比度调节
stVpssGrpAttr.bNrEn = HI_TRUE; //降噪
stVpssGrpAttr.bHistEn = HI_FALSE; //必须为0
stVpssGrpAttr.bEsEn = HI_FALSE; //必须为0 保留边缘平滑
stVpssGrpAttr.enPixFmt = SAMPLE_PIXEL_FORMAT;
stVpssGrpAttr.u32MaxW = ALIGN_UP(stSize.u32Width, 16);
stVpssGrpAttr.u32MaxH = ALIGN_UP(stSize.u32Height, 16);
// N路视频对应N个vpss group, 每个vpss group可以有多少个输出,对应多个vpss channel
// 本sample中只有一个HDMI输出,因此vpss channel只有一个
s32Ret = SAMPLE_COMM_VPSS_Start(UVC_NUM, &stSize, 1, &stVpssGrpAttr);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("start VPSS fail for %#x!\n", s32Ret);
goto HI_VPSS;
}
// step6 : start vo
uvcParam.VoDev = SAMPLE_VO_DEV_DHD1;
uvcParam.VoLayer = SAMPLE_VO_LAYER_VHD1;
stVoPubAttr.enIntfSync = VO_OUTPUT_1080P60;
stVoPubAttr.enIntfType = VO_INTF_HDMI;
stVoPubAttr.u32BgColor = 0x0000ffff;
s32Ret = SAMPLE_COMM_VO_StartDev(uvcParam.VoDev, &stVoPubAttr);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("vdec bind vpss fail for %#x!\n", s32Ret);
goto HI_VO_DEV;
}
if (HI_SUCCESS != SAMPLE_COMM_VO_HdmiStart(stVoPubAttr.enIntfSync))
{
SAMPLE_PRT("Start SAMPLE_COMM_VO_HdmiStart failed!\n");
goto HI_VO_HDMI;
}
s32Ret = SAMPLE_COMM_VO_GetWH(stVoPubAttr.enIntfSync, \
&stVoLayerAttr.stDispRect.u32Width, &stVoLayerAttr.stDispRect.u32Height, &stVoLayerAttr.u32DispFrmRt);
if (s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("failed with %#x!\n", s32Ret);
goto HI_VO_HDMI;
}
stVoLayerAttr.stImageSize.u32Width = stVoLayerAttr.stDispRect.u32Width;
stVoLayerAttr.stImageSize.u32Height = stVoLayerAttr.stDispRect.u32Height;
stVoLayerAttr.bClusterMode = HI_FALSE;
stVoLayerAttr.bDoubleFrame = HI_FALSE;
stVoLayerAttr.enPixFormat = SAMPLE_PIXEL_FORMAT;
s32Ret = SAMPLE_COMM_VO_StartLayer(uvcParam.VoLayer, &stVoLayerAttr);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("vdec bind vpss fail for %#x!\n", s32Ret);
goto HI_VO_LAYER;
}
s32Ret = SAMPLE_COMM_VO_StartChn(uvcParam.VoLayer, uvcParam.enMode);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("vdec bind vpss fail for %#x!\n", s32Ret);
goto HI_VO_CHN;
}
// step7: VDEC bind VPSS
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
for(i=0; i<UVC_NUM; i++)
{
s32Ret = SAMPLE_COMM_VDEC_BindVpss(i, i);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("vdec bind vpss fail for %#x!\n", s32Ret);
goto HI_VDEC_UNBIND_VPSS;
}
}
}
// step8 : VPSS bind VO
for(i=0; i<UVC_NUM; i++)
{
s32Ret = SAMPLE_COMM_VO_BindVpss(uvcParam.VoLayer, i, i, VPSS_CHN0);
if(s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("vpss bind vo fail for %#x!\n", s32Ret);
goto HI_VO_UNBIND_VPSS;
}
}
// step9 : thread send data
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
pthread_create(&uvcParam.ptuvc, 0, HI_UVC_MJPEG_Thread, NULL);
}
else if(uvcParam.pixelformat == V4L2_PIX_FMT_YUYV)
{
pthread_create(&uvcParam.ptuvc, 0, HI_UVC_YUV_Thread, NULL);
}
else
{
SAMPLE_PRT("wrong video format!\n");
}
while(1)
{
SAMPLE_PRT("input 'q' to exit sample\n");
ch = getchar();
if (10 == ch)
continue;
getchar();
if ('q' == ch)
break;
else
{
SAMPLE_PRT("input 'q' to quit sample\n");
continue;
}
}
uvcParam.pthRun = HI_FALSE;
pthread_join(uvcParam.ptuvc, NULL);
HI_VO_UNBIND_VPSS:
for(i=0; i<UVC_NUM; i++)
{
SAMPLE_COMM_VO_UnBindVpss(uvcParam.VoLayer, i, i, VPSS_CHN0);
}
HI_VDEC_UNBIND_VPSS:
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
for(i=0; i<UVC_NUM; i++)
{
SAMPLE_COMM_VDEC_UnBindVpss(i, i);
}
}
HI_VO_CHN:
SAMPLE_COMM_VO_StopChn(uvcParam.VoLayer, uvcParam.enMode);
HI_VO_LAYER:
SAMPLE_COMM_VO_StopLayer(uvcParam.VoLayer);
HI_VO_HDMI:
SAMPLE_COMM_VO_HdmiStop();
HI_VO_DEV:
SAMPLE_COMM_VO_StopDev(uvcParam.VoDev);
HI_VPSS:
SAMPLE_COMM_VPSS_Stop(UVC_NUM, 1);
HI_VDEC:
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
SAMPLE_COMM_VDEC_Stop(UVC_NUM);
}
HI_BLOCK:
if(uvcParam.pixelformat == V4L2_PIX_FMT_YUYV)
{
HI_MPI_VB_ReleaseBlock(uvcParam.VbBlk);
}
HI_POOL:
if(uvcParam.pixelformat == V4L2_PIX_FMT_YUYV)
{
HI_MPI_VB_DestroyPool(uvcParam.VbPool);
}
HI_SYS_EXIT:
SAMPLE_COMM_SYS_Exit();
HI_UVC_Stop();
UVC_UNMAP:
HI_UVC_Unmap();
UVC_CLOSE:
HI_UVC_Close();
UVC_EXIT:
return s32Ret;
}
/******************************************************************************
* function : to process abnormal case
******************************************************************************/
void SAMPLE_UVC_HandleSig(HI_S32 signo)
{
HI_U32 i;
if (SIGINT == signo || SIGTERM == signo)
{
for(i=0; i<UVC_NUM; i++)
{
SAMPLE_COMM_VO_UnBindVpss(uvcParam.VoLayer, i, i, VPSS_CHN0);
}
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
for(i=0; i<UVC_NUM; i++)
{
SAMPLE_COMM_VDEC_UnBindVpss(i, i);
}
}
SAMPLE_COMM_VO_StopChn(uvcParam.VoLayer, uvcParam.enMode);
SAMPLE_COMM_VO_StopLayer(uvcParam.VoLayer);
SAMPLE_COMM_VO_HdmiStop();
SAMPLE_COMM_VO_StopDev(uvcParam.VoDev);
SAMPLE_COMM_VPSS_Stop(UVC_NUM, 1);
if(uvcParam.pixelformat == V4L2_PIX_FMT_MJPEG)
{
SAMPLE_COMM_VDEC_Stop(UVC_NUM);
}
if(uvcParam.pixelformat == V4L2_PIX_FMT_YUYV)
{
HI_MPI_VB_ReleaseBlock(uvcParam.VbBlk);
}
if(uvcParam.pixelformat == V4L2_PIX_FMT_YUYV)
{
HI_MPI_VB_DestroyPool(uvcParam.VbPool);
}
SAMPLE_COMM_SYS_Exit();
HI_UVC_Stop();
HI_UVC_Unmap();
HI_UVC_Close();
SAMPLE_PRT("\033[0;31mprogram termination abnormally!\033[0;39m\n");
}
exit(-1);
}
/******************************************************************************
* function : main()
******************************************************************************/
int main(int argc, char *argv[])
{
HI_S32 s32Ret = HI_FAILURE;
signal(SIGINT, SAMPLE_UVC_HandleSig);
signal(SIGTERM, SAMPLE_UVC_HandleSig);
s32Ret = SAMPLE_UVC();
if (HI_SUCCESS == s32Ret)
SAMPLE_PRT("program exit normally!\n");
else
SAMPLE_PRT("program exit abnormally!\n");
exit(s32Ret);
}
#ifdef __cplusplus
#if __cplusplus
}
#endif
#endif /* End of #ifdef __cplusplus */
程序解读
这两个宏,UVC_NUM指定了USB摄像头的个数,在线程中打开对应设备节点时,必须要能找到两个 /dev/video*,否则程序会异常结束
SAVE_FILE为1,则让视频文件保存到本地,这个只是测试用,因为文件保存下来很大,一是容易把FLASH写满,二是VO输出会卡顿
#define UVC_NUM 2
#define SAVE_FILE 0
参数结构体
- fd是 /dev/video* 打开后的文件指针
- cFileName是宏SAVE_FILE生效时的保存路径
- cNodeName是设备节点的文件名 /dev/video*
- u32inputWidth/u32inputHeight用于设置USB摄像头的分辨率,也用于yuv422p格式输入时,计算vpss的参数
- u32Width/u32Height用于指定HDMI输出参数,以及用于计算和分配MPP内存空间
- pixelformat用于指定USB摄像头的视频格式,一般只支持YUV422和MJPEG
- V4L2_buffer_num用于指定V4L2的缓存数量
- buffer_num作用同上,表示实际分配的缓存数量
- uvc_buf指向V4L2分配的缓存空间
- u32BlkSize/u32BlkCnt用于分配VB
- VbPool/VbBlk/u32phyAddr/pVirAddr用于YUV422格式时,数据在mpp内存中的参数
- VdChn/VpssGrp/VpssChn/VoDev/VoChn/VoLayer,在MJEPG格式时,需要使用到VDEC,N路视频,需要N个vdec channel,每一路视频,分别对应一个VPSS group,因此有N个VPSS group。VPSS有多少路输出,就对应多少个vpss channel,此处只有一个HMDI输出,因此vpss channel数为1,VoDev的数量也为1, VoLayer为1。有N路视频需要输出,因此VoChn为N
- enMode指定是多少分屏输出,程序将会根据此参数决定VoChn的值,并开启对应的vo通道
- pthRun为假,则用于送数据的线程停止运行,程序退出
- ptuvc用于标识送数据的线程
struct hiUvcParam
{
HI_S32 fd[UVC_NUM];
HI_CHAR cFileName[UVC_NUM][128];
HI_CHAR cNodeName[UVC_NUM][128];
HI_U32 u32inputWidth;
HI_U32 u32inputHeight;
HI_U32 u32Width;
HI_U32 u32Height;
HI_U32 pixelformat;
HI_U32 V4L2_buffer_num;
HI_U32 buffer_num[UVC_NUM];
uvc_buf_t *uvc_buf[UVC_NUM];
HI_U32 u32BlkSize;
HI_U32 u32BlkCnt;
VB_POOL VbPool;
VB_BLK VbBlk;
HI_U32 u32phyAddr;
HI_U8 *pVirAddr;
VDEC_CHN VdChn[UVC_NUM];
VPSS_GRP VpssGrp[UVC_NUM];
VPSS_CHN VpssChn;
VO_DEV VoDev;
VO_CHN VoChn[UVC_NUM];
VO_LAYER VoLayer;
SAMPLE_VO_MODE_E enMode;
HI_BOOL pthRun;
pthread_t ptuvc;
}uvcParam;
以下是V4L2 uvc的初始化,在网上找的代码改了一下,为我所用,没什么好解释的,基本上都是ioctl,需要注意的是,
- VIDIOC_STREAMON这个可能报错,一般情况下就是用hub在一个usb上接了两路uvc,造成带宽不够(Resource temporarily unavailable),网上解决办法很多,有兴趣可以试试,但是不一定管用
- open设备节点的时候,不建议用非阻塞形式打开,因为MJEPG格式时,每次去读取数据,底层数据不一定就绪了,因此会返回错误,串口会有茫茫多的错误打印。建议是阻塞方式打开,用select去监听所有的fd
s32Ret = HI_UVC_Param_Set(); //完成参数初始化
s32Ret = HI_UVC_Open(); //打开设备节点
s32Ret = HI_UVC_Init(); //初始化UVC,获取UVC支持的参数
s32Ret = HI_UVC_Set_Format(); //设置UVC视频格式和分辨率
s32Ret = HI_UVC_Mmap(); //映射内存空间
s32Ret = HI_UVC_Start(); //开始视频流传输
后续初始化的部分,都比较普通,没什么好说的,说一下线程部分
yuv线程这部分,涉及到一个yuv422p转yuv422sp,因为摄像头的YUV输出的顺序是YUYVYUYV,是packet形式的yuv422,而VPSS只能接收semiplannar的yuv422(YYYYUVUV)或yuv420(YYYYUV),因此需要将数据做转码,然后调用HI_MPI_VPSS_SendFrame送到VPSS,否则画面上的位置、颜色,都不对,但是绝对是能分辨出来有画面的
mjepg线程部分,涉及到帧头、帧尾等关键位置数据的检测,每一帧画面都要符合jpeg的要求。但是uvc摄像头给的关键数据,不一定符合标准的jpeg格式,这部分知识可以阅读这篇文章补补课,文中提到文件头为FF D8 FF E0,但是我这边实际采集到的是FF D8 FF DB,因此需要自己灵活处理
最后建议使用MJPEG格式,因为YUV格式时,1080P只支持5帧,非常的卡,而MJPEG格式没有这个问题。