版权声明:本文为CSDN博主「提辖鲁」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/lj402159806/article/details/109614095
ndk实例总结补充
ndk实例总结:jni实例
ndk实例总结:opencv图像处理
ndk实例总结:安卓Camera与usbCamera原始图像处理
ndk实例总结补充:使用V4L2采集usb图像分析
ndk实例总结:使用fmpeg播放rtsp流
ndk实例总结:基于libuvc的双usbCamera处理
ndk实例总结补充:使用libuvc采集usb图像分析
ndk实例总结:jni日志存储
前言
本文是对ndk实例总结:基于libuvc的双usbCamera处理中jni部分的补充,主要分析下使用libuvc采集usb图像的流程
libuvc介绍
libuvc是用于USB视频设备的跨平台库,支持USB Video Class(UVC)设备(例如用户网络摄像头)的枚举,控制和流传输
有以下特性:
UVC设备发现和管理API
具有异步/回调和同步/轮询模式的视频流(从设备到主机)
对标准设备设置的读/写访问
各种格式之间的转换:RGB,YUV,JPEG等
代码实现分析
这里分析下ndk实例总结:基于libuvc的双usbCamera处理中的jni代码,主要是摄像头的打开关闭,预览与拍照,以及更改摄像头参数等
初始化摄像头
首先是初始化摄像头
int connect(int index, int vid, int pid, int fd, int busNum, int devAddr, std::string c_usbfs) {
uvc_error_t res = uvc_init2(&ctx[index], NULL, c_usbfs.c_str());
if (res < 0) {
LOGE("connect uvc_init2 error, res %d, index %d", res, index);
return JNI_ERR;
}
LOGI("connect UVC initialized, index %d", index);
res = uvc_get_device_with_fd(ctx[index], &dev[index], vid, pid, NULL, fd, busNum, devAddr);
if (res < 0) {
LOGE("connect uvc_get_device_with_fd error, res %d , index %d", res, index);
// close(fd);
return JNI_ERR;
} else {
LOGI("connect Device found, index %d", index);
res = uvc_open(dev[index], &devh[index]);
if (res < 0) {
LOGE("connect uvc_open error, res %d, index %d", res, index);
uvc_unref_device(dev[index]);
dev[index] = nullptr;
devh[index] = nullptr;
// close(fd);
::fd[index] = 0;
return JNI_ERR;
} else {
LOGI("connect Device opened, index %d", index);
}
}
if (res == 0) {
::fd[index] = fd;
is_connect[index] = true;
}
res = start_stream(index);
LOGI("connect, res %d, index %d", res, index);
return res;
}
初始化摄像头主要关注三个核心方法:uvc_init2、uvc_get_device_with_fd、uvc_open
uvc_init2是UVCCamera中特有的函数,在libuvc本身的uvc_init基础上加上了usbfs参数来实现uvc context的初始化
uvc_get_device_with_fd也是UVCCamera特有的函数,作用是获取通过vid、pid、fd、busNum、devAddr这些信息来标识的摄像头
uvc_open就是打开摄像头
打开视频流
打开摄像头成功后,就需要设置视频流参数和打开视频流了
uvc_error_t start_stream(int index) {
uvc_error_t res = uvc_get_stream_ctrl_format_size_fps(
devh[index], &ctrl[index], UVC_FRAME_FORMAT_MJPEG, IMG_WIDTH, IMG_HEIGHT, 1, 30);
LOGI("start_stream, stream_mode %d, index %d", stream_mode, index);
if (res < 0) {
LOGE("start_stream uvc_get_stream_ctrl_format_size_fps error, res %d, index %d", res,
index);
return res;
}
if (stream_mode) {
res = uvc_stream_open_ctrl(devh[index], &strmhp[index], &ctrl[index]);
LOGI("start_stream uvc_stream_open_ctrl, res %d, index %d", res, index);
res = uvc_stream_start_bandwidth(strmhp[index], NULL, NULL, 0.49, 0);
LOGI("start_stream uvc_stream_start_bandwidth, res %d, index %d", res, index);
uvc_frame_t *frame;
res = uvc_stream_get_frame(strmhp[index], &frame, 50 * 1000);
LOGI("start_stream uvc_stream_get_frame, res %d, frame %p, index %d", res, frame, index);
} else {
res = uvc_start_streaming_bandwidth(
devh[index], &ctrl[index], uvc_preview_frame_callback, (int *) index, 0.49, 0);
LOGI("start_stream uvc_start_streaming_bandwidth, res %d, index %d", res, index);
}
if (res == 0) {
is_start_stream[index] = true;
} else {
LOGE("start_stream error, res %d, index %d", res, index);
}
return res;
}
uvc_get_stream_ctrl_format_size_fps也是UVCCamera特有的函数,作用是与摄像头协商视频流的图像帧的格式、宽高,与libuvc本身的uvc_get_stream_ctrl_format_size函数相比多了最小和最大帧率参数
libuvc本身支持两种获取视频流的方式,一种是异步回调方式,另一种是主动轮询方式
主动轮询方式需要先调用uvc_stream_open_ctrl函数后再调用uvc_stream_start_bandwidth函数,uvc_stream_start_bandwidth不需要传入callback函数和用户指针,使用uvc_stream_get_frame函数来获取一帧图像
异步回调方式需要调用uvc_start_streaming_bandwidth函数,在第三个参数中传入callback函数和用户指针(区分多个摄像头)
void uvc_preview_frame_callback(uvc_frame_t *frame, void *ptr) {
int index = (int) ptr;
_mutex[index].lock();
if (data_list[index].size() >= 2) {
uvc_frame_t *front = data_list[index].front();
uvc_free_frame(front);
data_list[index].pop_front();
}
uvc_frame_t *copy = uvc_allocate_frame(frame->data_bytes);
uvc_duplicate_frame(frame, copy);
data_list[index].push_back(copy);
_mutex[index].unlock();
}
这样摄像头的每帧图像就会通过这个回调来提供,在这个回调中将图像复制一帧后放入缓冲队列里,作为缓冲帧
获取视频帧
打开视频流后就能获取视频帧了,同样也是异步与同步两种方式
int get_frame(uvc_frame_t *rgb565, int index) {
if (stream_mode) {
uvc_frame_t *frame;
uvc_error_t res;
_mutex[index].lock();
res = uvc_stream_get_frame(strmhp[index], &frame, 50 * 1000);
if (res || !frame) {
_mutex[index].unlock();
uvc_perror(res, "get_frame uvc_stream_get_frame error");
LOGE("get_frame uvc_stream_get_frame error, res %d, frame %p, index %d", res, frame,
index);
return -1;
}
uvc_frame_t *copy = uvc_allocate_frame(frame->data_bytes);
uvc_duplicate_frame(frame, copy);
_mutex[index].unlock();
res = uvc_mjpeg2rgb565(copy, rgb565);
if (res) {
uvc_perror(res, "get_frame uvc_mjpeg2rgb565 error");
LOGE("get_frame uvc_mjpeg2rgb565 error, res %d, index %d", res, index);
uvc_free_frame(copy);
return -1;
}
uvc_free_frame(copy);
} else {
_mutex[index].lock();
if (data_list[index].empty()) {
_mutex[index].unlock();
return -1;
}
uvc_frame_t *front = data_list[index].front();
data_list[index].pop_front();
_mutex[index].unlock();
uvc_error_t res;
res = uvc_mjpeg2rgb565(front, rgb565);
if (res) {
uvc_perror(res, "get_frame uvc_mjpeg2rgb565 error");
LOGE("get_frame uvc_mjpeg2rgb565 error, res %d, index %d", res, index);
uvc_free_frame(front);
return -1;
}
uvc_free_frame(front);
}
return 0;
}
同步方式通过uvc_stream_get_frame函数获取一帧图像,然后复制帧,转成rgb565格式(Android 预览使用)
异步方式直接从缓冲队列里取出第一帧,直接转成rgb565格式
拍照
拍照的流程和获取视频帧类似,区别只是不转成rgb565格式,而是直接将视频帧保存成jpg文件存在本地
bool take_photo(int index, std::string path) {
if (!is_start_stream[index]) {
LOGE("take photo error, not start_stream, index %d", index);
return false;
}
uvc_frame_t *copy;
_mutex[index].lock();
if (stream_mode) {
uvc_frame_t *frame;
uvc_frame_t *frame_test;
uvc_error_t res = uvc_stream_get_frame(strmhp[index], &frame, 50 * 1000);
uvc_error_t res_test = uvc_stream_get_frame(strmhp[index], &frame_test, 50 * 1000);
if (res || res_test || !frame || !frame_test) {
_mutex[index].unlock();
uvc_perror(res, "take_photo uvc_stream_get_frame error");
LOGE("take_photo uvc_stream_get_frame error, res %d, frame %p, frame_test %p, index %d",
res, frame, frame_test, index);
return false;
}
copy = uvc_allocate_frame(frame->data_bytes);
uvc_duplicate_frame(frame, copy);
} else {
if (data_list[index].empty()) {
_mutex[index].unlock();
LOGE("take_photo error, data_list empty, index %d", index);
return false;
}
uvc_frame_t *frame = data_list[index].back();
copy = uvc_allocate_frame(frame->data_bytes);
uvc_duplicate_frame(frame, copy);
}
_mutex[index].unlock();
unsigned char *buf = (unsigned char *) copy->data;
store_MJPG_image(path.c_str(), buf, copy->data_bytes);
uvc_free_frame(copy);
return true;
设置摄像头参数
设置摄像头参数使用的都是libuvc自己的api,这里实现了曝光方式、曝光时间、增益和亮度,其他参数可以自行参考文档libuvc
int set_param(int index, bool autoExpo, float expoTime, float gain, int brightness) {
if (!is_start_stream[index]) {
LOGE("set param error, not start_stream, index %d", index);
return false;
}
uint8_t mode;
uvc_error_t res = uvc_set_ae_mode(devh[index], autoExpo ? 8 : 1);
LOGI("set_param uvc_set_ae_mode, res %d, ae mode expect %d, index %d",
res, autoExpo ? 8 : 1, index);
res = uvc_get_ae_mode(devh[index], &mode, UVC_GET_CUR);
LOGI("set_param uvc_get_ae_mode, res %d, ae mode current %d, index %d", res, mode, index);
int ae_abs;
res = uvc_set_exposure_abs(devh[index], (int) (10000 * expoTime));
LOGI("set_param uvc_set_exposure_abs, res %d, expo time expect %d, index %d", res,
(int) (10000 * expoTime), index);
res = uvc_get_exposure_abs(devh[index], &ae_abs, UVC_GET_CUR);
LOGI("set_param uvc_get_exposure_abs, res %d, expo time current %d, index %d",
res, ae_abs, index);
uint16_t value;
res = uvc_set_gain(devh[index], (int) gain);
LOGI("set_param uvc_set_gain, res %d, gain expect %d, index %d", res, (int) gain, index);
res = uvc_get_gain(devh[index], &value, UVC_GET_CUR);
LOGI("set_param uvc_get_gain, res %d, gain current %d, index %d", res, value, index);
int16_t _brightness;
res = uvc_set_brightness(devh[index], (int) round((brightness - 50) * 1.28));
LOGI("set_param uvc_set_brightness, res %d, brightness expect %d, index %d",
res, (int) round((brightness - 50) * 1.28), index);
res = uvc_get_brightness(devh[index], &_brightness, UVC_GET_CUR);
LOGI("set_param uvc_get_brightness, res %d, brightness current %d, index %d",
res, _brightness, index);
return res;
}
这里要注意的是,曝光方式有四种方式:1 MANUAL、2 AUTO、4 SHUTTER_PRIORITY、8 APERTURE_PRIORITY,我的摄像头经测试只有1和8可以设置,原因应该与摄像头有关,可以自己尝试下
曝光时间以0.0001秒为单位,比如10ms就设置100
亮度设置范围在-64到64,对应0到100,用我代码里的公式就行了
关闭摄像头
关闭摄像头比较简单,先关闭流再关闭摄像头,最后清空异步方式的缓冲帧就行了
void release(int index) {
LOGI("start release, index %d", index);
if (is_start_stream[index]) {
is_start_stream[index] = false;
LOGI("start stop stream, stream_mode %d, index %d", stream_mode, index);
if (stream_mode) {
uvc_stream_stop(strmhp[index]);
uvc_stream_close(strmhp[index]);
strmhp[index] = nullptr;
} else {
uvc_stop_streaming(devh[index]);
}
LOGI("finish stop stream, stream_mode %d, index %d", stream_mode, index);
}
if (is_connect[index]) {
is_connect[index] = false;
LOGI("start release device, index %d", index);
uvc_close(devh[index]);
devh[index] = nullptr;
LOGI("uvc_close, index %d", index);
uvc_unref_device(dev[index]);
dev[index] = nullptr;
LOGI("uvc_unref_device, index %d", index);
uvc_exit(ctx[index]);
ctx[index] = nullptr;
LOGI("uvc_exit, index %d", index);
// close(fd[index]);
fd[index] = 0;
LOGI("finish release device, index %d", index);
}
LOGI("start clear data_list, index %d", index);
_mutex[index].lock();
while (!data_list[index].empty()) {
LOGI("data_list size %d, index %d", data_list[index].size(), index);
uvc_frame_t *front = data_list[index].front();
uvc_free_frame(front);
data_list[index].pop_front();
}
data_list[index].clear();
_mutex[index].unlock();
LOGI("finish clear data_list, index %d", index);
LOGI("finish release, index %d", index);
}
最后完整代码可以参考demo
ndk开发基础学习系列
JNI和NDK编程(一)JNI的开发流程
JNI和NDK编程(二)NDK的开发流程
JNI和NDK编程(三)JNI的数据类型和类型签名
JNI和NDK编程(四)JNI调用Java方法的流程
完整demo
https://github.com/GavinAndre/AndroidJNIDemo
参考
libuvc
UVCCamera
libuvc document