从零开始的嵌入式图像图像处理(PI+QT+OpenCV)实战演练
1综述
http://www.cnblogs.com/jsxyhelu/p/7907241.html
2环境架设
http://www.cnblogs.com/jsxyhelu/p/7908226.html
3两个例子
http://www.cnblogs.com/jsxyhelu/p/8000804.html
4程序框架
http://www.cnblogs.com/jsxyhelu/p/7953805.html
5编译使用最新opencv
http://www.cnblogs.com/jsxyhelu/p/8000819.html
6综合实验
http://www.cnblogs.com/jsxyhelu/p/8000829.html
7拾遗
http://www.cnblogs.com/jsxyhelu/p/8007117.html
最后,我们必须完成一个综合实验,来验证前面所做的一切工作。为了达到这个目的,将实验设定为:使用实时根据图像的特征(包括ORB/SHIFT/SURF/BRISK),进行特征比对。这样,就验证了opencv类库的编译(因为使用了contrib库,所以必须自己编译)、基本程序框架的运行(涉及摄像头操作)。并且我们是使用虚拟机(PC版本的PI系统)编译测试,而后移植到PI上面去的。
配置文件:
#
# Project created by QtCreator
#
#): QT += widgets
TARGET = GOQTTemplate2
TEMPLATE = app
INCLUDEPATH += /usr/local/include/opencv \
/usr/local/include/opencv2
LIBS += /usr/local/lib/libopencv_world.so
SOURCES += main.cpp\
mainwindow.cpp \
clickedlabel.cpp
HEADERS += mainwindow.h \
clickedlabel.h
FORMS += mainwindow.ui
# Project created by QtCreator
#
#): QT += widgets
TARGET = GOQTTemplate2
TEMPLATE = app
INCLUDEPATH += /usr/local/include/opencv \
/usr/local/include/opencv2
LIBS += /usr/local/lib/libopencv_world.so
SOURCES += main.cpp\
mainwindow.cpp \
clickedlabel.cpp
HEADERS += mainwindow.h \
clickedlabel.h
FORMS += mainwindow.ui
主程序文件,简单说明流程:程序一开始就打开默认的摄像头,而后截获显示摄像头获取的数据。当有点击图片的操作的时候,保存当前图片作为模板,而后开始特征点匹配,并且显示匹配结果。有一个按钮能够切换不同的特征点算法:
,,,);
connect(clickLabel,SIGNAL(clicked(ClickedLabel;);
);
}
,));
dst ,tmp.rows),tmp.type(),Scalar());
tmp.copyTo(dst(cv,,,)));
)
{
; ;
,));
; i ; i .) )
{
good_matches.push_back( matches[i]);
}
}
drawMatches( tmp, keypointsLeft, matMatch, keypointsRight, good_matches, dst );
,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
)
{
imethod ;
};
}
}
);
}
)
{
cvtColor( src, tmp, CV_BGR2RGB );
img = QImage( (const unsigned char*)(tmp.data), tmp.cols, tmp.rows, QImage::Format_RGB888 );
}
else
{
img = QImage( (const unsigned char*)(src.data), src.cols, src.rows, QImage::Format_Indexed8 );
}
QPixmap qimg = QPixmap::fromImage(img) ;
return qimg;
}
connect(clickLabel,SIGNAL(clicked(ClickedLabel;);
);
}
,));
dst ,tmp.rows),tmp.type(),Scalar());
tmp.copyTo(dst(cv,,,)));
)
{
; ;
,));
; i ; i .) )
{
good_matches.push_back( matches[i]);
}
}
drawMatches( tmp, keypointsLeft, matMatch, keypointsRight, good_matches, dst );
,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
,),CV_FONT_HERSHEY_DUPLEX,.0f,Scalar(,,));
)
{
imethod ;
};
}
}
);
}
)
{
cvtColor( src, tmp, CV_BGR2RGB );
img = QImage( (const unsigned char*)(tmp.data), tmp.cols, tmp.rows, QImage::Format_RGB888 );
}
else
{
img = QImage( (const unsigned char*)(src.data), src.cols, src.rows, QImage::Format_Indexed8 );
}
QPixmap qimg = QPixmap::fromImage(img) ;
return qimg;
}