俺也没玩儿过体感游戏…想着要是能够在远处控制鼠标就好了,于是就萌生了一个初步的想法:使用光源例如手电筒控制鼠标的移动,对着电脑屏幕,我往左挥,鼠标就往左移,往右挥,就往又移动…
实现方案
- 获取电脑摄像头的图像,跟踪光斑位置,并转化为电脑屏幕坐标
- 移动鼠标到该位置
环境准备
目标跟踪可以用opencv实现
pip install opencv-python
鼠标控制可以用pyautogui实现
pip install pyautogui
开始coding
from multiprocessing import Process, Manager
import cv2
import pyautogui
def light_tacking(screen_x:int, screen_y:int, q_location):
"""光斑跟踪"""
cap = cv2.VideoCapture(0)
size_x, size_y = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH)), int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
while True:
ret, frame_lwpCV = cap.read()
if not ret:
continue
gray_lwpCV = cv2.cvtColor(frame_lwpCV, cv2.COLOR_BGR2GRAY)
gray_lwpCV = cv2.GaussianBlur(gray_lwpCV, (21, 21), 0)
dst = cv2.flip(gray_lwpCV, 1)
white = cv2.threshold(dst, 253, 255, cv2.THRESH_BINARY)[1] # 转化为二值图(将亮度大于253的都修改为255,否则为0)
contours, hier = cv2.findContours(white.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
if len(contours) > 0:
(x, y, w, h) = cv2.boundingRect(contours[0])
mouse_x = int((2 * x + w) / 2 * screen_x / size_x) # 将摄像头图像坐标映射到屏幕坐标
mouse_y = int((2 * y + h) / 2 * screen_y / size_y)
q_location.put((mouse_x, mouse_y))
break
if cv2.waitKey(100) & 0xff == ord('q'):
break
def move_mouse(q_location):
"""获取光斑位置,并移动鼠标到该位置"""
print("get location")
while True:
location = q_location.get(True)
pyautogui.moveTo(location[0], location[1], 0.03)
if __name__ == '__main__':
screen_x = 1920 # 屏幕分辨率
screen_y = 1080
q = Manager().Queue(500)
# 两个任务由两个进行独立运行,避免卡顿
work1 = Process(target=move_mouse, args=(q,))
work1.daemon = True
work1.start()
work2 = Process(target=light_tacking, args=(screen_x, screen_y, q))
work2.start()
work2.join()
启动程序
确保电脑有摄像头,确保摄像头范围内没有其他光源,打开手机手电筒,对着屏幕摄像头挥舞试试吧
有光靠鼠标移动就能玩儿的游戏推荐吗?