日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

实时手势识别 【手部跟踪】Mediapipe中的hand

發布時間:2025/3/11 编程问答 22 豆豆
生活随笔 收集整理的這篇文章主要介紹了 实时手势识别 【手部跟踪】Mediapipe中的hand 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

參考鏈接:

1)github代碼鏈接:https://github.com/google/mediapipe

2)說明文檔:https://google.github.io/mediapipe

3)python環境配置文檔:https://google.github.io/mediapipe/getting_started/python

4)API簡單調用的使用文檔:https://google.github.io/mediapipe/solutions/hands#python-solution-api

0.環境準備
python環境配置文檔:https://google.github.io/mediapipe/getting_started/python

ubuntu20.04
cuda11.2
python3.8
opencv-python==4.1.2.30
mediapipe==0.8.2
?
sudo apt install -y protobuf-compiler
sudo apt install -y cmake
1.簡介
稍微說明下,文檔基本在第一個鏈接中,python中是通過安裝mediapipe的pypi庫,調用API來用的。

說明文檔:https://google.github.io/mediapipe

?

import cv2 import mediapipe as mp mp_drawing = mp.solutions.drawing_utils mp_hands = mp.solutions.hands file_list=[] # For static images: hands = mp_hands.Hands(static_image_mode=True,max_num_hands=2,min_detection_confidence=0.5) for idx, file in enumerate(file_list):# Read an image, flip it around y-axis for correct handedness output (see# above).image = cv2.flip(cv2.imread(file), 1)# Convert the BGR image to RGB before processing.results = hands.process(cv2.cvtColor(image, cv2.COLOR_BGR2RGB))# Print handedness and draw hand landmarks on the image.print('Handedness:', results.multi_handedness)if not results.multi_hand_landmarks:continueimage_hight, image_width, _ = image.shapeannotated_image = image.copy()for hand_landmarks in results.multi_hand_landmarks:print('hand_landmarks:', hand_landmarks)print(f'Index finger tip coordinates: (',f'{hand_landmarks.landmark[mp_hands.HandLandmark.INDEX_FINGER_TIP].x * image_width}, 'f'{hand_landmarks.landmark[mp_hands.HandLandmark.INDEX_FINGER_TIP].y * image_hight})')mp_drawing.draw_landmarks(annotated_image, hand_landmarks, mp_hands.HAND_CONNECTIONS)cv2.imwrite('/tmp/annotated_image' + str(idx) + '.png', cv2.flip(annotated_image, 1)) hands.close()# For webcam input: hands = mp_hands.Hands(min_detection_confidence=0.5, min_tracking_confidence=0.5) cap = cv2.VideoCapture(0) while cap.isOpened():success, image = cap.read()if not success:print("Ignoring empty camera frame.")# If loading a video, use 'break' instead of 'continue'.continue# Flip the image horizontally for a later selfie-view display, and convert# the BGR image to RGB.image = cv2.cvtColor(cv2.flip(image, 1), cv2.COLOR_BGR2RGB)# To improve performance, optionally mark the image as not writeable to# pass by reference.image.flags.writeable = Falseresults = hands.process(image)# Draw the hand annotations on the image.image.flags.writeable = Trueimage = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)if results.multi_hand_landmarks:for hand_landmarks in results.multi_hand_landmarks:mp_drawing.draw_landmarks(image, hand_landmarks, mp_hands.HAND_CONNECTIONS)cv2.imshow('Hands', image)if cv2.waitKey(5) & 0xFF == 27:break hands.close() cap.release()

基于Openpose改進的Python純CPU深度學習實時識別手勢和檢測手指關節點

?

總結

以上是生活随笔為你收集整理的实时手势识别 【手部跟踪】Mediapipe中的hand的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。