日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Jetson TX1使用usb camera采集图像 (1)

發布時間:2025/4/16 编程问答 106 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Jetson TX1使用usb camera采集图像 (1) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

使用python實現

https://jkjung-avt.github.io/tx2-camera-with-python/

How to Capture and Display Camera Video with Python on Jetson TX2

Quick link:?tegra-cam.py

In this post I share how to use python code (with OpenCV) to capture and display camera video on Jetson TX2, including IP CAM, USB webcam and the Jetson onboard camera. This sample code should work on Jetson TX1 as well.

Prerequisite:

  • OpenCV with?GStreamer?and?python?support needs to be built and installed on the Jetson TX2. I use opencv-3.4.0 and python3. You can refer to my earlier post for how to build and install OpenCV with python support:?How to Install OpenCV (3.4.0) on Jetson TX2.
  • If you’d like to test with an IP CAM, you need to have it set up and know its RTSP URI, e.g. rtsp://admin:XXXXX@192.168.1.64:554.
  • Hook up a USB webcam (I was using Logitech C920) if you’d like to test with it. The USB webcam would usually be instantiated as /dev/video1, since the Jetson onboard camera has occupied /dev/video0.
  • Install gstreamer1.0-plugins-bad-xxx which include the?h264parseelement. This is required for decoding H.264 RTSP stream from IP CAM.
$ sudo apt-get install gstreamer1.0-plugins-bad-faad \gstreamer1.0-plugins-bad-videoparsers

?

Reference:

  • I developed my code based on?this canny edge detector sample code.
  • ACCELERATED GSTREAMER FOR TEGRA X2 USER GUIDE: Descriptions of?nvcamerasrc,?nvvidconv?and?omxh264dec?could be found in this document.

How to run the Tegra camera sample code:

  • Download the?tegra-cam.py?source code from my GitHubGist:?https://gist.github.com/jkjung-avt/86b60a7723b97da19f7bfa3cb7d2690e
  • To capture and display video using the Jetson onboard camera, try the following. By default the camera resolution is set to 1920x1080 @ 30fps.
$ python3 tegra-cam.py

?

  • To use a USB webcam and set video resolution to 1280x720, try the following. The ‘–vid 1’ means using /dev/video1.
$ python3 tegra-cam.py --usb --vid 1 --width 1280 --height 720

?

  • To use an IP CAM, try the following command, while replacing the last argument with RTSP URI for you own IP CAM.
$ python3 tegra-cam.py --rtsp --uri rtsp://admin:XXXXXX@192.168.1.64:554

?

Discussions:

The crux of this?tegra-cam.py?script lies in the GStreamer pipelines I use to call?cv.VideoCapture(). In my experience, using?nvvidconv?to do image scaling and to convert color format to BGRx (note that OpenCV requires?BGR?as the final output) produces better results in terms of frame rate.

def open_cam_rtsp(uri, width, height, latency):gst_str = ("rtspsrc location={} latency={} ! rtph264depay ! h264parse ! omxh264dec ! ""nvvidconv ! video/x-raw, width=(int){}, height=(int){}, format=(string)BGRx ! ""videoconvert ! appsink").format(uri, latency, width, height)return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)def open_cam_usb(dev, width, height):# We want to set width and height here, otherwise we could just do:# return cv2.VideoCapture(dev)gst_str = ("v4l2src device=/dev/video{} ! ""video/x-raw, width=(int){}, height=(int){}, format=(string)RGB ! ""videoconvert ! appsink").format(dev, width, height)return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER) #該命令在測試時無法啟動攝像頭,但采用"return cv2.VideoCapture(0)"可以正常顯示,I don`t know ??? def open_cam_onboard(width, height): # On versions of L4T previous to L4T 28.1, flip-method=2 # Use Jetson onboard camera gst_str = ("nvcamerasrc ! " "video/x-raw(memory:NVMM), width=(int)2592, height=(int)1458, format=(string)I420, framerate=(fraction)30/1 ! " "nvvidconv ! video/x-raw, width=(int){}, height=(int){}, format=(string)BGRx ! " "videoconvert ! appsink").format(width, height) return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)

?

Here’s a screenshot of my Jetson TX2 running?tegra-cam.py?with a live IP CAM video feed. (I also hooked up a Faster R-CNN model to do human head detection and draw bounding boxes on the captured images here, but the main video capture/display code was the same.)

If you like this post or have any questions, feel free to leave a comment below. Otherwise be sure to also check out my next post?How to Capture Camera Video and Do Caffe Inferencing with Python on Jetson TX2, in which I demonstrate how to feed live camera images into a Caffe pipeline for real-time inferencing.

轉載于:https://www.cnblogs.com/haiyang21/p/10704308.html

《新程序員》:云原生和全面數字化實踐50位技術專家共同創作,文字、視頻、音頻交互閱讀

總結

以上是生活随笔為你收集整理的Jetson TX1使用usb camera采集图像 (1)的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。