Jetson TX1使用usb camera采集图像 (1)
使用python实现
https://jkjung-avt.github.io/tx2-camera-with-python/
How to Capture and Display Camera Video with Python on Jetson TX2
Oct 19, 2017
Quick link: tegra-cam.py
In this post I share how to use python code (with OpenCV) to capture and display camera video on Jetson TX2, including IP CAM, USB webcam and the Jetson onboard camera. This sample code should work on Jetson TX1 as well.
Prerequisite:
- OpenCV with GStreamer and python support needs to be built and installed on the Jetson TX2. I use opencv-3.4.0 and python3. You can refer to my earlier post for how to build and install OpenCV with python support: How to Install OpenCV (3.4.0) on Jetson TX2.
- If you’d like to test with an IP CAM, you need to have it set up and know its RTSP URI, e.g. rtsp://admin:XXXXX@192.168.1.64:554.
- Hook up a USB webcam (I was using Logitech C920) if you’d like to test with it. The USB webcam would usually be instantiated as /dev/video1, since the Jetson onboard camera has occupied /dev/video0.
- Install gstreamer1.0-plugins-bad-xxx which include the
h264parseelement. This is required for decoding H.264 RTSP stream from IP CAM.
$ sudo apt-get install gstreamer1.0-plugins-bad-faad \
gstreamer1.0-plugins-bad-videoparsers
Reference:
- I developed my code based on this canny edge detector sample code.
- ACCELERATED GSTREAMER FOR TEGRA X2 USER GUIDE: Descriptions of nvcamerasrc, nvvidconv and omxh264dec could be found in this document.
How to run the Tegra camera sample code:
- Download the
tegra-cam.pysource code from my GitHubGist: https://gist.github.com/jkjung-avt/86b60a7723b97da19f7bfa3cb7d2690e - To capture and display video using the Jetson onboard camera, try the following. By default the camera resolution is set to 1920x1080 @ 30fps.
$ python3 tegra-cam.py
- To use a USB webcam and set video resolution to 1280x720, try the following. The ‘–vid 1’ means using /dev/video1.
$ python3 tegra-cam.py --usb --vid 1 --width 1280 --height 720
- To use an IP CAM, try the following command, while replacing the last argument with RTSP URI for you own IP CAM.
$ python3 tegra-cam.py --rtsp --uri rtsp://admin:XXXXXX@192.168.1.64:554
Discussions:
The crux of this tegra-cam.py script lies in the GStreamer pipelines I use to call cv.VideoCapture(). In my experience, using nvvidconv to do image scaling and to convert color format to BGRx (note that OpenCV requires BGR as the final output) produces better results in terms of frame rate.
def open_cam_rtsp(uri, width, height, latency):
gst_str = ("rtspsrc location={} latency={} ! rtph264depay ! h264parse ! omxh264dec ! "
"nvvidconv ! video/x-raw, width=(int){}, height=(int){}, format=(string)BGRx ! "
"videoconvert ! appsink").format(uri, latency, width, height)
return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER) def open_cam_usb(dev, width, height):
# We want to set width and height here, otherwise we could just do:
# return cv2.VideoCapture(dev)
gst_str = ("v4l2src device=/dev/video{} ! "
"video/x-raw, width=(int){}, height=(int){}, format=(string)RGB ! "
"videoconvert ! appsink").format(dev, width, height)
return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER) #该命令在测试时无法启动摄像头,但采用"return cv2.VideoCapture(0)"可以正常显示,I don`t know ???
def open_cam_onboard(width, height): # On versions of L4T previous to L4T 28.1, flip-method=2 # Use Jetson onboard camera gst_str = ("nvcamerasrc ! " "video/x-raw(memory:NVMM), width=(int)2592, height=(int)1458, format=(string)I420, framerate=(fraction)30/1 ! " "nvvidconv ! video/x-raw, width=(int){}, height=(int){}, format=(string)BGRx ! " "videoconvert ! appsink").format(width, height) return cv2.VideoCapture(gst_str, cv2.CAP_GSTREAMER)
Here’s a screenshot of my Jetson TX2 running tegra-cam.py with a live IP CAM video feed. (I also hooked up a Faster R-CNN model to do human head detection and draw bounding boxes on the captured images here, but the main video capture/display code was the same.)

If you like this post or have any questions, feel free to leave a comment below. Otherwise be sure to also check out my next post How to Capture Camera Video and Do Caffe Inferencing with Python on Jetson TX2, in which I demonstrate how to feed live camera images into a Caffe pipeline for real-time inferencing.
Jetson TX1使用usb camera采集图像 (1)的更多相关文章
- Jetson TX1使用usb camera采集图像 (2)
该方法只启动usb摄像头 import cv2 import numpy import matplotlib.pyplot as plot class Camera: cap = cv2.VideoC ...
- Camera 采集图像的方法
使用 Camera 采集图像, 实现步骤如下: 需要权限: android.permission.CAMERA android.permission.WRITE_EXTERNAL_STORAGE // ...
- 【Xilinx-Petalinux学习】-06-OpenCV通过USB摄像头采集图像。
占位, 实现USB摄像头的图像采集与保存
- 基于英伟达Jetson TX1的GPU处理平台
基于英伟达Jetson TX1 GPU的HDMI图像输入的深度学习套件 [309] 本平台基于英伟达的Jetson TX1视觉计算的全功能开发板,配合本公司研发的HDMI输入图像采集板:Jetson ...
- camera按键采集图像及waitKey的用法(转)
源: camera按键采集图像及waitKey的用法
- camera按键采集图像及waitKey的用法
前言 项目需要通过摄像头采集图像并保存,主要是用于后续的摄像头标定.实现过程其实很简单,需要注意一些细节. 系统环境 系统版本:ubuntu16.04:opencv版本:opencv2.4.13:编程 ...
- [转]Jetson TX1 开发教程(1)配置与刷机
开箱 Jetson TX1是英伟达公司新出的GPU开发板,拥有世界上先进的嵌入式视觉计算系统,提供高性能.新技术和极佳的开发平台.在进行配置和刷机工作之前,先来一张全家福: 可以看到,Jetson T ...
- 【并行计算-CUDA开发】 NVIDIA Jetson TX1
概述 NVIDIA Jetson TX1是计算机视觉系统的SoM(system-on-module)解决方案.它组合了最新的NVIDIAMaxwell GPU架构,其具有ARM Cortex-A57 ...
- ffmpeg从USB摄像头采集一张原始图片(转)
本文讲解使用ffmpeg从USB摄像头中采集一帧数据并写入文件保存,测试平台使用全志A20平台,其他平台修改交叉工具链即可移植.开发环境使用eclipse+CDT.交叉工具链使用arm-Linux-g ...
随机推荐
- python3打开winodows文件问题
1,解决办法 "C:\\Users\\Darkness-02\\Desktop\\test.txt" 多加一个反斜杠就行了 2,解决办法r"C:\Users\Darkne ...
- 版本控制工具——Git常用操作(下)
本文由云+社区发表 作者:工程师小熊 摘要:上一集我们一起入门学习了git的基本概念和git常用的操作,包括提交和同步代码.使用分支.出现代码冲突的解决办法.紧急保存现场和恢复现场的操作.学会以后已经 ...
- Asp.Net.Identity认证不依赖Entity Framework实现方式
Asp.Net.Identity为何物请自行搜索,也可转向此文章http://www.cnblogs.com/shanyou/p/3918178.html 本来微软已经帮我们将授权.认证以及数据库存储 ...
- 变量类型、构造器、封装以及 LeetCode 每日一题
1.成员变量和局部变量 1.1成员变量和局部变量定义 成员变量指的是类里面定义的变量(field),局部变量指的是在方法里定义的变量. 成员变量无须显示初始化,系统会自动在准备阶段或创建该类的实例时进 ...
- QQ登录的那些坑
这几天在项目上面实现qq登录的功能,当功能做好后发现,同一个qq号登录之后腾讯返回的openid并不一样....(天啦噜啊~)然后查询文档以及咨询客服才知道注册申请时是有一个固定的套路的(不得不说,如 ...
- 005. [转] SSH端口转发
玩转SSH端口转发 SSH有三种端口转发模式,本地端口转发(Local Port Forwarding),远程端口转发(Remote Port Forwarding)以及动态端口转发(Dynamic ...
- loj#6041. 「雅礼集训 2017 Day7」事情的相似度(SAM set启发式合并 二维数点)
题意 题目链接 Sol 只会后缀数组+暴躁莫队套set\(n \sqrt{n} \log n\)但绝对跑不过去. 正解是SAM + set启发式合并 + 二维数点/ SAM + LCT 但是我只会第一 ...
- java 线程方法 ---- wait()
class MyThread5 implements Runnable{ private int flag = 10; @Override public void run() { while (fla ...
- 生鲜配送管理系统_升鲜宝V2.0 供应商协同系统设计思想及设计效果展现(一)
生鲜配送管理系统[升鲜宝]V2.0 供应商协同系统小程序设计思想及操作说明(一) 生鲜供应链企业,最重要的二个方面,其中一个是客户服务(销售订单) 另外一个就是供应商的管控,只有做好了这 ...
- 【NodeJS】基础知识
nodejs基础 nodejs允许自己封装模块,使得编写程序可以模块化,便于维护整理.在一个js文件中写完封装的函数或对象后,可以使用exports或module.exports来将模块中的函数暴露给 ...