USB Cameras - NVIDIA Jetson

Поділитися
Вставка
  • Опубліковано 20 сер 2024

КОМЕНТАРІ • 93

  • @beeforepremium9401
    @beeforepremium9401 2 роки тому +1

    I'm now starting my thesis so I can graduate. And after watching this video, I can now see the light from this dark tunnel! Thank you so much Mr. JetsonHacks you are a life saver!

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      Thank you for the kind words. Good luck on your studies, and thanks for watching!

  • @Smytjf11
    @Smytjf11 2 роки тому +1

    Mate, that beauty filter is precisely what I needed. Maybe we can leave the focus for later 😅

  • @tejobhiru1092
    @tejobhiru1092 2 роки тому +2

    THANK YOU SO MUCH..!!!!
    you have no idea how much youre helping people..!
    thanks again!

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      Thank you for the kind words, and thanks for watching!

  • @Siri12450
    @Siri12450 2 роки тому +1

    After so long good see u posting on jetson nano

  • @tarmiziizzuddin337
    @tarmiziizzuddin337 2 роки тому +1

    Thank you so much for this Sir! Much appreciated

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      You're very welcome, and thanks for watching!

  • @richleyden6839
    @richleyden6839 2 роки тому

    Very interesting episode. I have a big collection of USB camera but haven't used them much after CSI cameras became more accesable. Accessing and exploiting the capabilities of USB cameras was always a mystery, Your demo is very welcome.
    The cabling of USB cameras makes them a better choice for some projects. But that is just one factor of many. As an alternative, I've trired multiple ESP32-cams wifi linked to a Jetson nano to do detection. Attractive cost-wise but not super reliable.

    • @JetsonHacks
      @JetsonHacks  2 роки тому +3

      Thank you for the kind words. Each camera has its own little niche. Webcams are plentiful, and most people have access to them. CSI cameras tend to be less expensive, but the cabling can be a bit of a challenge. Especially for projects where the camera needs to be away from the Jetson. GMSL cameras, which are CSI cameras with a little bit of a disguise, are better if you need long cable lengths. However, GMSL tends to add quite a bit of cost.
      USB cameras tend to have more smarts built in, the CSI cameras tend to be rather raw image sensors. Remote cameras like you mention can also be useful, but require a little more system design to be as robust as a direct connect. Thanks for watching!

  • @CarlosNarvaez-Embedded
    @CarlosNarvaez-Embedded 2 місяці тому +1

    Nice! Thanks

    • @JetsonHacks
      @JetsonHacks  2 місяці тому

      You are welcome, and thanks for watching!

  • @mitchell-conrad
    @mitchell-conrad Рік тому +1

    How did you get more than 2 USB camera streams running at the same time? All of the NVIDIA forums say it's basically impossible. I'm trying to get three USB cameras to stream at the same time, but no luck

    • @JetsonHacks
      @JetsonHacks  Рік тому

      That's a good question. The limiting factor on how many USB cameras you can use at once is the USB bandwidth. Typically if you're using HD 1080P uncompressed video from a USB3 camera, there's only enough bandwidth to support 2 cameras or so.
      If you are using lower resolution or compressed video, you may be able to have more. In the video, there are "three" cameras. The first is a USB 2.0 camera, which is a Logitech webcam. It provides video in various sizes, most importantly a hardware compressed stream. The other two cameras are a ZED RGBD camera, and an Intel RealSense camera. If you look at the resolution of each video (in the window titles), you notice that they are of various sizes. Some are 640x480, others are much larger. It's this combination which allows more than two cameras to work at the same time. In general, you either need to use compressed video streams or smaller video stream sizes.
      Thanks for watching!

  • @qyy2889
    @qyy2889 2 роки тому +1

    The video benefits a lot

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      I am glad you found it useful. Thanks for watching!

    • @qyy2889
      @qyy2889 2 роки тому

      @@JetsonHacks It would be even better if you could come up with a video about jetson deploying the yolo model and accelerating inference through tensorrt and outputting the video with deepstream! Your videos have been of great help to my studies!

  • @henrywunsch7101
    @henrywunsch7101 10 місяців тому +1

    (Question) Hi There,
    Really Enjoyed the Video I just have one question. For some reason when I try and run "python3 camera_caps.py" at 6:02 into the video I get an error saying "ModuleNotFoundError: No module named 'PyQt5'". Do you have a solution for this by chance? I've tried looking into the raw code of that file and just cant find anything.
    Thanks,
    Henry

    • @JetsonHacks
      @JetsonHacks  10 місяців тому +2

      Thank you for the kind words. If you are running JetPack 5.X, you can execute:
      $ sudo apt install python3-pyqt5
      There are other dependencies, see the README: github.com/jetsonhacks/camera-caps/tree/jetpack-5.x#installation
      Thanks for watching!

  • @thedroneplanner4577
    @thedroneplanner4577 2 роки тому +1

    Great video. Thanks

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      Thank you for the kind words, and thanks for watching!

  • @alexgunagwera8310
    @alexgunagwera8310 2 роки тому

    As always. another great. thankst Jim. By the way, Is the website buggy? Or did you turn the comments off?

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      Thank you for the kind words. Working on the website, thanks for letting me know.

  • @prefpkg21
    @prefpkg21 2 роки тому +1

    Thank you so much! I’ve always wanted a deep dive into the cameras under the hood. How do you know if the hardware encode/decoding is being used? Is it automatic?

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      You are welcome. Typically the encode/decode process is handled by a GStreamer filter. Most of the NVIDIA filters include the hardware. Thanks for watching!

  • @outofthebots3122
    @outofthebots3122 2 роки тому

    Thanks for the detailed info on Jetson cameras. I assume that the low level V4L2 interface is faster than the higher level gstreamer??

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      There's no simple answer to that question, it depends on the project and the competence of the programmer.
      For example, let's say that you are doing a IRL live-streaming application, where you input a camera/audio feed into the Jetson and broadcast it over multiple 4G connections.
      Using GStreamer, this is relatively simple. Grab the audio/video, encode it (select multiple resolutions, using the hardware encoders on the Jetson H.264, H.265, and so on), put it in a container/transport layer (e.g. MPEG-TS) and then send it off to the network code using a library such as SRT.
      With V4L2 alone, you receive the video from the camera. Then you need to code everything else on up. You can reinvent the wheel, but it's not clear how much faster (if any) it might be at the end of the day. If execution speed is the only metric, it might make sense. However if you factor in time and money, the V4L2 approach doesn't make much sense.
      On the other hand, a simpler application where you are grabbing a video stream and examining/manipulating it on a frame by frame basis, you may be able to make it faster. That way you avoid the cost of having the GStreamer framework overhead.
      Thanks for watching!

    • @outofthebots3122
      @outofthebots3122 2 роки тому +1

      @@JetsonHacks Thanks. I will be doing simple by grabbing frame and processing using AI

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      @@outofthebots3122 Good luck on your project!

  • @hopekelvinchilunga4720
    @hopekelvinchilunga4720 2 роки тому +1

    Hi Jim, Thanks for this tutorial but i have an error " camera_caps.py" , line 31 window_width: int =640

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      I can't tell from your description what the issue might be. Typically you need to say which Jetson, which version of Python, how you are running the program and so on before I can guess.

  • @yalmadiable
    @yalmadiable 2 роки тому

    Hi Jim,
    What’s the difference between MIPI and USB cameras and what controls which one to you use in an application?
    Is all USB cameras (including thermal USB) easy to integrate in Jetson or it needs further development to get it run on Jetson.

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      It depends on what your needs are. The answers to these questions are it depends. If the USB camera has a V4L2 uvcvideo driver, they should work on Jetson. However, it's on a case by case basis.

    • @yalmadiable
      @yalmadiable 2 роки тому

      @@JetsonHacks the drivers will be there yes & application is for outdoor robotics like UGV & UAV for example

  • @Ivan-xg4ev
    @Ivan-xg4ev 2 роки тому +2

    Like!

  • @rubend.florez7710
    @rubend.florez7710 Рік тому

    Hello Mr. JetsonHacks, very good video, it helped me a lot; How many maximum fps can the jetson nano of a webcam run?

    • @JetsonHacks
      @JetsonHacks  Рік тому +1

      Thank you for the kind words. The maximum fps varies between web cam models. The size of the frame you are capturing, the frame rate of the camera, the type of compression (the Jetson has hardware decoders for certain formats), and the amount of USB bandwidth available. Thanks for watching!

  • @wishicouldarduino8880
    @wishicouldarduino8880 2 роки тому

    Cant get enough of these camera tutorials jim great work!👍.I was wondering if I can use a webcam that I bought that requires no special setup 30 fps streaming webcam with mic built in would something like that work for this I hope so it's a really good cam .I have a robot I want to use this in .In addition to this question lol do I need to put a camera on the hand of my robot to help it to pick up things ? Thanks for any advice cool video😁👍🤖

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      If it's a UVC webcam (most are) it should work. Good luck on your project!

    • @wishicouldarduino8880
      @wishicouldarduino8880 2 роки тому +1

      @@JetsonHacks thanks 😁👍🤖

    • @wishicouldarduino8880
      @wishicouldarduino8880 2 роки тому

      @@JetsonHacks I have the model with 2 spi cameras would the other tutorial for that conflict with this one? I wouldn't think it would but I'm a novice 😁👍thanks for any help .

  • @PhilippeMoulin
    @PhilippeMoulin 2 роки тому

    Thank You very much!
    I was on my way to try this with an L515 and a plush octopus, but Intel has just discontinued the L515 :-(
    Do you know any other depth camera that's as good, affordable, and won't turn into abandonware the day i get my hands on one ?

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      The L515 is a little different than other depth cameras, as it is a lidar. However, there are other RGBD depth cameras. The Stereolabs ZED cameras have been on the Jetson since the beginning. There are other Realsense cameras that are still being produced, the D435i for example. One of the new comers is Luxonis with its OAK-D cameras. The OAK-Ds have built in smarts in the camera that can run OpenCV code. Thanks for watching!

  • @caseyandtim
    @caseyandtim 2 роки тому

    Trying to get going with a Nano 2GB and d435i.
    Updated to JetPack 4.61 and OpenCV 4.6.0.
    d435i registers two depth data steams, no RGB,
    only the second outputs an image (raw IR dots, not depth).
    both show up in camera-caps but neither display.
    The output doesn't fully seem to register them:
    ```Unsupported format: Z16```
    camcaps shows:
    ```v4l2src device=/dev/video0 !``
    I'm looking for RGBD, or Luma+Depth (but not raw dots).
    Ever encounter any of this with your d435i? Could it be a busted camera?
    Do you know of any small, wide, low light d435i alternatives?
    Thank you for these videos!

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      How did you install the RealSense SDK? Does the RealSense viewer show everything correctly?

    • @caseyandtim
      @caseyandtim 2 роки тому

      @@JetsonHacks oh. SDK you say? I’ll install and test when I’m back at the studio tomorrow.

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      @@caseyandtim github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md
      I believe that they may have placed the realsense libs into the Ubuntu repositories so you can sudo apt install them.

  • @thenextproblem8001
    @thenextproblem8001 Рік тому

    Hi Jim,
    Can we control the exposure, brightness etc with keyboard commands while streaming? İ want to use it for my baby but sometimes i need adjustments on it due to light conditions. So i grab a ESP connected to nano with serial. But how can i adjust the parameters without closing the window?

    • @JetsonHacks
      @JetsonHacks  Рік тому

      Which ESP are you using?

    • @thenextproblem8001
      @thenextproblem8001 Рік тому

      @@JetsonHacks i have both esp32-s2 and wroom. I'm using pots and buttons to command other ESP with espnow. For example i can map the pot between exposure_min and exposure_max. Send the value over espnow to other ESP. But how to read and change exposure value in the code (python side) any help would be awesome

    • @JetsonHacks
      @JetsonHacks  Рік тому

      @@thenextproblem8001 Unfortunately this is more than I can help with in a UA-cam comment. It sounds like you need to program the ESP to control a camera. Once you do that, you should be able to interface with the Jetson some simple commands. Good luck on your project!

    • @thenextproblem8001
      @thenextproblem8001 Рік тому

      @@JetsonHacks hey Jim, thanks for the help! Sometimes everything we need a "wait a second" moment. It turns out pretty easy to do. İf anybody wants to do the same here is my path;
      One ESP (model doesn't important) read and send the data via espnow.
      Other ESP (model doesn't important) connected to nano via Serial communication.
      Get the data > parse to serial> read serial on nano> update the value. Duh!
      Thanks again ☺️

  • @BTXSISTEMAS
    @BTXSISTEMAS 2 роки тому

    Nice tutorial Jim ! But I just can't run the "camera_caps.py" Line 17 from PyQt5.QtCore import QSize - ModuleNotFoundErro.r No module named "PyQt5". Does someone has any idea why this ?

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      Thank you for the kind words. Your error message says that you do not have the PyQt5 module installed. The program uses Qt5 for its GUI.

  • @Lion_McLionhead
    @Lion_McLionhead 2 роки тому

    Would be nice to know how jetsons are still being bought, since the product line was discontinued last year. NVidia seems to have shifted to metaverse & dropped all of their embedded AI products.

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      Your statements are incorrect. The Jetsons are still in production. They have not been discontinued. However there have been component shortages (as for all hardware manufacturers) that have limited production. NVIDIA is working through the problem, but as everyone is trying to buy the same (sometimes non-existent) parts, it is challenging. This is an industry wide problem. Note that this isn't for a shortage of Tegra chips, but rather the "glue" chips, memory chips, ethernet chips and so on. Even on the spot market, these chips are difficult to get. See: forums.developer.nvidia.com/t/stop-production-jetsonhw/204799/3
      I do not know what you mean "dropped all of their embedded AI products". The Metaverse group is not part of the embedded group. NVIDIA is a large tech company, just because one group makes some noise about their product doesn't mean that other groups products are cancelled. Isaac Sim, which uses Metaverse, helps test code to run on Jetsons in the real world. The embedded group has been working on the the latest JetPack 4.6 and 5.0 developer release. The 4.6 release is not released, the 5.0 dev release is in the final stages. According to NVIDIA, the Jetson Orin is being released by the end of the month (one guess might be sometime around GTC 2022). Since Orin runs 5.0, we might guess that's when 5.0 will be released.
      The key to understanding the NVIDIA software stack is that each one of the specialized libraries written specifically for NVIDIA hardware runs cross platform. On the Jetson, that means CUDA, cuDNN, TensorRT and so on are basically all the same code that runs on the desktop graphics cards. The point releases may take a little longer to get to the Jetson, but it's the same code base. Therefore any code in the code libraries being developed "automagically" migrates to the Jetsons.

  • @krischlapek6939
    @krischlapek6939 8 місяців тому

    Hello. Would you be willing to explore RTSP cameras? Anything described on the Jetson forum has a big delay in the streamed image. Thanks

    • @JetsonHacks
      @JetsonHacks  8 місяців тому

      Is there a particular RTSP camera that you are using? I don't have any at the moment.

    • @krischlapek6939
      @krischlapek6939 8 місяців тому

      Yes, it's a Axis P3539lr. But I guess any with the RTSP will do the job. I think they call IP cameras in general

    • @JetsonHacks
      @JetsonHacks  8 місяців тому

      @@krischlapek6939 I'm not sure I know anything more about this than what they say in the forums. I would think someone there must have a better idea about how to speed it up. I'm making the assumption you have a fast enough network and a Jetson which can process the stream. Which type of stream are you using from the camera, h.264 and MPEG-4 AAC audio ?

  • @betons1868
    @betons1868 Рік тому

    hello😊 First of all, thank you very much for your work. How to load the library ? (cat/lib/modules...) the code you wrote gives me an error. Can you help me please ❤

    • @JetsonHacks
      @JetsonHacks  Рік тому

      You are welcome. It is difficult to understand what issue you are having from your description.

  • @ariels4629
    @ariels4629 3 місяці тому

    Which camera do you recommend for real-time motion tracking to use with the jetson nano 4gb?

    • @JetsonHacks
      @JetsonHacks  3 місяці тому

      That's a pretty broad question. There are many factors, such as how large of an area you are monitoring and the lighting conditions. Consider starting with a webcam to get a feel of how you want to solve the problem. Thanks for watching!

  • @sharoseali708
    @sharoseali708 2 роки тому

    HI @JetsonHacks Nice to see your excilent work. Thanks. I am wandering to get 4 usb cams live streaming on jetson nano but opencv is not letting me todo that more then 2 cams. After two cams 3rd and 4 the camera is not opening . I also use gstreamer but it also create the same issue. I see yours video. you are opening multi cams at once with almost no lagging. Can you please help or give any guide to use 4 USB cams video stream simultaneously. Thanks again . I'll be waiting for your reply.

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      Thank you for the kind words. The number of USB cameras depends on a couple of factors. The first is how much bandwidth you are using, the second is how the camera implements its driver.
      The Jetson dev kit, even though it has 4 USB 3.0 ports, only implements 1 USB hub. All USB traffic must fit into that bandwidth. When the camera is initialized, the driver reserves the amount of bandwidth it desires. Some camera driver are greedy, and reserve the amount of bandwidth of the maximum resolution and speed. Two high res cameras requesting raw frames may be taking up the entire USB bandwidth.
      Other camera drivers provide compressed video, which lower the bandwidth requirements significantly.
      I don't know which cameras you are using, or what commands you are issuing to provide any insight. Good luck on your project, and thanks for watching!

    • @sharoseali708
      @sharoseali708 2 роки тому

      @@JetsonHacks thanks for your reply and guidance. Well i am using logitech webcams on jetson nano dev kit.
      My code for capturing camera streams is like in normal way : cv2.videoCapture(id, cv2. CAP_GSTREAMER) or cv2.CPA_V4L
      Also i tried to provide a complex string as video id like "v4l2src device = /dev/video1 ! video/x-raw,format= YUV2,width=640,height=480,framerate...." but even this didnt work in my case.
      I am using one powered usb hub having 4 (usb 2.0) cams attached on it and keyboard+ mouse. If u have any further suggestions about some particular cameras which can work in realtime atleast 4 of them. Please suggest. Thanks again for your helpful replies.🤗👏👏

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      @@sharoseali708 Are you using a USB 3 hub? YUV2 is raw frames, I would try MJPG as a first attempt. Good luck on your project!

  • @venrossx
    @venrossx Рік тому

    Hello! idk nothing about Python or coding, but this video is very interesting, Could this work with a usb capture card just for it to be a "monitoring" window? i wonder, but i couldn't figure out how to do it since all you do in video is excecuted by Python T_T, amazing job anyways, i would love to know if anyone could "run " the gui part of the video for what i already mention earlier, thanks!

  • @nettleand
    @nettleand Рік тому

    Would be awesome to get this running on raspberry pi. Any way to? Thanks!

  • @jeffg4686
    @jeffg4686 2 роки тому +1

    They need a VR headset to go along with it.

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      That would be fun. Thanks for watching!

  • @yalmadiable
    @yalmadiable 2 роки тому

    Hey man, I’m moving from TX2 to NX but I couldn’t use the same Jetpack 4.3 I used on TX2 so I got struggled in the process! Any advise please 🙏🏻

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      Not sure what you are struggling with. For the NX, download the system image and burn it to an SD card using Etcher. I wouldn't use SDK manager unless I absolutely had to on the Xavier NX.

    • @yalmadiable
      @yalmadiable 2 роки тому

      @@JetsonHacks hey man, the nx only works with Jetpack 4.4 and with TX2 I’m using Jetpack 4.3 & tensorRT but NX seems not liking it and I couldn’t make it work

    • @yalmadiable
      @yalmadiable 2 роки тому

      @@JetsonHacks 'How do we convert SSD-mobilenet model to tensorRT .UFF file on Jetpack 4.4 on the Jetson NX?

    • @yalmadiable
      @yalmadiable 2 роки тому

      @@JetsonHacks to clarify the software is to detect & track objects but detection is not working well when used with NX! Couldn’t fix this so far

    • @JetsonHacks
      @JetsonHacks  2 роки тому

      @@yalmadiable The current JetPack release for the Xavier NX is JetPack 4.6.

  • @kenankucukkocak4126
    @kenankucukkocak4126 Рік тому

    What camera is ideal for ship detection can you give a website address to buy it

    • @JetsonHacks
      @JetsonHacks  Рік тому

      I am not a good resource for this question. Please ask on the official NVIDIA Jetson forums where a large group of developers and NVIDIA share their experience. Thanks for watching!

  • @user-rm8bx5lu8s
    @user-rm8bx5lu8s 5 місяців тому

    lsmod command is not showing uvcvideo. What will I do?

    • @JetsonHacks
      @JetsonHacks  5 місяців тому

      What did you do to load the module?

    • @user-rm8bx5lu8s
      @user-rm8bx5lu8s 5 місяців тому

      @@JetsonHacks I followed all instructions according to your video, but it's not working.

    • @JetsonHacks
      @JetsonHacks  5 місяців тому

      @@user-rm8bx5lu8sI'm not sure what that means. uvcvideo is not loaded until it is used. What did you do to use it?

  • @randywelt8210
    @randywelt8210 2 роки тому

    What is actually better usb or csi?

    • @JetsonHacks
      @JetsonHacks  2 роки тому +1

      Short answer, the CSI cameras are basically Image Sensor interfaces. Attach an image sensor, read it directly. The USB cameras tend to have more smarts built in.
      If the camera needs to be a distance away from the Jetson, CSI can be a bit challenging. Typically people use GMSL in that case to run longer cable lengths. USB cables can be pretty long, so not as much of an issue.
      The pathway of the CSI cameras has much larger bandwidth, so you'll see applications with multiple CSI camera. In practice, it's difficult to have more than two USB super speed cameras. Hope this helps.