Please note that the Jetson Nano works with the version 2.1 version of the RPi camera (IMX219 sensor). It DOES NOT work with the earlier 1.3 version (OV5467). Note: Starting with JetPack 4.3/L4T 32.3.1 the Jetson runs OpenCV 4. This means that if you are using an earlier version of JetPack, you will need to select an earlier release of the CSI-Camera repository. In order to do that, before running the samples: $ git clone github.com/JetsonHacksNano/CSI-Camera.git $ git checkout v2.0
Jim thank you so much for being very active with this topics. I follow you from the Jetson Tx1 and i still find useful every tutorial you made. A big hug from Italy
Please note that the Jetson Nano works with the version 2.1 version of the RPi camera (IMX219 sensor). It DOES NOT work with the earlier 1.3 version (OV5467).
JetsonHacks Hello and THANKS for the AWESOME CONTENT! I just bought a Nano and am trying to follow your example. However, mine does not find and mount any camera I’ve tried - including the v2.1 RasPi unit. I only have the base Nano SDK loaded. Is Jetpack 4.2 required, and/or is there another code which actually detects/mounts the RasPi camera? Note: I can find Video1 in /dev folder, but the Nano does not recognize it...?
@@johnfowler4264 If your RPi v2.1 camera is properly installed, it will appear as /dev/video0 You can run: $ v4l2-ctl --list-formats-ext which should list the available formats. There are no additional drivers needed. Thanks for watching!
the pi always had weak GPU. this is basically what ive been wanting for for a long time. Looking forward to when they are out en mass & as the community builds. This is going to be a hot little seller.
I ordered one from Sparkfun, but they're already out of stock. I'm eager to get my hands on it. ETA Prime said that it's already looking like the best SBC for GameCube/Dolphin emulation.
Subscribed. Really nice video with New Jetson Nano + Raspberry Pi. It's easy to understand. Thanks sir making this video. Keep up good work. Mr. Jetson!
That's excellent information!,I have an rpi4b and have been studying python 3:7 for twelve months now I look forward to the NVIDIA board some time in the near future👍
What python editor are you using on the Jetson Nano? Pycharm seems to install on the nano, but then can not install libraries once it is open. Pyscripter does not want to install. Is there any reasonable python IDE for the nano? Thanks!
I don't use Python enough to give you a good opinion. Please ask this question in the official NVIDIA Jetson Nano Forum where a large group of developers share their experience. Thanks for watching!
Finally... thanks for this u r a star. Can u please also do a step by step on mobilnet ssd coco model with squares and different colours per objects including labels on the squares for each object ?
Great video, thank you. I appreciate your showing all relevant outputs, ie. command line, getting code, etc. For Computer vision applications, would camera code be better in C++ -more efficient? I want to one-two cameras at home for ALPR due DV/hostile "visits" :-S.
It depends on how proficient you are in coding C++. It's "faster" than Python, but requires more programming skill. A poorly written Python program that runs is better than a C++ that hasn't been written. Thanks for watching!
I don't know the full answer. I believe that you need to get/build the OV5647 driver and install it. You may also have to modify the camera somewhat. Please ask the question on the official NVIDIA Jetson forums, where a large group of developers and NVIDIA engineers have experience with this sensor. Thanks for watching!
If I have to capture image at high resolution from pi camera, do you think running the opencv-python script on Jetson Nano GPU rather than CPU would help?
Hi!! Great video, One question how set the 640×480p90 video, without get a error? this are the video values: capture_width=1920, capture_height=1080, display_width=960, display_height=540, framerate=30, flip_method=0, how can edit it without get a error? Thanks for all, the videos are very useful
Thank you for the kind words. I do not know what error you are getting from your description. What did you try? Is your camera capable of 640x480 video at 90 fps?
@@JetsonHacks the camera is the raspberry camera v2, the 90 fps are not important for me but I want to set the resolution to 640x480 to decrease the video lag.
@@fernandobarajas566 I don't know what error you are getting. Does the camera support 640x480? What are the results of $ v4l2-ctl --list-formats-ext If the camera is not on /dev/video0, then add the -d flag (for example -d /dev/video1)
Hello Jim! I'm just curious if there's an (semi-easy) way to get the camera V1.2 module to work, as opposed to the v2 camera module? The main reason I ask is because I've already got a handful of these from my previous experiments. :) Thanks in advance and thanks again for these videos!
You're welcome for the videos. As far as I know, there is no easy way to get the earlier cameras to work without modifying the low level camera handlers in the kernel. Thanks for watching!
Great video, sir. Could you please create a video featuring the Arducam 64MP autofocus camera - B0399 with a Jetson device? I'm encountering compatibility issues with the mentioned camera and Jetson device.
That sounds like an interesting project, however I don't have that camera. You can ask for help on the official NVIDIA Jetson forums, where a large group of developers and NVIDIA engineers share their experience: forums.developer.nvidia.com/c/agx-autonomous-machines/jetson-embedded-systems/jetson-nano/76 Thanks for watching!
Thank you for the nice video. Thanks to you, I could operate the camera. But when running face_detect, it says, "empty() in function 'detectMultisScale.' Do you get an error when the opcvCV version is different?
I don't know if it is or not. Please ask this question on the official NVIDIA Jetson Nano forum, where a large group of developers and NVIDIA engineers share their experience. Thanks for watching!
Thanks to the video, I can use the CSI camera. Thank you. I have a question. How can I solve this problem? error:(-215: Assertion failed) !empty() in function 'detectMultiScale'
Hey, quick question here - I am trying to do exactly this, but I need to save the videos (for a science fair). The code for it is pretty simple, but the large problem is that I cannot save the frames fast enough compared to the frame-rate. One of my problems is that I want to use the mode 3 (1280 by 720 with 60 FPS), however it always chooses the 120 frames per second mode, mode 4. I have in the Gstreamer function you made the fps = 60, but it still will only do 120 frames per second. Thanks for the information!
I've been a subscriber to your channel for over a year, but now that I'm getting a real Jetson, the Nano, I'm looking forward to see if you'll do a lot of projects with it. I'd like to see what 720p @120fps comp vision performance looks like, and also how to hook it up to a battery with a charging circuit/buck converter or some sort of voltage regulator.
You will need to write an appropriate gstreamer pipeline. Please ask this question on the official NVIDIA Jetson Nano Forum, where a large group of developers and NVIDIA engineers share their experience. Thanks for watching!
Lots of potential there.. How about mounting the camera on a cheap servo controlled gimbal, and have it identify and track obstacles for robot navigation. It sounds like a simple application and I believe you already have the parts. On a personal note please make sure you exercise and eat healthy..We’re all getting older and it’s easy to sit in front of a computer for 8 hours a day without much movement..Great video as usual…Peace
like u said The new Jetson Nano B01 devloper kit has two CSI camera slots. You can use the sensor_mode attribute with nvarguscamerasrc to specify the camera. Valid vales are 0 or 1, i.e. and i wrote "nvarguscamerasrc sensor_mode=1 !" to terminal but it says that nvarguscamerasrc:command not found.. How can ı solve this issue Thx JetsonHacks
Thank you verey much for this review , i'am on the stage of choosing between ROC-RK3399-PC AI ( with NCC S1 ) or Jetson nano , which is better for 4K image / video AI / AR processing ?
Depends on your needs. The Nano has less CPU cores (4xA57), but probably about the same CPU performance for the mixed ROC cores (2xA72-4xA53), the Nano has 128 GPU cores, the ROC has 4. They both appear to have 4K video decoders, but the Nano has 4K video encoders where as the ROC has 2K. I don't know about how you program the GPU on the ROC, the Nano uses industry standard CUDA. I don't know anything about the ROC, so I can't make a suggestion. Thanks for watching!
I wonder if Jetson Nano can encode Full HD stream from USB camera (for example Raw, YUV or MJPEG format) to H.264 format fast enough for real-time viewing? I have a project where I need to take video from USB camera and encode it to H.264 to be streamed over internet. Do you think Jetson Nano is powerful enough to do that?
I do not know. Please ask this question on the official NVIDIA Jetson forum where a large number of developers and NVIDIA engineers share their experience. The forum is here: devtalk.nvidia.com/default/board/371/jetson-nano/
I believe it does, but I haven't used it. Please ask your question on the official NVIDIA Jetson Orin Nano forum where NVIDIA engineers can help you through the process. Thanks for watching!
Thanks for the amazing videos. Can you please make a video/share info about connecting Jetson nano remotely to laptop using WiFi or lan?? PS: I am confused whether to buy a monitor or is there another way.
I believe you need a HDMI monitor for the initial setup. That might be something you can check out in the forums. After that, it's a regular Linux box. You can SSH into it and so on. It is worth asking this question on the official NVIDIA Jetson forum where a large number of developers and NVIDIA engineers share their experience. The forum is here: devtalk.nvidia.com/default/board/371/jetson-nano/ Thanks for watching!
Hi, I follow u and u are great!! I have a question for u, I want to know if I can use the Jetson Nano for read distances between the camera to a car (FOR EXAMPLE) or I need to use a lidar? then, which libraries must use. I will thanksfull for u help. Thanks for ur videos!! Regards.
You will need to use a sensor that can return distance information. Lidars can do that, ultrasonic sensors can, as well as depth cameras and binocular cameras. Thanks for watching!
@@JetsonHacks Ty for u answer! Okey i´m rely on lidar because i think that this technology it´s more reliable thn ultrasonic. On the other hand, if I use Lidar, what library can i use for them? ty again!
You are welcome. I don't know about UA-cam streaming. You should ask on the NVIDIA Jetson Nano forum where other developers may have experience in that area. Thanks for watching!
On your GitHub page is written: "If you can open the camera in GStreamer from the command line, and have issues opening the camera in Python, check the OpenCV version." That is the problem I am facing. I do have cv2 version: '4.5.4' But what to do next, to solve the issue? Should I install another version of cv2? (Using Jetson NANO with Ubuntu 20.04)
Since Jetson Nano does not officially support Ubuntu 20.04, I have not seen the issue you are facing. I'm not sure what the issue is from your description, but at some point you will need to match the code to use the version of OpenCV you are using. Good luck on your project, and thanks for watching!
@@JetsonHacks Good to know, thanks! (don't think I'll ever get a chance to make it to one of those, so eBay it is! :)) Also make sure you get your commission(s), your videos were the biggest reason I ended up purchasing the Jetson Nano. :D Keep up the great work!
I asked this question earlier about RealSense camera(d435 and T265) on the first nano video introduction and Jim said "The changes to support the RealSense video formats is in part of the kernel which can not be modularized, so there's no easy way to just add a few modules here and there to get it to work. " not sure if this means it is not an easy... or can not be done. I would think if TX2 can the nano could too. The raspberry pi works for RealSense but it is limited by the USB2 ports. The nano has USB3 ports so here is to hoping we find out.
@@sy2532 I am fairly new to the jetson and realsense. I have ordered both, but haven't received them yet. I wanted to ask : If at all CUDA support cannot be made available for the LibRealsense on the Jetson Nano, then does the real-sense library work standalone? I mean, do we need to modify the kernel in order to get a basic interface with the camera working?
Thank you for the kind words. You should be able to use the Raspberry Pi NOIR camera that you mention. The only difference is that the camera does not have an infrared filter on it, where the regular cameras do. Thanks for watching!
In the video, two cameras are being used simultaneously. One is connected to a USB port, the other through the Raspberry Pi connector. There are 4 USB ports. There are third party carrier boards that have different camera input configurations. Thanks for watching!
I do not know right off hand. Please ask the question on the official NVIDIA Jetson Nano forum, where a large group of developers and NVIDIA engineers share their experience. They should be able to answer the question. Thanks for watching!
@@ahmedsaleh4640 I haven't tried it. You will need the correct driver/program for the capture device for a Jetson based device. You can ask on the forum to see if anyone else is using one.
You probably could, but would be better off using a desktop computer to get started. That area of study is very complex, and tends to require a lot of computer horsepower to build appropriate models. Thanks for watching!
Please ask this question on the official NVIDIA Jetson forum where a large number of developers and NVIDIA engineers share their experience. The forum is here: devtalk.nvidia.com/default/board/326/jetson-agx-xavier/
The C920 is shown working with the Jetson Nano in the video. It is being used to show me using the Cheese application, which is then recorded with the rest of the screencast. Thanks for watching!
Hard to tell from your description. In Python you could do something like (assuming your camera is on /dev/video1): cap = cv2.VideoCapture("v4l2src device=/dev/video1 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink") or: cap = cv2.VideoCapture(’/dev/video1’, cv2.CAP_V4L) The second version will try to negotiate appropriate capabilities. If you are going to use compressed video (MJPG or H.264), you will need to add decompressors and so on.
Hello I use both of the commands on the github and It worked but after I install opencv using pip3 install opencv-python I can`t open my csi camera with those two command I wonder if I use the wrong way to download the opencv or what pls help thx
I believe that when you install opencv-python that will overwrite the native OpenCV package. The native OpenCV package has support for using the RPi camera with the Jetson. Thanks for watching!
The image appears to crop towards the middle. I don't believe the DPI changes. If you look at the image at 3:36 - 3820x2464 and compare it against: 5:13 - 1280x720 You can see the difference. The 3820x2464 image is scaled down in the demo to fit in the window. If you were viewing it on a 4K monitor, it would fill the screen. The 128x720 image is shown full size. Thanks for watching!
@@JetsonHacks , i see. it would be good if we could keep the width and reduce the resolution, good for wide angle cameras and faster image processing....
@@fishworxify Since the images are coming straight in to the processor, it should be straightforward to de-res the 1280x720 @ 120fps images. Also, other vendors are coming out with cameras based on the IMX 219 that may better serve you needs.
when i command gst-launch-1.0 nvarguscamerasrc sensor_id=0 !, my camera window is opened but when i enter ctrl+c my camera window doesn't close how to close camera window and how to resize camera window size ?? please help me
How to do live video streaming from Jetson nano with the Raspberry Pi Camera Module?? Please help me with any idea. I want to do live streaming from Raspberry Pi Camera Module with Jetson Nano to any network.
@@puvi007 I am not sure what you mean by possible. If you port the driver over and modify the tegra camera code, it is possible. However, it may not be easy.
@@JetsonHacks simple I just want to take remote access of Nvidia Jetson with Ubuntu os installed and that Nvidia is from different location. I want to take remote access from another location. Both devices are from different locations and connected to different Networks.
@@sridharethiraj2149 I don't understand your question well enough to give you an answer. Please ask this question on the official NVIDIA Nano developers forum where a community of developers and NVIDIA engineers can help you.
@@JetsonHacks even more simpler, to take remote access of Nvidia nano with another nano, from different locations and connected to different Wi-Fi Networks.
@@sridharethiraj2149 Can you do it? Sure. But the question indicates that you do not know how to do that (it's basic Linux stuff). The official NVIDIA Jetson Nano forum is a really good place to get help with your projects. This is a UA-cam video about connecting a camera, it does not lend itself to providing detailed answers/how tos for your projects.
U can do it on PC but machine learning wouldn't be viable... Because we need a pack huge power in as small sbc as possible so it can be used for robotics and other stuff that don't have a large enough space for your PC motherboard
it looks like the face_detect application has less framerate when compared with normal camera vision, how to increase the frame rate for face_detection script?
@@moulich7372 This is more than can be explained in a UA-cam comment. There are several ways, grabbing the camera on a different thread, using a machine learning model, accelerating with TensorRT, and so on. You can probably get help on the official NVIDIA Jetson Nano Forum. Good luck on your project!
Please note that the Jetson Nano works with the version 2.1 version of the RPi camera (IMX219 sensor). It DOES NOT work with the earlier 1.3 version (OV5467).
Note: Starting with JetPack 4.3/L4T 32.3.1 the Jetson runs OpenCV 4. This means that if you are using an earlier version of JetPack, you will need to select an earlier release of the CSI-Camera repository. In order to do that, before running the samples:
$ git clone github.com/JetsonHacksNano/CSI-Camera.git
$ git checkout v2.0
why I didn't see the comments!, just after buy the incorrect camera, DUMB!
Is this still valid I have 4GB JP4.6.1 jetson nano and IMX219 V1.3 I had some problems.
@@karatugba Hard to tell what issues you encountered from your description. A v1.3 Raspberry Pi camera is not a IMX219 sensor, but rather a OV5467.
You are ”Mr. Jetson” in UA-cam. Best source for Jetson information. Good work!
Thank you for my new title, I will wear it proudly. Thanks for watching!
@@JetsonHacks Thanks for your contributions. I totally love how you make things aproachable even for noobs!
@@thedescriptor410 You're welcome, and thanks for watching!
If someone ever reads this :
The ring is to change the focus of the camera, you can put it on the lens and turn it.
Jim thank you so much for being very active with this topics. I follow you from the Jetson Tx1 and i still find useful every tutorial you made. A big hug from Italy
Thank you for the kind words, and thanks for watching!
Please note that the Jetson Nano works with the version 2.1 version of the RPi camera (IMX219 sensor). It DOES NOT work with the earlier 1.3 version (OV5467).
JetsonHacks Hello and THANKS for the AWESOME CONTENT!
I just bought a Nano and am trying to follow your example. However, mine does not find and mount any camera I’ve tried - including the v2.1 RasPi unit. I only have the base Nano SDK loaded. Is Jetpack 4.2 required, and/or is there another code which actually detects/mounts the RasPi camera? Note: I can find Video1 in /dev folder, but the Nano does not recognize it...?
@@johnfowler4264 If your RPi v2.1 camera is properly installed, it will appear as /dev/video0
You can run:
$ v4l2-ctl --list-formats-ext
which should list the available formats. There are no additional drivers needed. Thanks for watching!
This has been the most useful tutorial found so far.
Thank you for the kind words, and thanks for watching!
Thanks for doing this tutorial. I was about to give up on my project but this helped me get my camera working!
You're very welcome! I'm glad you got it to work. Thanks for watching!
Great videos! Just bought a nano and can’t wait to explore it. Your videos helped a lot!
Have fun! And thanks for watching!
the pi always had weak GPU. this is basically what ive been wanting for for a long time. Looking forward to when they are out en mass & as the community builds. This is going to be a hot little seller.
A little competition be good! Thanks for watching.
I ordered one from Sparkfun, but they're already out of stock. I'm eager to get my hands on it. ETA Prime said that it's already looking like the best SBC for GameCube/Dolphin emulation.
Subscribed. Really nice video with New Jetson Nano + Raspberry Pi. It's easy to understand. Thanks sir making this video. Keep up good work. Mr. Jetson!
Thank you for subscribing, and thanks for watching!
That's excellent information!,I have an rpi4b and have been studying python 3:7 for twelve months now I look forward to the NVIDIA board some time in the near future👍
There's a learning curve to get started, but it's fun once you get into it.
wow, what a great demo. Thanks
Thank you for the kind words, and thanks for watching!
Big fan of your work! Would love to see a Lipo battery powered solution for the Nano.
Thank you for your suggestion, and thanks for watching!
Good work sir, keep it uploading.
Thank you for the encouragement, and thanks for watching!
@@JetsonHacks Yes keep'em coming!
Mr. Jetson, great vid. very clear.
Thank you, and thanks for watching!
JetsonHacks is the best. Thanks alot.
Thank you for the kind words, and thanks for watching!
Thanks for the nice info. What is the FPS when we are doing object detection using OpenCV and what is the fps when we are doing only object tracking?
What python editor are you using on the Jetson Nano? Pycharm seems to install on the nano, but then can not install libraries once it is open. Pyscripter does not want to install. Is there any reasonable python IDE for the nano? Thanks!
I don't use Python enough to give you a good opinion. Please ask this question in the official NVIDIA Jetson Nano Forum where a large group of developers share their experience. Thanks for watching!
Finally... thanks for this u r a star. Can u please also do a step by step on mobilnet ssd coco model with squares and different colours per objects including labels on the squares for each object ?
Thank you for the suggestion, and thanks for watching!
Great video, thank you. I appreciate your showing all relevant outputs, ie. command line, getting code, etc.
For Computer vision applications, would camera code be better in C++ -more efficient? I want to one-two cameras at home for ALPR due DV/hostile "visits" :-S.
It depends on how proficient you are in coding C++. It's "faster" than Python, but requires more programming skill. A poorly written Python program that runs is better than a C++ that hasn't been written. Thanks for watching!
you are so awesome for doing this :)
No, you're awesome for watching!
Great pro tip!
Only the best! Thanks for watching!
Thank you very much for your videos.
You are welcome, and thanks for watching!
Hi Jim! Could you tell me if a CSI Wide Angle 200° camera with the OV5647 sensor is compatible with an NVIDIA Jetson Xavier NX or Jetson Orin Nano?
I don't know the full answer. I believe that you need to get/build the OV5647 driver and install it. You may also have to modify the camera somewhat. Please ask the question on the official NVIDIA Jetson forums, where a large group of developers and NVIDIA engineers have experience with this sensor. Thanks for watching!
Thanks a lot for you videos! One question, this board could work with that camera and OpenCV?
The face detect example shown in the video is an OpenCV example. Thanks for watching!
hi jim it's nice to see your this one tutorial, is my nano be able connected to DSLR camera?
I do not know what connect to a DSLR camera means.
If I have to capture image at high resolution from pi camera, do you think running the opencv-python script on Jetson Nano GPU rather than CPU would help?
Hi!! Great video, One question how set the 640×480p90 video, without get a error?
this are the video values:
capture_width=1920,
capture_height=1080,
display_width=960,
display_height=540,
framerate=30,
flip_method=0,
how can edit it without get a error?
Thanks for all, the videos are very useful
Thank you for the kind words. I do not know what error you are getting from your description. What did you try? Is your camera capable of 640x480 video at 90 fps?
@@JetsonHacks the camera is the raspberry camera v2, the 90 fps are not important for me but I want to set the resolution to 640x480 to decrease the video lag.
@@fernandobarajas566 I don't know what error you are getting. Does the camera support 640x480? What are the results of
$ v4l2-ctl --list-formats-ext
If the camera is not on /dev/video0, then add the -d flag (for example -d /dev/video1)
Thanks for the great video!!!
You are welcome, and thanks for watching!
Thanks a lot as I was having lot of problems with this pi camera on nano, and Nvidia forum is not the best help for me, you saved me awesome
You are welcome, I am glad you found it useful. Thanks for watching!
What will you suggest? Raspberry Pi 4 or Jetson Nano for image processing?
I'm not sure what you are asking. There are so many levels of image processing, and you are asking on a channel that is about Jetson developer kits.
Hello Jim!
I'm just curious if there's an (semi-easy) way to get the camera V1.2 module to work, as opposed to the v2 camera module? The main reason I ask is because I've already got a handful of these from my previous experiments. :)
Thanks in advance and thanks again for these videos!
You're welcome for the videos. As far as I know, there is no easy way to get the earlier cameras to work without modifying the low level camera handlers in the kernel. Thanks for watching!
Thanks, Sir!
This video very useful. Does Jetson Nano support Logitech Webcam? (which type webcam). I have some trouble with my webcam now.
In the video, a Logitech C920 cam is being used in the Cheese window on the Nano. Thanks for watching!
@@JetsonHacks Thank you so much!
Great video, sir. Could you please create a video featuring the Arducam 64MP autofocus camera - B0399 with a Jetson device? I'm encountering compatibility issues with the mentioned camera and Jetson device.
That sounds like an interesting project, however I don't have that camera. You can ask for help on the official NVIDIA Jetson forums, where a large group of developers and NVIDIA engineers share their experience: forums.developer.nvidia.com/c/agx-autonomous-machines/jetson-embedded-systems/jetson-nano/76 Thanks for watching!
Thank your Mr Jetson. 😎
You're welcome, and thanks for watching!
Legendary ❤
Thanks for watching!
Amazing, once again...
All in a days work. Thanks for commenting, and thanks for watching!
Thank you for the nice video.
Thanks to you, I could operate the camera.
But when running face_detect, it says, "empty() in function 'detectMultisScale.'
Do you get an error when the opcvCV version is different?
Very good, thank you.
You are welcome. Thanks for watching!
Professional. Nice.
Thank you! Cheers!
Hi!! Great content! Do you know if It is possible to configure de camera as the webcam of the jetson? Thank you
I don't know if it is or not. Please ask this question on the official NVIDIA Jetson Nano forum, where a large group of developers and NVIDIA engineers share their experience. Thanks for watching!
If I were to do a haar Cascades classifier training with 1000 positive and negative samples, how long would it take
Thanks to the video, I can use the CSI camera. Thank you. I have a question.
How can I solve this problem? error:(-215: Assertion failed) !empty() in function 'detectMultiScale'
Hey, quick question here - I am trying to do exactly this, but I need to save the videos (for a science fair). The code for it is pretty simple, but the large problem is that I cannot save the frames fast enough compared to the frame-rate. One of my problems is that I want to use the mode 3 (1280 by 720 with 60 FPS), however it always chooses the 120 frames per second mode, mode 4. I have in the Gstreamer function you made the fps = 60, but it still will only do 120 frames per second. Thanks for the information!
I do not have anything to share on your issue. Please ask this question on the official NVIDIA Jetson Nano forum. Thanks for watching!
I've been a subscriber to your channel for over a year, but now that I'm getting a real Jetson, the Nano, I'm looking forward to see if you'll do a lot of projects with it. I'd like to see what 720p @120fps comp vision performance looks like, and also how to hook it up to a battery with a charging circuit/buck converter or some sort of voltage regulator.
Congrats on the purchase! Sound like fun projects. Thanks for watching!
Thanks for the video!! Quick question - how would you play a mp4 file on the nano using gstreamer?
You will need to write an appropriate gstreamer pipeline. Please ask this question on the official NVIDIA Jetson Nano Forum, where a large group of developers and NVIDIA engineers share their experience. Thanks for watching!
great works
Thanks for watching!
Lots of potential there.. How about mounting the camera on a cheap servo controlled gimbal, and have it identify and track obstacles for robot navigation. It sounds like a simple application and I believe you already have the parts. On a personal note please make sure you exercise and eat healthy..We’re all getting older and it’s easy to sit in front of a computer for 8 hours a day without much movement..Great video as usual…Peace
Thanks for the advice. I always make sure to get out and walk 3 or 4 miles a day, so it's not all sedentary. Thanks for watching!
like u said The new Jetson Nano B01 devloper kit has two CSI camera slots. You can use the sensor_mode attribute with nvarguscamerasrc to specify the camera. Valid vales are 0 or 1, i.e. and i wrote "nvarguscamerasrc sensor_mode=1 !" to terminal but it says that nvarguscamerasrc:command not found.. How can ı solve this issue Thx JetsonHacks
You need to use the nvarguscamerasrc in a Gstreamer command.
@@JetsonHacks "Gstreamer command" couldnt get it. Is it something like terminal? Sorry I am so new to these things. Thank you for your help
okay i solve the issue. Thx.
@@mfatihozkan3503 I am glad you got it to work.
I’m facing the same issue . How did you solve it ?
Thank you verey much for this review , i'am on the stage of choosing between ROC-RK3399-PC AI ( with NCC S1 ) or Jetson nano , which is better for 4K image / video AI / AR processing ?
Depends on your needs. The Nano has less CPU cores (4xA57), but probably about the same CPU performance for the mixed ROC cores (2xA72-4xA53), the Nano has 128 GPU cores, the ROC has 4. They both appear to have 4K video decoders, but the Nano has 4K video encoders where as the ROC has 2K. I don't know about how you program the GPU on the ROC, the Nano uses industry standard CUDA. I don't know anything about the ROC, so I can't make a suggestion. Thanks for watching!
@@JetsonHacks Thank you verey much for your answer .
Hello ,is the Raspberry Pi night vision(NoIR Camera) can also be used?
It should work. Thanks for watching!
I wonder if Jetson Nano can encode Full HD stream from USB camera (for example Raw, YUV or MJPEG format) to H.264 format fast enough for real-time viewing?
I have a project where I need to take video from USB camera and encode it to H.264 to be streamed over internet.
Do you think Jetson Nano is powerful enough to do that?
I do not know. Please ask this question on the official NVIDIA Jetson forum where a large number of developers and NVIDIA engineers share their experience. The forum is here: devtalk.nvidia.com/default/board/371/jetson-nano/
does jetson orin nano will work with raspberry pi v3?, if possible what else i need to do to interface it
and importantly i'm using jetpack 6.0, thanks in advance
I believe it does, but I haven't used it. Please ask your question on the official NVIDIA Jetson Orin Nano forum where NVIDIA engineers can help you through the process. Thanks for watching!
Thanks for the amazing videos. Can you please make a video/share info about connecting Jetson nano remotely to laptop using WiFi or lan??
PS: I am confused whether to buy a monitor or is there another way.
I believe you need a HDMI monitor for the initial setup. That might be something you can check out in the forums. After that, it's a regular Linux box. You can SSH into it and so on.
It is worth asking this question on the official NVIDIA Jetson forum where a large number of developers and NVIDIA engineers share their experience. The forum is here: devtalk.nvidia.com/default/board/371/jetson-nano/
Thanks for watching!
@@JetsonHacks Thank you!! Yes, I had to use the monitor for initial setup and then I installed xrdp and now running fine. Could not make the VNC work.
@@009deepesh Please ask on the NVIDIA Jetson Nano Forum. There's a thread that goes over the process.
Hi, I follow u and u are great!!
I have a question for u, I want to know if I can use the Jetson Nano for read distances between the camera to a car (FOR EXAMPLE) or I need to use a lidar? then, which libraries must use.
I will thanksfull for u help.
Thanks for ur videos!!
Regards.
You will need to use a sensor that can return distance information. Lidars can do that, ultrasonic sensors can, as well as depth cameras and binocular cameras. Thanks for watching!
@@JetsonHacks Ty for u answer!
Okey i´m rely on lidar because i think that this technology it´s more reliable thn ultrasonic. On the other hand, if I use Lidar, what library can i use for them?
ty again!
@@MrFranckiie It depends on which type of lidar that you use, and the device in particular.
thank you for the video! do you know if its possible to stream to youtube from the jetson nano?
You are welcome. I don't know about UA-cam streaming. You should ask on the NVIDIA Jetson Nano forum where other developers may have experience in that area. Thanks for watching!
Is there a updated code for the new raspberry pi high quality camera ?
Thanks for sharing the videos . They help a lot in setting up .
Thank you for the kind words. I do not know what you mean by "updated code". Have you written a driver for the camera?
On your GitHub page is written: "If you can open the camera in GStreamer from the command line, and have issues opening the camera in Python, check the OpenCV version."
That is the problem I am facing. I do have cv2 version: '4.5.4' But what to do next, to solve the issue? Should I install another version of cv2? (Using Jetson NANO with Ubuntu 20.04)
Since Jetson Nano does not officially support Ubuntu 20.04, I have not seen the issue you are facing. I'm not sure what the issue is from your description, but at some point you will need to match the code to use the version of OpenCV you are using. Good luck on your project, and thanks for watching!
@@JetsonHacks I restored the original img file, and now it is working (again). Tnx for your advise. Best greetings.
Great tutoria Sir! How can I take a capture and save it with this camera?
Awesome one ⬆️⬆️⬆️
I appreciate you watching the video, thanks!
I love that ruler, think I might have to snag one! Do you know of a "legit" source for consumers or is eBay the only bet? :)
I have only seen it for sale to the public at the NVIDIA GTC Developers Conferences. Thanks for watching!
@@JetsonHacks Good to know, thanks! (don't think I'll ever get a chance to make it to one of those, so eBay it is! :))
Also make sure you get your commission(s), your videos were the biggest reason I ended up purchasing the Jetson Nano. :D
Keep up the great work!
@@ThinkinThoed I hope you have fun with your Jetson Nano. Good luck on the ruler hunt!
Looking forward to your tutorial over CUDA support for realsense cameras on Jetson Nano!
I asked this question earlier about RealSense camera(d435 and T265) on the first nano video introduction and Jim said "The changes to support the RealSense video formats is in part of the kernel which can not be modularized, so there's no easy way to just add a few modules here and there to get it to work. " not sure if this means it is not an easy... or can not be done. I would think if TX2 can the nano could too. The raspberry pi works for RealSense but it is limited by the USB2 ports. The nano has USB3 ports so here is to hoping we find out.
@@sy2532 I still hope Mr Jetson will come to our rescue!
@@sy2532 I am fairly new to the jetson and realsense. I have ordered both, but haven't received them yet. I wanted to ask : If at all CUDA support cannot be made available for the LibRealsense on the Jetson Nano, then does the real-sense library work standalone? I mean, do we need to modify the kernel in order to get a basic interface with the camera working?
@@projectnorthernlights8431 Jim just updated... The Nano supports the RealSense.
@@sy2532 That is a relief!
Hello frend. i cant run jetson-io tool. it flashing and closing. How can install dtb,dtbo files. Jetpack 4.6
How to set the Raspberry Pi camera as default so that it could be used for web based Video Conference?
Depends on the application you are using. The camera shows up as /dev/video0. Thanks for watching!
Does it work with Raspberry Pi 4 Camera IR-CUT Night Vision Camera Module (OV5647)?
I don't think it does. Thanks for watching!
that was really wonderful.
can i use raspberry pi 4 cameras night vision? is that compatible with jetson nano 4gb my devise is that one
Thank you for the kind words. You should be able to use the Raspberry Pi NOIR camera that you mention. The only difference is that the camera does not have an infrared filter on it, where the regular cameras do. Thanks for watching!
@@JetsonHacks thankyou
OpenCV(4.5.5) /io/opencv/modules/videoio/src/cap_images.cpp:253: error: (-5:Bad argument) CAP_IMAGES: can't find starting number (in the name of file): nvarguscamerasrc ! video/x-raw(memory:NVMM), width=3264, height=2464, format=NV12, framerate=21/1 ! nvvidconv flip-method=2 ! video/x-raw, width=320, height=240, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink in function 'icvExtractPattern'
Traceback (most recent call last):
File "/home/aastu/4kilo.py", line 10, in
cv2.imshow('picam' , frame)
cv2.error: OpenCV(4.5.5) /io/opencv/modules/highgui/src/window.cpp:1000: error: (-215:Assertion failed) size.width>0 && size.height>0 in function 'imshow'
AM USING CAMERA-NIGHT VISION cam001 but i kept getting this error
is that from the version of my camera?
or from my code?please am unable to understand the error
Thank you. Is there a way to attach more then one camera and process video streams from several cameras simultaneously?
In the video, two cameras are being used simultaneously. One is connected to a USB port, the other through the Raspberry Pi connector. There are 4 USB ports. There are third party carrier boards that have different camera input configurations. Thanks for watching!
@@JetsonHacks Mr. Jetson, interesting, could you please cover some of the 3rd party carrier boards? Wonder if there is an Xavier Nano or something.
I’m getting an error “no camera available ” when I write the command for utilizing gstreamer. I’ve done the connection properly tho. Can you help?
What brand and model of camera are you using? Does the camera show up on /dev/videoX where X is 0 or 1?
HELP! Nothing pops out and my camera is successfully connected. I am using the Waveshare IMX219-160IR camera
Please ask this question on the official NVIDIA Jetson Nano forum, where a large group of developers and NVIDIA engineers share their experience.
Hello sir do you know if my camera has Hdmi can i use it with my jetson using the CSI-HDMI interface?
I do not know right off hand. Please ask the question on the official NVIDIA Jetson Nano forum, where a large group of developers and NVIDIA engineers share their experience. They should be able to answer the question. Thanks for watching!
@@JetsonHacks do you happen to know if i got a HDMI to USB will this work?
@@ahmedsaleh4640 I haven't tried it. You will need the correct driver/program for the capture device for a Jetson based device. You can ask on the forum to see if anyone else is using one.
Could I use the Jetson for natural language procesing? I'm a linguist but don't know a thing about programming...
You probably could, but would be better off using a desktop computer to get started. That area of study is very complex, and tends to require a lot of computer horsepower to build appropriate models. Thanks for watching!
Mr Jetson, have you successfully tested the Raspberry Pi camera on Jetson AGX Xavier? Thanks,
No. Thanks for watching!
@@JetsonHacks Thanks, Mr. Jetson. Any suggestion on this?
Please ask this question on the official NVIDIA Jetson forum where a large number of developers and NVIDIA engineers share their experience. The forum is here: devtalk.nvidia.com/default/board/326/jetson-agx-xavier/
hmm, getting an error. No cameras available.
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=0 ! 'video/x-raw,width=960, height=616' ! nvvidconv ! nvegltransform ! nveglglessink -e
any suggestions?
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:521 No cameras available
It's hard to determine the issue without knowing what camera you are using and if the camera is properly installed.
Excuse me, I want ask you something, is the Logitech c920 camera works with Jetson nano??
The C920 is shown working with the Jetson Nano in the video. It is being used to show me using the Cheese application, which is then recorded with the rest of the screencast. Thanks for watching!
JetsonHacks thank you so much, and congrats for your videos !!
Thanks !!
You are welcome, and thanks for watching!
Hello... Can you share how to snap and save the photo??
How can I use object tracking and object detection same time using jetson nano , Logitech webcam and 2 servo motors
Start here: ua-cam.com/video/5WeTAA8Unqo/v-deo.html
WOW yes this exactly except I need like.. 6 of them at least. :D
Do you think this will work with v2.1 of the camera?
The Raspberry Pi camera in the video is a v2.1. Thanks for watching!
Hi I' am using C920 camera and I am trying opencv real time photo (test code) but I dont know What write to VideoCapture() please help me
Hard to tell from your description. In Python you could do something like (assuming your camera is on /dev/video1):
cap = cv2.VideoCapture("v4l2src device=/dev/video1 ! video/x-raw, width=640, height=480 ! videoconvert ! video/x-raw,format=BGR ! appsink")
or:
cap = cv2.VideoCapture(’/dev/video1’, cv2.CAP_V4L)
The second version will try to negotiate appropriate capabilities. If you are going to use compressed video (MJPG or H.264), you will need to add decompressors and so on.
Is it possible to connect RPi Camera V2 to Jetson Xavier AGX in a similar way?
There is no available CSI camera connection on the Jetson AGX Xavier Developer Kit. Thanks for watching!
Hello I use both of the commands on the github and It worked but after I install opencv using pip3 install opencv-python I can`t open my csi camera with those two command I wonder if I use the wrong way to download the opencv or what pls help thx
I believe that when you install opencv-python that will overwrite the native OpenCV package. The native OpenCV package has support for using the RPi camera with the Jetson. Thanks for watching!
I need help designing a Graphical User Interface on a Jetson Nano....
Very informative video, thank you. Do you know if the Nano will do image recognition like you showcased but with 2 simultaneously attached cameras?
You are welcome. I do not know the answer to your question, please ask this on the official NVIDIA Jetson Developers forum. thanks for watching!
Hi, did you get it to work?
Is there a way to hookup a 360 camera to the Nano?
when you reduce the resolution for higher fps, does the image width reduce or the dpi?
The image appears to crop towards the middle. I don't believe the DPI changes.
If you look at the image at
3:36 - 3820x2464
and compare it against:
5:13 - 1280x720
You can see the difference. The 3820x2464 image is scaled down in the demo to fit in the window. If you were viewing it on a 4K monitor, it would fill the screen. The 128x720 image is shown full size.
Thanks for watching!
@@JetsonHacks , i see. it would be good if we could keep the width and reduce the resolution, good for wide angle cameras and faster image processing....
@@fishworxify Since the images are coming straight in to the processor, it should be straightforward to de-res the 1280x720 @ 120fps images. Also, other vendors are coming out with cameras based on the IMX 219 that may better serve you needs.
Good video
Thank you! And thanks for watching!
when i command gst-launch-1.0 nvarguscamerasrc sensor_id=0 !,
my camera window is opened but when i enter ctrl+c my camera window doesn't close how to close camera window and how to resize camera window size ?? please help me
You have to select the Terminal window that you are running the command from to use Ctrl C. Thanks for watching!
That's cool. how about using a USB webcam
Would it be possible to connect Jetson Nano with StereoPi with ROS and I2C?
I don't know anything about the StereoPi. Thanks for watching!
How to do live video streaming from Jetson nano with the Raspberry Pi Camera Module?? Please help me with any idea. I want to do live streaming from Raspberry Pi Camera Module with Jetson Nano to any network.
You can ask in the official NVIDIA Jetson Forum for help with this. Thanks for watching!
Hi, does jetson nano support RPi V1 camera .. its very cheap .. same pinout?
The default Jetson Nano installation does not support the RPi V1 Camera. Thanks for watching!
@@JetsonHacks is it possible to enable support for that? or its not at all possible?
@@puvi007 I am not sure what you mean by possible. If you port the driver over and modify the tegra camera code, it is possible. However, it may not be easy.
Hello, how can I make the raspberry bi v2 run in python code for object detection?
ua-cam.com/play/PL5B692fm6--uQRRDTPsJDp4o0xbzkoyf8.html
@@JetsonHacks Thanks a lot!
one small doubts, does this two devices connected in same network or different network. can we do the same for remote access.
I do not understand your question.
@@JetsonHacks simple I just want to take remote access of Nvidia Jetson with Ubuntu os installed and that Nvidia is from different location. I want to take remote access from another location.
Both devices are from different locations and connected to different Networks.
@@sridharethiraj2149 I don't understand your question well enough to give you an answer. Please ask this question on the official NVIDIA Nano developers forum where a community of developers and NVIDIA engineers can help you.
@@JetsonHacks even more simpler, to take remote access of Nvidia nano with another nano, from different locations and connected to different Wi-Fi Networks.
@@sridharethiraj2149 Can you do it? Sure. But the question indicates that you do not know how to do that (it's basic Linux stuff). The official NVIDIA Jetson Nano forum is a really good place to get help with your projects. This is a UA-cam video about connecting a camera, it does not lend itself to providing detailed answers/how tos for your projects.
gimme more jetson homie
Will do. Thanks for watching!
my Jetson Nano suddenly Turnoff when running face_detection.py but work well at sample_camera.py and already use 4A for power. Anybody know why???
Why use jestan nano for machine
Learning and AI when wr have pc please reply??
U can do it on PC but machine learning wouldn't be viable... Because we need a pack huge power in as small sbc as possible so it can be used for robotics and other stuff that don't have a large enough space for your PC motherboard
@@gaurav9806 ok thanks😊
Oh man, lol, my like is for the pro tip
Much appreciated. Thanks for watching!
it looks like the face_detect application has less framerate when compared with normal camera vision, how to increase the frame rate for face_detection script?
You can do about 20 different things for faster frame rates. What have you tried?
JetsonHacks can you just share those tips which will help me in my project. I just know changing GPU , changing the camera might increase
@@moulich7372 This is more than can be explained in a UA-cam comment. There are several ways, grabbing the camera on a different thread, using a machine learning model, accelerating with TensorRT, and so on. You can probably get help on the official NVIDIA Jetson Nano Forum. Good luck on your project!
how many fps we can get with this face detection?