Combining the Camera and the Jetson Nano certainly opens a lot of possibilities for robotic navigation and obstacle avoidance. A gps receiver would be a good addition. The demo only shows a fraction of the potential that’s available to someone with an imagination and the money to invest. This could easily be turned into a serious autonomous lawn mower or other independent robotic device. I hope you’re planning on taking this in a specific direction at some point. It’s about time for a project. Good job.
Thanks for showing! I always love to see your videos about the Nano. I also make (SBC) review videos on UA-cam. I had to make the choice between the Odroid N2 and the NVIDIA Jetson to buy. I went for the N2. I needed it to do a lot of Blender renders. And it does it great with it's powerful CPU. But now I've gotten a 4K USB3 camera to review. The N2 can only handle 720p software encoding with it. I should have a board with USB3 and hardware encoding(I hope). Maybe the Odroid XU4. I'll have to get it out of the dust. Again thank you.
Right clicking within the terminal window should paste what held on the clipboard. No need to select paste. Similarly, whatever is highlighted within a terminal windows, is sent to the clipboard
Thanks for the tips! I think that you need to click the middle button in the Terminal to paste the clipboard contents, as the right button should always bring up the context menu. Of course we always (or at least try) to use the menus in the videos so that it's easier to follow in the tutorials. Thanks for watching!
I do not know, but the video is 5 years old. The updated installation instructions are on the Intel RealSense GitHub: github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md Good luck on your project!
Great Video! In the B&W left & right images do you see the IR pattern (maybe test in low light condition)? Is this pattern always the same or is it changing over time?
Thank you for your totorial. I just want to now if i use realsense cameras instead of normal mipi csi cameras for object tracking, what advantages will i have ?
Thanks for the video once again! I loved the PRO TIP about film covering! I am planning to buy one of this set (Jetson Nano + RS T26) for a prototype application only for detecting skeletal tracking and applying some projection matrix at the highest frame rate I can get. I wonder if this could be up to 60fps, and if I could use more than one RS T26 plugged on the same board...
RTABMap is designed for use with depth cameras. The images from the T265 need to be rectified before you can use them for that particular package. Thanks for watching!
Hello Jim, Can you help me figure out how to use both intel realsense t265 and d435 in rviz (ros) as I'm really in trouble right now can you help me out. Thanks in advance!
I want to get the X,Y,Z coordinates of the camera (displayed in the real sense windows) into my python code as a variable. Seems like pyrealsense2 can be used to get that data. But I can't find which function do that exactly. Do you have any idea for that ?
What is the accuracy of T265? Can you measure the distance and length of an object? The accuracy needs to be able to reach an error of about 1 cm, and the carrier may be unstable.
It remains across reboots, no need to add to a start up file. However once you are done with rebuilding the kernel, you may not want the swap file. You can use swapoff to turn off the swap file, and modify the /etc/fstab file. Thanks for watching!
The install for the d435 on the nano is the same as the t265 correct? So if I plug both camera's in they should both work.... but When I start the ./realsense-viewer I can only add the d435 I can not add the t265. Any idea's?
Nice tutorial! Do you think you could test GPU rendering for blender on it? I'm looking into buying 1 or 2 to add to a small render farm and the nano might be very useful
hmm...the 3D at the end looks terribly imprecise. You picked up from specific geolocation at first and returned it to almost the same point. Yet there is a differential distance that seems to be around 30 cm when we took a side view of the 3D location diagram. Can you show us that again or tell us what that tolerance was, please? It makes a difference to me. Thank you.
@@JetsonHacks I admire the support you give and the videos you do are very helpful to me. Thank you! I wonder if it is possible to tweak/improve the location of the realsense with functions or settings that come with it?
There are a large number of parameters to control. It depends on your application as to how you would get better accuracy. For example, if you are using a wheeled robot you can send odometry information to the camera and it will incorporate that information to get better positioning. To be clear, this is a "how to use the T265 with a Jetson Nano" how to, not a qualification of the T265 for a particular use.
Hey! I wanted to ask are there any alternatives for fast paced applications like for example race drones and even if the Tracking and slam is on par for the job would nano be able to handle it?
It's hard to tell, there's a lot of variables. The T265 is pretty new, I certainly don't have enough experience with it yet to give a meaningful answer. Thanks for watching!
@@ozy5332 It's a very large subject. For the drone stuff, Pixhawk and DIY Drones is a good place to start. For SLAM, that's a really big area. It's beyond what I can describe here, but robot operating system (ROS) is as good of starting point as any.
@@JetsonHacks btw for slam with python I was GeoHotz he has a a video where he codes it for hours understood a few things but still long way to go.. thanks for the other references though :)
the line "Build CMAKE: false" at 4:32 does not show up for me, which later shows up as an error message CMAKE_CUDA_COMPLIER not set, after EnableLanguage" How can I fix this? Thanks!
I do not know which L4T version you are running. Newer version of the repository do not rebuild CMAKE. I do not know if you have CUDA installed on your Jetson. Typically you would look for the CUDA compiler and set the path appropriately.
@@JetsonHacks Thanks for your quick reply. I am actually using the depth and tracking camera together with python on a RPi4. I'll keep looking how to fix the issue.
@@pratikparajuli5991 The scripts are for a NVIDIA Jetson Nano which has a GPU. For the RPi you will need to modify the script both for that environment and architecture. Good luck on your project.
Installed librealsense on my Nano last night and was able to confirm that the T265 is working with realsense-viewer. The problem I am having is that the T265 is not detected by realsense-viewer or any other realsense software after a restart where the T265 is left plugged in. If I unplug it and then plug it back in everything works again. Anyone else have this issue? Is it possibly a firmware issue on the T265?
I tried to use RealSense T265 on Jetson Nano. I had 4.9GB free space when I clone the installSwapfile directory. But when I run ./installSwapfile.sh command I got an error "fallocate : fallocate failed: No space left on device" . And my free space became 0. The clone is not that big. Do you know about this issue ?
The swapfile is 4.0 GB. The library and firmware for the T265 are of significant size. If you were trying to build the kernel too, there's no way that all fits in 4.9GB. Typically when building something like this, you should have an adequate size SD card/USB drive with enough free space. Also, it is recommended that you build on a fresh install.
@@JetsonHacks Hey man would you be able to help me out, I really want to make an AI to trade forex for me. I can't find any information and im not budging any closer on moving towards my goals I thought you would be the best person to ask for direction. Thanks!!
JetsonHacks thanks for the reply, I ended up finding some information on quant trading and making charts with R programming, buddy said the neural net he was running for years was too stressful, less complicated the better
@@JetsonHacks Right, I probably should have asked against the D435. I am looking at a navigation / obstacle camera for a remote lawnmower project, and am between the T265/D435 and the Zed. Main board will be a jetson nano, of course :) Do you have an opinion on the comparison? You have videos on all three cameras.
@@kalsprite No opinion. Each has its strengths and weaknesses. If you are using a Jetson Nano, you should be aware that the ZED does all of its processing on the Nano, which can be computationally expensive. That is, there's not a lot of extra computation capacity available past that. The Intel cameras do their computation on board the camera.
@@kalsprite Late reply, but oh well. In addition to the heavy computation needed by the ZED, it also works terrible indoors, like in a small room for example vs the Realsense D435, and the Realsense D435 is terrible compared to a TOF sensor like the Azure Kinect or the Realsense L515. Outdoors is another story and TOF/IR does not work as well, so the ZED is competitive outdoors at long ranges.
Jim, I have tried installing several times on my nano(with no install issues) , but I am still not able to access t265 using the RealSense-Viewer. I have looked on git hub and found similar problems github.com/IntelRealSense/librealsense/issues/3361 , the only difference is they were using ubuntu16. When I run lsusb I can see the t265 on Bus 001 Device:003: ID 03e7:2150 . But the realsense viewer does not seem to be able to pick it up. If I plug the d435 camera in the viewer instantly picks the camera up. One difference is the Bus the d435 connects to bus 002 while the t265 connects to bus 001. I can plug the t265 into my Windows10 PC and it works fine. Any idea's? The device number is also different t265 it is device 003 while the d435 is device 005. I also noticed that using a high end power supply that can provided 5Amps at 5volts really improved the nano's performance. I was using a 2amp max supply and it just was not enough...the nano was slow. Using the d435 I did not get any of the frame errors you were showing. The viewer was the only thing limiting the frame rate and size.
@@JetsonHacks I am plugging directly into the nano USB slot(s). Yes I have had both the t265 and the d435 plugged in at the same time. I have tried swapping USB ports. I have unplugged the d435 and left the t265 plugged in. With the viewer then showing nothing to add at all. The other USB Slots I have a keyboard and mouse directly plugged into the nano. I do have a usb hub(same as you have) but it is not plugged into the nano and I can give it a try if you think it is needed. Here maybe something of interest: The Linux Foundation 3.0 root hub is on the 002 Bus. The Linux Foundation 2.0 root hub is on the 001 bus. The camera's are USB 3.0.... yet the only bus that the t265 seems to attach to is the 001 bus... while the d435 seems to always on the 002 bus which is USB 3 hub. My question is how do I get the t265 on the 002 Bus that the USB 3 hub is on? Or is this not relevant.
@@sy2532 First, I would try the powered hub. The T265 does not need a modified kernel, you can use librealsense on its own. With that in mind, create a new SD card and build librealsense on it, but do not modify the kernel. You should see the T265 work. Also, sometimes just replugging a camera can help it be recognized.
i tested this camera last year using raspberry4 and jetson nano, it has an important issue: github.com/IntelRealSense/librealsense/issues/4518, not very stable
Combining the Camera and the Jetson Nano certainly opens a lot of possibilities for robotic navigation and obstacle avoidance. A gps receiver would be a good addition. The demo only shows a fraction of the potential that’s available to someone with an imagination and the money to invest. This could easily be turned into a serious autonomous lawn mower or other independent robotic device. I hope you’re planning on taking this in a specific direction at some point. It’s about time for a project. Good job.
Thank you for the kind words, and thanks for watching!
Thanks for showing! I always love to see your videos about the Nano.
I also make (SBC) review videos on UA-cam. I had to make the choice between the Odroid N2 and the NVIDIA Jetson to buy. I went for the N2. I needed it to do a lot of Blender renders. And it does it great with it's powerful CPU. But now I've gotten a 4K USB3 camera to review. The N2 can only handle 720p software encoding with it. I should have a board with USB3 and hardware encoding(I hope). Maybe the Odroid XU4. I'll have to get it out of the dust. Again thank you.
Thank you for the kind words. You are welcome, and thanks for watching!
Right clicking within the terminal window should paste what held on the clipboard. No need to select paste. Similarly, whatever is highlighted within a terminal windows, is sent to the clipboard
Thanks for the tips! I think that you need to click the middle button in the Terminal to paste the clipboard contents, as the right button should always bring up the context menu. Of course we always (or at least try) to use the menus in the videos so that it's easier to follow in the tutorials. Thanks for watching!
Nice fire extinguisher. I think I'll put my battery pack inside one when my Jetson arrives next week.
Mr Fire Extinguisher is impressed! Thanks for watching!
should this work with the Jetson AGX Orin? The app boots up but it does not seem to detect my D435i camera.
I do not know, but the video is 5 years old. The updated installation instructions are on the Intel RealSense GitHub: github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md
Good luck on your project!
Thank you!!
these two are made to be used together.
You are welcome, and thanks for watching!
Great Video!
In the B&W left & right images do you see the IR pattern (maybe test in low light condition)?
Is this pattern always the same or is it changing over time?
I've changed this answer. The T265 does not have an IR emitter, it is a visible light device. Thanks for watching!
Enjoyed! If you don't mind the question. Do you have a background in AI or electronics?
I'm glad you liked the video. I have a computer science degree. Thanks for watching!
Thanks for the nice demo
You are welcome, and thanks for watching!
Thank you for your totorial. I just want to now if i use realsense cameras instead of normal mipi csi cameras for object tracking, what advantages will i have ?
Thanks for the video once again! I loved the PRO TIP about film covering!
I am planning to buy one of this set (Jetson Nano + RS T26) for a prototype application only for detecting skeletal tracking and applying some projection matrix at the highest frame rate I can get. I wonder if this could be up to 60fps, and if I could use more than one RS T26 plugged on the same board...
The frame rate of the cameras is 30fps. You can use more than one camera. Thanks for watching!
Looks great will it work on my pixhawk rover project
I do not know. Thanks for watching!
Great video!
Thanks!
Thank you for the kind words, and thanks for watching!
Very nice overview! Have you got an idea whether you could run RTABMap on a turtlebot platform with this hardware?
Or have you perhaps already tried RTABMap without external odometry?
RTABMap is designed for use with depth cameras. The images from the T265 need to be rectified before you can use them for that particular package. Thanks for watching!
Hello Jim, Can you help me figure out how to use both intel realsense t265 and d435 in rviz (ros) as I'm really in trouble right now can you help me out. Thanks in advance!
I want to get the X,Y,Z coordinates of the camera (displayed in the real sense windows) into my python code as a variable. Seems like pyrealsense2 can be used to get that data. But I can't find which function do that exactly. Do you have any idea for that ?
What is the accuracy of T265? Can you measure the distance and length of an object? The accuracy needs to be able to reach an error of about 1 cm, and the carrier may be unstable.
Not sure what you are asking. The T265 is a tracking camera.
Thanks for the video. I was waiting for it. Can the D435 work in the same way without patching the kernel?
Here's a video comparing the differences: ua-cam.com/video/8ckhQP7Af-g/v-deo.html
Thanks for watching!
when you added the swapfile, was that a permanent add ?, or should we add it on a start up file ?
It remains across reboots, no need to add to a start up file. However once you are done with rebuilding the kernel, you may not want the swap file. You can use swapoff to turn off the swap file, and modify the /etc/fstab file. Thanks for watching!
The install for the d435 on the nano is the same as the t265 correct? So if I plug both camera's in they should both work.... but When I start the ./realsense-viewer I can only add the d435 I can not add the t265. Any idea's?
Hi Buddy. I'm a person with limitations due to amyotrophic lateral sclerosis disease. Sorry, what is the accuracy of the eye tracking of this device?.
Hi Miguel. The tracking camera tracks its position in space, not a feature such as eyes. Thanks for watching!
@@JetsonHacks
This product is not an eye tracker like Tobii eye tracker 5?
@@migueldelahoz4740 That is correct. It is not an eye tracker.
Anyone know how much power this board draws in deep sleep mode? I saw on the nvidia forums that it was around 10 mW but I wanted a second opinion
could you share the STL for the 3d printed mount?
I seem to recall that it's on thingiverse. Search for T265, there are several models there. Good luck on your project!
Nice tutorial! Do you think you could test GPU rendering for blender on it? I'm looking into buying 1 or 2 to add to a small render farm and the nano might be very useful
Thank you for this new tutorial. 😀
You are welcome, and thanks for watching!
Very cool
It's a very nice device. Thanks for watching!
What Linux distro are you using in this video on your desktop PC? Thanks.
The computer in the demo is a NVIDIA Jetson Developer Kit which runs L4T, an Ubuntu derivative. Thanks for watching!
hmm...the 3D at the end looks terribly imprecise. You picked up from specific geolocation at first and returned it to almost the same point. Yet there is a differential distance that seems to be around 30 cm when we took a side view of the 3D location diagram. Can you show us that again or tell us what that tolerance was, please? It makes a difference to me. Thank you.
I have found it to be very good. It doesn't sound like it meets your needs. Thanks for watching!
@@JetsonHacks I admire the support you give and the videos you do are very helpful to me. Thank you! I wonder if it is possible to tweak/improve the location of the realsense with functions or settings that come with it?
There are a large number of parameters to control. It depends on your application as to how you would get better accuracy. For example, if you are using a wheeled robot you can send odometry information to the camera and it will incorporate that information to get better positioning. To be clear, this is a "how to use the T265 with a Jetson Nano" how to, not a qualification of the T265 for a particular use.
Hey! I wanted to ask are there any alternatives for fast paced applications like for example race drones and even if the Tracking and slam is on par for the job would nano be able to handle it?
It's hard to tell, there's a lot of variables. The T265 is pretty new, I certainly don't have enough experience with it yet to give a meaningful answer. Thanks for watching!
@@JetsonHacks I look forward to getting into this any suggestions from where to learn about it and stuff?
@@ozy5332 It's a very large subject. For the drone stuff, Pixhawk and DIY Drones is a good place to start. For SLAM, that's a really big area. It's beyond what I can describe here, but robot operating system (ROS) is as good of starting point as any.
@@JetsonHacks btw for slam with python I was GeoHotz he has a a video where he codes it for hours understood a few things but still long way to go.. thanks for the other references though :)
It's rather visual odometry and not slam or is there really a map generated?
There is a map generated, stored on board the camera. Thanks for watching!
you said it gives POSE data, what is that? is it position/direction data?
Correct. You can see in the video that it knows where it is in 3 space, and which direction it is pointing. Thanks for watching!
Can a Razer Stargazer be used with the nano instead? It was a realsense webcam
I don't have any experience with the Stargazer.
the line "Build CMAKE: false" at 4:32 does not show up for me, which later shows up as an error message CMAKE_CUDA_COMPLIER not set, after EnableLanguage" How can I fix this?
Thanks!
I do not know which L4T version you are running. Newer version of the repository do not rebuild CMAKE. I do not know if you have CUDA installed on your Jetson. Typically you would look for the CUDA compiler and set the path appropriately.
@@JetsonHacks Thanks for your quick reply. I am actually using the depth and tracking camera together with python on a RPi4. I'll keep looking how to fix the issue.
@@pratikparajuli5991 The scripts are for a NVIDIA Jetson Nano which has a GPU. For the RPi you will need to modify the script both for that environment and architecture. Good luck on your project.
Installed librealsense on my Nano last night and was able to confirm that the T265 is working with realsense-viewer. The problem I am having is that the T265 is not detected by realsense-viewer or any other realsense software after a restart where the T265 is left plugged in. If I unplug it and then plug it back in everything works again. Anyone else have this issue? Is it possibly a firmware issue on the T265?
It looks like it's loosing position tracking quite a lot!..then a glitch before re-tracking..
currently the link shows the camera is unavaible .
does it have jittering issue?
I tried to use RealSense T265 on Jetson Nano. I had 4.9GB free space when I clone the installSwapfile directory. But when I run ./installSwapfile.sh command I got an error "fallocate : fallocate failed: No space left on device" . And my free space became 0. The clone is not that big. Do you know about this issue ?
The swapfile is 4.0 GB. The library and firmware for the T265 are of significant size. If you were trying to build the kernel too, there's no way that all fits in 4.9GB. Typically when building something like this, you should have an adequate size SD card/USB drive with enough free space. Also, it is recommended that you build on a fresh install.
Seems like I should use a 32GB SD card instead of the 16GB one. Thanks for the reply.
@@PRASANNAMALITH-s1g Yes, 16GB just gets you going. If you are going to develop on it or gather data, 32-64GB is a better choice.
It worked well with 64GB microSD card. Thanks
@@PRASANNAMALITH-s1g I am glad you got it to work.
Can you show us a real application with this camera?
How about using two webcams instead($4 each)?
You should try that.
Thanks!
You are welcome, and thanks for watching!
that is awesome
Thanks for watching!
@@JetsonHacks Hey man would you be able to help me out, I really want to make an AI to trade forex for me. I can't find any information and im not budging any closer on moving towards my goals I thought you would be the best person to ask for direction. Thanks!!
@@codyvandervoort4079 I don't know anything about forex trading. Good luck on your project!
JetsonHacks thanks for the reply, I ended up finding some information on quant trading and making charts with R programming, buddy said the neural net he was running for years was too stressful, less complicated the better
How does this camera compare to the zed?
The T265 is a tracking camera, and the ZED is a RGBD camera. Thanks for watching!
@@JetsonHacks Right, I probably should have asked against the D435. I am looking at a navigation / obstacle camera for a remote lawnmower project, and am between the T265/D435 and the Zed. Main board will be a jetson nano, of course :)
Do you have an opinion on the comparison? You have videos on all three cameras.
@@kalsprite No opinion. Each has its strengths and weaknesses. If you are using a Jetson Nano, you should be aware that the ZED does all of its processing on the Nano, which can be computationally expensive. That is, there's not a lot of extra computation capacity available past that. The Intel cameras do their computation on board the camera.
@@kalsprite Late reply, but oh well. In addition to the heavy computation needed by the ZED, it also works terrible indoors, like in a small room for example vs the Realsense D435, and the Realsense D435 is terrible compared to a TOF sensor like the Azure Kinect or the Realsense L515. Outdoors is another story and TOF/IR does not work as well, so the ZED is competitive outdoors at long ranges.
@@mattizzle81 thanks! the timing isnt bad at all. i ning on picking up the project again this winter.
Jim, I have tried installing several times on my nano(with no install issues) , but I am still not able to access t265 using the RealSense-Viewer. I have looked on git hub and found similar problems github.com/IntelRealSense/librealsense/issues/3361 , the only difference is they were using ubuntu16. When I run lsusb I can see the t265 on Bus 001 Device:003: ID 03e7:2150 . But the realsense viewer does not seem to be able to pick it up. If I plug the d435 camera in the viewer instantly picks the camera up. One difference is the Bus the d435 connects to bus 002 while the t265 connects to bus 001. I can plug the t265 into my Windows10 PC and it works fine. Any idea's? The device number is also different t265 it is device 003 while the d435 is device 005.
I also noticed that using a high end power supply that can provided 5Amps at 5volts really improved the nano's performance. I was using a 2amp max supply and it just was not enough...the nano was slow. Using the d435 I did not get any of the frame errors you were showing. The viewer was the only thing limiting the frame rate and size.
Did you try to hit the + button and add device?
@@JetsonHacks The RealSense-Viewer does not list the T265 so there is nothing to add. The d435 just pops up instantly without having to add.
@@sy2532 How are you interfacing with the T265? Do you have other cameras plugged in at the same time, and are you using a powered USB hub?
@@JetsonHacks I am plugging directly into the nano USB slot(s). Yes I have had both the t265 and the d435 plugged in at the same time. I have tried swapping USB ports. I have unplugged the d435 and left the t265 plugged in. With the viewer then showing nothing to add at all. The other USB Slots I have a keyboard and mouse directly plugged into the nano. I do have a usb hub(same as you have) but it is not plugged into the nano and I can give it a try if you think it is needed.
Here maybe something of interest: The Linux Foundation 3.0 root hub is on the 002 Bus. The Linux Foundation 2.0 root hub is on the 001 bus. The camera's are USB 3.0.... yet the only bus that the t265 seems to attach to is the 001 bus... while the d435 seems to always on the 002 bus which is USB 3 hub. My question is how do I get the t265 on the 002 Bus that the USB 3 hub is on? Or is this not relevant.
@@sy2532 First, I would try the powered hub. The T265 does not need a modified kernel, you can use librealsense on its own. With that in mind, create a new SD card and build librealsense on it, but do not modify the kernel. You should see the T265 work. Also, sometimes just replugging a camera can help it be recognized.
is possible works with unreal engine???
YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44 YAW!! 6:44
YAW !!!
Make python tutorial with Realsense and nvidia nano
.....please?
like developing a slam?
eyy!!! =)
Please make nano + d435 real sense
Yes please!
i tested this camera last year using raspberry4 and jetson nano, it has an important issue: github.com/IntelRealSense/librealsense/issues/4518, not very stable