Usually, RL framework libraries provide a lot of examples like this. stable-baselines.readthedocs.io/en/master/guide/examples.html di-engine-docs.readthedocs.io/en/latest/13_envs/index.html
Hello your tutorial is amazing. Launching the script everything works fine for a minute then this error happens: "RuntimeError: Input and parameter tensors are not at the same device, found input tensor at cuda:0 and parameter tensor at cuda:1", do you know why and how to fix it? It should be something related to the python script, specifically to torch part, but I don't know where to put my hands...
EDIT: I should have been able to fix the bug by fixing line 28 in the sac.py script, like this: self.device = "torch.device("cuda:0" if torch.cuda.is_available() else "cpu")"
Hi Alberto Vaglio! Thank you for pointing out the bug. Since I haven’t got a machine with multiple GPUs, I could not predict that error. Yes, I think the fix that you suggested will solve this problem.
Hello @robotmania8896 could you also make a video tutorial where you setup a simple reinforcement learning project (like the cartpole problem) in Isaac Sim from SCRATCH, building everything in the editor, writing all the necessary code, and then connect everything to make it work? That would be wonderful!! Because I still struggle to learn the basics for RL in Isaac Sim. Or if not, can you suggest, if they exist, some tutorials that do this?
Which part of the simulation exactly you would like to know deeper? Also, here is Nvidia official documentation and examples. docs.omniverse.nvidia.com/isaacsim/latest/core_api_tutorials/tutorial_core_hello_world.html
Great video! I am curious, how can you instantiate multiple robots at once to learn faster? and how can you import your older TD3 project into Isaac Sim?
Hi James! Thanks for watching my video! To instantiate multiple robots, you just have to add the URDF multiple times and change topic for each robot. I think it is possible to import my older TD3 project, but the code should be changed in several parts.
you just installed Isaacsim packages directly in a ROS 2 environment via pip. Are there any dependency conflicts between ros2 humble packages and isaacsim packages ?
@@robotmania8896 I have actually built the basics on ros2 and I am trying to make a few applications. For Isaac sim, you can create a series where the basics are explained and a few sample projects are also made. Do you also plan to produce content on swarm robots?
Hi Mert Aydoğan! Thanks for watching my video! If you would like, you can use (or re-record in another language) this video for university or collage lectures. But please do not post it to any social media or other video hosting platforms since there will be copy right issues.
@@robotmania8896 Hello, thank you so much for getting back to me. As a clarification, I didn’t mean to use your video directly. My plan was to follow the steps you showed in your video, explain them in my own words, and create a new video in Turkish. This way, there wouldn’t be any copyright issues regarding the content (video or audio). That said, I’d still like to ask for your permission from an ethical standpoint to cover this topic and create content in Turkish. By the way, I think your channel is amazing! It’s great to see platforms like Isaac SIM and Nvidia’s simulation tools being utilized so effectively.
@robotmania8896 Do you have any email to discuss? Autonomous navigation in my gazebo environment. I'm using hector-quadrotor-noetic (github) I want to tell to my drone go to this position without hitting any walls/obstacles. 4 actions for right left down up. I'm starting a project and I'm new , I need few things to learn asap
Incredible tutorial about reinforcement learning with Isaac sim!!!!
Hi Giang Nguyễn!
It is my pleasure if this video has helped you!
love that you are making videos on isaac sim keep it up
Hi willuu!
It is my pleasure if my videos have helped you!
ur the guy for isaac sim ... please make more tutorials like this
Sure!
Can you suggest projects or any other tutorials like this to learn RL with practical examples..?
Usually, RL framework libraries provide a lot of examples like this.
stable-baselines.readthedocs.io/en/master/guide/examples.html
di-engine-docs.readthedocs.io/en/latest/13_envs/index.html
Hello your tutorial is amazing. Launching the script everything works fine for a minute then this error happens: "RuntimeError: Input and parameter tensors are not at the same device, found input tensor at cuda:0 and parameter tensor at cuda:1", do you know why and how to fix it?
It should be something related to the python script, specifically to torch part, but I don't know where to put my hands...
EDIT: I should have been able to fix the bug by fixing line 28 in the sac.py script, like this: self.device = "torch.device("cuda:0" if torch.cuda.is_available() else "cpu")"
Hi Alberto Vaglio!
Thank you for pointing out the bug. Since I haven’t got a machine with multiple GPUs, I could not predict that error. Yes, I think the fix that you suggested will solve this problem.
Hello @robotmania8896 could you also make a video tutorial where you setup a simple reinforcement learning project (like the cartpole problem) in Isaac Sim from SCRATCH, building everything in the editor, writing all the necessary code, and then connect everything to make it work? That would be wonderful!! Because I still struggle to learn the basics for RL in Isaac Sim. Or if not, can you suggest, if they exist, some tutorials that do this?
Which part of the simulation exactly you would like to know deeper? Also, here is Nvidia official documentation and examples. docs.omniverse.nvidia.com/isaacsim/latest/core_api_tutorials/tutorial_core_hello_world.html
Great video! I am curious, how can you instantiate multiple robots at once to learn faster? and how can you import your older TD3 project into Isaac Sim?
Hi James!
Thanks for watching my video!
To instantiate multiple robots, you just have to add the URDF multiple times and change topic for each robot. I think it is possible to import my older TD3 project, but the code should be changed in several parts.
you just installed Isaacsim packages directly in a ROS 2 environment via pip. Are there any dependency conflicts between ros2 humble packages and isaacsim packages ?
Hi Kaushalkumar Patel!
Thanks for watching my video!
No, I didn’t experience any dependency conflicts.
@@robotmania8896 Thank you for feedback.
Hey mate, I follow you with interest, do you plan to share lessons on Isaac Sim from beginner to advanced level?
Hi Nikola Teslanın Çırağı!
What aspect of Isaac Sim would you like to learn?
@@robotmania8896 I have actually built the basics on ros2 and I am trying to make a few applications. For Isaac sim, you can create a series where the basics are explained and a few sample projects are also made. Do you also plan to produce content on swarm robots?
I don’t have anything particular in mind, but I will consider your suggestion about swarm robots.
@@robotmania8896 thanks!
Would you allow me to re-record the video in Turkish and broadcast it to the Turkish audience?
Hi Mert Aydoğan!
Thanks for watching my video!
If you would like, you can use (or re-record in another language) this video for university or collage lectures. But please do not post it to any social media or other video hosting platforms since there will be copy right issues.
@@robotmania8896 Hello, thank you so much for getting back to me. As a clarification, I didn’t mean to use your video directly. My plan was to follow the steps you showed in your video, explain them in my own words, and create a new video in Turkish. This way, there wouldn’t be any copyright issues regarding the content (video or audio).
That said, I’d still like to ask for your permission from an ethical standpoint to cover this topic and create content in Turkish.
By the way, I think your channel is amazing! It’s great to see platforms like Isaac SIM and Nvidia’s simulation tools being utilized so effectively.
@@mertaydogan5641 Yes, you can do that. I think there is no problem in doing that.
Thank you, it is my pleasure if my videos help you!
Hello, I really like your content.
Do you have any solutions for Drone and RL ?
Please let me know. I can buy you a coffee :D
Hi Andreas Valvis!
Thanks for watching my video!
What exactly would you like to do using a drone and RL?
@robotmania8896 Do you have any email to discuss?
Autonomous navigation in my gazebo environment. I'm using hector-quadrotor-noetic (github)
I want to tell to my drone go to this position without hitting any walls/obstacles. 4 actions for right left down up. I'm starting a project and I'm new , I need few things to learn asap
@@robotmania8896 go to a destination without hitting walls , 4 actions move
Here is my e-mail: robotmania8867@yahoo.com
In that case you may use ultrasonic sensors to control your robot. It is probably the simplest way.
Hi, robot mania first congrats for your project is very interesting. You have a discord channel , github or any way to I'm talk with you ?
Hi João Victor!
Thanks for watching my video!
I don’t have a discord channel. Please write me an email: robotmania8867@yahoo.com