Hi! Nice project, thanks for sharing. I have a conceptual question. Say you make your robot, so you have your motors, drivers and motor controller. How is the communication between ROS's Nav Stack and the controllers? For example, the Nav stack says 'explore, go fordward'... how does the robot interpret that message? Thanks
The output of the Nav Stack is cmd_vel, this is a geometry_msgs/twist, that contains the velocity for each wheel (in the case of differential robot). The controles must do the transformation between the message and the voltage that the motor need to achieve the order
Santiago RONDON CARDENAS thank you. By now, I am already subscribing my Arduino to the cmd velocity message. But nothing is easy xD ive got some troubles with the arduino frecuency to get the odometry from the encoders! Patience i guess..
Hi, i am PHD student and i work same project but i use Turtlebot/Kobuki. Can i do similar localization and object avoidance with my robot by your code. Can you help me this tutorial?
Hi, How did you merge ımu and wheel encoder data? For this, did you use arduino ? I want to do the same project too. Could you please help me this topic?
Hi, Could you please some question about ros navigation. I have a mobile robot without encoder in the wheels. And ı have 2d urg 04lx laser scanner ( ı converted to pointcloud data from laserscan data, I mean ı can obtain distance and angle of the the nearest object) and IMU. I want to do autonomous navigation using IMU and lidar data. How can ı do this? Please help me . If you need, ı can send to you my code document.
Hi, you should look into something called visual odometry. This visual odometry can be paired with IMU based odometry in a package called robot_localization. This should give you the right data to perform navigation!
Parabens achei genial, será um bom engenheiro eletronico, especialista em robótica. . .
bro i love your project
any tutorials? with code?
Fantastic, really really cool.
wow its amazing
Awesome !!!
hello sir, even I'm working on a similar project using hokuyo lidar and Kinect 360 camera please, can you explain this step by step?
same here bro! can u now help me out please? its my final year project
nice job ! which local planner do u use?
Hi Thomas, did you test the full coverage map algorithms or the cleaner application with your robot ?
Najeh Marzouk I'm not sure what you are referring to with "cleaner application"
what device or microcontroller or sbc that you used in that video for processing?? RPI?
Hi! Nice project, thanks for sharing. I have a conceptual question. Say you make your robot, so you have your motors, drivers and motor controller. How is the communication between ROS's Nav Stack and the controllers? For example, the Nav stack says 'explore, go fordward'... how does the robot interpret that message? Thanks
The output of the Nav Stack is cmd_vel, this is a geometry_msgs/twist, that contains the velocity for each wheel (in the case of differential robot). The controles must do the transformation between the message and the voltage that the motor need to achieve the order
Santiago RONDON CARDENAS thank you. By now, I am already subscribing my Arduino to the cmd velocity message. But nothing is easy xD ive got some troubles with the arduino frecuency to get the odometry from the encoders! Patience i guess..
same here bro! can u now help me out please? its my final year project
What obstacle avoidance algorithm have you used?
Hi, i am PHD student and i work same project but i use Turtlebot/Kobuki. Can i do similar localization and object avoidance with my robot by your code. Can you help me this tutorial?
Hello, If I rotate the camera 90 degrees, will it have any effect? Thanks
you should warn the audience about noise, anyway thanx for video
Will this library work for my own hardware robot model? I mean to say motor driver and all..?
Could you please share a screenshot of the rqt graph of the project?
Hey can me share which controller used in and the process for running it
How did u made the robot? Any tutorial?
Hi,
How did you merge ımu and wheel encoder data? For this, did you use arduino ? I want to do the same project too. Could you please help me this topic?
same here bro! can u now help me out please? its my final year project
@@ahtishamali431 hey did u figure this out can u help me this also my last year
how do you give the robot 2D nav goal? I want my robot find special object and move toward. can you help me?
Hey did you figure this out? I'm working on a similar problem.
Hi, Could you please some question about ros navigation. I have a mobile robot without encoder in the wheels. And ı have 2d urg 04lx laser scanner ( ı converted to pointcloud data from laserscan data, I mean ı can obtain distance and angle of the the nearest object) and IMU. I want to do autonomous navigation using IMU and lidar data. How can ı do this? Please help me . If you need, ı can send to you my code document.
Hi, you should look into something called visual odometry. This visual odometry can be paired with IMU based odometry in a package called robot_localization. This should give you the right data to perform navigation!
Are there any relevant code or links to this topic?