iCub has two cameras which allow it to do stereo vision. In fact, our robot-vision colleagues at IDSIA developed a cool vision approach you can find at the IDSIA robotics page linked above.
So much effort for it to do tasks we consider as simple. How much more will it take for the robot to do something of a level of complexity that is useful?
The robot is indeed a bit shaky, because (1) it doesn't have legs and is sitting on a stick, which makes it unstable (2) as the focus on the movie is our new task-relevant planning algorithm, we used a fairly simple controller algorithm for executing the planned motions on the robot. The shots with the robot model are very shaky, because its shows the evolutionary algorithm looking for solution at about 5000 robot poses per second.
Now that is exilent, I can this type of robot learning, going a long way, this robot is getting very advanced 😳, keep up the good work.
iCub has two cameras which allow it to do stereo vision. In fact, our robot-vision colleagues at IDSIA developed a cool vision approach you can find at the IDSIA robotics page linked above.
So much effort for it to do tasks we consider as simple. How much more will it take for the robot to do something of a level of complexity that is useful?
The robot is indeed a bit shaky, because (1) it doesn't have legs and is sitting on a stick, which makes it unstable (2) as the focus on the movie is our new task-relevant planning algorithm, we used a fairly simple controller algorithm for executing the planned motions on the robot. The shots with the robot model are very shaky, because its shows the evolutionary algorithm looking for solution at about 5000 robot poses per second.
Very nice.
It's kind of shaky. Is this because an evolutionary algorithm is behind?
Охренеть;-)