Task Relevant Roadmaps for iCub humanoid

Поділитися
Вставка
  • Опубліковано 27 лис 2024

КОМЕНТАРІ • 10

  • @videomentaryproductionschannel
    @videomentaryproductionschannel 3 роки тому

    Now that is exilent, I can this type of robot learning, going a long way, this robot is getting very advanced 😳, keep up the good work.

  • @brighticub
    @brighticub  11 років тому

    iCub has two cameras which allow it to do stereo vision. In fact, our robot-vision colleagues at IDSIA developed a cool vision approach you can find at the IDSIA robotics page linked above.

  • @lepwis
    @lepwis 10 років тому

    So much effort for it to do tasks we consider as simple. How much more will it take for the robot to do something of a level of complexity that is useful?

  • @brighticub
    @brighticub  11 років тому

    The robot is indeed a bit shaky, because (1) it doesn't have legs and is sitting on a stick, which makes it unstable (2) as the focus on the movie is our new task-relevant planning algorithm, we used a fairly simple controller algorithm for executing the planned motions on the robot. The shots with the robot model are very shaky, because its shows the evolutionary algorithm looking for solution at about 5000 robot poses per second.

  • @Dirtfire
    @Dirtfire 11 років тому

    Very nice.

  • @Eay5paev
    @Eay5paev 11 років тому

    It's kind of shaky. Is this because an evolutionary algorithm is behind?

  • @peculiarcruelty
    @peculiarcruelty 11 років тому

    Охренеть;-)