CS 285: Lecture 2, Imitation Learning. Part 3

Поділитися
Вставка

КОМЕНТАРІ • 5

  • @smallwang
    @smallwang 9 місяців тому +7

    Maybe the volume can be higher:)

  • @MrNoipe
    @MrNoipe 7 місяців тому +1

    Lectures 3-17 seem to be missing. Any chance they can be uploaded? Thanks!

  • @omarrayyann
    @omarrayyann 4 місяці тому

    Can the rest of the lectures be uploaded? Thanks a lot!

  • @muzesu4195
    @muzesu4195 Місяць тому

    I have a question about mixture gaussian.
    It outputs n gassian distributions or add them together with weights? Although I don't think adding with weights will work. And when talking about multimodal, does it mean we can have different ways to get the solution? like the example with tree. Then how come adding degree of freedom will relate to multimodal or different ways to do something

    • @browncow7113
      @browncow7113 6 днів тому

      If you were to add together two different Gaussian distributions, each with a different mean, then the graph/histogram of this distribution would look like "two humps". This is a probability distribution over the different actions that your agent can take. So, it is saying that there are two actions around which there is a high probability-density (where the two humps are). And those could be, for example, "turn left" (or, turn -90 degrees) and "turn right" (turn +90 degrees).