I have a question about mixture gaussian. It outputs n gassian distributions or add them together with weights? Although I don't think adding with weights will work. And when talking about multimodal, does it mean we can have different ways to get the solution? like the example with tree. Then how come adding degree of freedom will relate to multimodal or different ways to do something
If you were to add together two different Gaussian distributions, each with a different mean, then the graph/histogram of this distribution would look like "two humps". This is a probability distribution over the different actions that your agent can take. So, it is saying that there are two actions around which there is a high probability-density (where the two humps are). And those could be, for example, "turn left" (or, turn -90 degrees) and "turn right" (turn +90 degrees).
Maybe the volume can be higher:)
Lectures 3-17 seem to be missing. Any chance they can be uploaded? Thanks!
Can the rest of the lectures be uploaded? Thanks a lot!
I have a question about mixture gaussian.
It outputs n gassian distributions or add them together with weights? Although I don't think adding with weights will work. And when talking about multimodal, does it mean we can have different ways to get the solution? like the example with tree. Then how come adding degree of freedom will relate to multimodal or different ways to do something
If you were to add together two different Gaussian distributions, each with a different mean, then the graph/histogram of this distribution would look like "two humps". This is a probability distribution over the different actions that your agent can take. So, it is saying that there are two actions around which there is a high probability-density (where the two humps are). And those could be, for example, "turn left" (or, turn -90 degrees) and "turn right" (turn +90 degrees).