As an economist specialized in finance and mathematics lover I adore this institution despite I've never set a footstep in Massachusetts nor even the USA. Thanks Lord for these geniuses.
In the simWalks function , I think there is an error in Loop, numTrias has been passed to the WLAK function instead of passing numSteps, this is why the simulations are not dependent on the number of steps .
This is a very misleading class in my humild opinion, because he is computing the average of distances and not the expected value of the random walk. The random walk should be a bell curve with its peak at distance zero, so the expected value of the walk is always zero for the Usual Drunk. What happens is that when the number of steps increases the bell curve becomes wider and you have small probabilities of finding bigger distances, hence the 'mean' distance increases a little bit. Hower the expected distance to the origin is still zero.
but why is the expected distance to the origin zero? for a point that is 1 step away from the origin, there is 3/4 chance for the second step to be even further away from the origin. So the distance will eventually get bigger and bigger.
This is what I wrote earlier: When talking about distances the sign is not relevant. Say you have two trials with one step each and lets only allow movements in x. Asumme the first trial ends at -1 and the secon one at +1. The mean of covered distance is then one while average distance from origin is zero. Its just tow ways to look at the probem: The expected value of distance from origin is 0 while the average distnace covered is not.
If we consider what was covered in the previous lecture about Stoic thinking, and we were trying to create a realistic model about the wandering drunk, wouldn’t we need to create an additional factor that would affect probability called fatigue? Where, by the more of the drunk walk the more tired they get, and they start to either walk less or take smaller steps.
Why not give the abstract class the "usual" method for walking and then override it in the inherited class? operator overloading and inheritance in one example
PLEASE - next time you film a class, show the screen whenever the teacher is discussing what's on it. Countless times I could not see what he was talking about.
late for you but for anybody else seeing this. in the first minute he alludes to the project files which you should ideally download an then you can see them in your own time :)
@@aazz7997 if the NEW STUDENT is limited to downloading the slides, will this include the laser-pointing he's doing, or should the NEW STUDENT just randomly point their finger on a note on the slide and assume that "this here" or "that over there" is where he's laser-pointing?
If someone could help me, in the textbook there's another exemple of drunk: the EW Drunk, moving only in the horizontal axe (-1, 0) and (1, 0). However this drunk is also getting farther away from the origin. But why? If after n number of steps he has equal chance to step etiher W or E, wasn't he supposed to be back to the origin according to the law of big numbers? Isn't it the same as to flip n coins and count number of head and tails?
Have you plotted it on a graph as the professor explains near the end of the video? I would expect an hourglass shape of end points. I would like to know what you found if you end up running this sim.
@@narnbrez I did not have run the simmulation because the result is presented in the textbook. The professor is computing the average of distances and not the expected value of the random walk. The random walk should be a bell curve with its peak at distance zero, so the expected value of the walk is always zero for the EW. What happens is that when the number of steps increases the bell curve becomes wider and you have small probabilities of finding bigger distances, hence the 'mean' distance increases a little bit. Hower the expected distance to the origin is still zero. Kind of misleading IMO but it is correct.
When talking about distances the sign is not relevant. Say you have two trials with one step each and lets only allow movements in x. Asumme the first trial ends at -1 and the secon one at +1. The mean of covered distance is then one while average distance from origin is zero. Its just tow ways to look at the probem: The expected value of distance from origin is 0 while the average distnace covered is not.
Hey guys I have one question from the one who read this, Is there any sorting algorithm which directly predicts the place of every element in array subsequently reducing the time complexity because I m working on such a algorithm so if it is there then plz tell me.
@@sharan9993 consider this data[3,1,5,7,2,9,10,4,6,5,2,14] Here min =1 Max =14 Total numbers = 12 Now consider the first element 3 Here predict mean to predict that in this array where should be 3’s position Position =( 3-1/(14-1))*12 = 1.84 so the 3’s position should be at second which is good because when sort the list 3 stands at third position and if we have more numbers than we don’t have to compare every number with others because we only have to compare number with the other number which is at our numbers location
@@sharan9993 actually I have thought about this and I m still solving this problem but if you know that mean of the list is around min-mix/2 than this algorithm is good And if you have any thoughts about solving that problem than inform me I will glad😇😇
When the masochistic drunk moves on the y-axis he on average gets 0.1 to the north ((1.1-0.9)/2). And since he moves in the y-axis 50% of the time, he gets .05 (0.1/2) to the north on average for every step he takes. See stepChoices in the class definition.
I love how lazy this guy is when it comes to math 😂 Need to calculate probability? Blah, let's just code do that. Pythagor for triangle with 1x1? Nahh, can't be bothered... 😂 True programmers approach IMO
When you attend a class, the professor would face the board and write something on the board. Do your eyes follow what he writes or his back, or his but? Video guys are pretty dumb, they think they have to show the speaker as much as possible. When the professor discusses some point on the result, the video guy should show the viewers the screen Does it make sense to all of you? I am sure what I say does not make sense to you since I am the only one pointing this out. In all my lectures in the US or elsewhere, my face appears only for one minute and the rest of the video shows what I write or the results of my equations.
As an economist specialized in finance and mathematics lover I adore this institution despite I've never set a footstep in Massachusetts nor even the USA. Thanks Lord for these geniuses.
5:00 i just love this guys personality, what an amazing lecturer to have! so glad MIT uploads these breakthrough lectures
Breathtaking? 🤔🕯
What a great professor. It's a real pleasure being able to watch these lectures.
Lies again? Drink Beer + Red Wine
Nah, it's boring as f, even at 2x speed.
Really nice refresher for python. What a funny professor! I enjoy this a lot. Thanks MIT!
I love the way when he comes to a stopping point he stares down the class like a gangster. You got questions huh!?
20:05 Wait, did he just gave a candy to the studient just for trying to ask a question? Damm thats a positive reinforcement that i would like to have.
"...like to have for future obesity."
yep. it candy
@@adiflorense1477 ممكن خاص
Dear Sir, huge thanks to make this course such an easy peasy! Thanks MIT! "Share your knowledge. It’s a way to achieve immortality" - Dalai Lama
In the simWalks function , I think there is an error in Loop, numTrias has been passed to the WLAK function instead of passing numSteps, this is why the simulations are not dependent on the number of steps .
Yeah I spotted the same bug....
Edit: Continued watching... it was an intended bug xD
Good prof!
I aslo got it , but then you can see that in minute 22 he goes to it The idea was to detect the odd results and find out what was failing
Another well presented lecture illustrated with Python examples
thanks,mit
This is a very misleading class in my humild opinion, because he is computing the average of distances and not the expected value of the random walk. The random walk should be a bell curve with its peak at distance zero, so the expected value of the walk is always zero for the Usual Drunk. What happens is that when the number of steps increases the bell curve becomes wider and you have small probabilities of finding bigger distances, hence the 'mean' distance increases a little bit. Hower the expected distance to the origin is still zero.
I didnt see that when I wrote my comment below. Yeah you figured it out right I guess.
What he is talking about as distances is basically the variance of the expected value you are talking about... I think.
Or rather the standard deviation.
but why is the expected distance to the origin zero?
for a point that is 1 step away from the origin, there is 3/4 chance for the second step to be even further away from the origin. So the distance will eventually get bigger and bigger.
This is what I wrote earlier:
When talking about distances the sign is not relevant. Say you have two trials with one step each and lets only allow movements in x. Asumme the first trial ends at -1 and the secon one at +1. The mean of covered distance is then one while average distance from origin is zero. Its just tow ways to look at the probem: The expected value of distance from origin is 0 while the average distnace covered is not.
Simulations are used in reinforcement learning
If we consider what was covered in the previous lecture about Stoic thinking, and we were trying to create a realistic model about the wandering drunk, wouldn’t we need to create an additional factor that would affect probability called fatigue? Where, by the more of the drunk walk the more tired they get, and they start to either walk less or take smaller steps.
I think the first video he shows is not the brownian motion simulation but rather a course- grained collision algorithm such as DPD or MPC.
Why not give the abstract class the "usual" method for walking and then override it in the inherited class? operator overloading and inheritance in one example
Good professor, make things simple and fun
PLEASE - next time you film a class, show the screen whenever the teacher is discussing what's on it. Countless times I could not see what he was talking about.
late for you but for anybody else seeing this. in the first minute he alludes to the project files which you should ideally download an then you can see them in your own time :)
Maybe it is just a license thing.
@Anifco67 You are a fool. Use the lecture notes
Anifco67 if you are not watching this, you are a fool😂
@@aazz7997 if the NEW STUDENT is limited to downloading the slides, will this include the laser-pointing he's doing, or should the NEW STUDENT just randomly point their finger on a note on the slide and assume that "this here" or "that over there" is where he's laser-pointing?
Thanks Mr. Gutag and MIT
*My takeaways:*
1. Why random walks 1:05
Show the damn screen!
his jokes are so good and fall flat way too often loll
Drunkard walk
Simulate one walk k steps
& n such walks
3 abstractuons
1 location (immutable)
2 (possible) ield
3 the drunk
He is standing next to his shadow and attached
If someone could help me, in the textbook there's another exemple of drunk: the EW Drunk, moving only in the horizontal axe (-1, 0) and (1, 0). However this drunk is also getting farther away from the origin. But why? If after n number of steps he has equal chance to step etiher W or E, wasn't he supposed to be back to the origin according to the law of big numbers? Isn't it the same as to flip n coins and count number of head and tails?
Have you plotted it on a graph as the professor explains near the end of the video?
I would expect an hourglass shape of end points. I would like to know what you found if you end up running this sim.
@@narnbrez I did not have run the simmulation because the result is presented in the textbook. The professor is computing the average of distances and not the expected value of the random walk. The random walk should be a bell curve with its peak at distance zero, so the expected value of the walk is always zero for the EW. What happens is that when the number of steps increases the bell curve becomes wider and you have small probabilities of finding bigger distances, hence the 'mean' distance increases a little bit. Hower the expected distance to the origin is still zero.
Kind of misleading IMO but it is correct.
When talking about distances the sign is not relevant. Say you have two trials with one step each and lets only allow movements in x. Asumme the first trial ends at -1 and the secon one at +1. The mean of covered distance is then one while average distance from origin is zero. Its just tow ways to look at the probem: The expected value of distance from origin is 0 while the average distnace covered is not.
Nice explained
HI... does anyone here has watched lecture 600.1X ?? I don't seem to find it. I need some help here. Thank you and appreciate
it is on edx
the course at MIT is all meat. thanks MIT
Hey guys I have one question from the one who read this,
Is there any sorting algorithm which directly predicts the place of every element in array subsequently reducing the time complexity because I m working on such a algorithm so if it is there then plz tell me.
can u explain what you mean by predicts?
Look at trim sort once
@@sharan9993 consider this data[3,1,5,7,2,9,10,4,6,5,2,14]
Here min =1
Max =14
Total numbers = 12
Now consider the first element 3
Here predict mean to predict that in this array where should be 3’s position
Position =( 3-1/(14-1))*12 = 1.84 so the 3’s position should be at second which is good because when sort the list 3 stands at third position and if we have more numbers than we don’t have to compare every number with others because we only have to compare number with the other number which is at our numbers location
@@hamitdes7865 what about list= [0.1, 10.6, 10.4, 10.5, 10.3, 10.1]
@@sharan9993 actually I have thought about this and I m still solving this problem but if you know that mean of the list is around min-mix/2 than this algorithm is good
And if you have any thoughts about solving that problem than inform me I will glad😇😇
@@hamitdes7865 think why logically it would work instead of computationally first. Why we can apply to a general case?
Excuse me, where can I find the material and slides of this lecture?
The course materials can be found on MIT OpenCourseWare at ocw.mit.edu/6-0002F16. Best wishes on your studies!
What was the point of this whole lecture? is it that random walk is not so random?
i did't understand how it became .05 ?? if any one can enplane what he divide ?
When the masochistic drunk moves on the y-axis he on average gets 0.1 to the north ((1.1-0.9)/2). And since he moves in the y-axis 50% of the time, he gets .05 (0.1/2) to the north on average for every step he takes. See stepChoices in the class definition.
Looking a professor pointing at the wall was never so interesting...
Tens float towards you.
I love how lazy this guy is when it comes to math 😂 Need to calculate probability? Blah, let's just code do that. Pythagor for triangle with 1x1? Nahh, can't be bothered... 😂 True programmers approach IMO
Just like me when I fire up matlab to add 2 numbers
the maths comes in later, these are intro's
Merci
I wasn't sure if I would watch this drunk walk, however if a tartus is in play...I can make some time
what software is being used here?
Python, see the course for more info at: ocw.mit.edu/6-0002F16. Best wishes on your studies!
iranium inheritance
prestige
When you attend a class, the professor would face the board and write something on the board.
Do your eyes follow what he writes or his back, or his but?
Video guys are pretty dumb, they think they have to show the speaker as much as possible.
When the professor discusses some point on the result, the video guy should show the viewers the screen
Does it make sense to all of you?
I am sure what I say does not make sense to you since I am the only one pointing this out.
In all my lectures in the US or elsewhere, my face appears only for one minute and the rest of the video shows what I write or the results of my equations.
Not-so-random. (21:21)
union
38:58
python is tough
Show the damn screen
Who cares what the professor looks like