5. Random Walks

Поділитися
Вставка
  • Опубліковано 1 гру 2024

КОМЕНТАРІ • 84

  • @robrn9069
    @robrn9069 4 роки тому +18

    As an economist specialized in finance and mathematics lover I adore this institution despite I've never set a footstep in Massachusetts nor even the USA. Thanks Lord for these geniuses.

  • @Robdahelpa
    @Robdahelpa 6 років тому +21

    5:00 i just love this guys personality, what an amazing lecturer to have! so glad MIT uploads these breakthrough lectures

    • @Lulue_90
      @Lulue_90 5 років тому

      Breathtaking? 🤔🕯

  • @johnvonhorn2942
    @johnvonhorn2942 5 років тому +62

    What a great professor. It's a real pleasure being able to watch these lectures.

    • @NazriB
      @NazriB 2 роки тому

      Lies again? Drink Beer + Red Wine

    • @ZelenoJabko
      @ZelenoJabko Рік тому

      Nah, it's boring as f, even at 2x speed.

  • @harrypotter1155
    @harrypotter1155 6 років тому +14

    Really nice refresher for python. What a funny professor! I enjoy this a lot. Thanks MIT!

  • @Tadesan
    @Tadesan 6 років тому +15

    I love the way when he comes to a stopping point he stares down the class like a gangster. You got questions huh!?

  • @riibbert
    @riibbert 5 років тому +32

    20:05 Wait, did he just gave a candy to the studient just for trying to ask a question? Damm thats a positive reinforcement that i would like to have.

  • @gulmiraleman4487
    @gulmiraleman4487 Рік тому

    Dear Sir, huge thanks to make this course such an easy peasy! Thanks MIT! "Share your knowledge. It’s a way to achieve immortality" - Dalai Lama

  • @abu8123
    @abu8123 Рік тому +2

    In the simWalks function , I think there is an error in Loop, numTrias has been passed to the WLAK function instead of passing numSteps, this is why the simulations are not dependent on the number of steps .

    • @AlDumbrava
      @AlDumbrava 8 місяців тому +1

      Yeah I spotted the same bug....
      Edit: Continued watching... it was an intended bug xD
      Good prof!

    • @alacocacola
      @alacocacola 6 місяців тому

      I aslo got it , but then you can see that in minute 22 he goes to it The idea was to detect the odd results and find out what was failing

  • @smartdatalearning3312
    @smartdatalearning3312 3 роки тому +1

    Another well presented lecture illustrated with Python examples

  • @akbarrauf2741
    @akbarrauf2741 7 років тому +9

    thanks,mit

  • @batatambor
    @batatambor 4 роки тому +8

    This is a very misleading class in my humild opinion, because he is computing the average of distances and not the expected value of the random walk. The random walk should be a bell curve with its peak at distance zero, so the expected value of the walk is always zero for the Usual Drunk. What happens is that when the number of steps increases the bell curve becomes wider and you have small probabilities of finding bigger distances, hence the 'mean' distance increases a little bit. Hower the expected distance to the origin is still zero.

    • @stefankk2674
      @stefankk2674 4 роки тому

      I didnt see that when I wrote my comment below. Yeah you figured it out right I guess.

    • @stefankk2674
      @stefankk2674 4 роки тому +1

      What he is talking about as distances is basically the variance of the expected value you are talking about... I think.

    • @stefankk2674
      @stefankk2674 4 роки тому

      Or rather the standard deviation.

    • @EOh-ew2qf
      @EOh-ew2qf 3 роки тому

      but why is the expected distance to the origin zero?
      for a point that is 1 step away from the origin, there is 3/4 chance for the second step to be even further away from the origin. So the distance will eventually get bigger and bigger.

    • @stefankk2674
      @stefankk2674 3 роки тому

      This is what I wrote earlier:
      When talking about distances the sign is not relevant. Say you have two trials with one step each and lets only allow movements in x. Asumme the first trial ends at -1 and the secon one at +1. The mean of covered distance is then one while average distance from origin is zero. Its just tow ways to look at the probem: The expected value of distance from origin is 0 while the average distnace covered is not.

  • @shobhamourya8396
    @shobhamourya8396 5 років тому +1

    Simulations are used in reinforcement learning

  • @phillipworts5092
    @phillipworts5092 9 місяців тому

    If we consider what was covered in the previous lecture about Stoic thinking, and we were trying to create a realistic model about the wandering drunk, wouldn’t we need to create an additional factor that would affect probability called fatigue? Where, by the more of the drunk walk the more tired they get, and they start to either walk less or take smaller steps.

  • @actias_official
    @actias_official 4 роки тому +1

    I think the first video he shows is not the brownian motion simulation but rather a course- grained collision algorithm such as DPD or MPC.

  • @narnbrez
    @narnbrez 4 роки тому +2

    Why not give the abstract class the "usual" method for walking and then override it in the inherited class? operator overloading and inheritance in one example

  • @kwokhoilam2451
    @kwokhoilam2451 5 років тому +2

    Good professor, make things simple and fun

  • @rasraster
    @rasraster 6 років тому +31

    PLEASE - next time you film a class, show the screen whenever the teacher is discussing what's on it. Countless times I could not see what he was talking about.

    • @Robdahelpa
      @Robdahelpa 6 років тому +9

      late for you but for anybody else seeing this. in the first minute he alludes to the project files which you should ideally download an then you can see them in your own time :)

    • @hanwengu4408
      @hanwengu4408 4 роки тому

      Maybe it is just a license thing.

    • @aazz7997
      @aazz7997 4 роки тому +1

      @Anifco67 You are a fool. Use the lecture notes

    • @studywithjosh5109
      @studywithjosh5109 4 роки тому

      Anifco67 if you are not watching this, you are a fool😂

    • @dodgingdurangos924
      @dodgingdurangos924 3 роки тому

      @@aazz7997 if the NEW STUDENT is limited to downloading the slides, will this include the laser-pointing he's doing, or should the NEW STUDENT just randomly point their finger on a note on the slide and assume that "this here" or "that over there" is where he's laser-pointing?

  • @siniquichi
    @siniquichi 3 роки тому +1

    Thanks Mr. Gutag and MIT

  • @leixun
    @leixun 4 роки тому +4

    *My takeaways:*
    1. Why random walks 1:05

  • @minhsp3
    @minhsp3 2 роки тому +1

    Show the damn screen!

  • @waltwhitman7545
    @waltwhitman7545 3 роки тому

    his jokes are so good and fall flat way too often loll

  • @alute5532
    @alute5532 Рік тому

    Drunkard walk
    Simulate one walk k steps
    & n such walks
    3 abstractuons
    1 location (immutable)
    2 (possible) ield
    3 the drunk

  • @brendawilliams8062
    @brendawilliams8062 3 роки тому

    He is standing next to his shadow and attached

  • @batatambor
    @batatambor 4 роки тому +1

    If someone could help me, in the textbook there's another exemple of drunk: the EW Drunk, moving only in the horizontal axe (-1, 0) and (1, 0). However this drunk is also getting farther away from the origin. But why? If after n number of steps he has equal chance to step etiher W or E, wasn't he supposed to be back to the origin according to the law of big numbers? Isn't it the same as to flip n coins and count number of head and tails?

    • @narnbrez
      @narnbrez 4 роки тому

      Have you plotted it on a graph as the professor explains near the end of the video?
      I would expect an hourglass shape of end points. I would like to know what you found if you end up running this sim.

    • @batatambor
      @batatambor 4 роки тому

      @@narnbrez I did not have run the simmulation because the result is presented in the textbook. The professor is computing the average of distances and not the expected value of the random walk. The random walk should be a bell curve with its peak at distance zero, so the expected value of the walk is always zero for the EW. What happens is that when the number of steps increases the bell curve becomes wider and you have small probabilities of finding bigger distances, hence the 'mean' distance increases a little bit. Hower the expected distance to the origin is still zero.
      Kind of misleading IMO but it is correct.

    • @stefankk2674
      @stefankk2674 4 роки тому +2

      When talking about distances the sign is not relevant. Say you have two trials with one step each and lets only allow movements in x. Asumme the first trial ends at -1 and the secon one at +1. The mean of covered distance is then one while average distance from origin is zero. Its just tow ways to look at the probem: The expected value of distance from origin is 0 while the average distnace covered is not.

  • @adipurnomo5683
    @adipurnomo5683 Рік тому

    Nice explained

  • @weilinglynn
    @weilinglynn 6 років тому +1

    HI... does anyone here has watched lecture 600.1X ?? I don't seem to find it. I need some help here. Thank you and appreciate

    • @vitor613
      @vitor613 5 років тому +1

      it is on edx

  • @adiflorense1477
    @adiflorense1477 4 роки тому

    the course at MIT is all meat. thanks MIT

  • @hamitdes7865
    @hamitdes7865 3 роки тому +1

    Hey guys I have one question from the one who read this,
    Is there any sorting algorithm which directly predicts the place of every element in array subsequently reducing the time complexity because I m working on such a algorithm so if it is there then plz tell me.

    • @sharan9993
      @sharan9993 3 роки тому

      can u explain what you mean by predicts?
      Look at trim sort once

    • @hamitdes7865
      @hamitdes7865 3 роки тому

      @@sharan9993 consider this data[3,1,5,7,2,9,10,4,6,5,2,14]
      Here min =1
      Max =14
      Total numbers = 12
      Now consider the first element 3
      Here predict mean to predict that in this array where should be 3’s position
      Position =( 3-1/(14-1))*12 = 1.84 so the 3’s position should be at second which is good because when sort the list 3 stands at third position and if we have more numbers than we don’t have to compare every number with others because we only have to compare number with the other number which is at our numbers location

    • @sharan9993
      @sharan9993 3 роки тому

      @@hamitdes7865 what about list= [0.1, 10.6, 10.4, 10.5, 10.3, 10.1]

    • @hamitdes7865
      @hamitdes7865 3 роки тому

      @@sharan9993 actually I have thought about this and I m still solving this problem but if you know that mean of the list is around min-mix/2 than this algorithm is good
      And if you have any thoughts about solving that problem than inform me I will glad😇😇

    • @sharan9993
      @sharan9993 3 роки тому

      @@hamitdes7865 think why logically it would work instead of computationally first. Why we can apply to a general case?

  • @ccindy951357
    @ccindy951357 7 років тому +4

    Excuse me, where can I find the material and slides of this lecture?

    • @mitocw
      @mitocw  7 років тому +15

      The course materials can be found on MIT OpenCourseWare at ocw.mit.edu/6-0002F16. Best wishes on your studies!

  • @aviral550
    @aviral550 2 роки тому

    What was the point of this whole lecture? is it that random walk is not so random?

  • @abduogalal53
    @abduogalal53 4 роки тому

    i did't understand how it became .05 ?? if any one can enplane what he divide ?

    • @lindgren.bjorn1
      @lindgren.bjorn1 3 роки тому

      When the masochistic drunk moves on the y-axis he on average gets 0.1 to the north ((1.1-0.9)/2). And since he moves in the y-axis 50% of the time, he gets .05 (0.1/2) to the north on average for every step he takes. See stepChoices in the class definition.

  • @ases4320
    @ases4320 4 роки тому +1

    Looking a professor pointing at the wall was never so interesting...

  • @brendawilliams8062
    @brendawilliams8062 4 роки тому

    Tens float towards you.

  • @tomaschmelevskij623
    @tomaschmelevskij623 6 років тому +38

    I love how lazy this guy is when it comes to math 😂 Need to calculate probability? Blah, let's just code do that. Pythagor for triangle with 1x1? Nahh, can't be bothered... 😂 True programmers approach IMO

    • @Momonga-s7o
      @Momonga-s7o 5 років тому

      Just like me when I fire up matlab to add 2 numbers

    • @mikevincent6332
      @mikevincent6332 5 років тому

      the maths comes in later, these are intro's

  • @JohnbelMahautiere
    @JohnbelMahautiere Місяць тому

    Merci

  • @leejosephcommon3246
    @leejosephcommon3246 5 років тому

    I wasn't sure if I would watch this drunk walk, however if a tartus is in play...I can make some time

  • @augustinusntjamba4914
    @augustinusntjamba4914 3 роки тому

    what software is being used here?

    • @mitocw
      @mitocw  3 роки тому

      Python, see the course for more info at: ocw.mit.edu/6-0002F16. Best wishes on your studies!

  • @JohnbelMahautiere
    @JohnbelMahautiere Місяць тому

    iranium inheritance

  • @JohnbelMahautiere
    @JohnbelMahautiere Місяць тому

    prestige

  • @minhsp3
    @minhsp3 2 роки тому

    When you attend a class, the professor would face the board and write something on the board.
    Do your eyes follow what he writes or his back, or his but?
    Video guys are pretty dumb, they think they have to show the speaker as much as possible.
    When the professor discusses some point on the result, the video guy should show the viewers the screen
    Does it make sense to all of you?
    I am sure what I say does not make sense to you since I am the only one pointing this out.
    In all my lectures in the US or elsewhere, my face appears only for one minute and the rest of the video shows what I write or the results of my equations.

  • @anonviewerciv
    @anonviewerciv 3 роки тому

    Not-so-random. (21:21)

  • @JohnbelMahautiere
    @JohnbelMahautiere Місяць тому

    union

  • @quocvu9847
    @quocvu9847 Рік тому

    38:58

  • @syedadeelhussain2691
    @syedadeelhussain2691 7 років тому +1

    python is tough

  • @minhsp3
    @minhsp3 2 роки тому

    Show the damn screen
    Who cares what the professor looks like