10. Introduction to Learning, Nearest Neighbors

Поділитися
Вставка
  • Опубліковано 9 січ 2014
  • MIT 6.034 Artificial Intelligence, Fall 2010
    View the complete course: ocw.mit.edu/6-034F10
    Instructor: Patrick Winston
    This lecture begins with a high-level view of learning, then covers nearest neighbors using several graphical examples. We then discuss how to learn motor skills such as bouncing a tennis ball, and consider the effects of sleep deprivation.
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

КОМЕНТАРІ • 106

  • @MM-uh2qk
    @MM-uh2qk 4 роки тому +150

    Thank you MIT. Just found out today that Professor Patrick had passed on the 19th of July, 2019. I am immensely saddened by this incident. I was actually looking forward to meeting you but I guess that is no longer possible. Rest in Peace legend!

  • @AhmedSALAH-bb7un
    @AhmedSALAH-bb7un 2 роки тому +10

    RIP Professor Patrick Winston, definitely your work gonna live for ever

    • @dragolov
      @dragolov 2 роки тому +2

      Deep respect!

  • @cristiannievesp
    @cristiannievesp 3 роки тому +21

    "You have to be very careful about the confusion of correlation with cause. They see the correlation, but they don't understand the cause, so that's why they make a mistake"
    This is so simple but so meaningful!

  • @qzorn4440
    @qzorn4440 7 років тому +104

    this mit opencourseware is like eating potato chips you cannot eat just one or view just one lecture. thank you.

    • @TGPadm
      @TGPadm 6 років тому +5

      except they are healthy

  • @user-ol2gx6of4g
    @user-ol2gx6of4g 6 років тому +12

    The last minute of the lecture is gold.

  • @BapiKAR
    @BapiKAR 5 років тому +4

    This is why I love this prof's lecture.. so much of passion along with simplicity and fun!

  • @MacProUser99876
    @MacProUser99876 2 місяці тому

    Boy, he packed a lot into this lecture, but made it so engaging!

  • @cheng-haochang3509
    @cheng-haochang3509 6 років тому +8

    The world is better with you, thanks prof Winston and MIT

  • @bxh062000
    @bxh062000 9 років тому +10

    Professor Winston is the best. He is amazing!

  • @Andrei-fg6uv
    @Andrei-fg6uv 7 років тому +5

    ...aaaaand this is why MIT is one of the top educational institutions in the world ! Thanks MIT !

  • @ranjeetchoice
    @ranjeetchoice 8 років тому +42

    Love this professor..thanks MIT

  • @2flight
    @2flight 6 років тому +1

    Thanks Patrick Winston for the lively presentations! Thanks MIT!!!

  • @saikumarmv1250
    @saikumarmv1250 6 років тому +1

    Really love the way professor is teaching , his confidence and body language is great ..Thank you very much sir

  • @HangtheGreat
    @HangtheGreat 3 роки тому +1

    His stories and jokes inspires learning and intuition about the subject. That's good teaching skill right there. I was lucky enough to meet teachers with this skill during secondary my school years but I find really rare in university-level education. Thank you MIT and the late professor for this lecture series

  • @hengyue6596
    @hengyue6596 7 років тому +18

    i can't imagine how much does the knowledge contained in this course worth.

  • @pjakobsen
    @pjakobsen 5 років тому +1

    Excellent teacher, very organized. He has clearly taught this course many times.

  • @sassoleo
    @sassoleo 6 років тому +1

    These lessons just keep getting better and better

  • @EranM
    @EranM 6 років тому +5

    46:06 Another thing that is not especially related to the topic is that even when deprived of sleep, the brain works better in the middle of the day rather then the start or end. The huge drops of performance happens when a "subject" is used to sleep/need to sleep. While performance doesn't drop at all (and even goes higher related to the "sleeping time") during the mid-day. Therefore Linear regression can tell you the obvious hypothesis (losing sleep = losing performance) While the Cubic spline can teach you new things you didn't even think of.

  • @mangaart3366
    @mangaart3366 3 роки тому +2

    Amazing lecture thank you MIT for providing us with free knowledge!

  • @MathsatBondiBeach
    @MathsatBondiBeach 4 роки тому

    Taught by Marvin Minsky and a truly class act at so many levels.

  • @KulvinderSingh-pm7cr
    @KulvinderSingh-pm7cr 6 років тому +1

    he looks so cool !!!
    he's absolutely amazing in every way

  • @martinmadsen1199
    @martinmadsen1199 7 років тому +9

    I started out going: "This is too slow". Im now on day two, another 10 hour session. The pace of new information is just perfect. You are a great teacher!

    • @user-ol2gx6of4g
      @user-ol2gx6of4g 6 років тому +6

      I always put it on 1.25x speed, occasionally pause to ponder. ;)

  • @elivazquez7582
    @elivazquez7582 6 років тому

    Thank you Patrick Winston !

  • @nb1587
    @nb1587 2 роки тому +1

    I wish I was in MIT such an outstanding teaching.

  • @KulvinderSingh-pm7cr
    @KulvinderSingh-pm7cr 6 років тому +1

    best professor funny in a sense that is often senseless...
    Love this guy!!!!

  • @ally_jr
    @ally_jr 7 років тому

    Amazing lecturer!

  • @thechesslobster2768
    @thechesslobster2768 3 роки тому

    Absolutely blessed to be able to get MIT courses for free.

  • @HaiderKhanZ
    @HaiderKhanZ 9 років тому +1

    Great Lecture, explains it with nice animations on the blackboard great for programmers :-)

  • @famishedrover
    @famishedrover 6 років тому

    Amazing teacher !

  • @yoyokagus9245
    @yoyokagus9245 9 років тому

    great lecture

  • @pokixd2298
    @pokixd2298 4 роки тому

    As always great stuff

  • @angelmcorrea1704
    @angelmcorrea1704 3 роки тому +1

    I love this lectures, thanks MIT and Mr Pattrick for shared.

  • @redthunder6183
    @redthunder6183 10 місяців тому

    I can believe I just willingly watched an entire lecture

  • @oguzhanakgun9591
    @oguzhanakgun9591 2 роки тому

    What a great lecture..

  • @user-bt6bo8lb4l
    @user-bt6bo8lb4l 2 роки тому

    amazing lecture

  • @suniltech7586
    @suniltech7586 9 років тому +1

    good work sir

  • @michafilek6883
    @michafilek6883 7 років тому +2

    Incredible lecture, thanks MIT.

  • @sakcee
    @sakcee 2 роки тому

    RIP Professor Winston!

  • @stephk42
    @stephk42 5 років тому +6

    When lecture is over, he just nods and walks away...

  • @acal790
    @acal790 9 років тому +1

    that was hella hilarious on the part of the rangers, sleep dep, so the answer then should be how do they get major end decisions out of soldiers when they only have a 25 percent ability, and naps do help immensely if you can handle rounds or just nervousness.

  • @donbasti
    @donbasti 7 років тому +39

    The deeper you go into the series the more hard-core programmers you meet in the comment section :D

    • @zingg7203
      @zingg7203 3 роки тому +1

      Nicecore

    • @marceloflc
      @marceloflc 3 роки тому

      Wait, so the majority here are programmers and not math people? I thought that it would be the other way around. I don't know where one subject starts and the other begins anymore

  • @WepixGames
    @WepixGames 4 роки тому +1

    R.I.P Patrick Winston

  • @anishreddyellore6002
    @anishreddyellore6002 2 роки тому

    Just Wow!!

  • @shumakriss
    @shumakriss 8 років тому

    Thanks for posting, is there somewhere I can go to ask questions or discuss the lecture?

  • @amrdel2730
    @amrdel2730 6 років тому

    VERY USEFUL FOR STUDENT US GRADUATE PHD RESEARCHERS OR ANYONE NEEDING TOBE INTRODUCED TO AI FIELD I GUES THE BASES GOT HERE FROM PROF WINSTON ARE A GREAT BASIS TO TACKLE OR USE ANY NOTIONS IN THIS VAST FIELD THANKS FROM ALGERIA

  • @asmadjaidri1219
    @asmadjaidri1219 7 років тому

    thanx a lot ^^

  • @thetranslator1044
    @thetranslator1044 7 місяців тому

    Legend.

  • @TheBirdBrothers
    @TheBirdBrothers 8 років тому +18

    luv his curmudgeonly persona, but always lively!

  • @ThePeterDislikeShow
    @ThePeterDislikeShow 9 років тому +16

    I'm surprised in the 21st century we still haven't found a way to reduce our need for sleep.

    • @KaosFireMaker
      @KaosFireMaker 9 років тому +5

      I present you coffee!

    • @ThePeterDislikeShow
      @ThePeterDislikeShow 9 років тому +2

      Coffee doesn't reduce the *need* for sleep. It just prevents you from getting what you need.

    • @KaosFireMaker
      @KaosFireMaker 9 років тому +15

      FortNikitaBullion It does if you BELIEVE!

    • @ThePeterDislikeShow
      @ThePeterDislikeShow 9 років тому

      Well what I'm thinking is something you can take and then it would feel like you had 8 hours sleep, even though you didn't (or maybe even though you only had 2 hours). Caffeine doesn't do that --- it just makes it impossible to sleep without really improving your productivity.

    • @KaosFireMaker
      @KaosFireMaker 9 років тому +7

      FortNikitaBullion I understood what you meant.

  • @pmcate2
    @pmcate2 4 роки тому

    Isn't there an ambiguous way to divide the graph into 4 areas? That little triangle in the middle looks like it could be included in any of the four boundaries.

  • @sainiarvind3660
    @sainiarvind3660 2 роки тому

    Good

  • @Jackeeba
    @Jackeeba Рік тому

    At 23:01 the professor says 'so that's just the dot product, right' - but that's to say that cosine similarity = dot product, which is not precise, right? The dot product is the numerator in this case.

  • @kevnar
    @kevnar 4 роки тому

    I once use a nearest-neighbor algorithm to create a voronoi diagram. I didn't even know there was a name for either of them. I was just playing around with pixels.

  • @whutismyname
    @whutismyname 5 років тому

    Wish he could be my machine learning professor!!

  • @edusson
    @edusson 8 років тому

    Does anybody know the authors of the robot balancing the tennis ball? Thanks!

  • @sansin-dev
    @sansin-dev 4 роки тому +1

    R.I.P. Professor Winston

  • @user-hf2dr7sh4y
    @user-hf2dr7sh4y 8 років тому

    I hope that x-axis grows at a much faster rate than his y-axis, otherwise the example to get the idea across makes less sense. Still a great lecture though! Thumbs up.

  • @hdfhfdhdfhdfghdfh3306
    @hdfhfdhdfhdfghdfh3306 5 років тому

    Can anyone please help me?
    1. Regarding the Robotic Hand Solutions Table:
    If I understand correctly in the case of the robotic hand, we start from an empty table and drop a ball from a fixed height on the robotic hand. When the robotic hand feels the touch of the ball, we give a random blow as we record the robotic hand movements.
    Now, only if the robotic arm detects after X seconds that the ball has hit the surface again, it realizes that the previous movement was successful and records the movements it made for the successful result in the table for future use.
    I guess there is a way to calculate where on the surface the ball fell and then in case the robotic hand feels that the ball touched a region close to the area it remembers it will try the movement closest to these points in the table.
    Now there are a few things I do not understand:
    A. The ball has an angle, so that touching the same point on the board at different angles will lead to the need to use a different response, our table can only hold data of the desired point and effect and do not know the intensity of the fall of the ball or an angle, the data in the table will be destroyed or never fully filled ?
    B. How do we update the table? It is possible that we will drop a ball and at first when the table is empty we will try to give a random hit when the result of this is that the ball will fly to the side so we will not write anything in the table, now this case may repeat itself over and over and we will always be left with an empty table?
    It seems to me that I did not quite understand the professor's words and therefore I have these questions. I would be very happy if any of you could explain to me exactly what he meant by this method of solution.
    2. In relation to finding properties by vector:
    If I understand correctly, we fill in the data we know in advance, and then when a new figure is reached, and we do not know much about it, we measure the angle it creates with the X line (the angle of the vector) and check which group is the most suitable angle.
    Now there is a point I do not understand. Suppose I have 2 sets of data, 1 group have data with very low Y points and very high X points and a second group having data with high X and Y points when I get a new data with a low Y and low X , the method of the vector angle will probably associate them with group 1 although it appears on paper that the point is more suitable for group 2.
    It seems that if we used a simple surface distribution here (as in the first case presented by the professor) we would get more accurate results than the method of pairing according to vectors angle?

  • @110Turab
    @110Turab 6 років тому +1

    Wondefull

  • @xXxBladeStormxXx
    @xXxBladeStormxXx 8 років тому +4

    I don't think I'll even be able to walk with 36 hours of sleep deprivation.

  • @abjkgp
    @abjkgp 2 роки тому

    What is comparitor 8:32? Couldn't find on the web. Is this a spelling mistake?

  • @ffzcdbnc9679
    @ffzcdbnc9679 6 років тому

    Possible image knn code in Java

  • @magnfiyerlmoro3301
    @magnfiyerlmoro3301 7 років тому +1

    did someone understand in 40:00 the derivative of x ?

    • @MrFurano
      @MrFurano 7 років тому

      If you are talking about "x prime", that's not the derivative of x. It's a new random variable. More precisely, it's a random variable transformed from the original x.
      With the definition of "x prime", you can calculate its variance by plugging it in the formula. You will get 1.

  • @dragolov
    @dragolov 2 роки тому

    Deep respect to Patrick Winston. And this solo is made using KNN classifier: ua-cam.com/video/K2PQOgmlQwY/v-deo.html

  • @sauravfalia9676
    @sauravfalia9676 6 років тому +1

    Can some one help me expand the Cos theta equation?

    • @freeeagle6074
      @freeeagle6074 6 місяців тому

      Take two vectors of u, v of R2 as an example. Let u=[x11 x12] and v=[x21 x22]. Then, cos(theta)=(x11*x21 + x12*x22) / ( sqrt(x11^2 + x12^2 ) * sqrt(x21^2 + x22^2 )). If u=v, then x11=x21 and x12=22 so cos(theta)=1.

  • @samirelzein1978
    @samirelzein1978 2 роки тому

    the longest and less efficient way to deliver the intuition!

  • @amitgrover
    @amitgrover 6 років тому +1

    At 41:45, the professor indicates that you cannot use AI for predicting bankruptcies in credit card companies. That's like making cake without flour. Wouldn't the credit card company have relevant data to be able to use AI to predict bankruptcies? Why is the answer "no"?

  • @user-cm7bb1cc4g
    @user-cm7bb1cc4g 2 роки тому

    so.. what is nearest neighbor??

  • @tdreamgmail
    @tdreamgmail 5 років тому +1

    Tough crowd. He's funny.

  • @katateo328
    @katateo328 Рік тому

    some of the principle looks like very abstract and super-natural and human have been considering as mystery and classify it to AI, but actually it is very simple. Computer could simulate it easily. The Brain is small but can do a lot of things. Not because of mystery but very simple structure.

  • @philippg6023
    @philippg6023 5 років тому +2

    with nearest neighbours learning, I've got 92% accuracy on MNIST-Database ( with euclidean distance). 97% with Neural-Nets

  • @keira1412
    @keira1412 6 років тому

    The sleep data is helpful to me. This Professor is very typical of how a Robotics Prof. would teach.

  • @dhruvjoshi8744
    @dhruvjoshi8744 4 роки тому

    11:20 turn on caption ..lol

  • @MattyHild
    @MattyHild 5 років тому +1

    C'mon pierre...

  • @surflaweb
    @surflaweb 5 років тому +1

    IF YOU THING THAT IS FOR KNN NOT! GET OUT OF HERE.. THIS IS NOT FOR KNN ALGORITHM

  • @bubbleman2059
    @bubbleman2059 8 років тому

    lama

  • @luke8489
    @luke8489 3 роки тому

    asdf

  • @robertmcintyre9023
    @robertmcintyre9023 2 роки тому

    :(

  • @bryanjohnson7781
    @bryanjohnson7781 9 років тому

    There he GOES again: blaming his age on poor GAIN control

  • @akshayakumars2814
    @akshayakumars2814 3 роки тому

    Diet coke

  • @bensalemn271
    @bensalemn271 9 місяців тому

    this course is a waste of time...

    • @maar2001
      @maar2001 5 місяців тому

      Could you explain why you think that?