Support Vector Machine - Georgia Tech - Machine Learning

Поділитися
Вставка
  • Опубліковано 13 вер 2024
  • Check out the full Advanced Operating Systems course for free at: www.udacity.co...
    Georgia Tech online Master's program: www.udacity.co...

КОМЕНТАРІ • 52

  • @hamed7600
    @hamed7600 7 років тому +83

    These two guys are hilarious! Eases the pain of learning :D

    • @leif1075
      @leif1075 3 роки тому

      Why is learning a pain?

    • @DanielRodrigues-bx6lr
      @DanielRodrigues-bx6lr 3 роки тому +3

      @@leif1075 According to this video (ua-cam.com/video/Z2N5a7XZWg8/v-deo.html) learning something new activates stress responses in the brain. I can't speak of the veracity of that claim, but the video seems to be good.

  • @fani5000
    @fani5000 6 років тому +8

    This format of learning through a dialog like this is fantastic! Thanks for posting :)

  • @Philippe.C.A-R
    @Philippe.C.A-R 7 років тому +24

    the data revisionist and invisible point joke is hilarious ! these 2 guys makes it easier to understand ! very good idea to have this noob-expert exchange to help it make it more accessible!

  • @tehuatzi
    @tehuatzi 7 років тому +7

    Michael's comment at 2:32-2:47 was super helpful, esp. for folks with weaker linear algebra backgrounds.

    • @moazzamali3587
      @moazzamali3587 5 років тому

      Can you please explain. I wasn't able to understand

  • @vtrandal
    @vtrandal Рік тому

    Excellent. Thank you for making this video. I needed to know about hyperplanes in the context of support vector machines. And you nailed it.

  • @Nupur2308
    @Nupur2308 4 роки тому +1

    Awesome video. This is one of the most intuitive explanations of SVM I've seen so far. And I'm coming from Andrew NG's course where he went about explaining it in a very roundabout way and I didn't grasp anything.

  • @SatishSharma-ff5ug
    @SatishSharma-ff5ug 3 роки тому

    Awesome and creative way to present by two genuine persons

  • @tymothylim6550
    @tymothylim6550 3 роки тому

    Thank you very much for this video! It helps me get introduced to the conceptual understanding behind SVM!

  • @joshfann8495
    @joshfann8495 5 років тому +27

    I like the two man system where one guy acts like he barely has idea of what's going on (just like me).

  • @leiyin5544
    @leiyin5544 8 років тому +8

    good explanation!

  • @sevilaybayatl6315
    @sevilaybayatl6315 4 роки тому

    Thanks, very useful information!

  • @yuhaooo8143
    @yuhaooo8143 5 років тому +1

    why the top line is wtx+b=1? i mean it should be equal to the distance between the plane and the first positive point?

    • @anarbay24
      @anarbay24 4 роки тому +1

      because they want to classify positive points as +1 in order to distinguish the cases

  • @yanshi9071
    @yanshi9071 7 років тому +2

    Still can not figure out why it is 1 or -1 if points on those two lines. If this is the case, what about the equation for those are not lie on the line, is something like W*U +B >1 or come from?

    • @lobbielobbie1766
      @lobbielobbie1766 7 років тому +3

      Because in the beginning of this lecture, he mentioned that this is a binary classification problem where the class variable has 2 levels i.e. + and - so he used +1 for + level and used -1 for - level. In this sense, the top grey line will be wX+b = 1 and all the + data points above this line are in the zone wX+b >= 1. Same applies to the bottom grey line. Hope this helps as I am also still learning.

  • @fugangdeng4423
    @fugangdeng4423 2 роки тому

    what does quiz mean at the end?

  • @NikolaRJK1
    @NikolaRJK1 7 років тому +2

    One question: Since the gray lines are different, shouldn't they have different b's (intercepts)? Like b1 and b2

    • @brianvaughan633
      @brianvaughan633 6 років тому +5

      The B is used as the intercept for the yellow line when the y=0, when they move them be to the gray lines they set them equal to -1 and 1 but the b's remain constant

  • @user-pt1el8wc4d
    @user-pt1el8wc4d 7 років тому

    The plane helps you to define the support vectors. The support vectors help you to define the plane. But when I start with only a set of points, with no support vectors or planes, what do I do? Who would I know who is who?

  • @johnmichaelkovachi3338
    @johnmichaelkovachi3338 6 років тому +4

    Why do we transpose w in the equation?

    • @lightningblade9347
      @lightningblade9347 6 років тому

      have the same question too can't understand how he wants to explain SVM while ignoring the most crucial part which is why is w transposed in the equation.

    • @vishweshnayak2331
      @vishweshnayak2331 6 років тому +8

      w is a vector (a matrix of n rows and one column) representing the parameters, and x is another vector (n*1 again) representing a point. To multiply them, since we know we can't multiply two matrices of dimensions n*1, and y has to be a single value (or a 1*1 matrix), multiplying w transpose with x gives you a 1*1 matrix which is a summation of the product of every value of the vector w with the corresponding value in the vector x.

    • @BonsiownsGADU
      @BonsiownsGADU 6 років тому

      You rock

  • @peciarovazuli2370
    @peciarovazuli2370 6 років тому +1

    please make video about SVR

  • @redsnow123456
    @redsnow123456 5 років тому

    Why its w transpose ? Why are we transposing the values of matrix of feature x?

    • @dileep31
      @dileep31 4 роки тому

      w is a vector, or a matrix of dimension n*1. The convention in Machine Learning when we say vectors is that they represent column vectors. So both w and x are column vectors, and to get matrix multiplication of two vectors, we need to transform one of them to compatible shapes. So we do w.T * x which is equivalent to dot product of the two vectors.

  • @saptarshi9433
    @saptarshi9433 6 років тому +2

    It didnt mentioned about the term "Support Vector Machine" , apart from the title of the graph.

    • @anarbay24
      @anarbay24 4 роки тому

      Udacity system allows you to learn the new concepts little by little. If you want more information, check out the other videos. These guys have managed to make one of the hardest topics to be understood very easily!

  • @boonga585
    @boonga585 2 місяці тому

    5:39

  • @paperstars9078
    @paperstars9078 4 роки тому

    How do they make the hand and the pen transparent?

    • @anarbay24
      @anarbay24 4 роки тому

      I guess it is specifically Udacity's program. I have searched it for a long time and friend of mine working on Udacity told me it is their own specifically.

  • @wilfredomartel7781
    @wilfredomartel7781 7 років тому +2

    Excelente...
    Where can i find more videos from the same topic?

    • @iamdurgeshk
      @iamdurgeshk 7 років тому +1

      Udacity

    • @anarbay24
      @anarbay24 4 роки тому

      check out Georgia Tech Udacity course. You will not regret it!

  • @16avnisharma
    @16avnisharma 6 років тому +2

    why transpose of W?

    • @lightningblade9347
      @lightningblade9347 6 років тому +1

      same question

    • @JaskaranSingh-fj1iw
      @JaskaranSingh-fj1iw 5 років тому

      Because w and x are two vectors/matrices of the same dimensions. You would have to transpose one of them to multiply them.

    • @anarbay24
      @anarbay24 4 роки тому

      in order to have valid multiplication, you need to transpose W matrix.

  • @tsunghan_yu
    @tsunghan_yu 5 років тому +1

    0:54 I'm going to fix that... by putting a minus sign here LMAO

    • @anarbay24
      @anarbay24 4 роки тому

      hahaha they are just awesome

  • @raise7935
    @raise7935 6 років тому

    lol

  • @aliparcheforosh4895
    @aliparcheforosh4895 2 роки тому +1

    Not useful at all!