Lecture 6 - Support Vector Machines | Stanford CS229: Machine Learning Andrew Ng (Autumn 2018)

Поділитися
Вставка
  • Опубліковано 15 жов 2024
  • For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: stanford.io/ai
    Andrew Ng
    Adjunct Professor of Computer Science
    www.andrewng.org/
    To follow along with the course schedule and syllabus, visit:
    cs229.stanford....

КОМЕНТАРІ • 40

  • @a7744hsc
    @a7744hsc 2 роки тому +209

    SVM starts from 46:20

  • @coragon42
    @coragon42 2 роки тому +22

    32:07
    It helps me to think of Laplace smoothing as
    Pr(observation gets label) = (count of observations with label)/(number of observations) -->
    Pr(observation gets label) = (count of observations with label + 1)/(number of observations + number of possible labels)

  • @samurai_coach
    @samurai_coach 2 роки тому +2

    26:30 memo. he explains the difference between the multinomial event and the multivariate Bernoulli event model.

  • @kevinshao9148
    @kevinshao9148 10 місяців тому +2

    Thanks for the great video! One question: 8:00, if you have this NIPS in your feature, were you even able to train your model if you don't have any email contains NIPS? Your MLE formula will yield 0 probability. (Or actually you not really train your model, you got analytic solution directly, and prediction just use the counting solution?) Thanks in advance for any advice!

    • @littleKingSolomon
      @littleKingSolomon 7 місяців тому

      We estimate the parameters in the analytical solutions using MLE. If NIPS didn't occur, we can resolve the problem of zero division with Laplace smoothing.
      Or perhaps u mean NIPs not in your training dictionary. In which case a sentinel value it used to represent all other values not present in training data

  • @MrSteveban
    @MrSteveban Рік тому +4

    In 19:15 wouldn't it be more accurate to say multinouli instead of multinomial, since the concept of number of trials that's a parameter of the multinomial distribution doesn't really apply here?

  • @jaskaransingh3200
    @jaskaransingh3200 Рік тому +1

    A doubt : When talking about NIPS conference making zero probability in Naive Bayes ; in the first place, probability of word NIPS shouldn't come up in the calculation P(x /y=0) , as the the binary column vector of 10000 elements won't have this word in it as its not in the top 10000 words cuz it started appearing very recently.

    • @traveldiaryinc
      @traveldiaryinc Рік тому +1

      I think he said a dictionary with 10k words where Nips is a he 6017th word, dictionary doesn't contain top 10k words

    • @jayanjans
      @jayanjans Рік тому

      NIPS is the 6017 word in 10000 words dictionary but as the word doesn't appear in the mails that is received in the beginning the MLE is a product so it would tend to be zero, now when the word started appearing in the mail the detection by the model would be still zero as the product in the MLE is already at zero

  • @creativeuser9086
    @creativeuser9086 Рік тому +15

    too many side quests in this level

  • @deniskim3456
    @deniskim3456 2 роки тому +49

    Don't buy drugs, guys.

    • @realize2424
      @realize2424 2 роки тому +8

      I did drugs so I could become a machine learner!

    • @The_Quaalude
      @The_Quaalude 9 місяців тому +1

      A lot of software engineers take Adderall and micro doses of molly, shrooms, and acid 😂

    • @floribertjackalope2606
      @floribertjackalope2606 5 місяців тому

      too late

  • @abhigyanganguly1988
    @abhigyanganguly1988 2 роки тому +2

    just had a doubt....... at 54:56 , what does g(z) denote ? is it the sigmoid function ?

    • @vigneshreddy6121
      @vigneshreddy6121 2 роки тому

      Yes, for sigmoid function, when theta transpose x > 0, the sigmoid will be > 0.5

  • @microwavecoffee
    @microwavecoffee Рік тому +14

    They lost 😭

    • @karanbania2785
      @karanbania2785 Рік тому +2

      The exact thing I was wondering

    • @HyeonGon90
      @HyeonGon90 7 місяців тому +1

      so it doesn't need to do Laplace smoothing

  • @vemulasuman6995
    @vemulasuman6995 3 місяці тому

    where can I find class notes for this lecture? please any one know this

    • @liaroy7346
      @liaroy7346 День тому

      cs229.stanford.edu/lectures-spring2022/main_notes.pdf

  • @fahyen6557
    @fahyen6557 Рік тому +2

    1/4 done!😵

  • @cam9751
    @cam9751 Місяць тому +1

    Even Jabba the Hutt was interested in and asked a question

  • @bakashisenseiAnimeIsLove
    @bakashisenseiAnimeIsLove Рік тому +1

    at 35:21 shouldnt there be ni in general instead of the 10000 that is being added

    • @timgoppelsroeder121
      @timgoppelsroeder121 Рік тому +3

      No n_i is the number of words in the i'th email but the term we add to the bottom in laplace smoothing is the number of possible labels which in andrews example is the dictionary size=10'000

    • @timgoppelsroeder121
      @timgoppelsroeder121 Рік тому

      I was wondering the same thing for a second

  • @gnuhnhula
    @gnuhnhula Рік тому +1

    Done!

  • @haoranlee8649
    @haoranlee8649 10 місяців тому +1

    laplace smoothy

  • @marciamarquene5753
    @marciamarquene5753 11 місяців тому +1

    G a gente se vê se fala ET r viu o cafezinho tava no forno

  • @jaivratsingh9966
    @jaivratsingh9966 6 місяців тому +3

    camera person - please do not move it frequently next time. It should focus on what is written on board. You are tracing professor and losing content. We can relate voice to what is written on board. It should always be on vision what he talking. Your and his hard work got wasted a bit.

  • @marciamarquene5753
    @marciamarquene5753 10 місяців тому

    T amo demais essa noite foi tão rápido mas se Deus quiser vir buscar f xi tô falando w se quiser ir comigo te amo e fica tranquilo então obrigada pelo convite lá pegar o valor é é é só o mesmo do trabalho e depois do jogo e do trabalho é melhor hoje

  • @marciamarquene5753
    @marciamarquene5753 11 місяців тому

    C BB GG GG GG GG e GG e GG GG GG GG e o e um pouco então né eu tenho e um beijo e o cafezinho e o carro de manhã r ER r viu o jogo é só no é só no é o nome de quem é

  • @maar2001
    @maar2001 9 місяців тому +1

    He needs to learn how to speak loud and more clearly... Otherwise it's a good lecture 👍🏾

    • @LoneWolfDion
      @LoneWolfDion 4 місяці тому

      Turn up your head phones. I listen on 2x speed and can understand him. When I went to normal speed I understood less.

    • @survivinglife262
      @survivinglife262 13 днів тому

      his vocal/volume is more than enough for me at laptop volume 35-40. maybe its your phone/device at fault.