ML Teach by Doing Lecture 12: Perceptron Convergence Theorem

Поділитися
Вставка
  • Опубліковано 15 лис 2024

КОМЕНТАРІ • 14

  • @DonaldCooper-o4v
    @DonaldCooper-o4v 3 місяці тому +1

    Great Lecture! It's important to note that the basic perceptron algorithm doesn't automatically find the largest possible margin. It just tries to find any line that separates the groups correctly. I think margin like this: if you're trying to walk between two crowds, you'd prefer a wide path (large margin) rather than squeezing through a narrow gap (small margin). The wide path gives you more room for error and feels more secure.

  • @kaustavbora4789
    @kaustavbora4789 4 місяці тому +1

    Sir, is it necessary to draw the circle from the origin? To enclose all the data points in the smallest possible circle, the center might not be at the origin. In that case, the radius will differ.

  • @abishekdhakal5786
    @abishekdhakal5786 6 місяців тому +2

    Lecture 12 completed. I tried to solve the assignment with the provided dataset in the previous lecture.I found out that perceptron makes around 128 mistakes with the enclosed circle of radius 4.However,I do have a particular question, like isn't gamma the margin of the dataset( minimum margin to the boundary line).I did my calculation with this particular understanding.I was also wondering if my concept or my calculation might be wrong to the same regard.

    • @vizuara
      @vizuara  6 місяців тому

      Very good work! Yes, gamma is the margin of the dataset.

  • @inigofdzdeangulo3803
    @inigofdzdeangulo3803 3 місяці тому +1

    Gamma is the closest distance from the straight line to the closest point. But what is the straight line? From the drawing it looks like the final solution, since it separates "x" and "o", but I don't see the sense of it. If it is the solution, it means you have found it and therefore you already know the number of iterations you needed. If it is not the solution, the what straight line should we use to calculate gamma? Thank you sir.

  • @praveenchaturvedi2761
    @praveenchaturvedi2761 6 місяців тому +1

    Day 12 done.

  • @UbayedBinSufian-mx8mh
    @UbayedBinSufian-mx8mh 6 місяців тому +1

    Sir, I found gamma to be 0.353 and radius 3.605. From my calculations, perceptron can make a maximum of 104 mistakes.

  • @rachanabk8275
    @rachanabk8275 6 місяців тому +1

    data = [[-2,3],[2,-1],[0,1],[-2,1],[0,-1],[2, -3]] label = [1,1,1,-1,-1,-1] Mistake = 4, best_theta = [2,2], best_theta0 = 0, Radius = 3.7055, least_margin = 0.707, Estimation of max_errors = 27.46 ~ 28
    iteration (t)1: th = [[0],[0]] th0 = 0 x = [-2, 3] y = 1 value = 0

  • @ananthkrish2634
    @ananthkrish2634 14 днів тому

    Margin of Dataset will also tell us how wrong the point is as well right w.r.t to. x3?

  • @simransharma7482
    @simransharma7482 7 місяців тому +1

    Sir , I got gamma as 0.3333 and radius 3.6 . This gives me that the perceptron can make a maximum of 117 mistakes. Please reply if this is correct or I'm going somewhere wrong.

    • @vizuara
      @vizuara  7 місяців тому +1

      Good work Simran! I will highlight your submission in my next video..Thanks for being so proactive, really appreciate it.

  • @KomalGoyal-2000
    @KomalGoyal-2000 7 місяців тому +1

    @simransharma7482 Can you please mention for which hypothesis you have did your computation. I mean for which value of theta0 and vector theta

    • @simransharma7482
      @simransharma7482 7 місяців тому

      Theta= (4,4) and theta0 =2

    • @vizuara
      @vizuara  7 місяців тому

      Hello Komal! It won't be for a specific theta and theta0. We have to find the maximum number of mistakes the perceptron makes before reaching the optimal/best answer.. After it makes a mistake, the perceptron updates its theta and theta0