Logistic Regression - VISUALIZED!

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ •

  • @ulrikeschnaithmann3914
    @ulrikeschnaithmann3914 2 роки тому +17

    This visualization is so strong, I feel like it's like one of those lecture in university that were so good, you'll never forget them!

    • @CodeEmporium
      @CodeEmporium  2 роки тому +3

      Thank you for the amazing compliments! I definitely tried a lot with this one haha

  • @iyar220
    @iyar220 Рік тому +4

    I'm currently self studying machine learning, and the very first thing I did after learning about decision boundaries was check if 3blue1brown had a video on it, because his animations are just incredible for understanding math and gaining intuition. The way this video is constructed, and again, the animations used in here are incredible. Thank you so much for making this!

    • @CodeEmporium
      @CodeEmporium  Рік тому

      Glad you liked the visuals here. Honestly creating this video solidified my understanding of this as well :)

  • @adityagitte
    @adityagitte 9 місяців тому +1

    This video is exactly what I was looking for, I was getting confused between decision boundry and the sigmoid function for a 2d input problem, thanks a lot for the wonderful animations!

  • @alexanderdellorti6039
    @alexanderdellorti6039 Рік тому

    Thank you mate, I think visualization is so underestimated in universities, in math and science THIS is the key of teaching. Great job!

    • @CodeEmporium
      @CodeEmporium  Рік тому

      Thanks for the kind words. I thought this video didn't receive the attention I thought it should have. But glad there are others out there like you who find value here. :)

  • @matthiasmitchell4801
    @matthiasmitchell4801 3 роки тому +2

    This is insanely helpful. I feel like I can actually use logistic regression libraries and have a good idea of what's happening. Generalizing this to higher dimensions was eye-opening--I never quite knew what was going on there.

  • @mysillyusername
    @mysillyusername Рік тому

    A brilliant visualization of the logistic regression, thanks for making this!

  • @RiteshSinghArya
    @RiteshSinghArya 2 роки тому

    Beautifully explained

  • @bhuvandwarasila
    @bhuvandwarasila Місяць тому

    That was so fire! Nice bro!

  • @hypebeastuchiha9229
    @hypebeastuchiha9229 2 роки тому

    This video deserves a million views

  • @apoorvshrivastava3544
    @apoorvshrivastava3544 5 років тому +2

    dude please be regular You will earn more subscribers as you desrve million of subscribers keep it up

    • @CodeEmporium
      @CodeEmporium  5 років тому +3

      Thanks a ton! I had some life changing events take place in the last few months (graduated, travel, moved, new job). Now that things have settled a bit, I can be more regular :) Thanks for the support. Means a lot!

  • @olusanyatodd4083
    @olusanyatodd4083 3 роки тому

    Oh wow! This is so great! I just learned the math in a class but this really explains the intuition! Thank you so much

    • @CodeEmporium
      @CodeEmporium  3 роки тому +1

      Of course :) Thanks for watching

  • @bytesizebiotech
    @bytesizebiotech 4 роки тому

    Love your stuff. I'm not a math major, and I've learned that you don't have to be to understand, but it's kind of offputting when people use a ton of symbols. It's not necessary to explain what is going on

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 5 років тому

    Excellent explanation of Logistic Regression

  • @taiworidwan194
    @taiworidwan194 2 роки тому +1

    Thanks for the video. It is really helpful
    Please, how can one optimize the coefficients of a Logistic Regression Model using a Genetic Algorithm?

  • @arpitsahni422
    @arpitsahni422 2 роки тому

    simply lovely mate! this helped me connect everything

  • @jose4877
    @jose4877 3 роки тому +1

    This was very cool! Thank you!

  • @Tntpker
    @Tntpker 3 роки тому

    The visualization at the end is what wee need more off

  • @aniketchhabra8912
    @aniketchhabra8912 3 роки тому +1

    This is amazing!! Thanks a lot for sharing

  • @darasingh8937
    @darasingh8937 3 роки тому

    Great video! Thank you for your time and creativity!

  • @shubhamsingh6884
    @shubhamsingh6884 4 роки тому

    Great video. It really helped me to get a better understanding of logistic regression. However, I have a couple of queries -
    What is the target function of logistic regression which s being learned (like in linear regression we have y = w.T*x)?
    To what curve do we fit the training data, the decision boundary of sigmoid function (like in linear regression we fit the straight line defined above)?
    Thank you !!!!

  • @romanmelnyk1777
    @romanmelnyk1777 2 роки тому

    Why decision boundary doesn't depend on activation function? So if i want to have a curved dicision boundary, i don't have to change activation function, but rather change my features?

  • @Leibniz_28
    @Leibniz_28 5 років тому +3

    17:32 ¿"m" dimensions or "d" dimensions?
    Great video, your content is really high quality

    • @CodeEmporium
      @CodeEmporium  5 років тому +2

      Yup. I didn't write m because I didn't want to confound this with the "m" in number of iterations.

  • @PremKumarAmanchi
    @PremKumarAmanchi Рік тому

    Can u make a video which explains the plot for 3 labels using 2 features, I am just curious how the sigmoid function looks for it.
    Amazing work!!!

    • @CodeEmporium
      @CodeEmporium  Рік тому

      Thanks so much for watching! I’ll keep your suggestion in mind and see if there is leeway to do this at some point (tho I don’t think it will be anytime soon admittedly).

  • @muhammadusman5521
    @muhammadusman5521 2 роки тому

    Please explain how the value for bias and weigh are calculated
    That sigma there
    If n is also 1
    Then y1 and x1
    What are there values
    Please I want to calculate and loop them like you did

  • @amoughnieh
    @amoughnieh Рік тому

    Thank you so much for this video!

    • @CodeEmporium
      @CodeEmporium  Рік тому +1

      You are very welcome. Thank you for watching

  • @mohanakumaran5815
    @mohanakumaran5815 5 років тому

    I really love this channel 😘
    Pls post many videos often, not once in blue moon 😁

    • @CodeEmporium
      @CodeEmporium  5 років тому +1

      Super glad you do! Had some life changing events come in the last few months (graduation, move, new job). But now that things have settled, I'll be more frequent. :)

  • @clearwavepro100
    @clearwavepro100 5 років тому +1

    Thank you! Super helpful on a lot of levels :)

  • @patite3103
    @patite3103 3 роки тому

    You've done an amazing video! At 14.40 you show a plot in 3 dimensions. On the z-axis (vertical one) the values should spread on the interval [0,1] which is not clear here. It's not clear why the boundary is not a plane since we have a scatter plot in 3 dimensions. I'm quite confused with the dimensions of the plot.

  • @javxa
    @javxa 4 роки тому

    Oh boy, this visualization is incredible. I always knew there was a sigmoid function in 2D. Guess what... It was hidden in 3D LOL

    • @CodeEmporium
      @CodeEmporium  4 роки тому +1

      The mystery has been solved by Detective Emporium.

  • @X_platform
    @X_platform 5 років тому +1

    These visualizations are hot!
    Love both of your channels :)

    • @kpratik41
      @kpratik41 4 роки тому

      which is the other channel?

    • @X_platform
      @X_platform 4 роки тому +1

      @@kpratik41 3blue1brown

  • @soryegetun529
    @soryegetun529 3 роки тому

    awesome explanation
    thanks so much

  • @kdang7233
    @kdang7233 Рік тому

    why do we need to use e^-x instead of any f(x) >=0 with every x to express the possibility, I forgot all the math learned from highschool so At This Point I'm Too Afraid to Ask

  • @emilycoppens4603
    @emilycoppens4603 3 роки тому

    This video was great, thank you

  • @urfiyogabama2589
    @urfiyogabama2589 3 роки тому

    Great!! Thank you

  • @josephfdunphymba3241
    @josephfdunphymba3241 2 роки тому

    not seeing a link to visualization program

  • @vijayendrasdm
    @vijayendrasdm 3 роки тому

    loved the explanation.
    But, why do we use sigmoid function ?

    • @balajikannan7393
      @balajikannan7393 3 роки тому

      A sigmoid function transforms any real number to be between 0 and 1. In other words probability would lie between 0 and 1.

    • @vijayendrasdm
      @vijayendrasdm 3 роки тому

      @@balajikannan7393
      There are thousands of function that can map real number to a number between 0 and 1. For example Step function being one of them.
      Then , why pick sigmoid out of hat ?

    • @jeverly
      @jeverly 2 роки тому +2

      @@vijayendrasdm sigmoid function is differentiable which allows us to train our weights using gradient descent, step function is not differentiable

  • @skewbinge6157
    @skewbinge6157 3 роки тому

    thank you so much

  • @ermiassolomon7605
    @ermiassolomon7605 2 роки тому

    what is the book in the beginning of the video?

    • @CodeEmporium
      @CodeEmporium  2 роки тому

      That wasn’t a book; just me listing out some topics :)

  • @amarnathjagatap2339
    @amarnathjagatap2339 5 років тому +1

    Sir it's amzing make more on ml

  • @tawhidshahrior8804
    @tawhidshahrior8804 3 роки тому

    DUDE YOU ARE A LIFE SAVER. Subbed and reccommended to my fellow msc colleagues. Keep up the great work brother.

    • @CodeEmporium
      @CodeEmporium  3 роки тому

      Thanks a ton for the share :) And so happy this helps

  • @Simon-mv6zn
    @Simon-mv6zn 2 роки тому

    9:03

  • @viddeshk8020
    @viddeshk8020 2 роки тому

    Bro, please use dark theme