Це відео не доступне.
Перепрошуємо.

Logistic Regression Part 3 | Sigmoid Function | 100 Days of ML

Поділитися
Вставка
  • Опубліковано 15 сер 2024
  • Explore the essential Sigmoid Function in the context of Logistic Regression. In this video, we'll break down the role and significance of the Sigmoid function, shedding light on how it transforms input into probabilities.
    Code used: github.com/cam...
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.camp...
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in
    ⌚Time Stamps⌚
    00:00 - Intro
    00:17 - Problem with perceptron
    02:00 - The Solution
    08:33 - Understanding the equation
    15:55 - Sigmoid Function
    27:02 - Impact of Sigmoid Function
    38:20 - Code Demo

КОМЕНТАРІ • 76

  • @rohitekka2674
    @rohitekka2674 2 роки тому +51

    Nitish Sir, you're amongst the rare breed of educators who has the gift instilling intuition in your learners.

  • @astikpatel6522
    @astikpatel6522 2 роки тому +22

    A truly amazing teacher is hard to find, difficult to part with and impossible to forget , thank u sir

  • @sober_22
    @sober_22 Рік тому +11

    You remove my fear of logistic regression. India need more teachers like you. As always Amazing.

  • @finaz_18
    @finaz_18 6 місяців тому +5

    you are the best teacher in the field of Data Science sir

  • @yashshrivastava1612
    @yashshrivastava1612 Рік тому +8

    Never knew this is how Logistic Regression works, Only knew that we use sigmoid but what actually goes in the sigmoid is clear now.
    You're awesome man

  • @AltafAnsari-tf9nl
    @AltafAnsari-tf9nl 8 місяців тому +3

    Even after knowing Log Regression for more than 2 years , today I felt as if I am learning a new algorithm..No one has taught so well anywhere over the internet ..👌👌👌

  • @katadermaro
    @katadermaro 3 роки тому +9

    Nitish I cannot describe in words man. I learnt sigmoid function from andew ng ka video. I had so many confusions.
    Your explanation with perceptron and why sigmoid func can work better was fricking amazing man. This is so intuitive. Thanks a ton!

    • @Ravi-sl5ms
      @Ravi-sl5ms 2 роки тому

      I totally agree with you.

    • @abhishektehlan7814
      @abhishektehlan7814 2 роки тому

      bro agar hume continuous function ki hi jarurat hai toh hum (dot(w.x)>0 ko y=1) or (dot (w.x)

    • @apratimmehta1828
      @apratimmehta1828 Рік тому

      ​@@abhishektehlan7814normalization ke liye perhaps

    • @sauravthakur2915
      @sauravthakur2915 11 місяців тому

      @@abhishektehlan7814 Humne "dot(w.x)" ye use nahi kiya because if we do this then this would be a very large number. And while subtracting y-y_hat the number would be so large and model will never converge. So to squeeze the large number between 0 and 1 we use sigmoid.

  • @shubhamagarwal2321
    @shubhamagarwal2321 2 місяці тому +1

    For deviation with logistic regression, we should also see that the magnitude of X also plays an important role , a correctly positive point near to line will push more aggressively due to its lower sigmoid value ( high 1- sigmoid) compared to a correctly positive point far from line but since value of near to line point itself(X) coordinates might me less than the one being far the push might get diluted.

  • @sukritiguin5637
    @sukritiguin5637 6 місяців тому +2

    True Teacher and motivation for student to learn from scratch

  • @arslanahmed1311
    @arslanahmed1311 Рік тому +1

    Sir you have done lots of research in making these videos. You have done lots of effort in simplifying these videos. I really appreciate the amount of effort you have put into making this logistic regression playlist.

  • @Ravi-sl5ms
    @Ravi-sl5ms 2 роки тому +4

    Thanks Nitish for making this concept so crystal clear. The knowledge you are providing through this youtube channel is incredible. I can't thank you enough.

  • @shivamsrivastava4750
    @shivamsrivastava4750 2 місяці тому +1

    tooo good sir, bhot badiya explanation hai

  • @educationalpoint725
    @educationalpoint725 11 місяців тому

    one of the most exceptional tutor I have ever seen. Kudos to your knowledge and methodology.

  • @thatsfantastic313
    @thatsfantastic313 Рік тому

    you alone are much much better than whole of those big institutes on youtube. please please please make more

  • @competitive_coding_
    @competitive_coding_ 3 місяці тому

    today i understand logic behind the sigmoid. one of the best teacher in the world

  • @user-nu2tw8bi7s
    @user-nu2tw8bi7s 3 місяці тому +3

    crystal clear SIMPLY AMAZING

  • @user-bt6mh9ez3u
    @user-bt6mh9ez3u 4 місяці тому +1

    This playlist for machine learning is the best ..........like the best....

  • @arslanahamd7742
    @arslanahamd7742 2 роки тому +2

    Amazing explanation Sir, your content is top notch. Your views should be in Millions

  • @SaifaliKhan-zk6sh
    @SaifaliKhan-zk6sh 9 місяців тому +1

    Nice explanation very dept , clean and understanding.. thanks a lot sir

  • @mohammadvahidansari8212
    @mohammadvahidansari8212 3 роки тому +4

    Good work sir, you are the only who gave a clear intuition of this algorithm, thank you so much keep it up you are doing such a great work. 👌👌

    • @mohammadvahidansari8212
      @mohammadvahidansari8212 3 роки тому

      Can you please upload other algorithm's intuition like this , for svm kmeans etc..... It will be very helpful....

  • @user-oq4do4kj1o
    @user-oq4do4kj1o 5 місяців тому

    Wow, the most awesome video I have ever seen for machine learning. Really really helpful.

  • @kadambalrajkachru8933
    @kadambalrajkachru8933 2 роки тому

    I saw many videos on UA-cam about this algorithm but no one able to clear but sir you explained the concept very well.. Nice work sir keep it up...

  • @SumanSingh-gq5vn
    @SumanSingh-gq5vn 3 місяці тому +1

    mind blowing explaination sir !! thanks a lot !!

  • @princekhunt1
    @princekhunt1 6 місяців тому

    Baap explanation! feels like we are watching MLM (Machine Learning Movie) 👍

  • @anuragroy9045
    @anuragroy9045 5 місяців тому

    Long live the EDUCATOR and long live CampusX❤

  • @balrajprajesh6473
    @balrajprajesh6473 2 роки тому

    Best teacher on this earth!

  • @ArunKumar-yb2jn
    @ArunKumar-yb2jn 2 роки тому

    Give this guy a medal.

  • @AbdulHannan-dg6dl
    @AbdulHannan-dg6dl 2 роки тому

    Superb Lots of respect from Pakistan
    Thank you so much

  • @PS_nestvlogs
    @PS_nestvlogs Рік тому

    you are amazing with your teaching skills

  • @bluestone2523
    @bluestone2523 Рік тому

    sir your way of teaching is brilliant

  • @ManojKrVerma-vw4dx
    @ManojKrVerma-vw4dx 2 роки тому

    U r awesome sir. U made it look very intuitive and easy. HATS OFF TO YOU.

  • @heetbhatt4511
    @heetbhatt4511 10 місяців тому +1

    Thank you sir

  • @tusharsalunkhe7916
    @tusharsalunkhe7916 Рік тому

    superb explanation, thank you so much for all your efforts 🙏

  • @hrishikdebnath4756
    @hrishikdebnath4756 7 місяців тому +2

    Sir, you should write a book and I will be first to buy it. Your way of teaching is irreplaceable.

  • @harpritsinhyadav7271
    @harpritsinhyadav7271 2 роки тому

    Thanks for the clear intuition that why should we use Sigmoid function :)

    • @abhishektehlan7814
      @abhishektehlan7814 2 роки тому

      bhai agar hume continuous function ki hi jarurat hai toh hum (dot(w.x)>0 ko y=1) or (dot (w.x)

  • @descendantsoftheheroes_660
    @descendantsoftheheroes_660 Рік тому +1

    koi b video try kr lu youtube pe ..end me akr yhi smjh ata h

  • @suhasshetty6607
    @suhasshetty6607 Рік тому

    This channel is gold

  • @ParthivShah
    @ParthivShah 5 місяців тому +1

    Thanks

  • @saqibalam8781
    @saqibalam8781 7 місяців тому +1

    You should upload the OneNotes pdf also, other than this, Nitish sir is beyond any MIT or IIT Professor.

  • @laxmiagarwal3285
    @laxmiagarwal3285 Рік тому

    very well explained, sir. hats off toyou

  • @narendarveerla374
    @narendarveerla374 Рік тому

    thank you for giving this wonderful lectures sir

  • @talibdaryabi9434
    @talibdaryabi9434 Рік тому

    That is an excellent explanation; however, a sudden move or change from y^ and sigmoid (y^) calculation at 29:13 min to distance and sigmoid(distance) calculation at 35:25 min was somehow surprising for me. I mean, u were calculating Wn= W0+n(y-sigmoid(y^)).xi and suddenly changed to wn= w0+n(y-sigmoid(distance to line)).xi

  • @kindaeasy9797
    @kindaeasy9797 3 місяці тому

    maja aaya , funnnn!!!

  • @adityamishra6954
    @adityamishra6954 2 роки тому

    Thank you sir for this wonderful lecture ☺️

  • @ashish_1729
    @ashish_1729 8 місяців тому

    God level explanation bro 🔥

  • @ShadabAlam-jz4vl
    @ShadabAlam-jz4vl Рік тому

    Superb explanation👏💯

  • @gauravpundir97
    @gauravpundir97 Рік тому

    Great explanation Nitish

  • @rambaldotra2221
    @rambaldotra2221 3 роки тому

    This knowledge is incredible ❤️

  • @mozammilkarim8636
    @mozammilkarim8636 Рік тому

    Thanks sir, you are great.

  • @kislaykrishna8918
    @kislaykrishna8918 3 роки тому

    clearly understood ❤️👍

  • @ranirathore4176
    @ranirathore4176 2 роки тому

    Thank you sir 🙏

  • @Noob31219
    @Noob31219 2 роки тому

    you are great

  • @yashjain6372
    @yashjain6372 Рік тому

    BHAI BEST HAI

  • @dineshpandey7232
    @dineshpandey7232 Рік тому

    Awesome

  • @miss_anonymous16
    @miss_anonymous16 5 місяців тому

    why do we need that yi - yi^ should be zero? at 14:41

  • @hitinyadav3321
    @hitinyadav3321 2 роки тому

    Sir your volume is very low in the video. But explanation is very good.

  • @abhishekdeulkar8971
    @abhishekdeulkar8971 Рік тому

    Sir can you please explain what is logit function ?

  • @anshulsharma8152
    @anshulsharma8152 Рік тому

    38:07 FOR LAST TWO REMAINING POINTS,
    An incorrectly classified point if on the +ve side have greater distance resulting into a higher value of subtraction of its y_hat from the weights and will result into lower intensity of pulling as compared to a point which is also incorrectly classified but is closer to the line will have a lower value of subtraction and a greater pulling intensity,
    Here the logic I think fails , becoz initially u told an incorrect close point have low pulling intensity , and vice versa.

  • @shubhamchoubey3948
    @shubhamchoubey3948 Рік тому

    How w1,w2 values are coming ?? 14:19

  • @prathameshmore5262
    @prathameshmore5262 2 роки тому

    Sr can yu provide graph images used while explaining

  • @abhishektehlan7814
    @abhishektehlan7814 2 роки тому

    sir agar hume continuous function ki hi jarurat hai toh hum (dot(w.x)>0 ko y=1) or (dot (w.x)

    • @purubhatnagar483
      @purubhatnagar483 6 місяців тому

      hey bro, I believe sigmoid function makes it easy to create a boundary and it is more easily explainable and optimized.

  • @kadambalrajkachru8933
    @kadambalrajkachru8933 2 роки тому

    Sir do you have paid courses on machine learning pls send the link...

  • @avimishra9253
    @avimishra9253 Рік тому

    to nitish sir and to anyone reading this.....how we get to knw that this value of w0,w1,w2 will be suitable to calculate the z value ???

    • @rushikesh8132
      @rushikesh8132 Рік тому +1

      As in the previous videos sir explained that we start with random values of w0,w1,w2,w3...wn
      this w is our models w , which we are trying to correct as much as possible by iterating and in this case by improving the values of the weights[w] by pulling and pushing based on correct points ya toh incorrect points !!
      Hope this helps !

    • @avimishra9253
      @avimishra9253 Рік тому

      @@rushikesh8132 got it brother.thank you for ur help......so i'm left with 2 questions....if okay pls reply for them too :
      1. we have to calculate Z for every point right ?
      2. suppose for a point we took random value of w0,w1,w2 and then we calculated its Z and put it in sigmoid function and then we chck with conditions right..??..then how we and model will knw its mis classified or correctly classified..???

    • @rushikesh8132
      @rushikesh8132 Рік тому

      ​@@avimishra9253 Sure bhai !
      1 st question : Yes we do it for all the points .
      2nd : See , as per my understanding , hum even if the point is misclassified or correctly classified are not bothered about that , but weights get adjusted for misclassifed and correctly classified no matter what , this way for every point , our model improves the weight according to sigmoid value of the point ! Hope it clears it !

  • @RohanOxob
    @RohanOxob Рік тому

    27:34