Tutorial 3-Activation Functions Part-1

Поділитися
Вставка
  • Опубліковано 10 гру 2024

КОМЕНТАРІ • 116

  • @rakshitraushan1650
    @rakshitraushan1650 4 роки тому +3

    I think u r the only one who is gonna make my DL awesome

  • @sriramswar
    @sriramswar 5 років тому +8

    Very good and To-the-point explanation. A small suggestion. If the videos are prefixed by Serial numbers, then it would be easy for reference. Eg: "03. Activation Functions Part-1" as this is the third tutorial in this playlist. Same suggestion applies to your earlier playlists too. Thanks for understanding!

  • @madhugarg7499
    @madhugarg7499 4 роки тому +14

    It's just Awesome as usual...Could you please upload the Activation Function Part 2
    Many of your followers are requesting the same.
    So please upload it.

  • @madhugarg7499
    @madhugarg7499 4 роки тому +20

    Hi Krish...Can we expect Activation Function Part 2 ?
    Most of the subscriber has requested the same, hope you will post it !!!

  • @kingdomman1078
    @kingdomman1078 3 роки тому +1

    I sincerely like your enthusiasm as you teach. Thanks!

  • @satishkundanagar3237
    @satishkundanagar3237 3 роки тому +10

    Sigmoid function converts -inf to +inf values to range between 0 and 1. 0 and 1 would be asymptotic values. Also, in the graph the sigmoid function passes through y-axis=0.5 when "y=0".

    • @ne2514
      @ne2514 3 роки тому +3

      i agree, the graph may be wrong in the video. But good tutoring video.

  • @SuryaPrakash-xf5jv
    @SuryaPrakash-xf5jv 4 роки тому

    currently i am a data science intern, and suddenly i got to know about your videos. All the concepts are explained really nice. Though i know all the basics, but i again i will watch each and every video of yours. Thanks a lot for this amazing course.

    • @Hamidkhan-lr7qd
      @Hamidkhan-lr7qd 2 роки тому

      I am also doing the same thing.

    • @sunny-wi7ul
      @sunny-wi7ul 2 роки тому

      @@Hamidkhan-lr7qd great! This had let me achieved way more what I thought of 2 years back. All the best👍

    • @Hamidkhan-lr7qd
      @Hamidkhan-lr7qd 2 роки тому

      @@sunny-wi7ul hopefully same will be the case for me.

  • @nullf6950
    @nullf6950 5 років тому +2

    I like this whiteboard set up. As always great video

  • @harikrishna-harrypth
    @harikrishna-harrypth 3 роки тому +1

    the G.O.A.T(Teacher) of AI tutorials = KRISH NAIK!!!!! Thanks much for taking your valuable time to make these tutorial videos!!! GOD BLESS YOU MUCH!!! 🙏🙏👍👍👌👌

  • @rakeshacharjya8512
    @rakeshacharjya8512 5 років тому +32

    Why the Activation Functions part 2 was not uploaded

    • @chandrimad5776
      @chandrimad5776 4 роки тому +2

      I am also having the same question. @Krish Naik, your videos are helping me to learn deep learning from scratch. I am a beginner in this field and diligently following your videos to learn it. Can you kindly upload the part2 of activation function video?

    • @rohitmania1
      @rohitmania1 4 роки тому +2

      ua-cam.com/video/DDBk3ZFNtJc/v-deo.html&ab_channel=KrishNaik
      I think this is the one.

  • @shaz-z506
    @shaz-z506 5 років тому +6

    Good explanation Krish, please tell more about vanishing gradient problem arises and how RELU can help in such a scenario.

  • @DeepakKumar-mb2lw
    @DeepakKumar-mb2lw 11 місяців тому

    Straightforward explanation! Thanks, sir.

  • @laxya6779
    @laxya6779 4 роки тому

    Best videos on UA-cam thank you so much sir 😇

  • @simanchalpatnaik2566
    @simanchalpatnaik2566 4 роки тому +6

    Please upload Activation Functions part 2 video

  • @priyagrandhi7918
    @priyagrandhi7918 3 роки тому

    ur just awesome,u r deep learning videos are very clear and easily understandble by everyone,Thanks a ton krish

  • @aghileslounis
    @aghileslounis 4 роки тому +2

    great video ! PART 2 please !

  • @mdenamulhaque7589
    @mdenamulhaque7589 5 років тому

    in one word... awesome.........go ahead

  • @w.a.imadhusanka1578
    @w.a.imadhusanka1578 Рік тому

    The things taught are well understood.thank you sar🥰🥰😇😇

  • @dipayanroy8357
    @dipayanroy8357 5 років тому +2

    Great video. Appreciate the effort

  • @prashanths4455
    @prashanths4455 3 роки тому

    though sigmoid graph is wrong, krish explanation is super.

  • @uncommon_common_man
    @uncommon_common_man 5 років тому

    very useful best quality videos

  • @anilkshirsagar5624
    @anilkshirsagar5624 4 роки тому

    Best way to teach...

  • @vaibhavgupta4413
    @vaibhavgupta4413 4 роки тому +1

    Thank u sir
    Respect 🙏🙏🙏🙏🙏

  • @ramleo1461
    @ramleo1461 5 років тому +9

    Hi,
    Your videos are helpful.
    Is there activation function part 2??

    • @rakeshacharjya8512
      @rakeshacharjya8512 5 років тому +2

      Yeah.. Is there activation function part 2?? @Krish Naik..

  • @soumyasrm
    @soumyasrm 5 років тому +1

    Nice explanation sirq

  • @SalmanIbne
    @SalmanIbne 3 роки тому

    All of your videos are really helpful...great explanation :)

  • @poojarai7336
    @poojarai7336 4 місяці тому

    great applaud
    thank u so much sir

  • @abdulqadar9580
    @abdulqadar9580 2 роки тому

    Amazing Sir

  • @mansijadhav2997
    @mansijadhav2997 7 місяців тому

    Thank you so much for the video! Is there a Part 2?

  • @parakhsrivastava7743
    @parakhsrivastava7743 5 років тому +3

    Nice explanation...
    Sigmoid works for multi class classification too? But how, because it gives values between 0 and 1?

    • @katipomusatheesh8501
      @katipomusatheesh8501 5 років тому +8

      Sigmoid is used only for binary classification.Normally softmax activation is used for multi class classification, Because all the probabilities should sum up to 1.

    • @parakhsrivastava7743
      @parakhsrivastava7743 5 років тому

      @@katipomusatheesh8501 Thanks for the explanation

    • @niladribiswas1211
      @niladribiswas1211 4 роки тому +1

      what is domain of sigmoid function? all real, and its range is (0,1)

  • @sudipsen04
    @sudipsen04 5 років тому +2

    Please mention the link of part 2 didn't found in Playlist.

  • @santoshkumarsabat4086
    @santoshkumarsabat4086 4 роки тому +1

    ❤️❣️ Thanks ❤️❣️

  • @shashiyadav4528
    @shashiyadav4528 4 роки тому

    Really good

  • @akshat9722
    @akshat9722 4 роки тому

    I think what you mean is receptors pickup signal from hand and then pass to the neuron with increased weight.

  • @svishaliyer2254
    @svishaliyer2254 5 років тому +3

    Hi Krish
    I have 1 question.
    what if in sigmoid activation function, we get value equal to 0.5. Because in other cases we have either value greater than 0.5 or less than 0.5

    • @manishsharma2211
      @manishsharma2211 5 років тому +1

      It will be 0

    • @yousufborno3875
      @yousufborno3875 4 роки тому +1

      @@manishsharma2211 I believe it will be 1, if the sigmoid activation function is 0.5.

    • @shefeeknajeeb9064
      @shefeeknajeeb9064 4 роки тому

      The threshold basically depends on your use case and usually is set as 0.5 in case of a binary classification. There are other cases like when your data set is imbalanced, you might have to set threshold higher or lower. So it is important to understand your data. Correct me if I'm wrong

    • @SaptarsiGoswami
      @SaptarsiGoswami 4 роки тому

      This is a common case for many problems. In that case you just randomly break the tie, as the classes are equally probable

  • @meanuj1
    @meanuj1 5 років тому +1

    Thanks for classroom type lecture

  • @sandipansarkar9211
    @sandipansarkar9211 4 роки тому

    Hello Krish. Just finished this video.With your style of teaching I don't believe in making notes.I hope I am correct from the interview standpoint.Please guide and do reply.Thanks

  • @souravbiswas6892
    @souravbiswas6892 4 роки тому +1

    What about softmax? Can you explain that as well?

  • @kothapallysharathkumar9743
    @kothapallysharathkumar9743 5 років тому +1

    hai sir please explain about bias and weights

  • @ashishchandra8391
    @ashishchandra8391 4 роки тому +1

    Sir waiting for part 2.........

  • @kumarpiyush2169
    @kumarpiyush2169 4 роки тому

    Hi Krish.. Sigmoid function delivers the output in the range (0,1), then how we get the value as 1 or 0 in the case of classification problem. I know - in the logistic regression, if the output is 0.5 or greater , then the result is 1. Here sklearn takes the output from sigmoid in the range (0,1), then converts the output further.

  • @maithiltandel4762
    @maithiltandel4762 3 роки тому

    Hey Krish! in one of my regression problems I have used relu, but sigmoid is giving much better results, what could be the explaination for that because i have been in that confused state for months now and so started to brush up on my basics!!

  • @chayankathuria7801
    @chayankathuria7801 4 роки тому +1

    Incorrect graph of the Sigmoid function is shown. At y=0, it should be at 0.5 and not 0. Please correct it

  • @kin_1997
    @kin_1997 2 роки тому

    amazing

  • @sekharpink
    @sekharpink 5 років тому +1

    Hi Krish,
    One question. If the output value of record after applying sigmoid activation function is < 0.5 then output is 0, if it's more than 0.5 output is 1. What if the value is exactly 0.5 i.e. what is the output in this case?

    • @krishnaik06
      @krishnaik06  5 років тому +3

      If it is

    • @ganeshgaikwad3646
      @ganeshgaikwad3646 4 роки тому

      @@krishnaik06 how we decide the threshold 0.5 in ANN??.
      We can decide the threshold in Log Regression based on ROC and kappa factor.
      So is there any method in ANN for selecting the threshold or it is always 0.5?

  • @Adinasa2
    @Adinasa2 4 роки тому

    Please give us an example of use case of relu function in case of regression

  • @codesandroads
    @codesandroads 4 роки тому +1

    Krish many people want part 2 of this, if this is uploaded on your channel please pin this comment and provide the suitable links, we all are face same problems while learning.

  • @IshuPatel-di8ll
    @IshuPatel-di8ll 5 місяців тому +1

    Sir sound problem

  • @deepanshupant8282
    @deepanshupant8282 4 роки тому +1

    Is it the complete playlist of deep learning Sir or will u upload more..

  • @ga43ga54
    @ga43ga54 5 років тому +1

    Can we have Live Q&A session with you?

  • @mohammedkareem549
    @mohammedkareem549 2 роки тому

    we depend on what? to select suitable Act function?

  • @bibhupadhy4155
    @bibhupadhy4155 4 роки тому

    For Sigmoid Activation Function, The value of y at x = 0.5 should be 0.5 , What you are showing is a x axis translated sigmoid activation function, You may have missed it, Krish. Kindly recheck !!

  • @ajaykumar-k4h7v
    @ajaykumar-k4h7v Рік тому

    Can I apply Sigmoid on a set of neurons in hidden layer and Relu on another set of neurons ?

  • @raghavmanish24
    @raghavmanish24 2 місяці тому

    is that 0.5 is theta given in various numericals

  • @deokumarjnu
    @deokumarjnu 4 роки тому

    Hello Krish sir, in which case the value of y will be negative for Relu AF ? I just started learning Deeplearning along with ML.

  • @kanchanapallynikhilsai4347
    @kanchanapallynikhilsai4347 2 роки тому

    Thanks bro

  • @louerleseigneur4532
    @louerleseigneur4532 3 роки тому

    thanks sir

  • @praneethaluru2601
    @praneethaluru2601 4 роки тому

    What is the difference between Sigmoid Activation and Batch Normalization?

  • @mukund198526
    @mukund198526 3 роки тому

    Hi Krish...why Activation Function Part 2 is not part of this playlist?

  • @soumyaranjansethi1790
    @soumyaranjansethi1790 4 роки тому

    Hi krish nice video ,I don't find the part 2 of activation function could you please help me if possible

  • @Thelee4music
    @Thelee4music 4 роки тому +1

    Wonderful explanation of the topic, but please remove this extra wind noise from the video ...its irritating, I am so sorry had to say this.

  • @فييي-ج5ق
    @فييي-ج5ق Рік тому

    hi MR krish
    where part 2 about Type activation function ?

  • @blackkingrg3867
    @blackkingrg3867 4 роки тому

    i was wondering why this youtube algo doesn't show the second part of Activation Function. got answer in the comments . please upload the second part of the activation function.

  • @AkashRaj-if6di
    @AkashRaj-if6di 3 роки тому

    bias will add separately with each Wi*Xi or with whole???? i am asking (W1*X1 + B1) + (W2*X2 +B2) + (W3*X3 + B3) OR (W1*X1 + W2*X2 + W3*X3) + B..............WHICH ONE IS CORRECT???

  • @shubhangiagrawal336
    @shubhangiagrawal336 4 роки тому +1

    part -2 video plzzzzz

  • @vinayak186f3
    @vinayak186f3 4 роки тому

    Could you plz upload the 2nd part of this .

  • @samraharif7510
    @samraharif7510 Рік тому

    It is hard to connect your previous video to the next video. Please guide.

  • @hiteshnettam3188
    @hiteshnettam3188 3 роки тому

    if I am not wrong, the sigmoid activation function is depicted incorrectly. Please look into it and would suggest an edit if possible to make one.

  • @mahfuzraihan8690
    @mahfuzraihan8690 2 роки тому

    a nice explanation, but I got confuse on ReLu activation function, max(-ve, 0), (+Ve, 0) and related graph. Please could anyone help me to understand this term.

  • @sakshijaiswal1135
    @sakshijaiswal1135 4 роки тому

    I think formula will be y= 1/1+e power(-x)

  • @satya8411
    @satya8411 2 роки тому

    Sir what's the intuition behind relu

  • @sohamajgaonkar3119
    @sohamajgaonkar3119 2 роки тому

    sir plz upload part 2 of this video

  • @PushpendraSingh-ub7if
    @PushpendraSingh-ub7if 2 роки тому

    Sigmoid and Reluoid😁 that was human neural network at work😂

  • @himabinduh7623
    @himabinduh7623 4 роки тому

    Sir could u plz upload the 2nd part...

  • @ravineeshgoud8145
    @ravineeshgoud8145 4 роки тому +1

    Why the sigmoid function is starting from the origin ?

    • @SaptarsiGoswami
      @SaptarsiGoswami 4 роки тому

      Let's say, you take value of y as - inf so 1 /( 1 + e -(inf)) = 1/inf or 0. it takes entire value of y from -inf to +inf and squashes z to 0 and 1

  • @naveenvinayak1088
    @naveenvinayak1088 4 роки тому

    krish can we expect Activation Function Part 2 ?

  • @nandalala7915
    @nandalala7915 4 роки тому

    What if Val is 0.5
    Will it accepted to 1 or 0?

  • @hariprasad1744
    @hariprasad1744 4 роки тому

    Can you please add part 2

  • @siddharthdedhia11
    @siddharthdedhia11 4 роки тому

    Hello Sir , can you upload part 2?

  • @Thelaunius
    @Thelaunius 4 роки тому

    How can RELU return a binary output?
    Sigmoid was already between 0 and 1 and we used a threshold.
    But RELU just returns a positive number or zero. How can we make it return 0 or 1?

    • @SaptarsiGoswami
      @SaptarsiGoswami 4 роки тому +1

      RELU is used as activation function in hidden layers and then sigmoid at the output neuron, if it is a classification problem.

  • @mizgaanmasani8456
    @mizgaanmasani8456 4 роки тому

    In sigmoid function why always 0.5 is the value in y-axis used for classifying data?

    • @SaptarsiGoswami
      @SaptarsiGoswami 4 роки тому +1

      Well it is assumed for simplicity, you can assume it will return probability of the class 1, given the value of y.

  • @jaisvarghese7304
    @jaisvarghese7304 2 роки тому

    the curve of sigmoid is wrong, half comes in the negative x axis...

  • @ashwinshetgaonkar6329
    @ashwinshetgaonkar6329 3 роки тому

    what is activation is explained but why we need one is not

  • @abhishekmanral9476
    @abhishekmanral9476 4 роки тому

    Why did you not make its part 2
    I'm in middle of the understanding..

  • @ananthakumar7048
    @ananthakumar7048 5 років тому

    how to take the weight value bro

  • @girikgarg8
    @girikgarg8 Рік тому

    Done

  • @muhammadjaffarrazadar967
    @muhammadjaffarrazadar967 5 років тому

    Very Helpful. please make videos in hindi also

  • @saravananshanmugam4116
    @saravananshanmugam4116 5 років тому

    softmax pls

  • @kamal6762
    @kamal6762 5 років тому

    in case of the ReLu function is the value of "Y" or "X"? means if the value of "X" is positive than the value of "Y" is positive or what?

  • @priyankagupta5538
    @priyankagupta5538 2 роки тому

    plz provide me part-2

  • @GoogleUser-nx3wp
    @GoogleUser-nx3wp Рік тому

    Mic quality very bad brother

  • @אלהבריעקב
    @אלהבריעקב 2 роки тому

    The graph of sigmoid is incorrect... at 0 it should be 0.5

    • @umeshr9734
      @umeshr9734 7 місяців тому

      It's between 0 to 1. kindly check.

  • @akrsrivastava
    @akrsrivastava 4 роки тому +1

    Activating functions are required because they introduce non linearity.

  • @elahmedi24
    @elahmedi24 2 роки тому

    Dear Krish who is activating your activation function in your body? Simply God is activating. SO do you worship Allah?