Very good and To-the-point explanation. A small suggestion. If the videos are prefixed by Serial numbers, then it would be easy for reference. Eg: "03. Activation Functions Part-1" as this is the third tutorial in this playlist. Same suggestion applies to your earlier playlists too. Thanks for understanding!
It's just Awesome as usual...Could you please upload the Activation Function Part 2 Many of your followers are requesting the same. So please upload it.
Sigmoid function converts -inf to +inf values to range between 0 and 1. 0 and 1 would be asymptotic values. Also, in the graph the sigmoid function passes through y-axis=0.5 when "y=0".
currently i am a data science intern, and suddenly i got to know about your videos. All the concepts are explained really nice. Though i know all the basics, but i again i will watch each and every video of yours. Thanks a lot for this amazing course.
the G.O.A.T(Teacher) of AI tutorials = KRISH NAIK!!!!! Thanks much for taking your valuable time to make these tutorial videos!!! GOD BLESS YOU MUCH!!! 🙏🙏👍👍👌👌
I am also having the same question. @Krish Naik, your videos are helping me to learn deep learning from scratch. I am a beginner in this field and diligently following your videos to learn it. Can you kindly upload the part2 of activation function video?
Sigmoid is used only for binary classification.Normally softmax activation is used for multi class classification, Because all the probabilities should sum up to 1.
Hi Krish I have 1 question. what if in sigmoid activation function, we get value equal to 0.5. Because in other cases we have either value greater than 0.5 or less than 0.5
The threshold basically depends on your use case and usually is set as 0.5 in case of a binary classification. There are other cases like when your data set is imbalanced, you might have to set threshold higher or lower. So it is important to understand your data. Correct me if I'm wrong
Hello Krish. Just finished this video.With your style of teaching I don't believe in making notes.I hope I am correct from the interview standpoint.Please guide and do reply.Thanks
Hi Krish.. Sigmoid function delivers the output in the range (0,1), then how we get the value as 1 or 0 in the case of classification problem. I know - in the logistic regression, if the output is 0.5 or greater , then the result is 1. Here sklearn takes the output from sigmoid in the range (0,1), then converts the output further.
Hey Krish! in one of my regression problems I have used relu, but sigmoid is giving much better results, what could be the explaination for that because i have been in that confused state for months now and so started to brush up on my basics!!
Hi Krish, One question. If the output value of record after applying sigmoid activation function is < 0.5 then output is 0, if it's more than 0.5 output is 1. What if the value is exactly 0.5 i.e. what is the output in this case?
@@krishnaik06 how we decide the threshold 0.5 in ANN??. We can decide the threshold in Log Regression based on ROC and kappa factor. So is there any method in ANN for selecting the threshold or it is always 0.5?
Krish many people want part 2 of this, if this is uploaded on your channel please pin this comment and provide the suitable links, we all are face same problems while learning.
For Sigmoid Activation Function, The value of y at x = 0.5 should be 0.5 , What you are showing is a x axis translated sigmoid activation function, You may have missed it, Krish. Kindly recheck !!
i was wondering why this youtube algo doesn't show the second part of Activation Function. got answer in the comments . please upload the second part of the activation function.
bias will add separately with each Wi*Xi or with whole???? i am asking (W1*X1 + B1) + (W2*X2 +B2) + (W3*X3 + B3) OR (W1*X1 + W2*X2 + W3*X3) + B..............WHICH ONE IS CORRECT???
a nice explanation, but I got confuse on ReLu activation function, max(-ve, 0), (+Ve, 0) and related graph. Please could anyone help me to understand this term.
How can RELU return a binary output? Sigmoid was already between 0 and 1 and we used a threshold. But RELU just returns a positive number or zero. How can we make it return 0 or 1?
I think u r the only one who is gonna make my DL awesome
Very good and To-the-point explanation. A small suggestion. If the videos are prefixed by Serial numbers, then it would be easy for reference. Eg: "03. Activation Functions Part-1" as this is the third tutorial in this playlist. Same suggestion applies to your earlier playlists too. Thanks for understanding!
It's just Awesome as usual...Could you please upload the Activation Function Part 2
Many of your followers are requesting the same.
So please upload it.
Hi Krish...Can we expect Activation Function Part 2 ?
Most of the subscriber has requested the same, hope you will post it !!!
I sincerely like your enthusiasm as you teach. Thanks!
Sigmoid function converts -inf to +inf values to range between 0 and 1. 0 and 1 would be asymptotic values. Also, in the graph the sigmoid function passes through y-axis=0.5 when "y=0".
i agree, the graph may be wrong in the video. But good tutoring video.
currently i am a data science intern, and suddenly i got to know about your videos. All the concepts are explained really nice. Though i know all the basics, but i again i will watch each and every video of yours. Thanks a lot for this amazing course.
I am also doing the same thing.
@@Hamidkhan-lr7qd great! This had let me achieved way more what I thought of 2 years back. All the best👍
@@sunny-wi7ul hopefully same will be the case for me.
I like this whiteboard set up. As always great video
the G.O.A.T(Teacher) of AI tutorials = KRISH NAIK!!!!! Thanks much for taking your valuable time to make these tutorial videos!!! GOD BLESS YOU MUCH!!! 🙏🙏👍👍👌👌
Why the Activation Functions part 2 was not uploaded
I am also having the same question. @Krish Naik, your videos are helping me to learn deep learning from scratch. I am a beginner in this field and diligently following your videos to learn it. Can you kindly upload the part2 of activation function video?
ua-cam.com/video/DDBk3ZFNtJc/v-deo.html&ab_channel=KrishNaik
I think this is the one.
Good explanation Krish, please tell more about vanishing gradient problem arises and how RELU can help in such a scenario.
Straightforward explanation! Thanks, sir.
Best videos on UA-cam thank you so much sir 😇
Please upload Activation Functions part 2 video
ur just awesome,u r deep learning videos are very clear and easily understandble by everyone,Thanks a ton krish
great video ! PART 2 please !
in one word... awesome.........go ahead
The things taught are well understood.thank you sar🥰🥰😇😇
Great video. Appreciate the effort
though sigmoid graph is wrong, krish explanation is super.
very useful best quality videos
Best way to teach...
Thank u sir
Respect 🙏🙏🙏🙏🙏
Hi,
Your videos are helpful.
Is there activation function part 2??
Yeah.. Is there activation function part 2?? @Krish Naik..
Nice explanation sirq
All of your videos are really helpful...great explanation :)
great applaud
thank u so much sir
Amazing Sir
Thank you so much for the video! Is there a Part 2?
Nice explanation...
Sigmoid works for multi class classification too? But how, because it gives values between 0 and 1?
Sigmoid is used only for binary classification.Normally softmax activation is used for multi class classification, Because all the probabilities should sum up to 1.
@@katipomusatheesh8501 Thanks for the explanation
what is domain of sigmoid function? all real, and its range is (0,1)
Please mention the link of part 2 didn't found in Playlist.
❤️❣️ Thanks ❤️❣️
Really good
I think what you mean is receptors pickup signal from hand and then pass to the neuron with increased weight.
Hi Krish
I have 1 question.
what if in sigmoid activation function, we get value equal to 0.5. Because in other cases we have either value greater than 0.5 or less than 0.5
It will be 0
@@manishsharma2211 I believe it will be 1, if the sigmoid activation function is 0.5.
The threshold basically depends on your use case and usually is set as 0.5 in case of a binary classification. There are other cases like when your data set is imbalanced, you might have to set threshold higher or lower. So it is important to understand your data. Correct me if I'm wrong
This is a common case for many problems. In that case you just randomly break the tie, as the classes are equally probable
Thanks for classroom type lecture
Hello Krish. Just finished this video.With your style of teaching I don't believe in making notes.I hope I am correct from the interview standpoint.Please guide and do reply.Thanks
What about softmax? Can you explain that as well?
hai sir please explain about bias and weights
Sir waiting for part 2.........
Hi Krish.. Sigmoid function delivers the output in the range (0,1), then how we get the value as 1 or 0 in the case of classification problem. I know - in the logistic regression, if the output is 0.5 or greater , then the result is 1. Here sklearn takes the output from sigmoid in the range (0,1), then converts the output further.
Hey Krish! in one of my regression problems I have used relu, but sigmoid is giving much better results, what could be the explaination for that because i have been in that confused state for months now and so started to brush up on my basics!!
Incorrect graph of the Sigmoid function is shown. At y=0, it should be at 0.5 and not 0. Please correct it
amazing
Hi Krish,
One question. If the output value of record after applying sigmoid activation function is < 0.5 then output is 0, if it's more than 0.5 output is 1. What if the value is exactly 0.5 i.e. what is the output in this case?
If it is
@@krishnaik06 how we decide the threshold 0.5 in ANN??.
We can decide the threshold in Log Regression based on ROC and kappa factor.
So is there any method in ANN for selecting the threshold or it is always 0.5?
Please give us an example of use case of relu function in case of regression
Krish many people want part 2 of this, if this is uploaded on your channel please pin this comment and provide the suitable links, we all are face same problems while learning.
Sir sound problem
Is it the complete playlist of deep learning Sir or will u upload more..
Can we have Live Q&A session with you?
we depend on what? to select suitable Act function?
For Sigmoid Activation Function, The value of y at x = 0.5 should be 0.5 , What you are showing is a x axis translated sigmoid activation function, You may have missed it, Krish. Kindly recheck !!
Can I apply Sigmoid on a set of neurons in hidden layer and Relu on another set of neurons ?
is that 0.5 is theta given in various numericals
Hello Krish sir, in which case the value of y will be negative for Relu AF ? I just started learning Deeplearning along with ML.
Thanks bro
thanks sir
What is the difference between Sigmoid Activation and Batch Normalization?
Hi Krish...why Activation Function Part 2 is not part of this playlist?
Hi krish nice video ,I don't find the part 2 of activation function could you please help me if possible
Wonderful explanation of the topic, but please remove this extra wind noise from the video ...its irritating, I am so sorry had to say this.
hi MR krish
where part 2 about Type activation function ?
i was wondering why this youtube algo doesn't show the second part of Activation Function. got answer in the comments . please upload the second part of the activation function.
bias will add separately with each Wi*Xi or with whole???? i am asking (W1*X1 + B1) + (W2*X2 +B2) + (W3*X3 + B3) OR (W1*X1 + W2*X2 + W3*X3) + B..............WHICH ONE IS CORRECT???
part -2 video plzzzzz
Could you plz upload the 2nd part of this .
It is hard to connect your previous video to the next video. Please guide.
if I am not wrong, the sigmoid activation function is depicted incorrectly. Please look into it and would suggest an edit if possible to make one.
a nice explanation, but I got confuse on ReLu activation function, max(-ve, 0), (+Ve, 0) and related graph. Please could anyone help me to understand this term.
I think formula will be y= 1/1+e power(-x)
Sir what's the intuition behind relu
sir plz upload part 2 of this video
Sigmoid and Reluoid😁 that was human neural network at work😂
Sir could u plz upload the 2nd part...
Why the sigmoid function is starting from the origin ?
Let's say, you take value of y as - inf so 1 /( 1 + e -(inf)) = 1/inf or 0. it takes entire value of y from -inf to +inf and squashes z to 0 and 1
krish can we expect Activation Function Part 2 ?
What if Val is 0.5
Will it accepted to 1 or 0?
Can you please add part 2
Hello Sir , can you upload part 2?
How can RELU return a binary output?
Sigmoid was already between 0 and 1 and we used a threshold.
But RELU just returns a positive number or zero. How can we make it return 0 or 1?
RELU is used as activation function in hidden layers and then sigmoid at the output neuron, if it is a classification problem.
In sigmoid function why always 0.5 is the value in y-axis used for classifying data?
Well it is assumed for simplicity, you can assume it will return probability of the class 1, given the value of y.
the curve of sigmoid is wrong, half comes in the negative x axis...
what is activation is explained but why we need one is not
Why did you not make its part 2
I'm in middle of the understanding..
how to take the weight value bro
Done
Very Helpful. please make videos in hindi also
softmax pls
in case of the ReLu function is the value of "Y" or "X"? means if the value of "X" is positive than the value of "Y" is positive or what?
plz provide me part-2
Mic quality very bad brother
The graph of sigmoid is incorrect... at 0 it should be 0.5
It's between 0 to 1. kindly check.
Activating functions are required because they introduce non linearity.
Dear Krish who is activating your activation function in your body? Simply God is activating. SO do you worship Allah?