I have a perfect example.....my instructor gave me a 2.5-hour lecture on this finally m confused but in 7 min video you made it clear. It's a Curse of Dimensionality.........great.
One concern: When you are using the word exponentially, you are using it wrongly. The exponential increase required for ensuring reliable modeling is in the sampling size and not the number of features. Thank you for the good illustration though. Time: 5:55
In a very simple term, if you are working with large number of dimensions the pattern discover is challenging, this is what the curse of dimensionality.
Thanks for the effort. But what you explained is not curse of dimensionality it is simply increasing the model parameters which leads to overfitting. Whereas curse of dimensionality talks about high-dimensional data which their distance distribution gets independent of the data.
Hello my dear friend. I think you got confused between overfitting, underfitting and curse of dimensionality. Overfitting and underfitting usually happens when you dont select the right hyperparameter for the machine learning algorithm that we are using. Here we are discussing about attributes,features. Today i will also be uploading a video on overfitting and underfitting. Thanks Krish
@@krishnaik06 No I am not confused. When you have more features it means your models needs more parameters therefore you increase the complexity of your model this leads to over fitting when you have few training data. The solution is either regularizing the parameters or reducing the dimension. (off course increasing training data helps). Curse of dimensionality is a different subject please at least see the wikipidia page : en.wikipedia.org/wiki/Curse_of_dimensionality
@@skyman7290 the predictive power of a classifier or regressor first increases as the number of dimensions or features used is increased but then decreases,[4] which is known as Hughes phenomenon[5] or peaking phenomena........from your given link I have got this, Kindly check the article first before commenting on Krish's video. Peace!!
@@krishnaik06 is it only that the right hyperparameter is responsible for underfitting and overfitting not selecting the right features are responsible
@@krishnaik06 Surely it happens because number of sample data needed increases exponentially as dimensions increase? Even if dimensions increases linearly
Could we select the features using the l2 regularization coefficients; which helps us to select the right features which are not shrinked? By that can we reduce the curse of dimensionality?
I have a perfect example.....my instructor gave me a 2.5-hour lecture on this finally m confused but in 7 min video you made it clear. It's a Curse of Dimensionality.........great.
I have a perfect example.....my instructor gave me a 2.5-hour lecture on this finally m confused but in 7 min video you made it clear. It's a Curse of Dimensionality.........great.
One concern: When you are using the word exponentially, you are using it wrongly. The exponential increase required for ensuring reliable modeling is in the sampling size and not the number of features. Thank you for the good illustration though. Time: 5:55
same doubt tbh ++
Great video!
Watched it for second time for better understanding and coding practice. Thanks
In a very simple term, if you are working with large number of dimensions the pattern discover is challenging, this is what the curse of dimensionality.
Please make a video on the math behind t-SNE.... Great video!! Thank you
simply explained.. thank you Krish
Thank you for this simple explanation :)
Thanks for the effort. But what you explained is not curse of dimensionality it is simply increasing the model parameters which leads to overfitting. Whereas curse of dimensionality talks about high-dimensional data which their distance distribution gets independent of the data.
Hello my dear friend. I think you got confused between overfitting, underfitting and curse of dimensionality. Overfitting and underfitting usually happens when you dont select the right hyperparameter for the machine learning algorithm that we are using. Here we are discussing about attributes,features. Today i will also be uploading a video on overfitting and underfitting.
Thanks
Krish
@@krishnaik06 No I am not confused. When you have more features it means your models needs more parameters therefore you increase the complexity of your model this leads to over fitting when you have few training data. The solution is either regularizing the parameters or reducing the dimension. (off course increasing training data helps). Curse of dimensionality is a different subject please at least see the wikipidia page : en.wikipedia.org/wiki/Curse_of_dimensionality
@@skyman7290 the predictive power of a classifier or regressor first increases as the number of dimensions or features used is increased but then decreases,[4] which is known as Hughes phenomenon[5] or peaking phenomena........from your given link I have got this, Kindly check the article first before commenting on Krish's video. Peace!!
@@krishnaik06 is it only that the right hyperparameter is responsible for underfitting and overfitting not selecting the right features are responsible
@@krishnaik06 Surely it happens because number of sample data needed increases exponentially as dimensions increase? Even if dimensions increases linearly
How do u we know what is threshold value of features selection?
Thanks Krish
Thanks you much bro.. waiting for your next video
Great explanation Krish. No need to make notes .Just understand. Thanks
Basically diminished returns?
Is it necessary that the curse of dimensionality happens when the no of features increased exponentially ?
Eagerly waiting for your next video. Please upload soon.
Thanks for explaining clearly sir..
Very well explanation. Easy to understand.
Could we select the features using the l2 regularization coefficients; which helps us to select the right features which are not shrinked? By that can we reduce the curse of dimensionality?
Well explained! Thanks
Can you please make a very on OLPP?
sir thanx for the video i am new to the scenario of machine learning and this helped me
This was a great video explained very easily!
Great
kindly do video on Chunking and Lazy Learners method
What does accuracy here mean? Does it mean the ability for the model to predict?
Yes
Thank you bro
Thank You
TALK ENGLISH
I have a perfect example.....my instructor gave me a 2.5-hour lecture on this finally m confused but in 7 min video you made it clear. It's a Curse of Dimensionality.........great.