The main point from the LOOCV in penalized regression is to pick the appropriate penalty value. Say you create a grid values for a range of penalty values and apply the LOOCV method for each of the penalty values then pick the one that minimizes this MSE since it concentrates on the prediction error.
What is the mean squared leave one out cross validation error of using linear regression ? (i.e. the mode is y = β0 + β1x + noise) x=(0,2,3) y=(2,2,1) Answer: (22+(2/3)2+12 )/3= 49/27
@@IQmates nice. Good tutorial mate, i understand the concept, i wanted to know the difference between LOOCV and k-fold, i got it now. I'm studying at University of Western Cape btw, final year. So yes ML 👍👍
@@lehlohonolopapo888 Great! I'm glad you understand the concept. Please share the videos with your peers. It's motivating to know they are relevant to what you guys are studying :)
What is the mean squared leave one out cross validation error of using linear regression ? (i.e. the mode is y = β0 + β1x + noise) x=(0,2,3) y=(2,2,1) Answer: (22+(2/3)2+12 )/3= 49/27 Can any one explain how ew got this answer?
Congrats! This is one of the best video explaining this concept, practically! Thank you! Keep doing this kind of a great job!
Patrick Silva awesome to hear that! I’ll keep more coming. Had to take a hiatus but more are on their way including Deep Learning ones 😊
The main point from the LOOCV in penalized regression is to pick the appropriate penalty value. Say you create a grid values for a range of penalty values and apply the LOOCV method for each of the penalty values then pick the one that minimizes this MSE since it concentrates on the prediction error.
From now on, UA-cam is my university
Incredible explanation!
Nice, you have a clear explanation about the concept
Thanks for the positive feedback Glenn. All the best!
Very helpful, thank you!
Very well explained!
nice video mate, thanks
Thanks for the compliment Moamen. I am uploading more to my new playlist on Unsupervised Learning. Please share with your friends.
What is the mean squared leave one out cross validation error of using linear regression ? (i.e. the mode is y = β0 + β1x + noise)
x=(0,2,3)
y=(2,2,1)
Answer: (22+(2/3)2+12 )/3= 49/27
Can you help me understand how predicted values are calculated for linear regression
Good
You sound South African. Are you from SA?
Hi lehlohonolo Papo . I’m originally from Zimbabwe but studied at Wits Uni for my degree and postgrad and I’m working in SA 😊
@@IQmates nice. Good tutorial mate, i understand the concept, i wanted to know the difference between LOOCV and k-fold, i got it now. I'm studying at University of Western Cape btw, final year. So yes ML 👍👍
@@lehlohonolopapo888 Great! I'm glad you understand the concept. Please share the videos with your peers. It's motivating to know they are relevant to what you guys are studying :)
@@IQmates I'll do so.
Thank you soooo much
Thank you ! :)
What is the mean squared leave one out cross validation error of using linear regression ? (i.e. the mode is y = β0 + β1x + noise)
x=(0,2,3)
y=(2,2,1)
Answer: (22+(2/3)2+12 )/3= 49/27
Can any one explain how ew got this answer?
Hi! this is an amazing explanation. But I need help on an assignment desperately. And it is exactly on these lines. How can I ask you the question?
F