Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html
believe me, I have gone through hundreds of DL videos but couldn't understand them. But you made DL so much easier. Each video in the playlist is becoming easier instead of getting complex.
Thanks for this video but concretely, what do mean : - Increase the number of layers in the model? - Increase the number of neurons in each layer? - Changing what type of layer we are using where? (Question from a newb who starts taking up learn about ML, sorry)
{ "question": "When training a neural network, we can reduce underfitting by", "choices": [ "decreasing dropout rate", "making small changes to the learning rate", "removing data from the training set", "increasing dropout rate" ], "answer": "decreasing dropout rate", "creator": "Kyla", "creationDate": "2021-04-17T02:41:56.637Z" }
{ "question": "The problem of underfitting in a neural network can be tackled in all of the following ways except:", "choices": [ "Increasing the amount of data through data augmentation", "Increasing the complexity of the neural network model", "Increasing the number of features that are used in the data", "Decreasing the rate of dropouts in layers that have them" ], "answer": "Increasing the amount of data through data augmentation", "creator": "saluk", "creationDate": "2020-08-27T15:49:02.774Z" }
what is the point of dropout when you can set the number of nodes in each layer? At first the only difference I see is that dropout does not affect the validation test, why would we prefer dropout over resetting number of node?
Both numbers of nodes and dropout can be adjusted simultaneously. Dropout would randomly drop a portion of the nodes in each epoch, so resetting numbers of nodes doesn't have the same effect as dropout. For example, 0.5 dropout in a two-node layer may drop the first node in the first epoch and dropout the second node in the second epoch.
So how can a company like tesla be sure that its neural net has enough capacity to learn level 5 self driving in other words are there ways to compute the learning capacity of a neural net?
Hi deeplizard, What do you mean when you talk about reducing complexity? I know that your talking about decreasing the number of nodes in a layer, but how does that help overfit or underfit? In fact, an even more essential question, how does the number of nodes in a layer help or harm data? Thanks for your videos btw. Srikar
Actually, nodes represent a linear algebra matrix, so adding more nodes means bigger matrix complexity. When you have a bigger matrix than you have a more complex linear system to predict the outcome. But sometimes the matrix is too large that overfit the training set, so reduce the size would help decrease the overfitting, vice versa.
{ "question": "Which of the following is most indicative of underfitting?", "choices": [ "A model is unable to classify data in the training set", "A model is unable to classify data outside the training set", "A model has input/output sizes that are too large", "A model has input/output sizes that are too small" ], "answer": "A model is unable to classify data in the training set", "creator": "Chris", "creationDate": "2019-12-11T05:02:20.900Z" }
actually i did not get why the underfitting is happened ? can u explain it more If i understand well,it will be happen if the model can figure out the trained data well ,and can figure out the un trained data but with low level of expectation and this can be detected by the error function when we run out model
Hey mahmoud - Underfitting occurs when the model is not even able to fit the data that it was trained on. In other words, the model is not able to achieve high accuracy/low loss on the training data.
One example why underfitting can happen is the model is too simple to fit the training data. In other words, the data needs more complex model to fit them
I love your videos alot and thanks alot for making them. But can you please *please* change the intro and outro sound ( noise ) in the new videos ( if you already have, please ignore this comment ). They are creepy af.
Haha! Referring to the intro/outro as "creepy af" made me laugh :D Appreciate your feedback. We've changed it later videos. What do you think of this: ua-cam.com/video/eEkpBnOd8Zk/v-deo.html
Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html
Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html
i'm a deep learning programmer and after 6 years that this playlist has been published . its absolutely still insane 😍
believe me, I have gone through hundreds of DL videos but couldn't understand them. But you made DL so much easier. Each video in the playlist is becoming easier instead of getting complex.
really great and underappreciated channel. Plan to see all of the rest of your videos as well, it's a great explanation on a very difficult topic
Thanks, Karel!
Thank you very much for this video! The animations really help me understand the various methods for tackling underfitting!
The point about dropout was helpful! Thanks!
thank you very much for this clear and helpful explanation.
Thanks for this video but concretely, what do mean :
- Increase the number of layers in the model?
- Increase the number of neurons in each layer?
- Changing what type of layer we are using where?
(Question from a newb who starts taking up learn about ML, sorry)
{
"question": "When training a neural network, we can reduce underfitting by",
"choices": [
"decreasing dropout rate",
"making small changes to the learning rate",
"removing data from the training set",
"increasing dropout rate"
],
"answer": "decreasing dropout rate",
"creator": "Kyla",
"creationDate": "2021-04-17T02:41:56.637Z"
}
Thanks, Kyla! Just added your question to deeplizard.com/learn/video/0h8lAm5Ki5g
Thank You Very Much for this!!! ❤️❤️
Please ,if you can ,try to make a detailed video on Dropout !
Thank You again!Best of Luck!
{
"question": "The problem of underfitting in a neural network can be tackled in all of the following ways except:",
"choices": [
"Increasing the amount of data through data augmentation",
"Increasing the complexity of the neural network model",
"Increasing the number of features that are used in the data",
"Decreasing the rate of dropouts in layers that have them"
],
"answer": "Increasing the amount of data through data augmentation",
"creator": "saluk",
"creationDate": "2020-08-27T15:49:02.774Z"
}
Thanks, saluk! Just added your question to deeplizard.com/learn/video/0h8lAm5Ki5g :)
I spotted a slight typo in the article for this video
improve it’s accuracy
↓
improve its accuracy
Fixed, thanks Chris! :D
What is the point of the validation set if it doesn't validate on dropouts that we designed? Is it just a feature to debug the underfitting problem?
what is the point of dropout when you can set the number of nodes in each layer? At first the only difference I see is that dropout does not affect the validation test, why would we prefer dropout over resetting number of node?
Both numbers of nodes and dropout can be adjusted simultaneously. Dropout would randomly drop a portion of the nodes in each epoch, so resetting numbers of nodes doesn't have the same effect as dropout. For example, 0.5 dropout in a two-node layer may drop the first node in the first epoch and dropout the second node in the second epoch.
@@bradyhuang5606 I needed that. Thank you.
{
"question": "Underfitting is simply an unsuccesfull training",
"choices": [
"True",
"False",
"Kinda",
"Not at All"
],
"answer": "True",
"creator": "Tilkikelile",
"creationDate": "2021-11-30T00:07:18.358Z"
}
So how can a company like tesla be sure that its neural net has enough capacity to learn level 5 self driving in other words are there ways to compute the learning capacity of a neural net?
Hi deeplizard,
What do you mean when you talk about reducing complexity? I know that your talking about decreasing the number of nodes in a layer, but how does that help overfit or underfit? In fact, an even more essential question, how does the number of nodes in a layer help or harm data?
Thanks for your videos btw.
Srikar
Actually, nodes represent a linear algebra matrix, so adding more nodes means bigger matrix complexity. When you have a bigger matrix than you have a more complex linear system to predict the outcome. But sometimes the matrix is too large that overfit the training set, so reduce the size would help decrease the overfitting, vice versa.
{
"question": "Which of the following is most indicative of underfitting?",
"choices": [
"A model is unable to classify data in the training set",
"A model is unable to classify data outside the training set",
"A model has input/output sizes that are too large",
"A model has input/output sizes that are too small"
],
"answer": "A model is unable to classify data in the training set",
"creator": "Chris",
"creationDate": "2019-12-11T05:02:20.900Z"
}
Thanks, Chris! Just added your question to deeplizard.com
Vielen Dank!
Bitte!
ECE 449 UofA
actually i did not get why the underfitting is happened ? can u explain it more
If i understand well,it will be happen if the model can figure out the trained data well ,and can figure out the un trained data but with low level of expectation and this can be detected by the error function when we run out model
Hey mahmoud - Underfitting occurs when the model is not even able to fit the data that it was trained on. In other words, the model is not able to achieve high accuracy/low loss on the training data.
One example why underfitting can happen is the model is too simple to fit the training data. In other words, the data needs more complex model to fit them
for the alg
I love your videos alot and thanks alot for making them. But can you please *please* change the intro and outro sound ( noise ) in the new videos ( if you already have, please ignore this comment ). They are creepy af.
Haha! Referring to the intro/outro as "creepy af" made me laugh :D Appreciate your feedback. We've changed it later videos. What do you think of this: ua-cam.com/video/eEkpBnOd8Zk/v-deo.html
@@deeplizard lol. Thank you that's much better.
I actually like the creepy one LOL
👻
I fear to watch this video at 4am.
You safe my ass 🤭🤭