Underfitting in a Neural Network explained

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 40

  • @deeplizard
    @deeplizard  6 років тому +3

    Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html
    Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html

  • @Mahdi-noori-ai
    @Mahdi-noori-ai 8 місяців тому +1

    i'm a deep learning programmer and after 6 years that this playlist has been published . its absolutely still insane 😍

  • @zahidullah1707
    @zahidullah1707 3 роки тому +7

    believe me, I have gone through hundreds of DL videos but couldn't understand them. But you made DL so much easier. Each video in the playlist is becoming easier instead of getting complex.

  • @karelhof319
    @karelhof319 6 років тому +14

    really great and underappreciated channel. Plan to see all of the rest of your videos as well, it's a great explanation on a very difficult topic

  • @tymothylim6550
    @tymothylim6550 3 роки тому +2

    Thank you very much for this video! The animations really help me understand the various methods for tackling underfitting!

  • @michaelmuller136
    @michaelmuller136 6 років тому +5

    The point about dropout was helpful! Thanks!

  • @qusayhamad7243
    @qusayhamad7243 4 роки тому +1

    thank you very much for this clear and helpful explanation.

  • @guillaume8437
    @guillaume8437 2 роки тому

    Thanks for this video but concretely, what do mean :
    - Increase the number of layers in the model?
    - Increase the number of neurons in each layer?
    - Changing what type of layer we are using where?
    (Question from a newb who starts taking up learn about ML, sorry)

  • @kyla406
    @kyla406 3 роки тому +1

    {
    "question": "When training a neural network, we can reduce underfitting by",
    "choices": [
    "decreasing dropout rate",
    "making small changes to the learning rate",
    "removing data from the training set",
    "increasing dropout rate"
    ],
    "answer": "decreasing dropout rate",
    "creator": "Kyla",
    "creationDate": "2021-04-17T02:41:56.637Z"
    }

    • @deeplizard
      @deeplizard  3 роки тому

      Thanks, Kyla! Just added your question to deeplizard.com/learn/video/0h8lAm5Ki5g

  • @gourabsarker5491
    @gourabsarker5491 4 роки тому +1

    Thank You Very Much for this!!! ❤️❤️
    Please ,if you can ,try to make a detailed video on Dropout !
    Thank You again!Best of Luck!

  • @saluk7419
    @saluk7419 4 роки тому +1

    {
    "question": "The problem of underfitting in a neural network can be tackled in all of the following ways except:",
    "choices": [
    "Increasing the amount of data through data augmentation",
    "Increasing the complexity of the neural network model",
    "Increasing the number of features that are used in the data",
    "Decreasing the rate of dropouts in layers that have them"
    ],
    "answer": "Increasing the amount of data through data augmentation",
    "creator": "saluk",
    "creationDate": "2020-08-27T15:49:02.774Z"
    }

    • @deeplizard
      @deeplizard  4 роки тому

      Thanks, saluk! Just added your question to deeplizard.com/learn/video/0h8lAm5Ki5g :)

  • @fritz-c
    @fritz-c 4 роки тому +1

    I spotted a slight typo in the article for this video
    improve it’s accuracy

    improve its accuracy

  • @aravindvenkateswaran5294
    @aravindvenkateswaran5294 3 роки тому

    What is the point of the validation set if it doesn't validate on dropouts that we designed? Is it just a feature to debug the underfitting problem?

  • @carlosmontesparra8548
    @carlosmontesparra8548 3 роки тому +1

    what is the point of dropout when you can set the number of nodes in each layer? At first the only difference I see is that dropout does not affect the validation test, why would we prefer dropout over resetting number of node?

    • @bradyhuang5606
      @bradyhuang5606 3 роки тому +2

      Both numbers of nodes and dropout can be adjusted simultaneously. Dropout would randomly drop a portion of the nodes in each epoch, so resetting numbers of nodes doesn't have the same effect as dropout. For example, 0.5 dropout in a two-node layer may drop the first node in the first epoch and dropout the second node in the second epoch.

    • @coldwind8327
      @coldwind8327 2 роки тому

      @@bradyhuang5606 I needed that. Thank you.

  • @keliletilki3086
    @keliletilki3086 3 роки тому

    {
    "question": "Underfitting is simply an unsuccesfull training",
    "choices": [
    "True",
    "False",
    "Kinda",
    "Not at All"
    ],
    "answer": "True",
    "creator": "Tilkikelile",
    "creationDate": "2021-11-30T00:07:18.358Z"
    }

  • @roger_is_red
    @roger_is_red 4 роки тому

    So how can a company like tesla be sure that its neural net has enough capacity to learn level 5 self driving in other words are there ways to compute the learning capacity of a neural net?

  • @srikarvalluri8173
    @srikarvalluri8173 5 років тому +1

    Hi deeplizard,
    What do you mean when you talk about reducing complexity? I know that your talking about decreasing the number of nodes in a layer, but how does that help overfit or underfit? In fact, an even more essential question, how does the number of nodes in a layer help or harm data?
    Thanks for your videos btw.
    Srikar

    • @bradyhuang5606
      @bradyhuang5606 3 роки тому

      Actually, nodes represent a linear algebra matrix, so adding more nodes means bigger matrix complexity. When you have a bigger matrix than you have a more complex linear system to predict the outcome. But sometimes the matrix is too large that overfit the training set, so reduce the size would help decrease the overfitting, vice versa.

  • @thespam8385
    @thespam8385 5 років тому

    {
    "question": "Which of the following is most indicative of underfitting?",
    "choices": [
    "A model is unable to classify data in the training set",
    "A model is unable to classify data outside the training set",
    "A model has input/output sizes that are too large",
    "A model has input/output sizes that are too small"
    ],
    "answer": "A model is unable to classify data in the training set",
    "creator": "Chris",
    "creationDate": "2019-12-11T05:02:20.900Z"
    }

    • @deeplizard
      @deeplizard  5 років тому

      Thanks, Chris! Just added your question to deeplizard.com

  • @justchill99902
    @justchill99902 5 років тому +1

    Vielen Dank!

  • @demetriusdemarcusbartholom8063
    @demetriusdemarcusbartholom8063 2 роки тому

    ECE 449 UofA

  • @-arabsoccer1553
    @-arabsoccer1553 6 років тому

    actually i did not get why the underfitting is happened ? can u explain it more
    If i understand well,it will be happen if the model can figure out the trained data well ,and can figure out the un trained data but with low level of expectation and this can be detected by the error function when we run out model

    • @deeplizard
      @deeplizard  6 років тому

      Hey mahmoud - Underfitting occurs when the model is not even able to fit the data that it was trained on. In other words, the model is not able to achieve high accuracy/low loss on the training data.

    • @farhanrezagumay5730
      @farhanrezagumay5730 6 років тому

      One example why underfitting can happen is the model is too simple to fit the training data. In other words, the data needs more complex model to fit them

  • @tobias7256
    @tobias7256 2 роки тому +1

    for the alg

  • @Akshatgiri
    @Akshatgiri 6 років тому +3

    I love your videos alot and thanks alot for making them. But can you please *please* change the intro and outro sound ( noise ) in the new videos ( if you already have, please ignore this comment ). They are creepy af.

    • @deeplizard
      @deeplizard  6 років тому +4

      Haha! Referring to the intro/outro as "creepy af" made me laugh :D Appreciate your feedback. We've changed it later videos. What do you think of this: ua-cam.com/video/eEkpBnOd8Zk/v-deo.html

    • @Akshatgiri
      @Akshatgiri 6 років тому +1

      @@deeplizard lol. Thank you that's much better.

    • @kefetDtcom
      @kefetDtcom 6 років тому +1

      I actually like the creepy one LOL

    • @deeplizard
      @deeplizard  6 років тому

      👻

    • @nisnabudas5497
      @nisnabudas5497 6 років тому +4

      I fear to watch this video at 4am.

  • @sachinkushwaha934
    @sachinkushwaha934 2 роки тому

    You safe my ass 🤭🤭