Overfitting in a Neural Network explained

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 62

  • @deeplizard
    @deeplizard  6 років тому +2

    Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html
    Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html

  • @justchill99902
    @justchill99902 5 років тому +15

    The explanation couldn't have been better! You nailed it! Thank you.

  • @ForcefighterX2
    @ForcefighterX2 6 місяців тому +1

    Best video regarding this topic I have found so far - thank! 🙂

  • @tymothylim6550
    @tymothylim6550 3 роки тому +2

    Thank you very much for this video! The explanation for what overfitting is, as well as the possible methods to tackle it, really help!

  • @bassdewd
    @bassdewd 5 років тому +6

    Thank you so so much! Your videos are so short, concise and relatable.

    • @bassdewd
      @bassdewd 5 років тому +1

      Thanks to you I graduated! 🎓 @deeplizard

  • @johnsammut5961
    @johnsammut5961 3 роки тому +2

    Fantastic! I finally understand the fundamentals of DL.

  • @SB-gm6zx
    @SB-gm6zx 5 років тому +1

    so comprehensive, yet short and to the point. thanks so much for taking the time to do these videos.

  • @zakariaamiri1090
    @zakariaamiri1090 6 років тому +7

    Thank you so much, that was really clear, et very helpful !

    • @deeplizard
      @deeplizard  6 років тому

      You're welcome, Zakaria! Glad the video was helpful!

  • @amoghjain
    @amoghjain 3 роки тому

    Thankyou soo much for sharing your knowledge and making these videos! This is the best course so far to learn fundamentals !!! ( I havent tried advanced courses, but now I have high expectations) always 10000% to the point, and no words wasted.

    • @deeplizard
      @deeplizard  3 роки тому

      You're so welcome, Amogh! After completing the fundamentals course, you'll be set to move on to the various other deep learning courses available on deeplizard.com. Browse the site home page to get an overview of the courses and see the recommended order for which to take them.

  • @annakozio9220
    @annakozio9220 2 роки тому +1

    Wow, the explanation is so clear! Thank you :)

  • @Darieee
    @Darieee 5 років тому +1

    Beautiful video ! Thanks !

  • @anugrahps6463
    @anugrahps6463 4 роки тому +1

    great explanation mam thank you really helped me out. I was not able to answer a question in the Nvidia DLI course. Thank you Mam keep going whenever possible I'll contribute to your Patreon its that good.....

  • @qusayhamad7243
    @qusayhamad7243 4 роки тому

    thank you very much for this clear and helpful explanation.

  • @vi__ku4748
    @vi__ku4748 5 років тому +1

    nice explaination, thanks for your video

  • @VivekSingh-rl1rv
    @VivekSingh-rl1rv 4 роки тому

    Awesome teaching

  • @jt099
    @jt099 4 роки тому +1

    Love this very useful!

  • @saranshtayal2526
    @saranshtayal2526 4 роки тому

    best channel ever

  • @geezer2450
    @geezer2450 6 років тому +1

    Thanks! Very helpful.

  • @roger_is_red
    @roger_is_red 4 роки тому +1

    Hi thanks! how about adding noise does this help with overfitting?

    • @deeplizard
      @deeplizard  4 роки тому +1

      Perhaps, but this would only be useful if the data that you would be using in production with your model were inherently noisy.

  • @abdelhamidelwahabi3578
    @abdelhamidelwahabi3578 3 роки тому +1

    just thanks a lot

  • @sridharm6865
    @sridharm6865 5 років тому +1

    awsome explanation

  • @hiroshiperera7107
    @hiroshiperera7107 6 років тому +1

    Well explained. Thanks alot :)

  • @SlideTackles
    @SlideTackles 4 роки тому +1

    I was working on a text classification problem with ann where I encountered with overfitting problem . my model had 99 percent accuracy on training set and just 75 percent on test set after performing parameter tuning (setting drop out, no of epoch , weight transform etc) training accuracy was reduced to 94-95 percent and testing accuracy was 78 . I was able to reduce overfitting to a small extent but what more can do to improve it??? please help

  • @ajay0nz
    @ajay0nz 6 років тому +1

    Really well explained!

  • @nisnabudas5497
    @nisnabudas5497 6 років тому +1

    If the validation metrics are considerably worse than the training metrics, then that is indication that our model is overfitting.
    How to know this from looking at metrics, How do you know in this video?

    • @deeplizard
      @deeplizard  6 років тому +1

      You can compare the validation accuracy to the training accuracy. If the validation accuracy is considerably lower than the training accuracy, then that indicates overfitting.

    • @shreejanshrestha1931
      @shreejanshrestha1931 4 роки тому

      @@deeplizard i have similar question. Seeing the above model.fit, how can we say it is actually overfitting or it is ok to have certain numbers of up and down. If we have 85% of accuracy in training and 82% in validation, do you consider it as a overfit or normal?
      can you give some more suggestion on how to analyze our model performance base on training and validation. What should we look and by how much.
      Waiting for your reply :)

    • @carlosmontesparra8548
      @carlosmontesparra8548 3 роки тому

      @@deeplizard and how do you estimate whether or not "considerably" lower. Is there a criteria or test? Because it sounds like it would just be a personal judgement. Thanks in advance!

  • @simon5771
    @simon5771 6 років тому +4

    Early stopping also helps! :)

  • @glebmokeev6312
    @glebmokeev6312 Рік тому

    Thank you for making these videos! But is dropout out still popular nowadays?

  • @tushar_g7796
    @tushar_g7796 6 років тому +1

    Thanks a lot

  • @mohammadakbarizadeh9705
    @mohammadakbarizadeh9705 6 років тому +1

    Sooooooooooooooooo great video

  • @VinayGharge
    @VinayGharge 5 років тому

    How can reducing the layer or reducing the number of neurons help in reducing the Overfitting of the model?

  • @legendgamer-pm4dk
    @legendgamer-pm4dk 6 років тому +1

    how do we use data augmentation for text data?

  • @naingminhtun3876
    @naingminhtun3876 6 років тому +1

    Thanks very much. Love you :)

  • @IndranilTalukdarindia
    @IndranilTalukdarindia 6 років тому +1

    Your videos are so good. Thank you very much ❤️...wants to donate.

    • @deeplizard
      @deeplizard  6 років тому +1

      You're welcome, Indranil! And thank you!
      We have a Patreon set up for anyone who'd like to contribute:
      deeplizard.com/hivemind

  • @omyrazeem2571
    @omyrazeem2571 8 місяців тому +1

  • @thespam8385
    @thespam8385 5 років тому

    {
    "question": "All of the following can help reduce overfitting EXCEPT:",
    "choices": [
    "Greater availability of labels",
    "Increased data diversity",
    "Data augmentation",
    "Reduced model complexity"
    ],
    "answer": "Greater availability of labels",
    "creator": "Chris",
    "creationDate": "2019-12-11T04:46:40.163Z"
    }

    • @deeplizard
      @deeplizard  5 років тому

      Thanks, Chris! Just added your question to deeplizard.com

    • @-dialecticsforkids2978
      @-dialecticsforkids2978 4 роки тому

      well its not mentioned, but maybe does greater availability of labels not reduce overfitting? even though it would most likely reduce the accuracy either way...

    • @thespam8385
      @thespam8385 4 роки тому +1

      @@-dialecticsforkids2978 Looking back, I'm not totally sure what I meant when I said, "Greater availability of labels" haha. In this case, as is explored in "Supervised Learning Explained," we're examining supervised learning, so labels are available in every instance of the training data. Really, "Greater availability of labels" is just a nonsensical answer because availability cannot increase. If we think of it as an increased _number_ of labels, however, which is what I gather from your answer, then yes, it would help reduce overfitting because that is a subset of increased data diversity.

  • @world_posts
    @world_posts 3 роки тому

    Nice

  • @demetriusdemarcusbartholom8063
    @demetriusdemarcusbartholom8063 2 роки тому

    ECE 449 UofA

  • @ahmedhusham7728
    @ahmedhusham7728 4 роки тому

    Nice video but you need to decrease the speech speed a little bit

    • @deeplizard
      @deeplizard  4 роки тому

      I have in later videos :)
      Also, each video has a corresponding written blog on deeplizard.com that you can check out for a slower pace :D

    • @carlosmontesparra8548
      @carlosmontesparra8548 3 роки тому +1

      you can push SHIFT + < to slow it down

    • @ahmedhusham7728
      @ahmedhusham7728 3 роки тому

      @@carlosmontesparra8548 Thanks but it didn't work

  • @tobias7256
    @tobias7256 2 роки тому +1

    for the alg

  • @TheAmeer3881
    @TheAmeer3881 5 років тому

    Hello all,
    Would appreciate some eyes on this :
    stackoverflow.com/questions/59129348/how-to-deal-with-overfitting-with-simple-x-y-data-in-mlpregressor
    Just trying to figure out what I am doing wrong with simple X,Y Neural net, should be no problem for you guys. Thanks everyone and thank you deeplizard for the video.

  • @yatinarora9650
    @yatinarora9650 5 років тому

    Still not getting the essence of overfitting

  • @zx3215
    @zx3215 5 років тому

    Overfitting - it is when you get used to "like this video and subscribe to the channel" at the end of all UA-cam videos, and then when you don't hear it you think it is not a UA-cam video)