Epoch, Batch, Batch Size, & Iterations

Поділитися
Вставка
  • Опубліковано 28 жов 2024

КОМЕНТАРІ • 58

  • @sekharsamanta6266
    @sekharsamanta6266 16 годин тому

    In 3 mins you covered 3 topics like a piece of cake. Thank Yiu!

  • @MomSpaghetti
    @MomSpaghetti 6 місяців тому +1

    best explanation, thank you !

  • @kanavsharma9562
    @kanavsharma9562 3 роки тому +3

    i read multiple articles and watched video to get difference between epoch&batch this is by far the best explanation, Subscribing now.

  • @gulbintacademy
    @gulbintacademy Рік тому

    Ma Sha Allah
    Very effective method

  • @shahzadnomanmalik420
    @shahzadnomanmalik420 2 роки тому

    Simply Explained , Gr8

  • @Faridahjames
    @Faridahjames 2 роки тому

    Thank you! I no longer feel mad

  • @stevenzayas5527
    @stevenzayas5527 2 роки тому +9

    Thank you! Machine Learning topics are often so dense, they almost always gloss over material like this without properly explaining. c:

  • @eyoo369
    @eyoo369 Рік тому +1

    Super helpful man, thank a lot! Love the activate windows

  • @funrobloxvideos
    @funrobloxvideos Рік тому +1

    Thank you this helped me alot i love your content

  • @sumaiyaafridi
    @sumaiyaafridi Рік тому +1

    Thank You so much Best

  • @mohammadkashif750
    @mohammadkashif750 3 роки тому +1

    Very helpful video

  • @helatbini6793
    @helatbini6793 Рік тому

    THANK YOU SO MUCH!!!!!!!!!!!!!!!!!!!

  • @carlossamperquinto2777
    @carlossamperquinto2777 Рік тому

    Great video!

  • @rahulgosavi5412
    @rahulgosavi5412 2 роки тому

    Subscribed 👍👍👍👏👏👏👏

  • @leonelmessi3010
    @leonelmessi3010 Рік тому

    Damn in just 3 mins you made me understand all the 4 terms. Thanks really good for my last hour prep.

  • @malayawasthi
    @malayawasthi 3 роки тому +4

    To the point !!!!!

  • @firstkaransingh
    @firstkaransingh Рік тому

    Very well explained

  • @PythonArms
    @PythonArms Рік тому +1

    Straight to the point and very clear. thanks so much. Another subscriber!

  • @丅-k3r
    @丅-k3r 2 місяці тому

    Thank you!

  • @OpeLeke
    @OpeLeke 2 роки тому

    Great video man!

  • @skylerx1301
    @skylerx1301 3 роки тому +2

    This is soooooooo great sir!, thanks!!!.

  • @mithunchandrasaha403
    @mithunchandrasaha403 3 роки тому

    Nice Explanation.Sir

  • @bhavya2301
    @bhavya2301 3 роки тому

    Thanks for the video. Brief and best video so far on these topics.

  • @ermiyasabate12
    @ermiyasabate12 Рік тому +4

    Seriously this is by far the best explanation! It is very simple and easy to understand! Thank you! Subscribing now!

  • @matthew456929
    @matthew456929 2 роки тому

    Very good explanation. Thanks so much!

  • @katyhessni
    @katyhessni Рік тому

    Thanks so much

  • @muhammedcansoy6131
    @muhammedcansoy6131 3 роки тому +2

    thank you so much :)

  • @fombo3171
    @fombo3171 2 роки тому

    thankx

  • @daniloyukihara2143
    @daniloyukihara2143 3 роки тому +3

    smooth explanation!
    I have a question. In my understanding a model is adjusted on every iteration. Whe you say different weights are applied on each epoch, is it a secondary adjustment ?

    • @reachDeepNeuron
      @reachDeepNeuron  3 роки тому +2

      Each epoch adjust the weights but to complete each epoch it has to read the entire training data in multiple iterations since amount of data is huge. with just partial data it doesn't make sense to update the weights. passing the entire dataset through a neural network is not enough. And we need to pass the full dataset multiple times to the same neural network. So, updating the weights with single pass or one epoch is not enough. One epoch leads to underfitting

  • @JessyLi
    @JessyLi 2 роки тому

    love your simple and easy understand example. thanks

  • @sakshigaikwad8711
    @sakshigaikwad8711 Рік тому

    Thanks

  • @twinkle3j15
    @twinkle3j15 3 роки тому

    super explanation

  • @ercoco1az
    @ercoco1az 3 роки тому

    Thanks!

  • @samyeung122
    @samyeung122 3 роки тому

    thanks for the concise explanation!

  • @mechanicalbaba2484
    @mechanicalbaba2484 3 роки тому

    Nicely explained bro, thanks

  • @thebambooflute4870
    @thebambooflute4870 3 роки тому

    Very good explanation and to the point, thanks

  • @isaac2163
    @isaac2163 Рік тому

    Hey there, great video. May i know the references for the explanation in your video at 2:19 ?

  • @Djchristian15
    @Djchristian15 3 роки тому

    thanks... perfect explanation!

  • @cynthiacalixtro8923
    @cynthiacalixtro8923 3 роки тому

    Thanks :)

  • @namansethi1767
    @namansethi1767 3 роки тому

    Good

  • @AshuRane09
    @AshuRane09 3 роки тому +1

    Does increasing epochs lead to overfitting?

    • @reachDeepNeuron
      @reachDeepNeuron  3 роки тому +1

      Yup. Too few may result in an underfit model.

  • @Abhishek-ql3qu
    @Abhishek-ql3qu Рік тому

    3:00 I think it should be 4 iterations,because one epoch is equal to one forward and backword pass.plz explain.

  • @harshavardhan6368
    @harshavardhan6368 Рік тому

    what is the intuition behind batch size being the hyper parameter? How will it influence the model ?

    • @reachDeepNeuron
      @reachDeepNeuron  Рік тому

      why mfr units produces products in batches ?? suggest you to watch this one more time

    • @harshavardhan6368
      @harshavardhan6368 Рік тому

      So helps compute more efficiently? Rather than entire set?

    • @reachDeepNeuron
      @reachDeepNeuron  Рік тому

      @@harshavardhan6368 u got it

    • @harshavardhan6368
      @harshavardhan6368 Рік тому

      @@reachDeepNeuron yeah, i understood it in the first go but my question is if will help the model to perform better in terms of accuracy ?

  • @karimfayed4517
    @karimfayed4517 3 роки тому

    Love the video but I have one question:
    What is the difference between an iteration and a step?

  • @santoshkurasa8275
    @santoshkurasa8275 3 роки тому

    So epoch meaning one step in gradient decent for the neural network?

    • @reachDeepNeuron
      @reachDeepNeuron  3 роки тому

      A training step is also called Iteration. Epoch indicates the number of passes of the entire training dataset the machine learning algorithm has completed.

  • @nurkleblurker2482
    @nurkleblurker2482 2 роки тому

    So an iteration is another word for batch?

    • @fidgetyrock4420
      @fidgetyrock4420 Місяць тому

      Id say batch is the bag of content.. iteration is the cashiers registration of the bag.. say in a shop for example.

  • @techmusic701
    @techmusic701 3 роки тому

    Thanks !