How to Select the Right Activation Function and Batch Size

Поділитися
Вставка
  • Опубліковано 15 лип 2024
  • Neural Networks have a lot of knobs and buttons you have to set correctly to get the best possible performance out of it. Although some of them are self-explanatory and easy to understand (like the number of neurons in the input layer) and choose, there are many hyperparameters that are a bit more complex in terms of how they affect the outcome of the model (e.g. number of layers, the batch size or weight initialization).
    In this lesson, we will learn about the activation function, the batch size and the number of epochs. We will take a look at the options for these hyperparameters and how they affect the performance of the network.
    Previous lesson: • Which Loss Function, O...
    Next lesson: • Weight Initialization ...
    📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumroad.com/l/fdl
    📕 NNs hyperparameters cheat sheet: www.soyouwanttobeadatascienti...
    👩‍💻 You can get access to all the code I develop in this course here: github.com/misraturp/Deep-lea...
    ❓To get the most out of the course, don't forget to answer the end of module questions:
    fishy-dessert-4fc.notion.site...
    👉 You can find the answers here:
    fishy-dessert-4fc.notion.site...
    RESOURCES:
    🏃‍♀️ Data Science Kick-starter mini-course: www.misraturp.com/courses/dat...
    🐼 Pandas cheat sheet: misraturp.gumroad.com/l/pandascs
    📥 Streamlit template (updated in 2023, now for $5): misraturp.gumroad.com/l/stemp
    📝 NNs hyperparameters cheat sheet: www.misraturp.com/nn-hyperpar...
    📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumroad.com/l/fdl
    COURSES:
    👩‍💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp.com/hods
    🌎 Website - misraturp.com/
    🐥 Twitter - / misraturp
  • Наука та технологія

КОМЕНТАРІ • 8

  • @SluggishPanda826
    @SluggishPanda826 Рік тому +1

    Dude, thank you soo much, your videos are short, crisp and to the point. you try to explain using best graphical presentation. Thank you for your work. keep it up

    • @misraturp
      @misraturp  Рік тому

      Awesome to hear Prashant. You're very welcome :)

  • @mikelunibaso3143
    @mikelunibaso3143 Рік тому

    Great videos! Which activation function would be sensible to use for the output layer of an autoencoder?

  • @thewilltejeda
    @thewilltejeda Рік тому

    You're great !

    • @misraturp
      @misraturp  Рік тому +2

      Thank you Will!

    • @thewilltejeda
      @thewilltejeda Рік тому

      @@misraturp will you be doing any indepth gan content ? These vids have served as a nice refresher

    • @misraturp
      @misraturp  Рік тому

      @@thewilltejeda That's great to hear, thanks :) For the gans, I don't have that in the plan yet. Probably next year!