Activation Functions In Neural Networks Explained | Deep Learning Tutorial

Поділитися
Вставка
  • Опубліковано 2 гру 2024

КОМЕНТАРІ • 34

  • @_Anna_Nass_
    @_Anna_Nass_ 8 місяців тому +6

    OMG, you actually made this easy to understand. I can't believe it. The animations are so helpful. Thank you immensely!

  • @reireireireireireireireirei
    @reireireireireireireireirei 3 роки тому +57

    Actuation functions.

  • @draziraphale
    @draziraphale Рік тому +6

    These videos from Assembly AI are excellent. Distilled clarity

  • @FarizDarari
    @FarizDarari 3 місяці тому +1

    This video activates my understanding on activation functions!

  • @terrylee6904
    @terrylee6904 Рік тому +2

    Excellent Presentation.

  • @wagsman9999
    @wagsman9999 Рік тому +1

    Thank you. I am a little smarter now!

  • @igrok878
    @igrok878 2 роки тому +3

    thank you. Good pronouncing and good content.

  • @thepresistence5935
    @thepresistence5935 3 роки тому +6

    Explained clearly

  • @deeplearningexplained
    @deeplearningexplained 4 місяці тому

    Really sharp tutorial!

  • @alpeshdongre8196
    @alpeshdongre8196 Рік тому +4

    🎯 Key Takeaways for quick navigation:
    01:35 🧠 *Activation functions are crucial in neural networks as they introduce non-linearity, enabling the model to learn complex patterns. Without them, the network becomes a stacked linear regression model.*
    02:43 🔄 *The sigmoid function, commonly used in the last layer for binary classification, outputs probabilities between 0 and 1. It's effective for transforming very negative or positive inputs.*
    03:25 ⚖️ *Hyperbolic tangent, ranging from -1 to +1, is often chosen for hidden layers. ReLU (Rectified Linear Unit) is simple but effective, outputting the input for positive values and 0 for negatives, addressing the dying ReLU problem.*
    04:32 🔍 *Leaky ReLU is a modification of ReLU that prevents neurons from becoming "dead" during training by allowing a small output for negative inputs. Useful in hidden layers to avoid the dying ReLU problem.*
    05:13 🌐 *Softmax function is employed in the last layer for multi-class classification, converting raw inputs into probabilities. It's commonly used to determine the class with the highest probability.*
    Made with HARPA AI

  • @bernardoolisan1010
    @bernardoolisan1010 2 роки тому +2

    Very good video!

  • @oberstoffer
    @oberstoffer 4 місяці тому

    wow ! really good explanation

  • @ianlinify
    @ianlinify 6 місяців тому

    Excellent explanation! Very easy to understand this complex concept through your clear presentation. By the way, it looks like in some cases we don't need to include an activation function in layers, any explanation about why sometimes activation functions are not necessary?

  • @ifeoluwarutholonijolu6944
    @ifeoluwarutholonijolu6944 19 днів тому

    Can the softmax be used for a regression response

  • @canygard
    @canygard 9 місяців тому +2

    Why was the ReLU neuron so depressed?
    ...It kept getting negative feedback, and couldn't find any positive input in its life.

  • @narendrapratapsinghparmar91
    @narendrapratapsinghparmar91 10 місяців тому

    Thanks for this informative video

  • @joguns8257
    @joguns8257 Рік тому +1

    Superb introduction. Other videos have just been vague and hazy inn approach.

  • @_dion_
    @_dion_ 5 місяців тому

    excellent.

  • @anurajms
    @anurajms Рік тому

    thank you

  • @Rashad99990
    @Rashad99990 Рік тому

    We could apply an AI tool to this video to replace actuation with activation :D

  • @be_present_now
    @be_present_now Рік тому

    Good video! One thing I want to point out is that the presenter is talking too fast, a slower speed would make the video great!

  • @muskduh
    @muskduh Рік тому

    thanks

  • @beypazariofficial
    @beypazariofficial Рік тому

    nice

  • @DahBot-nr7rf
    @DahBot-nr7rf 5 місяців тому +1

    V can be W..

  • @valentinleguizamon9957
    @valentinleguizamon9957 7 місяців тому

    ❤❤❤❤

  • @sumanbhattacharjee7550
    @sumanbhattacharjee7550 11 місяців тому

    real life Sheldon Cooper

  • @JAYWRITE-h3e
    @JAYWRITE-h3e 5 місяців тому

    😊😊😊😊🎉🎉🎉🎉

  • @brianp9054
    @brianp9054 2 роки тому +1

    it was said but worth the emphasis, ... 'actuation' function 🤣🤣🤣. Repeat after me, one two and three: A-C-T-I-V-A-T-I-0-N. Great, now keep doing it yourself until you stop saying actuation function...

    • @Huffman_Tree
      @Huffman_Tree Рік тому +2

      Ok I'll give it a try: Activatizeron!