Weight Initialization techniques In Neural Network|How to initialize weight in a deep neural network

Поділитися
Вставка
  • Опубліковано 17 вер 2024
  • Weight Initialization techniques In Neural Network|How to initialize weight in a deep neural network
    #WeightInitialization #DeepLearning #UnfoldDataScience
    Hello,
    My name is Aman and I am a data scientist.
    About this video:
    In this video, I talk about weight initialization techniques in deep Neural network. I talk about below points here:
    1. How to initialize weights in deep neural network?
    2. Weight initialization techniques in neural network?
    3. He initialization in neural network
    4. Xavier initialization in neural network
    About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
    Join Facebook group :
    www.facebook.c...
    Follow on medium : / amanrai77
    Follow on quora: www.quora.com/...
    Follow on twitter : @unfoldds
    Get connected on LinkedIn : / aman-kumar-b4881440
    Follow on Instagram : unfolddatascience
    Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
    Watch python for data science playlist here:
    • Python Basics For Data...
    Watch statistics and mathematics playlist here :
    • Measures of Central Te...
    Watch End to End Implementation of a simple machine learning model in Python here:
    • How Does Machine Learn...
    Learn Ensemble Model, Bagging and Boosting here:
    • Introduction to Ensemb...
    Access all my codes here:
    drive.google.c...
    Have question for me? Ask me here : docs.google.co...
    My Music: www.bensound.c...

КОМЕНТАРІ • 31

  • @sandipansarkar9211
    @sandipansarkar9211 3 роки тому +1

    very nice explanation

  • @vijaybhatt8060
    @vijaybhatt8060 2 роки тому +1

    wonderful

  • @harshyadav1428
    @harshyadav1428 3 місяці тому

    sir i want to build a gnn model on node classification by using basics like loops please provide me steps

  • @sarthakgarg184
    @sarthakgarg184 4 роки тому +1

    Awesome video. Please make a detailed mathematical video on Glrout initialisation also

  • @Kumarsashi-qy8xh
    @Kumarsashi-qy8xh 4 роки тому +1

    Thank you sir

  • @abhijithkrishnan6097
    @abhijithkrishnan6097 2 місяці тому

    sir detailed video needed

  • @onkarchothe6732
    @onkarchothe6732 2 роки тому

    can you please explain the neural network with some real-life problems? It is perfect for all to understand the ANN.

  • @mohammedfahadkhan
    @mohammedfahadkhan 3 місяці тому

    Sir what about sigmoid

  • @JMI_PhD
    @JMI_PhD 2 роки тому

    Sir , neural fitting app hve also this theory because this app can initialise weight ....plz reply sir

  • @apoorva3635
    @apoorva3635 3 роки тому

    Does each neuron in the hidden layer take a different set of weights for each input?

    • @thetruereality2
      @thetruereality2 3 роки тому

      This is a good question, however before answering this question I have to say that I am a beginner at this and my answer is solely based on my understanding of neural networks.
      The answer is actually no, if you start assigning different weights for each input you receive first of all it is not practical and second it would be similar to overfitting the model. So what does the neural network do? When it receives a different input it uses the same assigned weights to predict the output checks it with the target value and uses back propagation to fine tune the weights and biases. Now it must be noted that the weights and biases are not adjusted everytime. They are only adjusted if the overall loss is minimizable (gradient descent). The idea is to not make the neural network very accurate but more reliable and less variance. In other words more robust.

  • @sandeepladi5929
    @sandeepladi5929 4 роки тому

    I want to perform discrete wavelet transform using db1 wavelet in conv2d layer by initializing weights in convolution layers with db1 wavelet filter banks. Can you please help me in keras weight inialization code.

    • @UnfoldDataScience
      @UnfoldDataScience  4 роки тому

      I did not get what you are explaining. Could you please write me in an email what you are looking for.

  • @tejaspatil3978
    @tejaspatil3978 Рік тому

    sir, why we cant use random initializing techniques

  • @ronylpatil
    @ronylpatil 3 роки тому

    Here variance means difference? Am i right

  • @riyazbagban9190
    @riyazbagban9190 Рік тому

    did you write code on weight initialization sir

    • @UnfoldDataScience
      @UnfoldDataScience  Рік тому

      Hi Riyaz, no, I am using pre written modules.

    • @riyazbagban9190
      @riyazbagban9190 Рік тому

      @@UnfoldDataScience please make a vidoe of 3 to 4 minut only for weight initialisation or assign

  • @Programming9131
    @Programming9131 Рік тому

    Sir Hindi me explain ker deta to Jada acche rhe the agr English me he smj na hota to Google se nhi smj leta

  • @kaisersayed9974
    @kaisersayed9974 4 роки тому +1

    Sir, I have sent you a message as (sana) in LinkedIn please reply sir.