Forward Propagation | How a neural network predicts output?
Вставка
- Опубліковано 4 бер 2022
- Forward Propagation is a fundamental step in the functioning of neural networks. It involves the transmission of input data through the network's layers to produce an output. Each layer processes the input using weights and activation functions, ultimately generating a prediction. This process is crucial for understanding how neural networks make predictions.
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!
💭Share your thoughts, experiences, or questions in the comments below. I love hearing from you!
✨ Hashtags✨
#ForwardPropagation #NeuralNetworks #MachineLearningBasics #PredictionGeneration #DataScience #NeuralNetworkMechanism
Bhaisahab, maine kabhi imagine bhi nhi kiya tha ki forward propagation itna easily koi samjha sakta hai. Sir aap legend ho! Thank you so much.
Sir , you are such a good teacher and your voice is very soothing , which makes even one of the toughest subjects known mankind sound easier to understand and accept any challenging concepts without any hesitation . Thank you so much all of these efforts Sir . Please consider my this comment for all your videos because I am going watch all of them and learn as much as i can from you .Thank you again Sir .
This is the simplest forward propagation teaching/learning approach I've ever seen. Love you & thankyou so much sir😇😇
Best Content Creator of ML/DL on the whole internet.😊
For a simple but awesome explanation!
loved it, love from Pakistan
Smaj a gai kyaa??
@@witcherhunt3779 smaj kya hota hai
Simply one word for this explanation. Awesome!! Better than some of the paid videos.
the playlist is a masterpiece
Wat a explanation Sir Ji .. Mind blowing .. no one on this planet has explained in depth with details such 🙂
Super majja aa gya forward propagation samjhne mai
VERY BEAUTIFULLY EXPLAINED. THANK YOU
Amazing explaination!
Awesome. Very well explained.
Your explained this in very easy manner, thanks a lot
Wonderful explanation of a complex topic in simplified manner. Loved it.
What an explanation, unparalleled yr.
Great video 🙏
Thanks for this awesome video ❤
Perfect explanation.
Love you sir, we are lucky that you are teraching us data science otherwise lots of student would not have understand with this much clearity. i dont have any other word to express my gratitude ,expect thank you ,thank you a lot sir.... 💌
Thank you so much sir for such a nice explanation.
best lecture and best teacher in deep learning
Completed: 29 May 2024
Before watching the next video, I will revise all the things and also try to understand MLP intuition by drawing another neural network and solving for the output.
Maza Agaya Sir jee and you explain very well
Sir ky a hi bolu just amaizing keet it up sir so simple and crisp
One of best Explanation
Brilliantly explained
You are the great sir 🙏
brilliant explanation...
God level explanation!
Superb explanation!!!!!!!!!!!!!!!!
U nailed it boss
amazing content
living legend
outstanding video
great video sir g
what a explanation sir I mean wow.
Superb!
This is absolutely brilliant. I have watched tutorials by so many data science experts, both Indian and foreign, but none came this close to explaining the fundamentals of neural networks so well. Hats off
Bhai padai chood do aap
@@shubhamagrahari9745 tujhe kya dikkat hai
@@kisholoymukherjee kya hi hoga bhai padhai krke naukari toh lgni ni hai
mujhe naukri lagne ki chinta nahi ha, tumhe hai to batao guide kar doonga. Mere paas already hai, accha wala hi hai, baas upskill kar raha hoon for better
@@shubhamagrahari9745
Excellent 🙏
awesome video...my request , please show pencil moment so that we could know flow of concept
If anyone makes a lecture notes of this series plz share it with all the deep learning enthusiast!
What an Explanation
Thanks!
this is soo easyy !!
Thanks for explain in simple way
here is another IMP d2l dot ai , remove dot and apply .
very Nice bhaiya
Amazing!!!!!!!!!!!!!!!!!!!!!!!!
thank you so much legend
Thanks 🙏
Awesome!!!!!!!!!!!!!!!!!!!!!!
clearly explained
Superb
Thanks sir ji
Thank u so much ....🤗
clear content
sir may i ask question
these coumn data ,but is there is a satllite umage how we see that output
First off , Thank you sir!
I had a question. What about in a scenario where we the hidden layer dimensions aren't reducing. What if all the hidden layers have the same dimension? How will the maths change there? Because our output matrix won't be reducing in size?
thanks!
Can you please tell me why are you using sigmoid in hidden layers? Usually, we don't use sigmoid activation function in hidden layers for binary classification tasks. We use ReLu instead of that... right?
thanks 🙏
superb
Awesome
Sir i have a question? Sir apne weights pehele ke row ko input se into kiya why sir waha aur bhi rows hai ? Mai bahut complex netural Network bna Raha hu sir jisme bahut se input ke columns bhi hai apka sirf one row tha mera bahut sara hai ab mai intu kaise karu
sir big fan ...Love from haldia...batch6
Sir! please make one vedio for neural architecture search
finished watching
Nice
Sir please tell is the reinforcement learning Playlist complete??? As I am going to start it
Very 👍
in the MLP notation video, you denoted W^1 41 to W^4 41. I am confused now, can you please tell me which one is correct?
Sir suggest book for deep learning
2nd
Thanks ❣️
But matrix ka multiplication tho aisa nahi hoga. Can u pls check once sir.
Can someone explain why are we transposing the matrix? Does't transposing the matrix for making a valid multiplication just make the answer obsolete?
The matrix is used is simple row column matrix but in algorithm 4 inputs are going to one perceptron, so transformation of matrix is used.
Transpose q Kiya without transpose bhi ho sakta tha kya?
11:05-->14:45 How can the complex neural network be simply solved by linear algebra 😊?
Sir why is the input a 4*1atrox and not 1*4 mayex bcs it as 4 rows and one column
the way he explain, hilarious
Can you explain this network in MS EXCEL please?
sir sabhi video uplode kr do
white background 😟
Sir telegram group
What is the logic of transposing the W here?
So that matrix multiplication could be possible to perform
i think wight denotation is wrong
I am dumbstruck .
Nice