Multiple Linear Regression | Part 2 | Mathematical Formulation From Scratch
Вставка
- Опубліковано 21 лип 2024
- Dive into the mathematical foundation of Multiple Linear Regression in this second part of our series. We'll guide you through the formulation from scratch, making it easy to grasp the concepts behind this powerful regression technique. Build a solid foundation for your regression modeling skills.
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:30 - Types of Linear Regression
03:31 - Mathematical Formulation
26:45 - Detouring for sometime
43:20 - Why Gradient Descent?
I dont think mathematical explanation given in this video exist in youtube. I have found this in the book "Mathematics for Machine Learning by Marc Peter Deisenroth". This is Simply brilliant. Although, matrix differentiation part is absent, but still this is extra ordinary stuff.
yes , Well said .. brilliantly explained :-)😍😍😍😇😇
Sir I Can tell You No One Literally No One Video Can Compare through your teaching, I have seen videos of Coding Ninjas and Other Paid Lectures but nobody has got into this dept , I can literally feel Machine Learning in Front of My Imagination. Thank You So Much Sir 🙏
Hi Sir..U r truly gem person sharing such a great knowledge free..is blessing for new generation..god bless u sir..Aap hamare Guru ho Aaj Se..
Such wonderful explanation sir, really thanks a lot♥️♥️♥️you were able to explain something which 100s of videos couldn't explain to me.
literally, it was the greatest explanation that I have ever seen on UA-cam. Hat's off Sir
Most underrated channel on utube 🥲
step by step series in very detail , superb .
This was just brilliant. You are an incredible teacher. Thankyou.
I jumped when I understood eT*e concept. Thank you so much!!!
Man top class stuff, been trying to find mathematical derivation from many days in UA-cam.
Thank you for making such fantastic videos!
You are a Gem, Sir. Keep it up. Thank you!
Thank you very much for the explanation sir, I searched the whole youtube to get this mathematical explanation!
Itna saare phle pdha kese bhaiya apne.....
Suprrrrr se uprrrrrr vla h, wowooooooooo...
Eagerly waiting for next video😁😉. Thank you so much sir for this🙏❤️
Have u achieved your dream now ? Please say yes
Made complex things so easy . i.e, CampusX
Boss tusi great ho ❤️ struggling straight from 1 months 🙃
This ML series is making me interested in maths of Machine Learning algorithms.
you are an amazing teacher. never saw with such good explanations on youtube. Love from pakistan
Superb content ,easily my semester
saviour at IIT Kanpur...Thanks Sir
Beautiful, luckily I know it before but awesome teaching skills
Really amazing video sir.
love you sir, with all respect.
GOD LEVEL TEACHING SKILL 💡💡
Thank you for this sir!
*->Transpose
y*(XB)=(XB)*Y can be prove after taking XB=Y(hat) and put them with their respective matrices.
y=[y1 y2 yn]
y(hat)=[y(h)1 y(h)2...y(n)]
putting into equation will prove both are same
Thank You Sir.
Sir you are the best teacher
No Doubt sir your teaching is so fantastic I am following your videos
Sir I have one doubt in the step where you do
X^TXB^T=y^TX
B^T=Y^TX(X^TX)^-1
but it shouldn't be B^T=(X^TX)^-1Y^TX basically inverse term must be pre multiplied
because we pre multiply by inverse term to cancel it out on left hand side and matrix multiplication is not commutative so we can't write it on other side
Please clear my doubt
Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?
34:18 shouldn't its differentiation be equal to (X^T)y which is transpose of X times y instead of transpose of y times X which is (y^T)X.
My master's prof. can't explain things better than you ! Thank you for making such awesome videos !
Thank you so much.
SIr... shouldnot after differentiation and reduction we will be left with yT=XTBT which again transposed gives y=XB and there fore B=X^-1y?
Wonderful sir
bosss that was awesome
Please make detail video on matrix diff.
Thank you sir 👍
best explain
Bahat badi baat bool di aj ap ne
0:30-->2:45 intuition behind MLR
43:20-->47:45 Why gradient descent is more effective as compared to OLS?
Doubt Sir ji 🙏
36:00
When you differentiate the matrix considering Y = A'XA
Your answer is 2XA^T ..while the answer should be dY/dA = 2XA not the transpose .. correct me if I am wrong plzz
1 no. sir
36:00 bhiya kindly upload video on matrix differentiation.
Sir actually I had a doubt, d/da of At*x*A is 2XA but you have written 2XA transpose, can you explain it?
Hi Nitish sir, while calculating the error function we used differenctiation to get the expression but in the very beginnning you said we don't use calculus for ols and for gradient descent we do but we used that in both so how it is closed form or non closed form? whats the concept I got it , but closed and non closed form how they're diff as we're doing differentiation in both of them ?
Thanks for these videos.
he used calculus to show how ols equation is formed from scratch. In ols machine use final equation to calculate best fit line but in gradient descent it use calculus to reach minima point.
@@spynom3070 thanks for this.
Loss function 1/2m se start hota hai sir ?
Matrix differentiation video please upload sir @campusx
37:23 sir i have a doubt if i multiply LHS by (X`T X)'-1 i will be left with B'T on LHS so therefore i need to multiply RHS also by same term right. But if i do so i'll get some other answer. why is this sir.
Awsome
@campusx sir Y(hat)=B0+B.X1+B2.X2......Bn.Xn tha to matrix me different element kaise ho gya????
Math guru as well as machine learning
@CampusX Can someone explain what would happen if the Inverse doesn't exist for that particular matrix in the last step(X^T.X)^-1 i.e. if the determinant is 0?
Very nice question but iska answer mujhe bhi nahi patha
The reason is simple. See, X is the matrix consisting of features. Now, there are 2 possibilities for non-existance of the inverse of (X^T.X)^-1; first one is X is a null matrix and hence X^T is also a null matrix; second possibility is X^T.X is a null matrix (but none of them is individually null). You can skip the first possibility because if feature matrix is null nobody cares about the problem. Coming to the 2nd possibility, X is a (nx1) and X^T is a (1xn) matrix; X^T.X will be a (1x1) matrix. Now even if some elements of X^T are negative , it will be multiplied with the same element of X ( Notice : ith element of the 1st row of X^T == ith element of the 1st column of X ) . Hence while multiplicating and adding the elements while performing X^T.X you will never come across any negative element. So, addition of all positive quantity will give you a positive (1x1) matrix. Hence, inverse of X^T.X will always exist.
matrix differentiation ki video kab aaegi
campusX > MIT
❤
Hell sir Do you have notes ??
ily!
🎉🎉🎉❤
The video is awesome. I have a doubt though. At 37:35 you premultiply the inverse on the LHS but post multiply on RHS. Isn't that wrong? Correct me if I am missing something
Yes correct ..The final value of beta should be: (( X transpose)Y)((X transpose)X)^-1
@@readbhagwatgeeta3810 thnxx
(AT)-1 = (A-1)T
using this formula you can prove the last part
[(XTX)-1]T = (XTX)-1
(best best best best ......best)^best
Day4
Date:12/1/24
sir , at 36:26 i think you used d/dA( A^TxA) = 2xA^T but its 2xA.... so i'm little confused about the last final result 🫤..only this thing else everything , you are great sir ...love your videos .
Actually na, the differentiation is (X+X^T).A^T. If X=X^T then it becomes 2xA^T.
13:09 matrix multiplication is wrong
When X is multiplied by B it should give a matrix of single column
While the matrix described earlier is of m coloums
I guess the Y hat(matrix before decomposing) should be equal sum of each columns as ine column so Y hat will be of n*1 order
Then it would be correct