Error at 15:32 Multiple linear regression: Y = B0 + B1*x1 + B2*x2 + ... + BP*xp + E Polynomial linear regression: Y = B0 + B1*x^1 + B2*x^2 + ... + BP*x^p + E
Right.. I was confused here too! Because linear regression must always have deg=1 as opposed to what has been taught in the lect, eq. cannot have a polynomial degree of 'p'
The best professor in machine learning.i like her teaching.i followed her from 2010 onwards.i had collected her lectures through CDs from 2010.i like her very much
Having 1/2 as a multiplicative factor does not change the solution as what minimizes z also minimizes 1/2 z. 1/2 is usually added so that the derivative formula has a constant coefficient of 1.
maybe i am missing somthing here but are these lessons meant to be a review or just an overview of what will be taught? beacuse its hard to understand this if learning this the first time without much examples?
You people explain those things in details which doesn't require explanation. And those things which do require explanation you skip them as if they even doesn't need expatiation.
You can see this for clarification. medium.com/@nicolabernini_63880/ml-what-is-the-difference-between-gradient-descent-and-stochastic-gradient-descent-be79ab450ef0
Anybody can help me why we need to assume that errors are independent to each other or mean as zero & has some standard deviation ? and as normally distributed ? please
Hello Prof the equations written on the blackboard are of polynomial regression but the slides contain equations of multivariate regression is it a mistake if it is please mention it in the annotation. if anyone knows the answer to my query respond to me freely. Thanks
It's for making the mathematics easier since you would have to differentiate the function later. (1/2 gets canceled with the 2 which you get from differentiation, making equations and stuff a lot cleaner.)
bro please help me, at 16:09 of this video she tells the equation of Y with square terms for p independent variable like in a polynomial regression but at the last section she tells for multi variables the equation is a linear function just like one for a multi variable regression, is this p independent variables are not multi variables?
please help me, at 16:09 of this video she tells the equation of Y with square terms for p independent variable like in a polynomial regression but at the last section she tells for multi variables the equation is a linear function just like one for a multi variable regression, is this p independent variables are not multi variables?
Kind of confusing lecture - switching from single variable regression example to multi-variable. All explanation is in a rush. I was hoping that the examples are well explained. Having 1/2 in the equation, is it for half theta? I heard this is not important. When you compare Stanford or MIT Online lectures, lots of improvements needed.
polynomial regression is a form of regression analysis in which the relationship between the independent variable x and the dependent variable y is modelled as an nth degree polynomial in x. y = b0+b1x+b2x^2+... and you are calling that as multiple independent variables which relates to multiple linear regression not polynomial
You are getting confused between Multivariate Linear Regression and Polynomial linear regression. The equation given here is not wrong. It is just a special case for multivariate, where every feature is a function of the 1st parameter. This is called as polynomial linear regression.
How is she explaining the non linear equation as linear one, the equation should be linear and she has end up with non linear one. the numbers suffix notation has been written as power. 16- 19th minutes of the video.
You are getting confused between Multivariate Linear Regression and Polynomial linear regression. The equation given here is not wrong. It is just a special case for multivariate, where every feature is a function of the 1st parameter. This is called as polynomial linear regression.
There are UA-cam channels that provide better lectures or explanations in a simplified form than these IIT professors . Too bad our Indian quality of education/teaching (or whatever you wanna call) needs to improve a lot!! :(
Sorry to say this maam , but this is wrong explanation of gradient descent algorithm and cost function. This tutorial was good till 25 minutes and after that it was total confusion.
Surprised to see.. writing from the chit still making the basic equation itself wrong. The multiple linear regression is wrongly written. Assumptions are just copied and not explained.
is this called an NPTL ....worst of time whatever the concepts need extra time you just skip them like anything.......i wont watch NPTL from now onwards ........why u r doing these NPTL certifications i don't know........and overall the title is not at all justified......we don't know how to learn the straight line using linear regression ...........don't watch and waste your time........find anyother resources.......
Error at 15:32
Multiple linear regression: Y = B0 + B1*x1 + B2*x2 + ... + BP*xp + E
Polynomial linear regression: Y = B0 + B1*x^1 + B2*x^2 + ... + BP*x^p + E
Right.. I was confused here too! Because linear regression must always have deg=1 as opposed to what has been taught in the lect, eq. cannot have a polynomial degree of 'p'
no , linearity is based on coeffiecents x terms in this case, not directly the x terms
Yes u r right
It's polynomial regression
If the degree of x is more than1 it represents non linearity
The best professor in machine learning.i like her teaching.i followed her from 2010 onwards.i had collected her lectures through CDs from 2010.i like her very much
Very well explained...........This is GOLD ❤
CORRECTION at 16:24
equation shall not have exponential degrees in power as it is for Polynomial Regression.
She is wonderful teacher, respect to you
Having 1/2 as a multiplicative factor does not change the solution as what minimizes z also minimizes 1/2 z. 1/2 is usually added so that the derivative formula has a constant coefficient of 1.
I am enjoying your courses. thanks
Machine Learning by Prof. Sudeshna Sarkar
Basics
1. Foundations of Machine Learning (ua-cam.com/video/BRMS3T11Cdw/v-deo.html)
2. Different Types of Learning (ua-cam.com/video/EWmCkVfPnJ8/v-deo.html)
3. Hypothesis Space and Inductive Bias (ua-cam.com/video/dYMCwxgl3vk/v-deo.html)
4. Evaluation and Cross-Validation (ua-cam.com/video/nYCAH8b5AQ0/v-deo.html)
5. Linear Regression (ua-cam.com/video/8PJ24SrQqy8/v-deo.html)
6. Introduction to Decision Trees (ua-cam.com/video/FuJVLsZYkuE/v-deo.html)
7. Learning Decision Trees (ua-cam.com/video/7SSAA1CE8Ng/v-deo.html)
8. Overfitting (ua-cam.com/video/y6SpA2Wuyt8/v-deo.html)
9. Python Exercise on Decision Tree and Linear Regression (ua-cam.com/video/lIBPIhB02_8/v-deo.html)
Recommendations and Similarity
10. k-Nearest Neighbours (ua-cam.com/video/PNglugooJUQ/v-deo.html)
11. Feature Selection (ua-cam.com/video/KTzXVnRlnw4/v-deo.html )
12. Feature Extraction (ua-cam.com/video/FwbXHY8KCUw/v-deo.html)
13. Collaborative Filtering (ua-cam.com/video/RVJV8VGa1ZY/v-deo.html)
14. Python Exercise on kNN and PCA (ua-cam.com/video/40B8D9OWUf0/v-deo.html)
Bayes
16. Baiyesian Learning (ua-cam.com/video/E3l26bTdtxI/v-deo.html)
17. Naive Bayes (ua-cam.com/video/5WCkrDI7VCs/v-deo.html)
18. Bayesian Network (ua-cam.com/video/480a_2jRdK0/v-deo.html)
19. Python Exercise on Naive Bayes (ua-cam.com/video/XkU09vE56Sg/v-deo.html)
Logistics Regession and SVM
20. Logistics Regression (ua-cam.com/video/CE03E80wbRE/v-deo.html)
21. Introduction to Support Vector Machine (ua-cam.com/video/gidJbK1gXmA/v-deo.html)
22. The Dual Formation (ua-cam.com/video/YOsrYl1JRrc/v-deo.html)
23. SVM Maximum Margin with Noise (ua-cam.com/video/WLhvjpoCPiY/v-deo.html)
24. Nonlinear SVM and Kernel Function (ua-cam.com/video/GcCG0PPV6cg/v-deo.html)
25. SVM Solution to the Dual Problem (ua-cam.com/video/Z0CtYBPR5sA/v-deo.html)
26. Python Exercise on SVM (ua-cam.com/video/w781X47Esj8/v-deo.html)
Neural Networks
27. Introduction to Neural Networks (ua-cam.com/video/zGQjh_JQZ7A/v-deo.html)
28. Multilayer Neural Network (ua-cam.com/video/hxpGzAb-pyc/v-deo.html)
29. Neural Network and Backpropagation Algorithm (ua-cam.com/video/T6WLIbOnkvQ/v-deo.html)
30. Deep Neural Network (ua-cam.com/video/pLPr4nJad4A/v-deo.html)
31. Python Exercise on Neural Networks (ua-cam.com/video/kTbY20xlrbA/v-deo.html)
Computational Learning Theory
32. Introduction to Computational Learning Theory (ua-cam.com/video/8hJ9V9-f2J8/v-deo.html)
33. Sample Complexity: Finite Hypothesis Space (ua-cam.com/video/nm4dYYP-SJs/v-deo.html)
34. VC Dimension (ua-cam.com/video/PVhhLKodQ7c/v-deo.html)
35. Introduction to Ensembles (ua-cam.com/video/nelJ3svz0_o/v-deo.html)
36. Bagging and Boosting (ua-cam.com/video/MRD67WgWonA/v-deo.html)
Clustering
37. Introduction to Clustering (ua-cam.com/video/CwjLMV52tzI/v-deo.html)
38. Kmeans Clustering (ua-cam.com/video/qg_M37WGKG8/v-deo.html)
39. Agglomerative Clustering (ua-cam.com/video/NCsHRMkDRE4/v-deo.html)
40. Python Exercise on means Clustering (ua-cam.com/video/qs7vES46Rq8/v-deo.html)
Tutorial I (ua-cam.com/video/uFydF-g-AJs/v-deo.html)
Tutorial II (ua-cam.com/video/M6HdKRu6Mrc/v-deo.html )
Tutorial III (ua-cam.com/video/Ui3h7xoE-AQ/v-deo.html)
Tutorial IV (ua-cam.com/video/3m7UJKxU-T8/v-deo.html)
Tutorial VI (ua-cam.com/video/b3Vm4zpGcJ4/v-deo.html)
Solution to Assignment 1 (ua-cam.com/video/qqlAeim0rKY/v-deo.html)
Great lecture
maybe i am missing somthing here but are these lessons meant to be a review or just an overview of what will be taught? beacuse its hard to understand this if learning this the first time without much examples?
You people explain those things in details which doesn't require explanation. And those things which do require explanation you skip them as if they even doesn't need expatiation.
they copied from andrew NG's lectures
How do you know ?
You can see this for clarification.
medium.com/@nicolabernini_63880/ml-what-is-the-difference-between-gradient-descent-and-stochastic-gradient-descent-be79ab450ef0
@@dipanjanbiswas4924 Is the Anderw NG father of ML ??? or the people who write the papers
@@subashchandrapakhrin3537 you can say that
Superb
The best professor!! I love your classes, thank you for your hard work.
Excellently covered the topic. Which textbook reference ma'am
Anybody can help me why we need to assume that errors are independent to each other or mean as zero & has some standard deviation ? and as normally distributed ? please
Hello Prof
the equations written on the blackboard are of polynomial regression but the slides contain equations of multivariate regression is it a mistake if it is please mention it in the annotation. if anyone knows the answer to my query respond to me freely.
Thanks
It was......she wrote polynomial equation.
Mam :- Excellence concept clarification
Why you have written polynomial regression equation in place of multiple linear regression. This seems a bad lecture. Not expected from IIT
Yes I too got stuck at this point in the lecture and started doubting my own knowledge
Why objective function is 1/2 of sum square error. If we have n data set it should be average so I guess it is 1/n of sum of square error
It's for making the mathematics easier since you would have to differentiate the function later. (1/2 gets canceled with the 2 which you get from differentiation, making equations and stuff a lot cleaner.)
i was good at Linear regression and after watching this lecture.. i forgot everything about Regression. lol ironic
wtf lol
Nice Video How to use #Linear_Regression in #Machine_Learning
bro please help me, at 16:09 of this video she tells the equation of Y with square terms for p independent variable like in a polynomial regression but at the last section she tells for multi variables the equation is a linear function just like one for a multi variable regression, is this p independent variables are not multi variables?
Sir/mam after completion of this course can i get any certificate
Please Reply me
gives a brief overview ,Thanks for your efforts
Error in multiple linear regression formula formula should be x=b1+b2x2+b3x3+...............+bpxp
please help me, at 16:09 of this video she tells the equation of Y with square terms for p independent variable like in a polynomial regression but at the last section she tells for multi variables the equation is a linear function just like one for a multi variable regression, is this p independent variables are not multi variables?
mam you teaches awesome but one thing that i suggest you can you improve your black board ,improve your camera so we can see clearly
Kind of confusing lecture - switching from single variable regression example to multi-variable. All explanation is in a rush. I was hoping that the examples are well explained. Having 1/2 in the equation, is it for half theta? I heard this is not important. When you compare Stanford or MIT Online lectures, lots of improvements needed.
for examples see tutorials
polynomial regression is a form of regression analysis in which the relationship between the independent variable x and the dependent variable y is modelled as an nth degree polynomial in x.
y = b0+b1x+b2x^2+...
and you are calling that as multiple independent variables which relates to multiple linear regression not polynomial
This is indeed a mistake , I presume. it should be y = b0 + b1x1 + b2x2 + ... + bpxp instead of rising to the power.
@@ashwinprasad5180 yes! i took me 1 day. i rhought IIK KGP teachers must be right .... then i found MIT Andre ng one and now all is sorted.. thanks
We started with question to find the parameter but never discussed on that....
That is what the algorithm called gradient descent does, which she wrote at the end. It finds the parameters such that it reduces the loss function
Shall we get any certificate after completion all video's
my professor lectured fucking 3h and i understood nothing but linear is straight.. here in half and hour i am really ready for the exam, thanks
Queen 👑
Amazing explanation
please explain the concept completely do not leave them in-between.
I think LMS is least mean square
Scope
Easy to interpret for statistics background..
Mam, you rushed towards the end of the lecture. The theory is more important as we have computers to do most of the calculations.
Copied J(theta) formula from Andrew Ng's module and didnt update the variables
andrew ng from coursera?
@@harisankar6104 yes
great lecture ma'am . Thank you so much and happy teacher day, Pronam niben.
28:12 wtf was that?It sounded alien like and hilarious 😂
bro i was searching for this comment lmao !!!!
@@mitrabb4812dude I am glad someone noticed that shit.It is insane.
yea man big LOL
To understand this video , I think people must know the linear algebra . Then only they can understand this concept.
The name is *Linear* Regression my man
Equation of multi-variable liner regression is wrong.
Swaroop Singh Deval yeah I think she misinterpreted the sub's as the powers
yes exactly
You are getting confused between Multivariate Linear Regression and Polynomial linear regression. The equation given here is not wrong. It is just a special case for multivariate, where every feature is a function of the 1st parameter. This is called as polynomial linear regression.
im very confused. and lost with these lectures
lms stands for least m,ean square not least minimum slope
Sorry for comma in between its mean not m,ean
Mam can i get certificate
How is she explaining the non linear equation as linear one, the equation should be linear and she has end up with non linear one. the numbers suffix notation has been written as power. 16- 19th minutes of the video.
directly skipped to LMS algo without explaining "how to learn the parameters " clearly ..! poor explanations !
Maybe split this lecture into two. It got really rushed at the end.
Now more confuse explanation is not good can any one share Good videos for linear regression with gradient descent
coursera's machine learning by stanford is good.
Ma'am please review your lectures before publishing, poor explanation and incorrect equation for multi linear regression.
You are getting confused between Multivariate Linear Regression and Polynomial linear regression. The equation given here is not wrong. It is just a special case for multivariate, where every feature is a function of the 1st parameter. This is called as polynomial linear regression.
Worst explanation of gradient descent in the world
Kuch samajh me nahi aaya
lol same with me
7:25
Things are not being clearly explained. Its really unclear or confusing...atleast that example must be taken completely to understand the concept
There are UA-cam channels that provide better lectures or explanations in a simplified form than these IIT professors .
Too bad our Indian quality of education/teaching (or whatever you wanna call) needs to improve a lot!! :(
please suggest me some good youtube channels .
@@sujitfulse8846 t series
@@sujitfulse8846 @brazzers
Nahi samaj mei aa Raha hai.....jo PPT mei hai use hi phir se explain kar Rahi hai madam
Sorry to say this maam , but this is wrong explanation of gradient descent algorithm and cost function. This tutorial was good till 25 minutes and after that it was total confusion.
Surprised to see.. writing from the chit still making the basic equation itself wrong. The multiple linear regression is wrongly written. Assumptions are just copied and not explained.
nptel teach us very badly.........
proper explanation should be provided, the teacher is just rushing without explaining the concepts, this is not good.
is this called an NPTL ....worst of time whatever the concepts need extra time you just skip them like anything.......i wont watch NPTL from now onwards ........why u r doing these NPTL certifications i don't know........and overall the title is not at all justified......we don't know how to learn the straight line using linear regression ...........don't watch and waste your time........find anyother resources.......
Can't imagine that Kids are preparing from there 8th standard to get into the IIT
and after getting into the IIT they will get this kind of lecture.
Sorry to say this! But poor explanations by IIT standards. LMS, Batch gradient descent, and Stochastic descent would require more explanation.
not good explanation at all.