- 7
- 153 822
Wang Zhiyang
Приєднався 23 тра 2014
SIAT-CIBS ROBOT Team Show
A short video introducing the team of SIAT-CIBS ROBOT for registering IROS competition.
Переглядів: 70
Відео
Andrew Ng Logistic Regression Newton's Method I
Переглядів 16 тис.10 років тому
This set of videos come from Andrew Ng's courses on Stanford OpenClassroom at openclassroom.stanford.edu/MainFolder/HomePage.php OpenClassroom is the predecessor of the famous MOOC platform Coursera. However, some of these videos are not published in Coursera Machine Learning course, i.e., Newton's Methods, Naive Bayes, etc. We selected some of them to share with you.
Andrew Ng Logistic Regression Newton's Method II
Переглядів 8 тис.10 років тому
This set of videos come from Andrew Ng's courses on Stanford OpenClassroom at openclassroom.stanford.edu/MainFolder/HomePage.php OpenClassroom is the predecessor of the famous MOOC platform Coursera. However, some of these videos are not published in Coursera Machine Learning course, i.e., Newton's Methods, Naive Bayes, etc. We selected some of them to share with you.
Andrew Ng Logistic Regression Gradient Descent vs Netwon's Method
Переглядів 8 тис.10 років тому
This set of videos come from Andrew Ng's courses on Stanford OpenClassroom at openclassroom.stanford.edu/MainFolder/HomePage.php OpenClassroom is the predecessor of the famous MOOC platform Coursera. However, some of these videos are not published in Coursera Machine Learning course, i.e., Newton's Methods, Naive Bayes, etc. We selected some of them to share with you.
Andrew Ng Naive Bayes Generative Learning Algorithms
Переглядів 85 тис.10 років тому
This set of videos come from Andrew Ng's courses on Stanford OpenClassroom at openclassroom.stanford.edu/MainFolder/HomePage.php OpenClassroom is the predecessor of the famous MOOC platform Coursera. However, some of these videos are not published in Coursera Machine Learning course, i.e., Newton's Methods, Naive Bayes, etc. We selected some of them to share with you.
Andrew Ng Naive Bayes Text Clasification
Переглядів 36 тис.10 років тому
This set of videos come from Andrew Ng's courses on Stanford OpenClassroom at openclassroom.stanford.edu/MainFolder/HomePage.php OpenClassroom is the predecessor of the famous MOOC platform Coursera. However, some of these videos are not published in Coursera Machine Learning course, i.e., Newton's Methods, Naive Bayes, etc. We selected some of them to share with you.
Andrew Ng Regularization Common Variations
Переглядів 71310 років тому
This set of videos come from Andrew Ng's courses on Stanford OpenClassroom at openclassroom.stanford.edu/MainFolder/HomePage.php OpenClassroom is the predecessor of the famous MOOC platform Coursera. However, some of these videos are not published in Coursera Machine Learning course, i.e., Newton's Methods, Naive Bayes, etc. We selected some of them to share with you.
Are you sure it is legal to share this video to youtube??
Which course is this?Have not seen this course
That is so clear and helpful!
<3
Don't Know how to thank Andrew for this <3 <3
The title is misleading, naive Bayes isn't included in the video.
5:14 "...take the most frequent N-words"
LOL He had to clarify that directly afterwards
how do we get P(x|y)? thanks
The voice can be higher and clearer. The explanation is good.
very clear concise and simple explanation
good video!
Thank you friend!
Thanks for sharing these!
Finally I got what Naive Bayes algorithm does, thank you so much :)
Andrew Ng is a magician.
perfect
perfect
Does anybody know where this video is from, it looks similar to his Machine Learning course on Coursera but it isn't there?
These videos are earlier than Coursera. So you cannot find them in Coursera. I have provided the link below the video.
Interesting and well explained. Thanks
Who are these people who disliked this video? Andrew Ng is an amazing teacher and anything he says is automatically absorbed by brain and understood for good. I am a fan of his teaching.
There will always be people that want to provoke or are angry for some totally unrelated reason (I have one such colleague, truth is not important for him). Most probably has nothing to do with the contents of this video.
dislike!!
yeah!!!
I think it is people who know that their jobs will be replaced by artificial intelligence :)
Andrew is the best teacher for LM. thanks for this posting
Andrew goes on to say that we fit a logistic regression line using gradient decent (at 1:35). Isn't this done using maximum likelihood ?
Yes, but in general there isn't a closed form solution for the coefficients of a logistic regression model (like there is in linear regression). In practice, we find the MLE using one of several numerical methods such as iteratively reweighted least squares. It's not wrong to use gradient descent, although it may not be the most efficient.
Oh, thanks for the clarification.
Where can we get other videos?
you can get this ,coursera ml course
the explanation is very clear.
I wish Andrew Ng would make some more videos and MOOCs on new material. He's the best teacher!
best ML prof I ever met
Thank you Prof. Ng, this video helped me a lot.
Awsome teacher, such a great and insightful explanation!
Excellent , I looked at several videos before finding this one. This is the best. It makes the topic very clear. I look forward to viewing your other videos.
good video. And where can i find the next video?
He talks about it in greater detail here cosmolearning.org/video-lectures/neural-networks-naive-bayes-support-vector-machines/
He is an amazing teacher. When I have trouble understanding any concept I listen to his explanation for the concept
LOVELY!!!
Waiting for another Coursera Course by Sir Andrew for Machine Learning..