what you explained for optimal classifier is wrong, Probable classification of the new instance is obtained by combining the predictions of all hypotheses, weighted by their posterior probabilities is the concept for optimal classifier. So the formula you used and explained for Optimal classifier is the formula and explanation of Naive bayes
all videos of this series have some type of mistakes and misguidance ( due to lack of knowledge) please don't watch this series whether you are watching for your University Exams or anything else, Try to read some standard books or follow some good knowledgeable person I am not criticising this teacher but the student should know that s/he is going wrong
UNIT - III Bayesian learning - Introduction, Bayes theorem, Bayes theorem and concept learning, Maximum Likelihood and least squared error hypotheses, maximum likelihood hypotheses for predicting probabilities, minimum description length principle, Bayes optimal classifier, Gibs algorithm, Naïve Bayes classifier, an example: learning to classify text, Bayesian belief networks, the EM algorithm. Computational learning theory - Introduction, probably learning an approximately correct hypothesis, sample complexity for finite hypothesis space, sample complexity for infinite hypothesis spaces, the mistake bound model of learning. Instance-Based Learning- Introduction, k-nearest neighbour algorithm, locally weighted regression, radial basis functions, case-based reasoning, remarks on lazy and eager learning.
Mam pls cover this topics our exam is on 23 april Convergence and local maxima Representation power of feed forward networks Hypothesis space sreach and inductive bias Hidden layer representation Generalization Overfitting Stopping criterion And an example - face recognition
THIS is NOT bayes optimal classifer, but is NAIVE BAYES classifier. I spent hours understand where I'm going wrong. Please check the formula before posting.
all videos of this series have some type of mistakes and misguidance ( due to lack of knowledge) please don't watch this series whether you are watching for your University Exams or anything else, Try to read some standard books or follow some good knowledgeable person I am not criticising this teacher but the student should know that s/he is going wrong
all videos of this series have some type of mistakes and misguidance ( due to lack of knowledge) please don't watch this series whether you are watching for your University Exams or anything else, Try to read some standard books or follow some good knowledgeable person I am not criticising this teacher but the student should know that s/he is going wrong
11:11 "ADHE"
what you explained for optimal classifier is wrong,
Probable classification of the new instance is obtained by combining the predictions of all hypotheses, weighted by their posterior probabilities is the concept for optimal classifier.
So the formula you used and explained for Optimal classifier is the formula and explanation of Naive bayes
all videos of this series have some type of mistakes and misguidance ( due to lack of knowledge) please don't watch this series whether you are watching for your University Exams or anything else,
Try to read some standard books or follow some good knowledgeable person
I am not criticising this teacher but the student should know that s/he is going wrong
exactly
UNIT - III
Bayesian learning - Introduction, Bayes theorem, Bayes theorem and concept learning, Maximum
Likelihood and least squared error hypotheses, maximum likelihood hypotheses for predicting
probabilities, minimum description length principle, Bayes optimal classifier, Gibs algorithm, Naïve
Bayes classifier, an example: learning to classify text, Bayesian belief networks, the EM algorithm.
Computational learning theory - Introduction, probably learning an approximately correct hypothesis,
sample complexity for finite hypothesis space, sample complexity for infinite hypothesis spaces, the
mistake bound model of learning.
Instance-Based Learning- Introduction, k-nearest neighbour algorithm, locally weighted regression,
radial basis functions, case-based reasoning, remarks on lazy and eager learning.
while adding 0.031+0.08571 how it will be 0.27
the result must be 0.117
Yeah yeah mistake
your explanation is exceptional but please change that intro music . it feels like I am watching a cartoon. hope you don't mind thank you
Maam we have Compiler Design exam on 21st August (jntuh) badly need notes, Bharat Institute of engineering and technology
Medam i need pdf of ml subject pdf send cheyyara please
Mam pls cover this topics our exam is on 23 april
Convergence and local maxima
Representation power of feed forward networks
Hypothesis space sreach and inductive bias
Hidden layer representation
Generalization
Overfitting
Stopping criterion
And an example - face recognition
Gibs algorithm 10:13
Maam pls complete the syllabus as soon as possible we have ml exam on 23rd .JNTUH
mam can i write this ans of bayes optimal and just this info of gibbs if it comes in the exam??
THIS is NOT bayes optimal classifer, but is NAIVE BAYES classifier. I spent hours understand where I'm going wrong. Please check the formula before posting.
Yes, this is not Bayes Optimal Classifier.
all videos of this series have some type of mistakes and misguidance ( due to lack of knowledge) please don't watch this series whether you are watching for your University Exams or anything else,
Try to read some standard books or follow some good knowledgeable person
I am not criticising this teacher but the student should know that s/he is going wrong
Make videos on Genetic algorithms also :)
Mam this is naives Bayes classier
Got the concept anf had to say your english fluency is amazing 😅 trying to catch up skills like you for my interviews !! Any tips
What if total is different in both cases
It’ll be same
Mam we have CD exam on 20/2/24 plz make playlist 2,3,4,5 chapters plzz mam❤️❤️
Thank you mam good explanation
we need more about gibs algorithm
Ma'am ig this concept is Naive bayes classifier. Because Bayes optimal classifier has slightly different concept. Please re-check the topics =)
yes bro ur right
Yeah bro , it is naive Bayes classifier concept.
yes bro
all videos of this series have some type of mistakes and misguidance ( due to lack of knowledge) please don't watch this series whether you are watching for your University Exams or anything else,
Try to read some standard books or follow some good knowledgeable person
I am not criticising this teacher but the student should know that s/he is going wrong
ML exam on 23-8-2021jntuh syllabus
Make video on t square likelihood ratio criterion with example
Mam ur native language???
Thank you mam for helping us out 😊😊
Hlo mam.Are u from ANITS ??Becos u r explaining the exact order of our syllabus and question paper question.
Really helped a lot . 🙏
Ooo9o in order of our ii ii our ownooooooooooo op no oop our own our oo
👍👍👍👍
Grib algorithm I didn’t get it
🙏❤❤❤❤
The total value you counted is wrong there .. 0.031+0.87= 0.11 something 😓
But apart from that great last minute explanations sis! Thank you!!
the value is 0.087
Maam can you teach cryptography,
Please do the needful
Tq
Already done I’ve completed the syllabus
Tqsm maam
Madam, this is not bayes optimal classifier, don't miss guide students, please remove this video.
U just make videos theoretically
Explain the concepts as well
Please explain Gibbs algorithm properly
Ma'am can you please speak little bit slower its too fast 🙂
Gibbs algorithm detailed ga explain cheyyandi
With an example
Yes sure definite ga chestha
@@TroubleFreevideos but till u didn't explained
Waiting...
wrong explanation, teacher is uneducated please read from books
Kk
please do not mislead, you are doing wrong things.
I’ll check it and thanks for correcting me
AKKA WRONG CHEPAV ASAL VIDEO MOTHAM TAPPEY PLS REMOVE THIS OTHERS WILL SEE UR VIDEO AND WRITE WRONG IN EXAM