Pattern Recognition and Application by Prof. P.K. Biswas,Department of Electronics & Communication Engineering,IIT Kharagpur.For more details on NPTEL visit nptel.ac.in
The question is old but someone may still be interested: It's not: The hyperplane dividing the classes has the same dimension of the feature space. If you see he is using a line (dimension = 2 ) to divide examples on a 2d space. If you want to classify a set of gym athletes as Tall or Short based only on their height you can plot the height values on a 1D space and use a point (1D boundary) that acts like a threshold and divides the classes Tall and Short. You just have to propagate this reasoning to a higher dimensional space
hyperplane will have the same dimension as vector space but the difference is that, if in vector space you have n independent variable then in hyperplane you will have n-1 independent variable.
Bayes Decision Theory 32:00
Probability of error 50:44
Watch from 32:00 for Bayesian Decision Theory.
Amazing lecture. Simply explained this difficult topic. Salute to you Sir.
Best video best teacher. lots of respect !
Graet Sir
Very helpful ,Thanks for uploading.
very well-explained!
Thank you very much professor.
Very helpful. Thank you very much.
very helpful sir.
At 51:12, shouldn't the vertical axis be labeled P(x|w) ? Otherwise, the points along the graphs at any given x would have to sum to 1.
can we say evidence that is (p(x)) can be considered as scalable factor for determining the class?
Isn't the dimension of the hyper plane one less than the dimension of the vector space defining the pattern ?
I agree with you
same doubt
The question is old but someone may still be interested:
It's not:
The hyperplane dividing the classes has the same dimension of the feature space.
If you see he is using a line (dimension = 2 ) to divide examples on a 2d space.
If you want to classify a set of gym athletes as Tall or Short based only on their height you can plot the height values on a 1D space and use a point (1D boundary) that acts like a threshold and divides the classes Tall and Short.
You just have to propagate this reasoning to a higher dimensional space
hyperplane will have the same dimension as vector space but the difference is that, if in vector space you have n independent variable then in hyperplane you will have n-1 independent variable.
Nice lecture on this topic but sound is very poor
watch in 1.25
Very low sound
Bayes decision theory 31.00
bhot zyada volume h kam kro thoda