Mod-01 Lec-05 Bayes Decision Theory

Поділитися
Вставка
  • Опубліковано 1 чер 2014
  • Pattern Recognition and Application by Prof. P.K. Biswas,Department of Electronics & Communication Engineering,IIT Kharagpur.For more details on NPTEL visit nptel.ac.in

КОМЕНТАРІ • 23

  • @adityaanand81
    @adityaanand81 7 років тому +30

    Bayes Decision Theory 32:00
    Probability of error 50:44

  • @SheshadriMadhu
    @SheshadriMadhu 7 років тому +7

    Watch from 32:00 for Bayesian Decision Theory.

  • @sumitvaise5452
    @sumitvaise5452 4 роки тому +1

    Amazing lecture. Simply explained this difficult topic. Salute to you Sir.

  • @Sanyat100
    @Sanyat100 5 років тому +2

    Best video best teacher. lots of respect !

  • @nuwanatthanayake
    @nuwanatthanayake 2 роки тому

    Graet Sir

  • @RushikeshTade
    @RushikeshTade 9 років тому +1

    Very helpful ,Thanks for uploading.

  • @sam41619
    @sam41619 6 років тому

    very well-explained!

  • @UraratkK
    @UraratkK 9 років тому +1

    Thank you very much professor.

  • @Code-09
    @Code-09 4 роки тому

    Very helpful. Thank you very much.

  • @neomuks
    @neomuks 4 роки тому

    very helpful sir.

  • @ConsuelaPlaysRS
    @ConsuelaPlaysRS 6 років тому

    At 51:12, shouldn't the vertical axis be labeled P(x|w) ? Otherwise, the points along the graphs at any given x would have to sum to 1.

  • @YoungAthleticclub
    @YoungAthleticclub 5 років тому

    can we say evidence that is (p(x)) can be considered as scalable factor for determining the class?

  • @aanandakrishnan102
    @aanandakrishnan102 7 років тому +4

    Isn't the dimension of the hyper plane one less than the dimension of the vector space defining the pattern ?

    • @nicolagnecco
      @nicolagnecco 7 років тому

      I agree with you

    • @aishwaryagoel149
      @aishwaryagoel149 6 років тому

      same doubt

    • @domste
      @domste 5 років тому

      The question is old but someone may still be interested:
      It's not:
      The hyperplane dividing the classes has the same dimension of the feature space.
      If you see he is using a line (dimension = 2 ) to divide examples on a 2d space.
      If you want to classify a set of gym athletes as Tall or Short based only on their height you can plot the height values on a 1D space and use a point (1D boundary) that acts like a threshold and divides the classes Tall and Short.
      You just have to propagate this reasoning to a higher dimensional space

    • @namangupta2228
      @namangupta2228 5 років тому

      hyperplane will have the same dimension as vector space but the difference is that, if in vector space you have n independent variable then in hyperplane you will have n-1 independent variable.

  • @dr.neerajkumar4264
    @dr.neerajkumar4264 4 роки тому

    Nice lecture on this topic but sound is very poor

  • @vaibhavsharda9443
    @vaibhavsharda9443 6 років тому

    watch in 1.25

  • @chinmayajitm
    @chinmayajitm 6 років тому +1

    Very low sound

  • @chinnakotlasreenath8337
    @chinnakotlasreenath8337 3 роки тому

    Bayes decision theory 31.00

  • @KunalSaini97
    @KunalSaini97 4 роки тому

    bhot zyada volume h kam kro thoda