8.6 David Thompson (Part 6): Nonlinear Dimensionality Reduction: KPCA

Поділитися
Вставка

КОМЕНТАРІ • 19

  • @dorukhansergin9831
    @dorukhansergin9831 5 років тому +14

    Hi, the reference should be corrected as Tanenbaum et al., Science 22, 2000 not 2009. Took me a while to find. Great video and thanks a lot!

  • @swavekbu4959
    @swavekbu4959 8 місяців тому

    Outstanding speaker and communicator.

  • @godexolrv4906
    @godexolrv4906 3 роки тому

    Very well explain and presented ,
    really quite helpful

  • @swavekbu4959
    @swavekbu4959 8 місяців тому

    14:25 summarizes very well what KPCA is up to. It's identical to PCA, only that instead of weighting values of variables by corresponding eigenvector weights, the eigenvector weights are applied to the kernel of data points. Why apply to kernels instead of the original variables? So that we can benefit from the kernel trick, which means we can compute in the original dimensional space but basically figure out what's "going on" in higher-D even if we do not know the exact function in higher D. The kernel trick is not exclusive to KPCA but is also seen in techniques such as support vector machines and other kernel methods.

  • @matiassanchezgavier155
    @matiassanchezgavier155 3 роки тому

    Excelente, me encanto!

  • @kishkash8350
    @kishkash8350 2 роки тому +1

    Great lecture! thank you!
    I believe there's a small typo at 13:30 . In the bottom row, the subscript of the first x in the third addend should be j rather than i.

  • @khawlaallouche3082
    @khawlaallouche3082 5 років тому +4

    Thank you so help full

  • @user-sc4bu7fy1o
    @user-sc4bu7fy1o Рік тому

    Thank you very much!

  • @ltbd78
    @ltbd78 5 років тому +3

    Very well presented. Thank you.

  • @gnanadeepch5065
    @gnanadeepch5065 2 роки тому

    awesome!

  • @learnwithash12345
    @learnwithash12345 6 місяців тому

    Can you give the playlist where lectures are in sequence?

  • @ElChe-Ko
    @ElChe-Ko 4 роки тому +4

    How do you know beforhand that your dataset has a non-linear structure if you are a dimension higher than 3?

    • @sau002
      @sau002 3 роки тому +2

      I have never understood this. All examples that I come across are with toy datasets and that does not help me.

    • @TheSimslash
      @TheSimslash 3 роки тому +1

      @@sau002 Depends on what you want to do. Try linear approach and if it fails you might consider using non linear ones ?

    • @CursedByManga
      @CursedByManga 2 роки тому

      If your linear methods result in bad accuracy, you can try non linear methods. although you will not know for certain

    • @scholar7558
      @scholar7558 Рік тому +1

      I don't know if an answer after two years would help, but basically, apply linear PCA and plot it. If the linear PCA worked perfectly, then your data has a linear structure and vise versa.

    • @ElChe-Ko
      @ElChe-Ko Рік тому

      @@scholar7558 Ok thank you very much for the help! :) I really apreciate it

  • @jmtv1474
    @jmtv1474 5 років тому +3

    Poor pedagogy...