Singular Value Decomposition and Regularization of Least Squares Problems

Поділитися
Вставка
  • Опубліковано 12 лип 2024

КОМЕНТАРІ • 5

  • @gwanyongjung
    @gwanyongjung Рік тому

    It was very helpful! Thanks!

  • @arbabhaider255
    @arbabhaider255 Рік тому +1

    At 12:24 you are taking out V and V_transpose out of the inverse, how can they be the same out of it?

    • @arbabhaider255
      @arbabhaider255 Рік тому

      Got it. Considering V.VT = I. Square matrix with rank R = N (AT.A is invertible)

    • @wokeclub1844
      @wokeclub1844 Рік тому

      @@arbabhaider255 Can you explain how he did that? I'm confused how the V and VT inside the inverse disappeared and especially how did the V on the right just magically shifts to the leftmost?

    • @kirbylover5418
      @kirbylover5418 Рік тому

      @@wokeclub1844 We can write it out like this ((here, I've used E for sigma and L for lambda to make my notation easier; I is the identity matrix):
      (V (E + L I ) V^T)^-1 = (V^T)^-1 (E + L I)^-1 V^-1 By matrix inverse rules
      (V (E + L I ) V^T)^-1 = V (E + L I)^-1 V^T As for singular value decomposition of symmetric matrices V^-1 = V^T
      Next, if we multiply the right side of the above by what's on the slides:
      V (E + L I) V^T V E U^T d = V (E + L I) E U^T d Since V^T V = V^-1 V = I so we can get rid of the terms
      Thus we've gotten to the equation on the slides :)