Matrix Norms : Data Science Basics

Поділитися
Вставка
  • Опубліковано 21 чер 2020
  • What does it mean to take the norm of a matrix?
    Vector Norms Video:
    • Vector Norms
    Eigenvalues and Eigenvectors Video:
    • Eigenvalues & Eigenvec...

КОМЕНТАРІ • 50

  • @robertc6343
    @robertc6343 3 роки тому +7

    We’re saying “OK, matrix, you’re allowed to have...” 🤣🤣🤣🤣 so stressful for her 🤣🤣

  • @jtm8514
    @jtm8514 2 роки тому +6

    You make studying fun! Thank you so much, I loved watching this. It wasn't a chore after a bit, I was in bliss from how cool the math was.

  • @angelmcorrea1704
    @angelmcorrea1704 3 роки тому +11

    Thanks for all this lectures, very clear.

  • @emmanuelamankwaaadjei3051
    @emmanuelamankwaaadjei3051 2 роки тому

    Precise yet detailed explanation. Great work.

  • @hba3415
    @hba3415 2 роки тому

    I was struggling so hard and finding anything on the internet. Thank God I found your Video.

  • @sofiyavyshnya6723
    @sofiyavyshnya6723 3 роки тому +6

    Amazing video! Super brief and to the point and super clear! Thanks so much for all your help!

  • @florawang7603
    @florawang7603 3 роки тому +8

    Thank you this is the clearest video on matrix norms I've watched so far

    • @ritvikmath
      @ritvikmath  3 роки тому

      Wow, thank you!

    • @dansantner
      @dansantner 2 роки тому

      Agree. I have been watching Strang's lectures and he skips so many conceptual steps sometimes it's hard to follow. This filled in the gaps.

  • @jacob_dmn
    @jacob_dmn 3 роки тому

    This Channel Changed My way of thinking.. THANK YOU MAN

  • @tyflehd
    @tyflehd 3 роки тому +2

    Thank you for your clear and intuitive descriptions :)

  • @teodorvijiianu41
    @teodorvijiianu41 Рік тому

    I swere to god I watched the uni lecture 3 times i had no ideea what they were talking about. In less than 10 minutes it now makes sense. Thank you!

  • @haimteicherteicher4227
    @haimteicherteicher4227 3 роки тому +1

    very focused and clear explanation, much appreciated.

  • @kamelismail3730
    @kamelismail3730 3 роки тому

    this is an absolute gem!

  • @doxo9597
    @doxo9597 2 роки тому

    This was great, thank you!

  • @SeidelMatheus
    @SeidelMatheus 2 роки тому

    Great lesson!

  • @manhhungnguyen4270
    @manhhungnguyen4270 2 роки тому

    Thank you for the explanation. I have a question
    Does this spectral norm ( 2-norm) show the "size" of the matrix just like Frobenius norm?
    I know that 2 norm of matrix shows the maximum singular value of a matrix
    and the Frobenius norm shows the "size" of a matrix
    But I am confused when you use 2 norm to compare the matrices
    When I want to compare 2 matrices, which one is better to use?

  • @youlongding
    @youlongding 3 роки тому +2

    could you give me the detaied discussion of that property mentioned

  • @user-li1mb8gt2w
    @user-li1mb8gt2w 2 роки тому

    Thank u! I am just wondering if a non-square matrix has a norm as well. Why you give [ui ui...ui] the matrix rather than the nx1size matrix [ui]?

  • @pulkitnijhawan653
    @pulkitnijhawan653 3 роки тому +1

    Awesome intuitive explanation :)

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 3 роки тому

    Awesome and clear

  • @juneshgautam8655
    @juneshgautam8655 6 місяців тому

    4:22 Could you please provide me the information on how could I find a proof for that?

  • @EmilioGarcia_
    @EmilioGarcia_ 3 роки тому

    Hi I really enjoy your content! Quick question here, what you mean with '' bigger output "? Perhaps that with a matrix A you can span most of 'y' that belongs to R^m with the vector x that belongs to R^n ? Bit confused here, thanks for your help.

  • @srs.shashank
    @srs.shashank Рік тому

    This would be applicable for only square matrices right? How we calculate 2-norms for rectangular(non-square) matrices, Thanks!

  • @Posejdonkon
    @Posejdonkon Рік тому

    Applicable and accent-free study material. Greatly appreciated!

  • @sandrasyperek1945
    @sandrasyperek1945 Рік тому

    Great video. I like it. Concrete examples would have been nice. Thank you for making the video. :)

  • @user-bb8dr2nj6j
    @user-bb8dr2nj6j 3 роки тому +2

    Thank you bro! Huge Respect from Ukraine!

  • @KorayUlusan
    @KorayUlusan Рік тому

    10/10 video!

  • @sirengineer4780
    @sirengineer4780 2 роки тому

    Great ! keep on bro

  • @makslopl
    @makslopl Рік тому

    Thanks!

  • @potreschmotre1118
    @potreschmotre1118 2 роки тому

    thank you!

  • @Pukimaxim
    @Pukimaxim 3 роки тому

    What do you mean by decay when talking about negative 9 to the power of a large number?

    • @jako276
      @jako276 3 роки тому +1

      I believe that he said "point 9" meaning 0.9, and by decay he means that 0.9^2 would 0.81, 0.9^3 would be 0.729, thus "decaying" to zero as exponent approaches large numbers.

  • @noway4715
    @noway4715 3 роки тому

    Definition of matrix norm can have an example?

  • @fatihsarac662
    @fatihsarac662 9 місяців тому

    Perfect

  • @thomasheirbaut6612
    @thomasheirbaut6612 20 днів тому

    insanneeeeeee! Legend

  • @user-vg4lr1ip3u
    @user-vg4lr1ip3u 2 роки тому

    u r really cooool~~~~~

  • @maciejgarbacz8785
    @maciejgarbacz8785 Рік тому +1

    I think that this video does not give a good, human-like explanation. I have figured out this thing from other sources after some time and I will leave my attempt of explanation here.
    For vectors, you would usually want to know how big they are, based on their length. Applying the Pythagoras theorem, would give that ||v||2 = sqrt(x^2 + y^2 + z^2 + ...), which is considered a 2-norm (or Euclidean norm) and can be used to measure the length of a vector in an nD space. As it turns out, these are not the only ways of measuring length of vectors, as there are different norms (And a generalized formula for an n-norm), like 1-norm (Manhattan norm), which gives the total distance to walk in straight lines: ||v||1 = |x| + |y| + |z| + ... . Last important norm is the infinity-norm, which lightly speaking tells that if you were a chess king piece on a plane, how many moves would you have to perform to get to the tip of the vector: ||v||inf = max(x, y, z, ...). All of those n-norms have a visualization, which is just a graph which you obtain after assuming the length of a vector is 1 and transforming the given formulas.
    1 = sqrt(x^2 + y^2)
    1^2 = x^2 + y^2
    y^2 = 1 - x^2
    y = sqrt(1-x^2) or y = -sqrt(1-x^2)
    (Fun Fact: That is the equation for a circle, and you can technically integrate that to get its area to obtain the value of pi, which is what Newton did to calculate the precise value of pi for the first time in history - Watch "The Discovery That Transformed Pi" by Veritassium)
    As it turns out it is also useful to measure how big the output of a matrix can get! For example if the maximum length of an output vector would be zero, you would know that the matrix gives a 0 to every vector it gets, so it is useless in many cases. This is literally a matrix norm. The matrix 2-norm means that the length of a vector will be measured by the 2-norm method I explained below. So to get that matrix norm, just plug in every possible vector with length 1 and find the output vector with maximum length.
    I hope I have helped someone in despair. I was just really frustrated how everyone on the internet just reads out the formulas and hopes the viewer will memorize them without any understanding. If there is something wrong in my comment, don't hesitate to give a reply there!

  • @tanvirkaisar7245
    @tanvirkaisar7245 Рік тому

    could you please give me the detailed proof of ||A||||B||

  • @bartlmyy
    @bartlmyy 3 роки тому

    merci bisou

  • @cahitskttaramal3152
    @cahitskttaramal3152 Рік тому

    couldn't understand almost most of it but thanks anyway

  • @epsilonxyzt
    @epsilonxyzt 3 роки тому +2

    solve an example is better than so much talk.

  • @broda680
    @broda680 3 роки тому

    Did someone ever tell you that you look like kumar from the movie Harold and Kumar ? :D

    • @psychwolf7590
      @psychwolf7590 3 роки тому +2

      I was thinking of the same actor!! They are soo similar omg

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 3 роки тому +1

    The editing brings jumps and looks very uncomfortable.
    The course is superb!