Garnet Chan "Matrix product states, DMRG, and tensor networks" (Part 1 of 2)

Поділитися
Вставка
  • Опубліковано 13 вер 2024

КОМЕНТАРІ • 23

  • @TheNaiveComposer
    @TheNaiveComposer 8 років тому +16

    This is the best introduction to Tensor Networks I've seen yet

    • @adwaitnaravane5285
      @adwaitnaravane5285 Рік тому +2

      Its so interesting to see a comment from 6 years ago when I didn't even know basic Physics. And now I am learning about Tensor networks for research.

    • @sohammukherjee7102
      @sohammukherjee7102 9 місяців тому

      @@adwaitnaravane5285 Hello sir. Me too

    • @sohammukherjee7102
      @sohammukherjee7102 9 місяців тому

      @@adwaitnaravane5285 Hello sir. Me too

    • @a.c.e7407
      @a.c.e7407 3 місяці тому

      @@adwaitnaravane5285 Greetings. Could I bother you with a few questions regarding MPOs please?

    • @adwaitnaravane5285
      @adwaitnaravane5285 3 місяці тому

      @@a.c.e7407 Yes sir

  • @yumingbai2023
    @yumingbai2023 4 роки тому +12

    This is a very good introduction to TN, but unfortunately there is some content in it that is not in focus and is vague.

  • @kingofthehill5950
    @kingofthehill5950 5 місяців тому +1

    Garnet Chan is the don of tensor networks

  • @popwittenino7411
    @popwittenino7411 4 роки тому +2

    I agree. This is THE best intro of tensor network. I knew for the first time that MPS is the representation of entaglement.

  • @pranaynayak
    @pranaynayak 2 роки тому +2

    the camera is getting of the focus everynow and then

  • @stephengibert4722
    @stephengibert4722 Рік тому +1

    My advice to audience: try waiting to ask questions until the poor man has actually said something!

  • @seyedhosseinmahdaei8782
    @seyedhosseinmahdaei8782 2 роки тому

    nice

  • @michaelfulciniti2622
    @michaelfulciniti2622 7 років тому +1

    I would like to share a revelation. Time is connected from one period to another by energy. This can be tested, although I have not proven, by asking someone on a video to raise their hand. I have been amazed very many times when they hear me in another time and actually react in kind. They do not however, have to do as you ask, so be weary. Good day.

  • @maxwellsdaemon7
    @maxwellsdaemon7 3 роки тому

    It was a good start, then he broke his own rule when he introduced the SVD. He wasn't careful (sloppy?) with the sigma_i and how it fits with the diagrams and matrix product he defined previously. Disappointing, although it is a free video after all.

    • @cea6770
      @cea6770 3 роки тому +3

      What 'rule' was broken? SVDs are used all the time when working with tensor networks.
      The presenter seemed about as careful with notation as any physicist would be so I don't get your point there.
      Also, what is your point about a 'free video'? Tensor network methods aren't really common enough to be done in classrooms, so they will be done at colloquia/seminars/tutorial talks at conferences, which are effectively free for participants because they are payed for by research institutes, and often uploaded on UA-cam these days.
      I feel like you're just being rude for the sake of being rude or edgy.

    • @maxwellsdaemon7
      @maxwellsdaemon7 3 роки тому +1

      @@cea6770 1. I am only just learning about tensor networks, so I don't know that they are used all the time in TN, although it makes sense that it does. He first presents about entanglement in the form psi^{nm}= explicit matrix product of A and B. Then in the SVD, psi^{nm} = sum over single index i of L_i sigma_i R_i. While this can be recast as a matrix product, he does not do so explicitly, which I thought breaks the consistency of the level of presentation he started with. He never it writes it as such. Also, in other lectures like that of G Vidal, psi^{nm}= explicit matrix product of A and B is really an approximation, not an identity, with the approximation becoming very good when the range of the internal index between A and B is large enough, something G Chan did not say explicitly in his talk (I could be wrong).
      2. I think this lecture is good, certainly useful and I saved this video in my Quantum Computing playlist. But we can agree to disagree on whether he could be more careful and thoughtful on the presentation of some concepts and notation.
      3. It is free, I don't have to pay to see the video on youtube, which I mean that there is no moral obligation for the lecture or video producer to make this better, and I don't have the right to demand more, but I do have the right/privilege to leave my comment/opinion, which I feel is constructive.
      4. You can feel that way, I don't doubt that, but I am not being rude or edgy or snarky, not at all. Since you took it personally (I have no idea why or who you are) your comment may be a reflection on how you think of such things and how you may have posted rude and edgy comments yourself, I suppose I don't think like you.

    • @cea6770
      @cea6770 3 роки тому +4

      @@maxwellsdaemon7 1. If the bond dimension of the internal index is allowed to be infinite, then this is exact not an approximation. You can always do a SVD. The approximation is when you impose a maximum limit on the bond dimension (in effect, truncating the smaller singular values). The validity of these approximations are related to 'structures of entanglement' in the states/operators we are interested in, typically called area laws of entanglement.
      2. Fine, I agree to disagree.
      3. I mean, G. Chan in all likelihood is not going to watch this video so don't know if it is particularly constructive. I just don't think anyone will 'explicitly pay' for a lecture on tensor networks.
      4. I hardly ever comment on videos. This probably

    • @lucymcclain1236
      @lucymcclain1236 2 роки тому +2

      Just watch the video carefully. The product state is for the ZERO entanglement, and SVD is used for general cases.

    • @a.c.e7407
      @a.c.e7407 3 місяці тому

      Apologies for being late to the party. Would you happen to know a good book on MPS which would build the understanding from the very scratch? Kind of like the Sheldon Axler book for linear algebra, love that book.