Tamara G. Kolda: "Tensor Decomposition"

Поділитися
Вставка
  • Опубліковано 29 сер 2024

КОМЕНТАРІ • 18

  • @KleineInii
    @KleineInii 4 роки тому +30

    What a great talk! I was watching tensor decomposition videos on youtube nearly the whole day, but no video was as good and clear as this one. I recommend everyone to watch this! :)

  • @rwfrench66GenX
    @rwfrench66GenX 3 роки тому +3

    This is a great video! Tamara is a very intelligent person! The thing is, at my previous job I spent 80% of my day verifying and cleaning the data I acquired before I could load it into my tables to do my analysis. External feeds would get truncated. People put information in the wrong field. System logic filtered out information it believed was duplicate information. People on a shared drive overwrote tables or changed criteria in queries I only had read only access to so I couldn't get the right information. File conversions caused data corruption. There were many issues that required attention before I could load the data into my own tables to run my own queries and do my own analysis. When you have errors like I mentioned you'll be running down all kinds of anomalies before you get to the real problems no matter how well your models are set up in your analysis. For example, the mice that were studied for two years. They received treats and their neurons were tracked for two years. Two years is a long time to study something. Was the same treat given for the full two years with the same ingredients? Did the mice develop any cognitive impairments during that time? Was there any change in the lab during that time like new paint or remodeling that might affect the sense of smell on the mice? I'm not going to ask if the data was collected at the same time, verified, and if it was input into the system correctly. There are many steps in research, measuring, collecting, inputting and analyzing and and misstep can throw things off downstream in a large way. Obviously you want your models to be as precise as possible, but you need your data to be correct too and in my experience in working with large datasets most data is flawed making most statistics flawed, but it's not the fault of the model.

  • @foximweb
    @foximweb 3 роки тому +4

    Great talk! Regarding the question at the end of the talk about tensor analysis for deep learning, one can look for the paper "On the Expressive Power of Deep Learning: A Tensor Analysis
    "

  • @blchen1
    @blchen1 4 місяці тому

    great lecture! And thanks for giving the credit to Jih-Jie Chang, a female pioneer who should also be remembered.

  • @camiloatencio3662
    @camiloatencio3662 3 роки тому +4

    I wish i could speak like this confidently about anything.

  • @jimmyxue2927
    @jimmyxue2927 3 роки тому +2

    a great talk about tensor decomposition

  • @Raf4le
    @Raf4le 3 роки тому +5

    Great talk ! Is there one where Mrs Kolda address the Tucker decomposition ?

    • @juliocardenas4485
      @juliocardenas4485 3 роки тому +4

      Do you mean Dr. Kolda ?

    • @ellielikesmath
      @ellielikesmath 7 місяців тому

      one of the questions after the talk was about tucker decomposition...

  • @-danR
    @-danR 3 роки тому +1

    14:47 Never thought I'd see the day when meeces would teach me something about tensor decomposition.

  • @user-hb7ro6ht7p
    @user-hb7ro6ht7p Рік тому

    Ela é demais.

  • @oleksiinag3150
    @oleksiinag3150 Рік тому

    Really cool
    !

  • @whyisitnowhuh8691
    @whyisitnowhuh8691 2 роки тому

    does each element value in each rank one tensor have meanings?

  • @whyisitnowhuh8691
    @whyisitnowhuh8691 2 роки тому

    5:42, instead of inner product, it should be outer product

  • @user-pi3nw9ru8g
    @user-pi3nw9ru8g 2 роки тому

    牛牛牛万谢万谢

  • @zhuyan2008
    @zhuyan2008 2 роки тому

    I saw a lot of good words on this video. But I got little insights from this video.

  • @warrengran8542
    @warrengran8542 5 років тому +2

    that slide with the 3 gears... lol