Ard Louis: Simplicity Bias in Machine Learning

Поділитися
Вставка
  • Опубліковано 25 жов 2024

КОМЕНТАРІ • 1

  • @PaulPukite
    @PaulPukite Рік тому +1

    Very good. Takeaway as related ideas -- Principle of Maximum Entropy gives Zipf's law scaling w/ density of states set by uncertainty of statistical moments such as mean & variance; idea of parsimony=Occam's; study Gell-Mann's complexity arguments; non-linear functions such as sinusoidal contain much fitting power as they are simply described but have a massive Taylor series expansion -- like layers of NN.