The Learning With Errors problem

Поділитися
Вставка
  • Опубліковано 11 жов 2024

КОМЕНТАРІ • 3

  • @ThefamousMrcroissant
    @ThefamousMrcroissant Рік тому +1

    The example of LWE is marvelous however. Bit heavy on the rounded Gaussian maybe (which didn't feel like it was important enough to take up 1/3 of the entire lecture), but overall easy to grasp.
    Thanks for publishing this to youtube as well.

  • @aspidistrax_x2722
    @aspidistrax_x2722 7 місяців тому

    Thank you for ur awesome video. I only wish u added an example of asymmetric lwe in the end. Im a bit lost.

  • @ThefamousMrcroissant
    @ThefamousMrcroissant Рік тому +2

    This has to be the most confusing explanation of Gaussian elemination I've ever seen. For example in slide 4:28 what are the indices here? In step 1 they are (column, row), but in step 3 they're suddenly rows? This is one of the extremely few instances were Wikipedia is actually way more understandable than some introductory video.
    At around 6:13 you reduce row 1 by subtracting row's 2 coefficients from it twice, but how does that result in -2 when the residue class is already at 0 at that point?