Neural network learns the Mandelbrot [Part 2]

Поділитися
Вставка
  • Опубліковано 26 лют 2021
  • Hey all! This massive improvement in approximation was by the suggestion of reddit user u/jnbrrn, who suggested the use of fourier features as an input for the network. Essentially this means rather than giving the network an (x, y) coordinate, I am now giving it an an encoding of [cos(x), sin(x), cos(2x), sin(2x), cos(3x), ...] up to order 32 of the fourier series for both x and y coordinates.
    This improved the approximation dramatically and significantly reduced training time. This network was trained for 150 epochs, where it's loss finally plateaued regardless of the learning rate.
    The rest of the network is the same architecture as the previous attempt, though slightly smaller to fit on my gpu.
    Here is the video and paper that explain the theoretical basis behind this work:
    • Fourier Features Let N...
    arxiv.org/abs/2006.10739
    I believe r/jnbrrn is actually one of the co authors of this paper, so huge thanks to them for the reference!
    The Fourier series is another robust universal function approximator, and it's fascinating to see it work so well on this problem. Hooray for calculus!
    Code: github.com/MaxRobinsonTheGrea...
    Music: / user-82623269
  • Наука та технологія

КОМЕНТАРІ • 19

  • @timonix2
    @timonix2 2 роки тому +2

    fourier is very useful for neural networks. I have used it to preprocess images and sound. Also since addition in the frequency domain is multiplication the operations it can perform seem more powerful. It's also what your brain does with audio input. It does not view a waveform. The ear is a kind of analog fourier transform.

  • @pedroheck3667
    @pedroheck3667 2 роки тому

    Awesome! May I ask what tools did you use for the visual representation?

  • @hanyanglee9018
    @hanyanglee9018 2 роки тому +2

    I'm curious what data structure you used in this project. The nn seems to be trained to generate the vectors one by one and then via a rasterization algorithm to calculate the loss. Which forms the shaking of the edges. Is that the case?

  • @Vanikicraft
    @Vanikicraft Рік тому

    I feel like AI and fractals don't work well together right now but we'll find a way to make then cooperate and it's going to bring a huge step forward to science.
    Would be cool to create a neural network able to recognize the most similar fractal to an input image.

  • @erickmarin6147
    @erickmarin6147 2 роки тому +3

    Because of the nature of this dataset, you should try and see if Grokking emerges

    • @moonandstars1677
      @moonandstars1677 2 роки тому +2

      What the heck is Grokking?

    • @MiqelDotCom
      @MiqelDotCom 2 роки тому +1

      @@moonandstars1677 grok
      understand (something) intuitively or by empathy.
      empathize or communicate sympathetically; establish a rapport.

  • @EliSpizzichino
    @EliSpizzichino Рік тому

    Try to prompt Mandelbrot, Julia, fractals in SD, you'll get a more organic version of known fractals

  • @user-bb3kw8gx1n
    @user-bb3kw8gx1n Рік тому +1

    Are you able to derive the Mandelbrot algo at some point?

  • @Graverman
    @Graverman 2 роки тому +4

    looks near perfect to me

    • @kovacsattila8993
      @kovacsattila8993 2 роки тому +1

      near is subjective. If you compere something what has a real complexity and compare to something what has infinite complexity, then the difference always will be always infinite.

    • @Graverman
      @Graverman 2 роки тому

      @@kovacsattila8993 it is subjective but goal of this project was to train AI to do projection of Mandelbrot that looks good for humans not to create infinitely perfect Mandelbrot

    • @kovacsattila8993
      @kovacsattila8993 2 роки тому +1

      @@Graverman yeah, but it kind of pointless if the ai has no access to functions to able to generate his own infinite pattern. What is the point of giveing someone an inpossible task? It's like "can you please count to infinite for me"? Where are you now? I reached "36 354 at now". hmmm okey time is over, you not reached infinite but that's inpossible anyway, so you did well with just 36 354. If you correctly want to ask an ai to count to infinite the give him acces to, loops and building blocks or non real numbers to work with, you know what i am saying.

    • @Graverman
      @Graverman 2 роки тому

      @@kovacsattila8993 but the goal was to create good looking for human eye sets not perfect set, if you wanted an set 99.999999999999% accurate AI is not even the way to go and calculating this by math program is way more efficient

    • @kovacsattila8993
      @kovacsattila8993 2 роки тому +1

      @@Graverman Then why don't the ai use the math program in the first place?

  • @GinoGiotto
    @GinoGiotto Рік тому

    A smarter approach would be to train the AI with different zoomed sections of the Mandelbrot set and using stable diffusion to "guess" how it would look like if you zoom it more.

  • @axitc
    @axitc Рік тому

    this scares me for some reason...