Neat AI does Recurrent Connections

Поділитися
Вставка
  • Опубліковано 28 лис 2024

КОМЕНТАРІ • 19

  • @okboing
    @okboing 3 роки тому +25

    this needs more recognition, I've never seen a NN loop back on itself.

    • @neatai6702
      @neatai6702  3 роки тому +13

      Thanks for that.. It does make a big difference for the asteroids pilot to know what it did on the previous move..

  • @Kraus-
    @Kraus- 3 роки тому +6

    It's getting pretty good at blasting asteroids.

    • @neatai6702
      @neatai6702  3 роки тому +2

      it is.. I ran it again yesterday for awhile with an improved ai and it just blasts away for hours..

  • @typicalhog
    @typicalhog 3 роки тому +5

    I'm not sure if you are using multiple mutatable activation functions or not, but I got another potentially interesting idea. Two new types of neurons. Integrator/accumulator neuron that would sum up the states into a "pool?" and a derivator/delta? (I really don't know all the math terms that could be applied here) a neuron that would return the change in its state's value instead of returning the state like a normal neuron. Another possibly emergent property could if derivator/delta? neuron fed its absolute output value into integrator/accumulator neuron. (Meaning the pair would essentially "collect" changes in values and you get something that would measure? volatility?). There could also be a decay factor to prevent the values in the accumulator to go to infinity. I might do a lil sketch in paint cause I'm really bad at explaining this.

    • @neatai6702
      @neatai6702  3 роки тому +5

      All great idea's.. please keep them coming.. I'll try and work them into future videos..

    • @typicalhog
      @typicalhog 3 роки тому +2

      @@neatai6702 I also tried to draw this to better explain it, but YT seemed to auto remove the link. ipfs . io/ipfs/QmS2ptefbYMGcLetEiiYzWibKNfBohDtJ4fYyU9kFpxJ4z?filename=2021_06_23_0rf_Kleki.png Hope this works. Also, maybe all this is completely useless, no idea, really.

  • @dough6081
    @dough6081 3 роки тому +6

    hi! just want to point out, before my tiny rant, that I love your videos!
    So, this isnt really how recurrent networks (or RNN) works.
    If you want to make a layer recurrent, you need to make a new hidden layer, with its own weights, and with its input being dependant on the layer you chose to make recurrent.
    its input: at the 0th state, the input is all 0, during all other iteration of your recurrent layer, its input is the output of your layer during the previous iteration.
    so: activation_func(w+w_from_recurrent_layer+b)
    if you are not using an alternative to backpropagation (here NEAT AI does use an alternative) you need to use BTT (backpropagation through time) to train the weights of the recurrent layer.

    • @neatai6702
      @neatai6702  3 роки тому +8

      Thanks for the comment ( and the rant !).. Fully agree with you on RNN's... All I'm doing here though is allowing a specific mutation which can add recurrent connections and seeing what the impact is.. maybe I have the naming wrong ?

  • @tomoki-v6o
    @tomoki-v6o 11 місяців тому

    Recurrent connections keeps a memory about the ordering of data ,
    so , to get a good performance on Xor data , the only thing you need to do is to shuffle the data randomly every iteration.

  • @captainjj7184
    @captainjj7184 3 місяці тому

    1:32 I'm gonna blew yalls mind. This is how life just flows as fractals and everything repeats as mimicry. Dunno who or what before us but now we're doing what it was doing like a domino effect and soon, we'll built another version of... us. This NEAT algorithm already prefers a solution that a lot of us as kids did in fighting games: that annoying yet effective, continuous bursts of endless low kicks! It's alive😅

  • @typicalhog
    @typicalhog 3 роки тому +4

    What language are you using? Also, what GFX lib? I'm sorry if you mentioned it already or if I asked you before, I may have forgotten.

  • @typicalhog
    @typicalhog 3 роки тому +3

    XOR is probably the simplest thing a non-recurrent network can learn to solve. The simplest problem I can think of that would benefit from recurrent connections might be counting to 10, or generating Fibonacci numbers? If you wanted to make a video that focuses on recurrent connections more, you could do that, or even a simple memory game. Let's say a 4x4 grid where we try to get the AI to find all the pairs with fewer tries than just choosing random cards.

  • @Dalroc
    @Dalroc 6 місяців тому

    Isn't it a bit weird to feed the recurrent connection into the input layer? I feel like it should be between different hidden layers or from the output layer to a hidden layer. The input layer shouldn't be messed with.

    • @FSckaff
      @FSckaff 6 днів тому

      The bias node isn’t really an input

  • @domc2909
    @domc2909 3 роки тому +1

    I'm not quite following. So the previous output of the node gets fed back to the input node and added to its current value? How many times is this done? Is it just one previous value each time or do they build up and average?
    Also if recurrent connections connect back to the input layer, then do input nodes also require activation functions? Usually they just take a scaled input and pass it on without any squashing function applied.

    • @neatai6702
      @neatai6702  3 роки тому +1

      When you're working out the total input sum to a node, it simply takes the outputs from the connections that terminate at that node and adds them up.. If the connection is coming from the output of node later in the network, its value won't have been updated yet as the input signal won't have reached it, so its using the 'old' value for that nodes output.. So its using what it did previously as an input for what to do now.. hence the memory effect of recurrent connections..
      Doesn't matter if its going back to an input layer node.. You might be scaling the input signals, but the recurrent connection component just gets added on. And the sum appears at the node output as there's no activation function for the input layer nodes..

  • @DavidGillespie1987
    @DavidGillespie1987 Рік тому

    Is there a place to see the code for this?

  • @jonathanwilson8809
    @jonathanwilson8809 3 роки тому

    Is that poly bridge music?