Finn Eggers
Finn Eggers
  • 36
  • 298 726
Using NEAT to play Snake
I used NEAT to evolve a network to learn snake.
You can download the code including the libraries here:
github.com/Luecx/SnakeAI
Переглядів: 7 969

Відео

Using a Genetic Algorithm to learn a smaller version of BlockuDoku
Переглядів 6 тис.4 роки тому
AI Library: github.com/Luecx/AILibrary/blob/master/src/boids_model/BoidSwarm.java The game: github.com/Luecx/SudoBlock
Neat - Java Implementation 8 - Evolving
Переглядів 3,3 тис.5 років тому
source code: github.com/Luecx/NEAT/tree/master/vid 7 and 8/src
Neat - Java Implementation 9 - Optimization
Переглядів 2,6 тис.5 років тому
source code: github.com/Luecx/NEAT/tree/master/vid 9/src
Neat - Java Implementation 7 - Client and Species
Переглядів 2,4 тис.5 років тому
source code: github.com/Luecx/NEAT/tree/master/vid 7 and 8/src fixed Genome-Code: github.com/Luecx/NEAT/blob/master/vid 7 and 8/src/genome/Genome.java
NEAT - Java Implementation 6 - Calculating
Переглядів 2,6 тис.5 років тому
Source code: github.com/Luecx/NEAT/tree/master/vid 6/src
NEAT - Java Implementation 5 - Mutations
Переглядів 3,2 тис.5 років тому
Full source code: mega.nz/#!WjRWAAwA!JyFKkR_AJREKlyQz8ZQWK_mV2JSJbdxWrxIPd4Ughtw Only frame classses: mega.nz/#!jzRSQIyZ!n8_OTdO5krupybdbq2ww3_7eLhNwnUWREX0lF2Y9VGU add_sorted method: mega.nz/#!u7ZG1aZT!reTMP5CCAgxXBvRj6iFafA7S59a2fcO31-EgD-rReig
Neat - Java Implementation 4 - Distance function and crossover
Переглядів 3,7 тис.5 років тому
source code: mega.nz/#!W3RxGIKa!Ajx2r9hwYQc3fT-Pt0ntSnCI1raXCTqLjPieur5m_D8
Neat - Java Implementation 3
Переглядів 4,6 тис.5 років тому
Source Code: mega.nzmega.nz/#!22RFXSaI!BPk7-xoy7e-UlKHGEJ3h6vew95XjpvOaslon0FJpc7E
Neat - Java Implementation 2
Переглядів 5 тис.5 років тому
Source files: mega.nz/#!mjAkCKZZ!fOT1jgTjQb3NQMt9wFmRdDJNBAbbaYbYAAAAAAAAAAA
Neat - Java Implementation 1
Переглядів 11 тис.5 років тому
I am back after a while. Hope you enjoy it. Next videos are coming in the following days. Feel free to ask any question in the comments below. Code download: mega.nz/#!biAG1KrI!fOT1jgTjQb3NQMt9wFmRdDJNBAbbaYbYAAAAAAAAAAA
NEAT - Introduction
Переглядів 84 тис.6 років тому
Please give me some feedback. Again, my mic quality is not amazing but I hope you are fine with that. MarI/O: ua-cam.com/video/qv6UVOQ0F44/v-deo.html
Genetic algorithm - 4: Flappy bird
Переглядів 1,5 тис.6 років тому
Full code: www.mediafire.com/?179s7f7w36he3 Again, my mic just messed up a little bit. I am really sorry for that.
Genetic algorithm - 2: Implementation 1
Переглядів 9716 років тому
My mic just screwed up at the beginning. So NO, that's not me saying some weird words. Full code in the last video :)
Genetic algorithm - 3: Implementation 2
Переглядів 5966 років тому
Genetic algorithm - 3: Implementation 2
Genetic algorithm - 1: Introduction
Переглядів 2,8 тис.6 років тому
Genetic algorithm - 1: Introduction
Activation Functions - Softmax
Переглядів 36 тис.6 років тому
Activation Functions - Softmax
NN: The Problem of the vanishing gradient
Переглядів 8786 років тому
NN: The Problem of the vanishing gradient
NN - Activation Functions
Переглядів 1,8 тис.6 років тому
NN - Activation Functions
Neural networks library [Java] 6 - 3D (2D) MNIST
Переглядів 8546 років тому
Neural networks library [Java] 6 - 3D (2D) MNIST
Neural networks library [Java] 5 - Transformation Layer
Переглядів 3946 років тому
Neural networks library [Java] 5 - Transformation Layer
Neural networks library [Java] 4 - Dense Layer
Переглядів 6076 років тому
Neural networks library [Java] 4 - Dense Layer
Neural networks library [Java] 3 - Input and Output
Переглядів 5416 років тому
Neural networks library [Java] 3 - Input and Output
Neural networks library [Java] 2 - Layer construct
Переглядів 9326 років тому
Neural networks library [Java] 2 - Layer construct
Neural networks library [Java] 1 - Structure
Переглядів 1,6 тис.6 років тому
Neural networks library [Java] 1 - Structure
Neural networks tutorial: Fully Connected 11 [Java] - Some projects
Переглядів 5 тис.6 років тому
Neural networks tutorial: Fully Connected 11 [Java] - Some projects
Neural networks tutorial: Fully Connected 10 [Java] - Saving and loading
Переглядів 4,2 тис.6 років тому
Neural networks tutorial: Fully Connected 10 [Java] - Saving and loading
Neural networks tutorial: Fully Connected 9 [Java] - Mnist dataset
Переглядів 9 тис.7 років тому
Neural networks tutorial: Fully Connected 9 [Java] - Mnist dataset
Neural networks tutorial: Fully Connected 8 [Java] - Advanced learning
Переглядів 6 тис.7 років тому
Neural networks tutorial: Fully Connected 8 [Java] - Advanced learning
Neural networks tutorial: Fully Connected 7 [Java] - Backpropagation implementation
Переглядів 11 тис.7 років тому
Neural networks tutorial: Fully Connected 7 [Java] - Backpropagation implementation

КОМЕНТАРІ

  • @metlov
    @metlov 3 місяці тому

    Why not passing the average weight of the parents to the offspring. Doesn't it improves the network diversity over when we copy the gene of only one parent?

  • @liam9269
    @liam9269 4 місяці тому

    Hello Finn, im trying to implements pacman with NEAT and i rlly struggle with it . if u can help me i will very appreciate it and please let me know

  • @aligranett6355
    @aligranett6355 5 місяців тому

    dose anyone know why in the last slide he killed 6 with fitness 442 but did not kill 5 and 7?

  • @Dtomper
    @Dtomper 5 місяців тому

    THANK YOU. This presentation was AWESOME, I understood it very well, thank you thank you thank you so much

  • @chameleonchamlee2551
    @chameleonchamlee2551 9 місяців тому

    love you, needed this soo much!

  • @Dalroc
    @Dalroc 10 місяців тому

    Everything is great except for the part about speciation. And it's not because of the animation! You're trying to be specific while also refering to future videos for the details you're being specific about. Don't repeat it five times. Just say that you categorize the genomes by cerain method that you'll show later and you'd skip a lot of confusion, and time!

  • @IgorSantarek
    @IgorSantarek 10 місяців тому

    Wow, great idea with inputs to the neural network. I was looking for an info on how to do it!

    • @finneggers6612
      @finneggers6612 10 місяців тому

      i have worked on this again with a c++ project and better NEAT implementation and found abetter way. maybe of interest for you. instead of just doing that, create a space of like 7x7 around the snakes head. the impotrant part is to rotate that data bsaed on the direction of the snake. basically pretend that not the snake is changing directions but its always going up while the board rotates. this way the output is either left, forward or right and is relative to the pov of the snake. i got a prefect game with that.

  • @NameLast-wm5je
    @NameLast-wm5je 11 місяців тому

    pronounciation +dangit... You said "gõringurschaaan" instead of "derivation". In instances such as this you might be causing someone to waste days because they heard you say something incorrect. Annuuuuunciate please

    • @Klabauterking
      @Klabauterking 6 місяців тому

      Gotta aggree on this one, the pronounciation is rough! If you start making tutorials again, you might wanna check out, how natives pronounce words or simply ask DeepL or something :) Also, why don't you record chunks instead of the whole thing at once. This way you can explain small pieces and make them sound proper instead of getting confused with words and "ähhms" all the time.

  • @nuke_bird
    @nuke_bird 11 місяців тому

    I'm not sure why you're using ArrayList<T> and HashSet<T> at the same time. IMHO single HashSet<T> will work just fine

  • @okonkwo.ify18
    @okonkwo.ify18 Рік тому

    There’s no problem with sigmoid , all activation functions have their uses

  • @samuelcrawford8055
    @samuelcrawford8055 Рік тому

    Sorry, I'm having a little trouble in understanding how to ensure that the nodes have consistent ids. Should the function that creates them take as arguments the ids of the input and output nodes of the connection it is splitting? What about more complex structures with many hidden nodes that interact? But generally though, great video. Definitely earned a like and subscribe!

  • @nuriel3
    @nuriel3 Рік тому

    GREAT VIDEO ,Thank you !

  • @WAXenterprises
    @WAXenterprises Рік тому

    Great overview, thanks for this

  • @adonisssssssssssss
    @adonisssssssssssss Рік тому

    can you provide the whole code for this project... i am interested in rewriting it in C++

  • @o_2731
    @o_2731 Рік тому

    Thank you very much for this introduction, It was very helpful

  • @rafe_3d160
    @rafe_3d160 Рік тому

    Hi, ich schreibe meine Facharbeit über den Lernprozess künstlicher Intelligenzen mit NEAT. bei 12:13 erklärst du mutate_node, in den offiziellen NEAT docs finde ich jedoch nur mutate_add_node und mutate_delete_node. Gab es dahingehend Updates? Desweiteren finde ich gar keine Information in den docs zu den anderen Mutation, wie mutate_weight_shift. Sind diese möglicherweise geupdadet wurden im Namen oder wurden komplett ersetzt? Vielen Dank im voraus

  • @gijsvermeulen1685
    @gijsvermeulen1685 Рік тому

    Thanks a lot!

  • @vladislavchessmate1567
    @vladislavchessmate1567 Рік тому

    Finn, thanks for great video series about networks. Love your engine!

  • @minhtaihoang3020
    @minhtaihoang3020 Рік тому

    I found a problem that if you set initial weights random from 0 to 1, the network breaks. Because the sum output: weights * prevNeuron is so high that the output_derivative is approximately 0. Hence, delta will be 0 and weights and biases will not be updated.

  • @ProBarokis
    @ProBarokis Рік тому

    Hello. What resources did you use to learn NEA?. Have you only read the original paper or are there other great source to learn from?

  • @sebaaismail1951
    @sebaaismail1951 Рік тому

    Hello finn, i will strat learning neural network from this séries of vidéo, i know little about them, but i am good Java programmer and i had good background about ia, i worked before with aco ant colony algorithms.

  • @amir3645
    @amir3645 Рік тому

    thanks man

  • @teenspirit1
    @teenspirit1 Рік тому

    Ooooh only one representative client instead of comparing to every member of the species. Brilliant idea, thanks.

  • @filakabamba9584
    @filakabamba9584 Рік тому

    How can I do the following : Please help asap *1.* Make an Artificial Neural Network with dynamic input and binary out .... *2.* Make a Self Organizing Maps with dynamic input and binary out .... Use only C++ or Java

  • @ninek1902
    @ninek1902 Рік тому

    Hey, nice presentation! Could you please provide a link for formula that calculates genome distances, to sort genomes into species? Thanks!

    • @finneggers6612
      @finneggers6612 Рік тому

      This is actually a part that the original paper left pretty open. I did some further research and Also asked on stackexchange but I am unable to find it. I also dont remember the exact method but I think I am doing something like: 1: the distance of a genome to a species is the distance of the genome to the representative of the species which I consider the FIRST one to enter the species. For each genome g: Go through each existing species S If distance (g,s) < some threshold Add g to s Break If no species found: Create new species with g as the representative

  • @davidsimek1197
    @davidsimek1197 2 роки тому

    I love how voice and its volume always changes

  • @HarryBGamer2570
    @HarryBGamer2570 2 роки тому

    well, that's neat

  • @omkarbhale442
    @omkarbhale442 2 роки тому

    Hey, great series! Love it. One thing though, I think you got the batch training concept wrong. Batch training is done so the weights are not updated AS WE TRAIN EACH INPUT over the batch. We accumulate the delta weights, and update them all at the end of the batch. The reason we do this, is because our final AIM is to minimize cost (error) over all the dataset (which is hard, hence batches). If this is confusing, I'm not sure how to explain this, but you can check 3Blue1Brown's 4 video playlist on Neural networks. He explains this in 3rd or 4th vid I think. Good luck all

    • @finneggers6612
      @finneggers6612 2 роки тому

      you are correct. I am sorry for this. Has been long time ago and i realised this was very wrong :P

  • @bernardoolisan1010
    @bernardoolisan1010 2 роки тому

    Sorry for asking so much questions but this are the last ones... 1. How the selection of the species is made? how you score the results to then mutate that selection? 2. The nodes(neurons) what data contains?. Does it contains the weighted sum? like a normal neuron do? 3. How many types of NEAT algorithms are? 4. I read on the paper that there are some formulas you didn’t mention, what are those? like the fit formula, why those formula are useful?

  • @bernardoolisan1010
    @bernardoolisan1010 2 роки тому

    I have a question. In what does the Encoding helps us, the Neat Encoding Scheme helps us only to visualize as a Genetically form? or is there another use? can you explain it to me?

    • @finneggers6612
      @finneggers6612 2 роки тому

      the encoding is the principle of how genomes are compared which then serves for speciating.

  • @inanitas
    @inanitas 2 роки тому

    I noticed you have a lot of getters and setters. I would suggest looking at a project Lombok. You can generate getters and setters by using annotations. You can even generate constructors, that initialize final values by using @AllArgsConstructor annotation before the class or generate data classes. There are more things it can do but that's the basics :)

  • @ArMeD217
    @ArMeD217 2 роки тому

    The presentation was good, but I believe your explanation suffered from the lack of video editing. Some cuts could have made it all clearer and shorter.

  • @diegocassinera
    @diegocassinera 2 роки тому

    Awesome work. After evolve, purging old links nodes not used by any gnome improves performance quite a lot

  • @diegocassinera
    @diegocassinera 2 роки тому

    I got it working, the missing files are not needed. I also implemented a thread pool so rateClient can be parallelized. As for the game it self, I changed the inputs to 8 [0-6] are the number of free spaces in a direction (n, nw, w, sw, s, se, e, ne) plus [7] which is still the direction towards the food. A bit better results, than with just 4 inputs, and slower to evolve time. My best results I got was 109, starting with a gnome where all outputs are connected to all inputs and only weights and shift waits were allowed. Again, thank you for taking the time to post this.

  • @bribes_for_nouns
    @bribes_for_nouns 2 роки тому

    best explanation on the internet. i'm doing an ecosystem project right now and tried to look up and understand NEAT algorithms but they were all far too technical for me, but the way you explained it gave me hope that i can implement this step by step. thank you. right now i have a simple feedforward network with no hidden layers for each of the creatures (just input/output and the directions they move) i suppose the next step would be to make it so the network prototype has some type of static method to generate a new hidden neuron/connection. this is going to be tough. even still this type of genetic algorithm fused with NN topic interests me way more than gradient descent/backpropagation calculus, so i think this path will be worth it for me in the long run as i just find this topic so much more interesting it would be great if this algorithm could be optimized even further somehow

    • @finneggers6612
      @finneggers6612 2 роки тому

      this algorithm has been improved even further :) The algorithm is still the same although it can be applied to larger networks. Its called HyperNEAT. HyperNEAT does basically the same with the genomes although they represent different informations. I havent looked into HyperNEAT in depth but you may want to do that. It scales way better with larger networks.

    • @bribes_for_nouns
      @bribes_for_nouns 2 роки тому

      ​@@finneggers6612 i'll definitely check that out after i implement the simpler version first! question, i'm still in the beginning stages and i have a Neuron class and a Brain class configured. The Neuron can store a connection object, and the Brain uses static methods to generate the initial network and has an instance method to form a connection between two neurons in your instructions you mentioned the importance of the 'innovation number.' in my neuron's connection object, i have both an innovation number and a path value like [1-6] showing which neurons are connected. i'm having a hard time distinguishing why i would need both a path and innovation number. would just having the path stored be sufficient enough to check if a previous connection has been made in the global brain? or does innovation number/id play some type of important role later on since it just keeps incrementing over time? also, can connections only occur one layer up in this algorithm? meaning a neuron at layer 1 can only connect with a hidden neuron at layer 2, and not bypass and connect with a neuron at layer 3? the inputs start out connecting with the outputs initially directly with no hidden, but if a hidden is dynamically created do they have to go through the hidden to get to output layer 3?

    • @finneggers6612
      @finneggers6612 2 роки тому

      @@bribes_for_nouns innovation number plays an important role in computing the distance function between two genomes and sorting them into the same "species". Connections are basically just a computational path between two neurons. the connection itself does not give any information which connection is "older". With "older" i mean which connection has been created first. Generally older connections are differently weighted compared to newer weights. Thats why innovation number plays an important role. Also NEAT does not know anything about layers. neurons can be created by splitting connections. as far as i know, HyperNEAT works with layers.

  • @diegocassinera
    @diegocassinera 2 роки тому

    It looks like ion your while loop while walking the connections, after the if (in1 == in2) and before if(in1 > in2) your are missing an else. Otherwise when in1 == in2 index_g1 will be incremented twice. Once for the equality and once for not being greater. since booths if will be executed in each loop. Haven't finish watching the rest of the series, so the suspense its building up. Anyway great series.

  • @teenspirit1
    @teenspirit1 2 роки тому

    I love the work, I definitely want to watch through the series, I am especially confused about the "calculating" part because the random mutations cause cycles in my graphs. But why do you have jordan peterson in your NEAT playlist? I don't have a thing against the guy but it looks out of place.

    • @finneggers6612
      @finneggers6612 2 роки тому

      Yeah I had that problem too. I solved it by assigning an x coordinate. Input nodes had an x value of 0 and output nodes an x value of 1. I only allowed new connections from a node with a smaller x to one with a higher x. This solves this problem entirely

  • @hulohai
    @hulohai 2 роки тому

    Thank you!

  • @anthonyh694
    @anthonyh694 2 роки тому

    very good explanation

  • @l.-.l533
    @l.-.l533 2 роки тому

    I used NEAT to evolve my COC

  • @l.-.l533
    @l.-.l533 2 роки тому

    I suck at chess, and checkers

  • @l.-.l533
    @l.-.l533 2 роки тому

    Guys got nothing on me

  • @l.-.l533
    @l.-.l533 2 роки тому

    Losers

  • @optimusagha3553
    @optimusagha3553 2 роки тому

    Thanks, easy to follow👏🏾👏🏾

  • @yfrite
    @yfrite 2 роки тому

    Hi, I do not know if you are still alive, but judging by your SOF, still how =) Why did you decide not to use your NEAT implementation?

  • @shreevathsagp6789
    @shreevathsagp6789 2 роки тому

    this is amazing. you really are good at teaching these things. would love to see similar series on reinforcement learning methods like actor-critic or deep q-learning.

  • @stashmm
    @stashmm 2 роки тому

    thanks

  • @user-hf3fu2xt2j
    @user-hf3fu2xt2j 2 роки тому

    Why do you assigning zero to replacement index just right after getting a replacement index from neat? it's always zero in that case (0:40)

  • @Applestaffman
    @Applestaffman 2 роки тому

    Hi Finn, the code cannont be unzipped on my side. Is there any other other to access these codes?