I Made an AI with just Redstone!

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 2,8 тис.

  • @mattbatwings
    @mattbatwings  6 місяців тому +657

    To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/mattbatwings
    You’ll also get 20% off an annual premium subscription.

    • @kraralmosawi7843
      @kraralmosawi7843 6 місяців тому +7

      ok man we got to address your genius 💀😭🙏

    • @maxtres764
      @maxtres764 6 місяців тому +2

      "Why should we try brilliant if we have you?"-me

    • @YSCU261
      @YSCU261 6 місяців тому +6

      i mean, neural networks, brilliant, it's all connected

    • @DieNow
      @DieNow 6 місяців тому +2

      Did you use a cnn to make the mnist image reduced and input the weights in a feed forward neural network?

    • @gameingroom5829
      @gameingroom5829 6 місяців тому +1

      Thanks I love brilliant

  • @CraftyMasterman
    @CraftyMasterman 6 місяців тому +12183

    if you guys think this is insane, it took this guy like 2 weeks to make this all start to finish this man is a MACHINE

  • @wiki2014
    @wiki2014 6 місяців тому +8627

    ChatGPT playing minecraft: ❌️
    Minecraft running ChatGPT: ✅️

    • @alibrahym
      @alibrahym 5 місяців тому +97

      Yeah bro they'll make a server, represent the internet, someone will then recreate chatgpt with redstone make it learn alot and people would be able to use it, but the problem is redstone is very slow, so they have to speed up the time so much, that it even responses in a "ok" time.

    • @Centorym
      @Centorym 5 місяців тому +41

      Someone NEEDS to make a chat GPT in minecraft, I Don't care if it uses command blocks, it would be so cool!

    • @tung-hsinliu861
      @tung-hsinliu861 5 місяців тому +91

      @@Centorym The GPT language models are so huge that, if we convert the whole model into redstone, the scale of the redstone machine will be so large that it will not even fit within render distance!
      For comparison chatGPT model size is somewhere about 10 million~10 billion times larger than the number-recognitoin model.
      Yeah I think command blocks is the only way to go, but even that the amount of command blocks would be monumental!
      And the labor of copying the entire model by hand...
      I think the conversion process has to be automated to be feasible

    • @Ari_Fudu
      @Ari_Fudu 5 місяців тому +20

      @@tung-hsinliu861 then we must settle for a very barebones version that has predetermined responses - although that'll be more of a magic 8ball ngl

    • @crispinotechgaming
      @crispinotechgaming 5 місяців тому +44

      ​@@Ari_Fudubut then it's not a neural network

  • @chaosinsurgency884
    @chaosinsurgency884 5 місяців тому +790

    Your transcript for college, internships, and future jobs in computer science is gonna be so stacked

    • @gryphonvalorant
      @gryphonvalorant 3 місяці тому +15

      nice pun

    • @tanawatjukmongkol2178
      @tanawatjukmongkol2178 3 місяці тому +22

      @@gryphonvalorant That's what we called "Stack overflow"
      BA DUM TSSS
      I'm manpage kinda guy lol

    • @SimoneBellomonte
      @SimoneBellomonte 2 місяці тому

      @@tanawatjukmongkol2178 Pfp (Profile Picture) and / or Banner Sauce (Source [Artist])? 🗿

    • @tanawatjukmongkol2178
      @tanawatjukmongkol2178 2 місяці тому +2

      @@SimoneBellomonte Murakami Shiina. I don't watch the anime, but it be funny having an anime profile carrying C programming book. I'm a great proponent of "Advanced Programming in the Unix Environment" though (not the book she's holding). It helped me going through the times when I had to write my own shell, and readline in C from scratch as a school project.

    • @Blue_Guy_The
      @Blue_Guy_The Місяць тому +4

      Imagine in an interview “do you have any achievements?” Then responds “I made an ai” “oh which one” “in Minecraft” “what”

  • @electricitysgaming5383
    @electricitysgaming5383 4 місяці тому +2390

    10 neurons? Bro just built me

    • @priyank5161
      @priyank5161 4 місяці тому +76

      Oh that means 10 of me = 1 of u

    • @Sphinxery101
      @Sphinxery101 4 місяці тому +23

      @@priyank5161Oh that means 10 of me = 1 of you (100 of original commenter’)

    • @priyank5161
      @priyank5161 4 місяці тому +28

      @@Sphinxery101 woah how u only have 1/10 of neuron?

    • @Sphinxery101
      @Sphinxery101 4 місяці тому +18

      @@priyank5161 si

    • @AlexanderVonish
      @AlexanderVonish 4 місяці тому +18

      @@priyank5161rip, 9/10 of their Neuron got paywalled.

  • @kindstranger3871
    @kindstranger3871 5 місяців тому +275

    I remember having my mind blown when I saw the first working computer in minecraft... the redstone was so enormous for the time. To see neural networks in minecraft a little over 10 years later is truly staggering. I'm no one of any real note but I just want you to know that you have impressed me and I am not easily impressed.

    • @Pwassoncru
      @Pwassoncru 2 місяці тому +4

      While I agree it’s impressive, a computer is much harder to build than basic neural networks (which are still very impressive).
      The hard part about ai has always been the training. The evaluation at the end is quite trivial and as he shown, is mainly basic multiplications and additions.

    • @j_evgenyyyy
      @j_evgenyyyy 2 місяці тому +1

      @Pwassoncru Imagine if you could train a model right in the game)

    • @gamer__dud10
      @gamer__dud10 Місяць тому

      ❤😊😊😊🎉🎉🎉😊❤🎉
      Truly truly i say to you all Jesus is the only one who can save you from eternal death. If you just put all your trust in Him, you will find eternal life. But, you may be ashamed by the World as He was. But don't worry, because the Kingdom of Heaven is at hand, and it's up to you to choose this world or That / Heaven or Hell.
      I say these things for it is written:
      "Go therefore and make disciples of all nations, baptizing them in the name of the Father and of the Son and of the Holy Spirit, *teaching them* to observe all that I have commanded you; and behold, I am with you always, even to the end of seasonal". Amen."
      -Jesus
      -Matthew 28:19-20

  • @NoahWolfe
    @NoahWolfe 6 місяців тому +517

    You solved a number of difficult problems elegantly, but your amazing ability to communicate those ideas both visually and with narrative ease really stands out. Fantastic piece of content my dude.

  • @Ierzi
    @Ierzi 6 місяців тому +3048

    This was 100% a brilliant partnership

  • @matercan5649
    @matercan5649 6 місяців тому +607

    The internet is such a cool place, imagine having a degree and choosing it to build real video games and software into minecraft and share it for a job, instead of actually building the video games and software, and making a living from that. The internet is so cool.

    • @Louis13XIII
      @Louis13XIII 6 місяців тому

      Gaming companies are so scummy and exploitative that honestly that's ain't really a bad deal after all

    • @VortexFlickens
      @VortexFlickens 5 місяців тому +7

      A forum for all ppl from stupid kids to Elon Musk

    • @watema3381
      @watema3381 5 місяців тому

      @@VortexFlickens Not much of a flattering comparison for stupid kids don't ya think?

    • @Esiv0_
      @Esiv0_ 5 місяців тому +23

      @@VortexFlickens you said stupid kids twice

    • @Meyer-gp7nq
      @Meyer-gp7nq 5 місяців тому

      Wow look at the stupid kids hating on Elon cause he’s successful. Someone made a joke, cope

  • @jakestrouse12
    @jakestrouse12 3 місяці тому +116

    In a couple years I wouldn’t be surprised to see a video from you about building a LLM in Minecraft

    • @ligma445
      @ligma445 2 місяці тому +2

      this is basically that tho, you would just have to make it bigger

    • @abhijitprajapati3764
      @abhijitprajapati3764 Місяць тому +1

      ​@@ligma445 Yea the core concepts are there but implementing things like transformers and lstms would be a whole different beast

  • @InfinityMind1
    @InfinityMind1 4 місяці тому +44

    I approached the topic years ago when I was doing a perceptron for playing tic tac toe. I had problems with keeping neurons and it's weight in a small enough size not to add too much of delay. Today it's solved by saving it as a signal strength in a barrel. It's such a genius thing that was impossible back in my day. Mine perceptron was five times bigger and had shit accuracy as I had to limit hidden layers (every weight and bias had to be saved in a separate RS NOR Latch bases 8-bit register plus every neutron had a 8 bit multiplicator and summator). Eventually I circled back to a rule based solution as the tic tac toe is simple enough to implement it in a smaller factor size, but it was deterministic and not really "very AI". I'm so proud and happy to see that quality redstone engineering is still alive and well and now you can do those things in a very nice and compact way.

  • @mmdts
    @mmdts 6 місяців тому +434

    In 16-bit logic, you can replace division by 15 by a multiplication by -30583 (32 bit result), three shifts, and two addition operations. You can easily figure this out by compiling a function that returns its 16-bit argument divided by 15 on clang with -O2, and what's efficient to do on silicon fabric (integers over floats, and multiplication over division) is almost always efficient in minecraft too.
    As for softmax, in 2021, researchers at nvidia created a hardware-efficient softmax replacement called "softermax" that is realistically implementable in minecraft.
    I'm not a minecraft expert, but I love seeing hardware implementations of functions, and minecraft is no exception.

    • @law1337
      @law1337 5 місяців тому +22

      Just because a function is hardware-efficient doesn't necessarily mean it can be easily or efficiently implemented in Minecraft, but it's an interesting point.

    • @LtDan-fy7lc
      @LtDan-fy7lc 5 місяців тому +12

      @@law1337 "what's efficient to do on silicon ... is almost always efficient in Minecraft too."
      Java: *raises eyebrow*

    • @DawnshieId
      @DawnshieId 4 місяці тому +2

      I liked this because I'm curious. ☺️

    • @mmdts
      @mmdts 3 місяці тому +2

      @@law1337 Minecraft logic speed relies on distance traveled by redstone for most cases. There are isntant rail-based propagation systems, but they require far more real estate, and all gates ultimately suffer redstone tick delay. In short, builds with less gates are more efficient because they take less real-estate and less real-estate translates to faster and more compact circuits. Similarly, in sillicon fabric, builds with less gates are more efficient because they require less capacitance delay, allowing shorter clock cycles for the synchronous logic, and because they consume less power.
      So regardless of the underlying reasons, gate count per logic achieved is an efficiency metric that's equivalent in both minecraft and hardware - and multiplying instead of dividing uses less gates for the same results, yielding higher efficiency in both.
      I just assumed the audience reading the comment are aware of the underlying reasons, which was a mistake on my part. I hope the explanation is correct and sufficient.

    • @mmdts
      @mmdts 3 місяці тому

      @@LtDan-fy7lc Minecraft redstone efficiency is bound to redstone ticks, rather than the underlying Java implementation of minecraft. It'll consume less redstone ticks if it has less gates, which is an equivalent efficiency metric to hardware.
      I hope I'm making sense.

  • @giosee_
    @giosee_ 5 місяців тому +506

    the ONLY person on youtube that managed to explain neural networks in seconds, it took me days of research to understand them, be able to make and explain them

    • @TheMochaMadness
      @TheMochaMadness 5 місяців тому +18

      Imma be perfectly honest I still ain't understand

    • @giosee_
      @giosee_ 5 місяців тому +10

      @@TheMochaMadness skill issue 😔

    • @TheMochaMadness
      @TheMochaMadness 5 місяців тому +4

      @@giosee_ zoinks 😔

    • @Shadowfury22
      @Shadowfury22 5 місяців тому

      @@TheMochaMadness If a particular pixel on the input is lighted up, chances are you can make a list of numbers that could have that pixel included in their final drawing, as well as a list of numbers that are very unlikely to have that one included in theirs. If you combine all of these lists from each pixel on the input, then you can get on the output how likely it is that each of the numbers was the one actually drawn.
      Everything else (hidden layers, weights, biases, etc) is just an algorithmic way to process and combine the "lists" of information made in an ingenious manner that allows you to automatically pre-generate the lists (a.k.a. get the values for weights and biases) by "training" the network beforehand (which in reality is as simple as than taking every possible final drawing to begin with, looking at the pixels that are lighted up in each of them and storing that information).

    • @gpt-jcommentbot4759
      @gpt-jcommentbot4759 5 місяців тому +1

      @@TheMochaMadness its multiplying a bunch of numbers (the input pixels) with a bunch of set values (the weights), then adding a bias (should the neuron be biased towards negative or positive activation) then adding a nonlinearity (any function which cannot be plotted as a single line)

  • @Zoetec13B
    @Zoetec13B 5 місяців тому +1082

    Bro, people out there creating neural networks in Minecraft, and I'm struggeling opening a chocalate bar while watchin them

    • @Spiinosauro
      @Spiinosauro 5 місяців тому +18

      Bruh

    • @opticalreticle
      @opticalreticle 4 місяці тому +10

      I dont see how those two actions compare

    • @haybail7618
      @haybail7618 4 місяці тому +48

      if it makes you feel any better most really advanced ai robots really struggle opening a chocolate bar as well

    • @GamerBoy22334
      @GamerBoy22334 4 місяці тому +8

      Me struggling to open FUCKING CHIP BAG (:

    • @mateshpl6552
      @mateshpl6552 4 місяці тому +12

      I still have a small wound on my finger after trying to open a water bottle

  • @KingKaleb77
    @KingKaleb77 5 місяців тому +155

    2024: Neural Networks in Minecraft
    2027: Sentient AI in minecraft

    • @lunyxappocalypse7071
      @lunyxappocalypse7071 4 місяці тому +3

      Hmm, not that early. Even when it comes to hard, general purpose AI, we are not nearly there yet.

    • @SimoneBellomonte
      @SimoneBellomonte 2 місяці тому

      The problem with porting stuff to minecraft is that redstone built-in delays make everything a whole lot less efficient, a sentient AI in minecraft thats the size of Chat-GPT 4 or l8r (millions of times bigger than this 1 in theishre video, btw i think) wouldnt even be able to fit in render distance, but good joke nonetheless albeit jushtre a tadre lil’ bitre too predictreable. 🗿

    • @theminecraftedplayer
      @theminecraftedplayer 24 дні тому

      More like ChatGPT

  • @coltith7356
    @coltith7356 5 місяців тому +14

    That's super cool ! I like that you explained the difficulties you had and how you overcame them, makes everything less mystical and really helps understand why you do what you do

  • @lolmom5004
    @lolmom5004 6 місяців тому +1588

    my brother in christ, IT TOOK ME TWO MONTHS TO MAKE A NETWORK FROM SCRATCH THAT SOVLED THE MNIST DATASET IN PYTHON AND YOU DID IT IN REDSTONE IN 2 WEEKS, i applaude you, you redstone genius

    • @WoolyCow
      @WoolyCow 6 місяців тому +76

      lol there is a video i love of some bloke just writing it in like half an hour :> watching it is great way to lose confidence in your abilities

    • @GustvandeWal
      @GustvandeWal 6 місяців тому +3

      ​@@WoolyCow Link?

    • @WoolyCow
      @WoolyCow 6 місяців тому +45

      @@GustvandeWal yt doesn't play nice with links, but its called "Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math)"

    • @GustvandeWal
      @GustvandeWal 6 місяців тому +4

      @@WoolyCow Thx!
      (Most copy the part after /watch?v= 🙂)

    • @WoolyCow
      @WoolyCow 5 місяців тому

      @@GustvandeWal oh lol i shouldve thought of that! thanks for the tip :D

  • @viaJustin1910
    @viaJustin1910 5 місяців тому +109

    This is such a good demonstration that every hard problem is just a ton of smaller easier problems.

    • @glowerworm
      @glowerworm 5 місяців тому +6

      This is also a good demonstration that there is always someone out there smarter than you could ever be lol

    • @RealLifeQuirks
      @RealLifeQuirks 3 місяці тому

      Which is ironically exactly how neural networks work

  • @RedRedstoneCat
    @RedRedstoneCat 6 місяців тому +1038

    I’m struggling on a 2x2 this dudes making a Neural Network.

    • @MrFiveHimself
      @MrFiveHimself 6 місяців тому +42

      dont worry dude! it just takes time! You should watch his logical redstone reloaded series. (both new and old). they’re really helpful in understanding how computational redstone works. After that, just try to make an ALU. Its an amazing starting goal and once you’ve made your own, you can confidently say you’re proficient. I wish you luck on your journey

    • @takyc7883
      @takyc7883 6 місяців тому +101

      its 4

    • @Asheetanshu
      @Asheetanshu 6 місяців тому +11

      ​@@takyc7883he is talking about door

    • @MrFiveHimself
      @MrFiveHimself 5 місяців тому +46

      @@takyc7883 god damnit i laughed way too hard at that

    • @nynvib276
      @nynvib276 5 місяців тому +4

      ​@@MrFiveHimself That's assuming the commenter is not on bedrock.

  • @taffetaarcher7888
    @taffetaarcher7888 3 місяці тому +16

    Bet bro is gonna make the observable universe with redstone next

  • @Henzoid
    @Henzoid 4 місяці тому +4

    That was inCREDIBLE. I'm floored. Not just by the redstone prowess but also by the ingenuity to be able to dissect these concepts and then rebuild them from scratch. Seriously impressive.

  • @flameofthephoenix8395
    @flameofthephoenix8395 6 місяців тому +428

    14:19 Exponentiation is pretty simple, just convert the exponent to a binary number, then for each bit that is turned on you add the corresponding exponent, and to get the list of corresponding exponents you just start with the number you're raising to the power of the exponent and multiply by two each step. Here's an example, if you have 5^7 then it will convert 7 to binary which is 111 then it will multiply 5, 25, and 625 to get 78,125 which is the correct answer.

    • @skaleee1207
      @skaleee1207 6 місяців тому +88

      Also known as Square-And-Multiply algorithm

    • @flameofthephoenix8395
      @flameofthephoenix8395 6 місяців тому +67

      @@skaleee1207 Nice! I didn't know its official name. Originally, I thought I was the first person to come up with it, I remember being quite proud of it, later on I learned that it already existed, but I didn't know the name until now! That name is a lot simpler than my explanation and will allow people to find more information on it too, thanks!

    • @sebastiangrau8409
      @sebastiangrau8409 5 місяців тому +26

      This is an exponential with eulers number. Any output would be irrational and very messy. I understand why he would avoid this.

    • @antarctic214
      @antarctic214 5 місяців тому +10

      You could do it with base 2 (or 4), its just changing the "temperature". In that case exponentiation is trivial (bitshift). But you still have to do division.

    • @Rudxain
      @Rudxain 5 місяців тому

      That's like shift-and-add but for exp instead of mul

  • @puppypalice
    @puppypalice 6 місяців тому +1682

    We’re getting to the point where pretty soon someone is gonna recreate the nes in minecraft, or make doom in minecraft, im betting that within 10 years someone will get either doom or super Mario bros or the legend of Zelda running just off redstone

    • @thisflyingpotato4227
      @thisflyingpotato4227 6 місяців тому +297

      Idk about other games but doom already exist, someone ran it on his redstone computer (I believe it was called IRIS) I'll get back and edit this comment with the code of the video
      (edit) _SvLXy74Jr4 Also I have no idea if this has been done before

    • @frkieran
      @frkieran 6 місяців тому +21

      such an original comment

    • @proceduralism376
      @proceduralism376 6 місяців тому +54

      Modpunchtree already ran doom on his cpu iris you can look up the video

    • @feeries8208
      @feeries8208 6 місяців тому +34

      @@proceduralism376 yeah and its only 28~32s for each frame

    • @adryanlucas096
      @adryanlucas096 6 місяців тому +5

      A NES Emulator in minecraft would be CRAZY

  • @Knarfy
    @Knarfy 5 місяців тому +249

    I will likely never fully understand these videos, but man are they impressive 👏
    Incredible work! My brain is fried

    • @Centorym
      @Centorym 5 місяців тому +5

      ive never seen people not reply to a famous youtuber lol

    • @Flupus
      @Flupus 5 місяців тому

      Hi knarfy

    • @Flupus
      @Flupus 5 місяців тому +1

      Are you gonna be doing "Breaking a neural network with your dumb ideas"?

    • @ThatGuyNyan
      @ThatGuyNyan 5 місяців тому

      Fried brain 🤤

    • @Centorym
      @Centorym 5 місяців тому

      @@ThatGuyNyan run knarfy RUN before this guy makes a 3 course meal from you

  • @holthuizenoemoet591
    @holthuizenoemoet591 5 місяців тому +8

    So a really cool detail is how you handle the floating point limitation, this is actually really close to some quantitation solutions, look at the paper : "The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits" if you have the time, you might find further optimizations there

  • @bananabroshsid8234
    @bananabroshsid8234 2 місяці тому +4

    this was the first time i was confused by the redstone and not the actual mechanics of the build

  • @pegasaurisrex9707
    @pegasaurisrex9707 5 місяців тому +76

    I just did a machine learning course last semester, and your 2 minute explanation for an MLP network was way easier to understand than our textbooks chapter that covered it. This entire build is insane, amazing work!

  • @TheKikou18
    @TheKikou18 6 місяців тому +201

    Actually you only need to be continuous for training, for deployment you can drastically decrease the precision
    Without losing accuracy, if you do it right
    There's a paper where they reduce it all the way to one bit per neuron, which is a perfect fit for minecraft
    (And I'm pretty sure also to 4 bits, which would fit signal strength applications)

    • @AgamSama-u2t
      @AgamSama-u2t 6 місяців тому +34

      quantization baby

    • @MilkGlue-xg5vj
      @MilkGlue-xg5vj 6 місяців тому

      ​​@@AgamSama-u2t Imagine getting a binary quantization good at mnist lol

    • @whatisrokosbasilisk80
      @whatisrokosbasilisk80 6 місяців тому +14

      Even for training, you can use quantization-aware or non-differentiable methods and meet parity on inference during training.

    • @MilkGlue-xg5vj
      @MilkGlue-xg5vj 6 місяців тому

      @@whatisrokosbasilisk80 That's what I'm talking about

    • @MrSonny6155
      @MrSonny6155 5 місяців тому +1

      I'm guessing this is the BNN paper by Courbariaux et al. from 2016? I'm skimming through the claims and it's insane what quantization can theoretically do.

  • @bens8419
    @bens8419 6 місяців тому +198

    It’s always a good day when a mattbatwings Video is on my recommended

    • @CubeXC
      @CubeXC 6 місяців тому +3

      Brp you could not get recommended this before premiere

    • @user-Herobro
      @user-Herobro 6 місяців тому +2

      Same

    • @qtpaulie
      @qtpaulie 6 місяців тому +2

      @@CubeXC you can. before a premire starts, it can be recommended

  • @kevnar
    @kevnar 4 місяці тому +3

    Create a mod that lets you put a redstone build inside a single block. This block would have an input and an output, with the guts of it being shrunk down in a smaller dimension inside the block. Clicking on it takes you inside the block where you can build (or paste in) the redstone circuitry and connect it all to the output. Then, of course, you could have these circuits nested inside of each other so you could further compartmentalize the functions. This would compact these massive builds into a few blocks. Full functional encapsulation. Imagine the possibilities.

    • @cosmic4453
      @cosmic4453 3 місяці тому

      and you just keep the redstones process as code rather then actually running the graphics of the redstone in the block, also you would have input blocks that take a input and transfers to the other out put block inside the demension

    • @rrrrrr9308
      @rrrrrr9308 2 місяці тому

      Compact Claustrophobia would be just fine! It even have REAL computer WITH NETWORK, programmable in Lua!

  • @SanoKei
    @SanoKei 3 місяці тому +1

    the multiplication and division portions can be simplified by taking the binary input and bit shifting them

  • @capsey_
    @capsey_ 6 місяців тому +161

    offtopic but recently started second semester on my computer science in college and was like "omg it's mattbatwings thing" the whole lecture because i already learned most of the stuff they were talking about from you 💀

  • @novantha1
    @novantha1 6 місяців тому +32

    The first thing that comes to mind is a recent cutting edge implementation of QAT (quantization aware training) called Bitnet 1.58; it operates on different principles than a standard MLP. It replaces the Matrix multiplication with binary operators (addition, subtraction, or no-ops), so it's fast in inference deployment and cheap in that you can sort of fit a single "unit" of weights into 1.58 bits (though it's easier to just do it as a 2bit implementation with one state unused). It'd probably be way faster in a Minecraft context as one of the biggest disadvantages in IRL deployment, that you need custom hardware to take full advantage of the speed improvements, isn't really a disadvantage in a bespoke system.
    Anyway, the biggest difference is in the training process; it's trained at Int8 or FP8 (if memory serves, it's been a little while), and is then downscaled to the 1.58bit representation, but the information lost in that conversion to ternary values is preserved in a weight reconstruction matrix, basically. The end goal is that the network is made aware that it will be converted to a ternary representation. Hence, "quantization *aware* training", so you might be able to preserve more of the accuracy of the floating point model than you thought.
    Strictly speaking, the full bitnet implementation is a Transformer network, but it should still apply to raw MLPs given that they started with the FFN (essentially an MLP placed inside a more complex network with self attention and a language head).

  • @kevinjerome5954
    @kevinjerome5954 6 місяців тому +68

    At this rate in 5 years I'm going to see a video on my homepage from mattbatswings where he ports the entire Linux kernel into Minecraft

    • @kaz49
      @kaz49 5 місяців тому +5

      Well, they do say that Linux runs on just about anything

    • @noerlol
      @noerlol 3 місяці тому +1

      @@kaz49 dont give him ideas bro

  • @Jumpingjaqs
    @Jumpingjaqs 5 місяців тому +3

    10:30 I love the music and I personally listen to it on my free time and for anyone else who wants to listen to the song it’s “rain” or a play list named Lo-fy bets

  • @mattshull4665
    @mattshull4665 3 місяці тому +10

    Fun fact the human body has over a hundred billion neurons (100,000,000,000) even so making 10 in MINECRAFT is a amazing achievement

  • @nik7069
    @nik7069 6 місяців тому +14

    Brother.
    I spent a while learning how to make neural networks as a school project, and just doing this from scratch, in redstone is absloutely astonishing. Legend, Mattbat.

  • @IGaming73
    @IGaming73 5 місяців тому +1010

    We got real AI in Minecraft before GTA 6

    • @krinodagamer6313
      @krinodagamer6313 5 місяців тому +17

      Diabolical

    • @goldfishglory
      @goldfishglory 5 місяців тому +22

      😭😭WE ONLY HAVE A COUPLE YEARS TO MAKE THESE JOKES; EVERYTHING WILL STOP BEING IMPRESSIVE SINCE ITS AFTER GTA 6

    • @_sandy_
      @_sandy_ 5 місяців тому +3

      i came here looking for this comment LMFAO

    • @NolanHOfficial
      @NolanHOfficial 5 місяців тому +16

      ​@@goldfishglorywe got gta 6 before gta 7 - some guy in 2093

    • @goldfishglory
      @goldfishglory 5 місяців тому

      @@NolanHOfficial true

  • @guyleroy8022
    @guyleroy8022 5 місяців тому +12

    Amazing project, congrats. Note: instead of multiplying the weights by 100, you can perform post-training int8 quantization to maintain most of the original accuracy.

  • @Sporkz_
    @Sporkz_ 21 день тому +1

    Can you make it smaller is the question we all been waiting for.

  • @FrankHacking
    @FrankHacking 3 місяці тому +1

    Wow, How kind of you to give the world download!!!, appreciate it! ❤

  • @Iopal152
    @Iopal152 6 місяців тому +26

    Nice i also thought at first that your going to train the model in Minecraft but it seems that if its going to happen its going to be a whole other story

  • @LazyGuy-ne3ox
    @LazyGuy-ne3ox 6 місяців тому +29

    That's incredible! Combining neural networks with Minecraft is pure genius. Keep up the amazing work!

  • @NEOMatrix-bd7uo
    @NEOMatrix-bd7uo 6 місяців тому +27

    I never thought a Minecraft video will teach me neural network better than my teacher, thanks for the upload

  • @befikerbiresaw9788
    @befikerbiresaw9788 5 місяців тому +1

    Dude your project just made me fully understand MLPs and neural networks thank you.

  • @alexking4699
    @alexking4699 2 місяці тому

    The difference in space occupied between the MLP and the Barchart graphing section is an amazing way to visualize the difference in logical functions and the use of like, analog signals vs digital.
    Amazing and insane, keep it up man!

  • @fearofthechippan
    @fearofthechippan 6 місяців тому +14

    This is honestly incredible. I wish this was around when I was studying these concepts, would have helped me understand back propagation and softmax so much quicker

  • @cosmoplaysmc
    @cosmoplaysmc 6 місяців тому +24

    This is great work! I never thought we would have machine learning with just Redstone.

    • @bintangramadan3217
      @bintangramadan3217 6 місяців тому

      There's a guys who made this 1 year ago lol in minecraft

    • @The.Sponge
      @The.Sponge 6 місяців тому +2

      @@bintangramadan3217 Yeah but Mattbatwings is aware of that so maybe there will be something new?

    • @CubeXC
      @CubeXC 6 місяців тому +1

      You could npt have seen it yet, stop saying stuff just to get like. It was before premiere

    • @mineq4967
      @mineq4967 6 місяців тому +17

      its not machine learning, he just pasted the weights and biases into the neural network, not making it learn itself like a machine learning algorithm would

    • @Louis13XIII
      @Louis13XIII 6 місяців тому +2

      @@mineq4967 yeah that's a bit deceptive tbh

  • @TimeWisely
    @TimeWisely 6 місяців тому +5

    Wow, that's actually crazy, good on you!

  • @sabersakin3685
    @sabersakin3685 5 місяців тому

    As a minecraft player and having experience in ML for over one year, i''m literally blown away!!! This is truly amazing.

  • @radekk5380
    @radekk5380 5 місяців тому +1

    It is obviously very impressive but perhaps many people do not know that he is actually doing kind of assembly programming which many of you would not find so cool

  • @OszkarFulop
    @OszkarFulop 5 місяців тому +23

    mattbatwings in 1 year: I Made a Technological Singularity with just Redstone!

  • @CreatorProductionsOriginal
    @CreatorProductionsOriginal 5 місяців тому +6

    7:18
    “I pressed F5”

  • @humanperson8418
    @humanperson8418 6 місяців тому +95

    Ok, now make an AI assisted shape drawing tool for your paint program.
    e.g. draw a bad square, it draws a good square with the same width and height.
    draw an ugly number, it fixes it by converting it to the closest possible number with correct dimensions.

    • @itsaducklin
      @itsaducklin 6 місяців тому +33

      that sounds like pure hell
      I love it

    • @alluseri
      @alluseri 6 місяців тому +4

      nah

    • @NoVIcE_Source
      @NoVIcE_Source 5 місяців тому +2

      @@alluseri i like how google translate assertively translates this to "Now"

  • @AeroSW
    @AeroSW 5 місяців тому +1

    So the work completed in this video is worth a degree. Many people are required to get doctorates for even a chance to think about how to build APUs for Artificial Neural Networks (ANN). This is essentially what this creator built in Minecraft using Redstone because, Redstone, mimics circuitry on a basic level. So he essentially built the APU's Core with a test input and output workbench.

  • @notapplicable7292
    @notapplicable7292 5 місяців тому

    you could probably have made your life easier with the network design but the confidence display is extremely cool.

  • @error.418
    @error.418 5 місяців тому +8

    Can't say enough about how great it is that you showed prior work from others in the community before digging in to your version. That's what we want to see in the community ❤

  • @rubensf7780
    @rubensf7780 6 місяців тому +31

    Now please make a calculator where you can draw the numbers yourself (using a neural network and calculator) that would be awesome

    • @bugmenot799
      @bugmenot799 4 місяці тому

      That would be so cool. It would also be fun if you could draw the plus sign (Or whatever operation you were doing) as well.

  • @NieMamNicku
    @NieMamNicku 6 місяців тому +6

    respect for the sponsor's dish at the end of the episode

  • @epicfilms4life507
    @epicfilms4life507 5 місяців тому

    This is really good bro for visualising how computers work deep down in their tiny chips. Like ur essentially blowing up a cpu to its full size and literally WALKING thru the details and wiring. U can be a goated CS major bro, u have so much f**ING talent bro. How old are you dude? Did you do UNI, or are you currently doing uni? Like bro, go do a CS major or smth, you could make a shit ton of money from just research and development. U got like bottomless talent levels bro

  • @noahflood
    @noahflood 5 місяців тому

    Dude this is so amazing. To have the skill to make a machine like this, understand the math and computations behind it, minecraft knowledge, and the video production after it all? That's amazing

  • @youMatterItDoesGetBetter
    @youMatterItDoesGetBetter 5 місяців тому +7

    Congrats, you passed your PhD thesis.

  • @ALPRNX422
    @ALPRNX422 6 місяців тому +13

    at this point bro is gonna make hooman brain in redstone dang good job

    • @ziphy_6471
      @ziphy_6471 5 місяців тому +1

      Cringe

    • @ALPRNX422
      @ALPRNX422 5 місяців тому +1

      @@ziphy_6471 omg its linus no way 🔥🔥🔥

    • @ziphy_6471
      @ziphy_6471 5 місяців тому +2

      @@ALPRNX422 I have several children in my basement

    • @ALPRNX422
      @ALPRNX422 5 місяців тому +1

      @@ziphy_6471 cool

    • @ziphy_6471
      @ziphy_6471 5 місяців тому

      @@ALPRNX422 Will you be my next OwO UwU * turns up bulge *

  • @imabioligist1882
    @imabioligist1882 5 місяців тому +1

    bro what you are a genuine genius. I do not mean this non-literally, you are a genius

  • @1999Fabion
    @1999Fabion 5 місяців тому +1

    I can hear so many people I know saying, "I feel like an easier way to do that would be to just connect them directly, what is even the point of this machine?" People like that don't have the capacity to see beyond what's in front of them into what could be. God damn, this is cool.

  • @luckybeeyt
    @luckybeeyt 5 місяців тому +4

    This guy in 2030: Building robots to colonize all solar system planets with just redstone!

  • @LightslicerGP
    @LightslicerGP 6 місяців тому +18

    Amazing
    I hope you mention the first guy who did a neural network thing in minecraft, recognising numbers
    Edit: he did

    • @ThiaGamesBR
      @ThiaGamesBR 5 місяців тому +1

      Feels good to comment before watching the video...

    • @two697
      @two697 5 місяців тому +3

      Why would you comment this before watching the video. He mentioned the other guy very early on in the video

    • @doctoroppa7991
      @doctoroppa7991 5 місяців тому

      Twitter rot

  • @etienneweidenfeld6468
    @etienneweidenfeld6468 5 місяців тому +7

    Bro is bout to build a quantum computer in Minecraft… 💀

  • @huynhat1799
    @huynhat1799 5 місяців тому +1

    Although I didn't actually understand what you were doing, it's always fascinating how those people like you has pushed the minecraft redstone community so far. Keep up with your work!

  • @anonymanonymus4706
    @anonymanonymus4706 5 місяців тому

    After not being able to build a Python simulation for one project of mine, the last thing I expected was to find a detailed explanation of how the particular part that wasn't working for me was working in a random Minecraft video I watched for fun in my free time. Thank you for the (probably unintended) help with my project, and the video was interesting in its own right as well.

  • @Random-Sad
    @Random-Sad 4 місяці тому +12

    0:22 Loved it

    • @Only_Some
      @Only_Some Місяць тому

      It’s 10 seconds and there’s nothing to fucking love

    • @Random-Sad
      @Random-Sad Місяць тому +1

      @@Only_Some There is

  • @Kirbogun
    @Kirbogun 5 місяців тому +12

    1 step closer to google in minecraft

    • @Fineas_Bondar
      @Fineas_Bondar 5 місяців тому

      There is a mod that uses block's as a screen and it connects to Google's url so thechnicly you can wach UA-cam in Minecraft

    • @t.bo.a7061
      @t.bo.a7061 4 місяці тому

      No mods. Pure bloodstone ​@@Fineas_Bondar

  • @KiwiRedstone
    @KiwiRedstone 6 місяців тому +8

    Wait what!???? Please tell me that this is just uploading the model into redstone and not all complex things like backpropagation to train the NN inside Minecraft...

    • @FriedMonkey362
      @FriedMonkey362 6 місяців тому +4

      For simple nural networks you dont really need backpropagation, you can just randomize the values until it gets better, itll take longer to train and wont be as efficient but its way easier to do

    • @bintangramadan3217
      @bintangramadan3217 6 місяців тому +1

      Bro there's a guy from Chinese who made neural Network in mine5 1 year ago lol

    • @Abcdef0101_
      @Abcdef0101_ 6 місяців тому

      ​@@bintangramadan3217Send the vid pls

    • @boblol1465
      @boblol1465 6 місяців тому +3

      yes it is uploading the model into redstone dw

    • @KiwiRedstone
      @KiwiRedstone 5 місяців тому

      At least...

  • @miltontinoco9851
    @miltontinoco9851 5 місяців тому

    Your work is incredible! I have a lot of trouble working with PyTorch to create a neural network, and the fact that you were able to do it in Minecraft is mind-blowing.

  • @Gunbudder
    @Gunbudder 5 місяців тому

    this is amazing showcase of what a NN is and how it works. i worked with NN's for years and i still struggle sometimes lol

  • @dreamer964
    @dreamer964 6 місяців тому +42

    NO DONT TAKE OUR REDSTONE ENGINEERS JOBS

  • @InsertName404
    @InsertName404 6 місяців тому +4

    How did u get around the network being bad at actual digit recognition, due to the MNIST data set all being perfectly centered?

    • @ferguspick6845
      @ferguspick6845 5 місяців тому +1

      A simple MLP can learn a pretty good representation already for this dataset, but one easy approach would be to transform the input images (e.g. skew, rotate) and add these as additional training samples, this makes the learned representations even more robust :)

    • @InsertName404
      @InsertName404 5 місяців тому

      @@ferguspick6845 tysm

  • @NimArchivesYT
    @NimArchivesYT 6 місяців тому +8

    I’m a time traveler and mattbatt has recently made a human brain in Minecraft

    • @Meyer-gp7nq
      @Meyer-gp7nq 5 місяців тому +2

      He also made a Time Machine in Minecraft which is how you’re here I assume

    • @NimArchivesYT
      @NimArchivesYT 5 місяців тому

      @@Meyer-gp7nq Naturally

  • @nix207
    @nix207 4 місяці тому

    I don't know why I only now started to actively seek Redstone computer UA-camrs. Maybe it's because I recently got a job that involves low-ish level stuff. But now projects like this make me want to learn more about implementing even basic computing processes in my survival world.
    Probably won't ever be useful in single player, but the learning is what I'm here for

  • @ahmad777-noob3
    @ahmad777-noob3 5 місяців тому

    The way you explained all of those deep learning terms in simple words is just marvelous!

  • @velartt
    @velartt 5 місяців тому +13

    2:07 my little pony or what?

    • @rayanshorts233
      @rayanshorts233 2 місяці тому +4

      My little pony or cable news network

  • @Burueberii
    @Burueberii Місяць тому +3

    Programmers 👇

  • @adamburningham
    @adamburningham 5 місяців тому

    Just another comment saying I'm thoroughly impressed, both in your execution and your explanation of neural networks. Thank you!

  • @IntentStore
    @IntentStore 5 місяців тому

    The reason the network redstone is smaller than the display is because the heavy lifting of the network has already been distilled within the pre trained weights. The operation of inferencing a small network is simple arithmetic, compared to training, which is complicated, and excels at developing relationships with high precision weights and biases. I also imagine running the training on redstone would take an infinity, and it would be virtually impossible to represent the training data within all loadable chunks.

  • @stackootb9822
    @stackootb9822 5 місяців тому

    This taught me about implementing neural networks better than a lot of learning resources I've watched. Good work

  • @Tommy-mo7wh
    @Tommy-mo7wh Місяць тому

    it‘s so crazy to think about but these minecraft videos were my motivation to study electrical engineering, which starts tomorrow. In a few years I might be able to do something like this!

  • @THEORANGER-hv7sg
    @THEORANGER-hv7sg 5 місяців тому

    Very comprehensive explanation on neural networks. Appreciate it man

  • @marcosmachado6844
    @marcosmachado6844 4 місяці тому

    It would be insane if you actually trained it in Minecraft. I was already scratching my head on how you would even transform the images into data. I don't even want to think how would you implement back propagation. Amazing video!

  • @yoseph32kefelegn21
    @yoseph32kefelegn21 4 місяці тому +1

    I once tried to make a canon in minecraft and blow my mind and here you make an ai😮😢what a genius

  • @ttking
    @ttking 5 місяців тому +1

    I think you create videos that are worth subscribing for. Good job!

  • @RedVRCC
    @RedVRCC Місяць тому

    Minecraft never ceases to amaze me. No wonder it was used in schools for learning purposes, there's so much actual practical use for it as a leaening tool. And that's just the vanilla game, if you dip into modding, there's sooo much more, from programming mods yourself to learn java to using mods centered around realistic things to learn about stuff.

  • @RealTheScienceCat
    @RealTheScienceCat 4 місяці тому +1

    i got it so i could constantly draw numbers, and if it got them wrong, i 'punish' it by setting a part on fire, pouring a bit of water on it, exploding it, switching wires and more ways of damaging it. and when it breaks fully i will just reset it.

  • @SabraHummusMan
    @SabraHummusMan 5 місяців тому

    I finished a CNN machine learning model on ENSO’s (El Nino Souther Oscillation), a weather phenomenon in the southern Pacific Ocean, prediction for STEM fair, it didn’t make it past regionals because I’m still in junior league, and it is not as a easy as this man explains and showcases in this video. Props to you!

  • @MaskalHayzenbrg
    @MaskalHayzenbrg 3 місяці тому

    That's the idea. By increasing the number of entries and the processes running them (eg 10). The goal of this complex task is to respond to texts and make sense of images at the same time

  • @braveecologic2030
    @braveecologic2030 5 місяців тому

    Yep definitely cool. You just said about integers being needed for minecraft and I'm thinking so you just multiply it up... sounds obvious but its only because you were already talking about it. So good.

  • @Darockam
    @Darockam 5 місяців тому

    Congratulations, that's so cool! I used to do a lot of redstone back then, so I love seeing people pushing the limits further and further with it :)

  • @georgeadrianstefan1676
    @georgeadrianstefan1676 Місяць тому

    I really don't understand how you managed to feed up the data to the AI. Amazing work.

  • @mathieuagostini7690
    @mathieuagostini7690 3 місяці тому

    Oh my god.
    This is the best neural network I ever heard.
    I listen to a bunch of videos, and I always had some trouble about "what are the operations done"
    Thanks for breaking it down..
    Multiply, adds up result, activation function, and here we go again.
    It makes sense now: we need to multiply, so when activation function returns 0, it completly toggle off the neurone..
    Damn I love you

  • @leekezar1344
    @leekezar1344 5 місяців тому

    This is so cool. You can do something that most ML researchers can't (including me lol). You could dedicate this energy towards a PhD if you wanted, you would probably like it a lot.

  • @grayjphys
    @grayjphys 5 місяців тому

    I love how machine learning people use things from physics. like the softmax function is the way you find probabilities of states in statistical mechanics. The sum of exponentials is the partition function, which normalizes all of the probabilites. :)