Next video after that: I reverse engineered imperialism and have established the means to exploit local ethnic conflicts for the sake of extracting conflict minerals for my CPU
@@astroblurf2513 after that: using a large stockpile of up quarks, down quarks, and electrons to create ethnic groups to fight and exploiting them for materials to make a particle accelerator where wich I create silicone to make transistors that I put together to make a CPU
Punch cards? The first computer I worked with had 16 switches for the address bus, 8 switches for the data bus, a deposit button, a reset button and a run/stop switch. Based on an 8080. Hand assembly and switch input. It taught me to become a wiz in binary.
As electrical engineer I actually think you used logisim for its intended purpose, education. The visual representation of gates as well as real-time status updates will make the circuit much more intuitive to understand than 2 pages of VHDL or verilog.
not a student but I play a lot of Logisim and made my own 16x16 video player, which I another circuit dedicated on animating every frame of it. years pass and I forgot how I even able to do that.
This is absolutely mind blowing. As someone who is studying EE and CS the amount of work here is crazy! The range of skills is nutty! Big props to this dude
CS student here did my compiler project last semester and a VM project a year before (which the compiler's asm output runs on), and dear god this guy is pushing himself way too hard.
@@arc-sd8sk idk man, it kinda shows you don't recognize different levels of intelligence. This guy is not normal, statistically speaking. He's for sure way above 135 IQ (SB) , i'd bet 140+ . He doesn't even have enough views/subs to make a living from youtube, so he does this as a part-time. I'm a CS graduate, i've build a compiler from scratch, lexer and parser in C (Among other "low level" stuff, like a 3d engine), i've spent countless hours studying (You go through a lot of maths, that don't add up to practical skills , if one's not invested in research later in life), i've met much smarter people than me, and they went a step further by creating their own OS. Great for them, loved learning from smarter peers.
I completely relate to the StackOverflow bit about lexical analyzer generators. I eventually managed to build a working compiler from scratch but initially, all my questions were met with "use this existing tool that does all the work for you".
What a timing! I just finshed my own cpu 2 days ago (it was an extra project for my second semester of IoT Engineering). I made a proper design in verilog so I could get it manufactured but it turned out that no manufactures take any orders under 2.000.000$ so i just got it to run on a 5$ fpga (tang nano). Having physical hardware do something is much cooler than emulation and It's actually not much harder than logisim. To be honest it was a lot easier than I expected. I think you will be pleasantly surprised!
sweet! I'm thinking about ordering an FPGA first to simulate this thing before I build it for real. I've heard of people getting a small number of PCBs manufactured and putting it together with ICs to get something in between a manufacturer-made CPU and a breadboard machine, if you can invest the time. and it's much cheaper than $2 million to get the circuits :)
@@jdh That's so cool!!! Yeah fpga's are great and while doing the project I learned that they are used not only for prototyping but you can actually find them inside cars, excavators and other industrial machines. Also I cant wait to see how you approach making the "PCB computer". It seems very hard to find a sweet spot in terms of how high level circuits you want to use. It's a weird balance between full on cpu ICs and raw transistors.
@@Agnostic080 If you're as clever as him, at the very least, hex edit the exe to invert the color on the program itself, or maybe even create a custom theme
@@YOEL_44 Although these options sound like fun, it's so unnecessary. I just have a Keyboard shortcut to switch to inverted screen colors whenever I want. But I guess, most of the things on this channel could be considered 'unnecessary' :D
In my senior year integrated circuits course in undergrad, I was given an integrated circuit architecture and was instructed to program a simulation in c-spice and do some benchmarks. Going from the block diagram to c-spice code directly was not something that I had done before, so I recreated the circuitry in Logisim. I included the Logisim macros in the report and expressed how much easier the project was for me to complete with it. It is a great program for simplification and understanding.
@@plebisMaximus You're absolutely not too stupid. I can be hard, and it could be that it's not something you really want to do (the result is VERY different from the method), but find the right guide and you're able to do simple programs in 2 hours from now. Another 10 hours and you could be making a website. Another 7 days and you could be making a game in unity. If you have a clear (and small) goal in mind, it can be surprisingly fast to learn.
This is super fun. When I was in university, we had to make an ALU (among other things) on a breadboard directly from basic gates (and, or, not). We also had to design those gates at the CMOS logic level as well, and of course, we studied the physics of semiconductors, but we didn't do any manufacturing at that level, haha. Most people reading this will already know, but it's cool that each individual component of the circuit in this video could be implemented using nothing but a bunch of NAND gates, and of course each NAND gate of course can be made using two PMOS and two NMOS transistors. For students interested in this type of thing, there is actually a game online called, unsurprisingly, NAND game, where you build up towards a "computer" from simple gates. Very cool video, I'm excited to see what comes next! :)
One of my classes in college was creating a MIPS CPU with SystemVerilog and getting it working on an FPGA with keyboard input and VGA output. I ended up writing 2048 in ~500 lines of MIPS assembly. Probably my favorite class in all my education! Great video :)
@@randomguy-gb9ge """If you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" -random guy
@@gigabit6226 """"if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit
@@dontsmi1e """"""if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit" - Kazat0" -DonTSmi1e
@@2wugs """""""if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit" - Kazat0" -DonTSmi1e" - GalaxyStudios0
Imagine my face, trying to construct my own CPU in breadboard, thinking me as a God, seeing your video. Man you are astonishing. Best video that I watched in YT in my entire life. Thank you man!
That's gonna be one expensive computer. Also massive props, not only did you do all this, you also had to sift through hours of recording and commentate for a 20m video while editing (rather masterfully might I add) and overlapping all of it. I would not have had the patience. Not to deal with strings for hours in C just to make an assembler, and much less to do all that in logisim. Hell, even display stuff has gotten too cumbersome for me to not get bored with. And maybe I'm just not a video guy, or maybe I'm just disorganized with footage, but much less would I have felt like reliving the whole thing while having to also edit it and comment over it later. Massive respect and gratitude from me! Looking forward to the next video and seeing you bleed all over your breadboards from stripping wire and moving around ICs for hours haha.
Amazing. Reminds me of my EE degree ! We had to do such things in the early 90s. Going down memory lane A. Hope your next step is building your processor using a fpga or something like that ! The most incredible thing is you probably managed to design and build everything faster than it would have taken in the 80s to do it !
15:43 While I do much more simple coding compared to this, this little montage really hit home. Sometimes it feels like everything starts blurring together, I really felt that. This is actually something I hope to do one day as well, I want to make a machine entirely made by me, that runs games/programs made by me. Just for the learning experience, and to say I did it I guess, but this gives me some hope.
Gotta say, I was very shocked and very impressed when you whipped out the "I don't have any formal education in digital electronics" during the hardware design phase. I would NEVER have attempted something like this without the university classes I took on Verilog and digital systems design. Mad props! Can't wait to see the final hardware!
@@wewilldiehere I have: >matchbox (for phosphorus dopant) >borax (for boron dopant) >sand >fuselage of the plane I crashed in _and today we'll be running crysis!_
If you really want to implement your CPU irl I would recommend you use a fpga, most fpga boards include an editor similar to logisim so it shouldn't take you long to replicate your design.
Saw title if the vid. Tapped on it expecting him to fool me and just switch to buying parts and assambling a computer. Coupleminutes passed. Man start creating his own instruction set. I can't believe what I'm watching. Instantly subbed.
I basically did this a good few years ago for an A-level project (yeah, it was massively out of spec for what i *should* have been able to do), designed an 8-bit CPU (a somewhat more basic architecture than yours, could only access 256 bytes of RAM and all registers were memory mapped) in logicsim, and wrote an emulator in C#. interestingly, you took the same approach as i did with running instructions, one big case switch, but having recently written a Z80 simulator, i kind or realised that was a poor design choice. In the process writing the emulator, I also wrote a simple assembler (very simple) but still offered a useful feature set. I actually designed a few circuits that made the architecture unique, while I did store an overflow and negative bit in a flags register, I also implemented a magnitude comparator, this meant i had some opcodes for jump is less than, jump if more than and jump if equal, as well as jump if magnitude greater than (equal to and less than) which made some operations use significantly less instructions. Was a fun project, especially given how much experience i didn't have at the time. Logicsim wasn't brilliant, but it worked for the design. My test programs were fairly simple, counting up and down to 100, calculate Fibonacci sequence etc...
@@philosophicalearthworm6819 it's not universal enough. Surprisingly Calculators actually change a lot overseas. Most of Europe uses Casio because Texas Instruments are way to expensive
@@InnerEagle too easy for jdh? lol I am genuinely curious about calculators(despite not being a math person), like ways that they could be re-purposed.
I took a class in college where we described a computer's operations and then used "Register transfer language" to express all the operations; these were then collected and used to generate the logic for each left-hand-side of those expressions. The computer happened to be a PDP-8. Note that there was no anachronistic 64K ROMs for decoding! I would be more impressed if you actually did express the design using only basic logic gates, preferably as "sum of products" or "product of sum" subexpressions wherever possible. (This was in the late 1980's)
i feel proud of already having done that, tho jeez congrats, the amount of work must've been insane, i mean it took me days to work on my own 16 bit computer design also good luck for the next vidéo: "i build my own computer in real life on a bread board"
I never thought I would be seeing logisim ever again but god damn was I happy when I saw it. Might be old and discontinued (at least the original) piece of shit but it just works™
@@jdh Same, I can't really use anything else because I don't know about another simple software that is as complete as logisim. Though everytime I have to rotate components or choose the number of inputs of a component from a list I die a little bit inside
There is an open source program similar to logisim that is being actively developed, you can find it at hneemann/Digital on github, don't know how it compares do logisim though
I am a Python programmer, but I am switching to C now. I don't really code in Python now much, because of C. Believe me, C/C++ is easy, you just need to learn the syntax. Assembly is easy as well, you just need to understand how your CPU works, what are memory addresses, etc.
@@dominikmazurek753 how do you learn this lower level stuff? I have java exp but want to learn computers on the lower level as that'll help me be a better dev tbh
@@avidreader6534 Sure, i would advise you to try the Harvard CS50 course for learning the C language, you can skim over concepts that you already know about from python but you should focus on pointers and memory management
This is amazing. I wrote a cross compiler in a manic weekend, but the rest of this is just beyond me. I was tired of "you missed a semicolon" errors, so the compiler just printed the error message and said "one assumed" when it could.
This is awesome. I did this exact same project in Logisin back in my undergrad. 8 bit cpu with the built in video display and an assembler I wrote in Java.
Up next uploading it on an FPGA, making your own internet and then deep diving into VLSI design to get down to CMOS transistors and clock generation. This is fun!!
@@casimirwallwitz8646 the osdev wiki wont be too much help if you're making your own cpu. His tetris os honestly isn't that hard to do, he basically just got a basic booting os and then started on graphics right away after input is working. Were you creating an os meant to be an os you would also need to worry about paging, user mode, system calls, executable formats etc
Ahhh man logisim brings me back. The logic projects I got to do in uni were my absolute favorite. Writing quick sort in MIPS is a close second. Love everything about this project
Mumbos redstone skills are overrated as fuck :D Just saying, I like his vids. I always cringe when somebody says he's the best at redstone. But he's not even saying it, people are. I grew up watching redstone computers and stuff, in terms of this it kinda sounds trivial.
@@aviko9560 redstone computers are different in my opinion, what mumbo jumbo does is actually practical and useful, but whos actually going to use a redstone computer? its impressive but practically, mumbo jumbos redstone is more useful.
YEESSS YEEESSS YESSS Man You are a beast! So much time spend into this it's almost incredible! Make that circuit and the screen too for fucks sake! Amazing work ! I reeeeallly would love to know where did you learn or search all the info you need to make such a thing like this tho, because if you are not an electric/electronic engineer you must have worked your brain AF to make this happen. Congrats once more, and will be expecting anxious the next episode!!
EE here. Props for doing it with logic gates. Takes a much bigger brain to do that without tools like Verilog. Verilog is easy to learn if you know C already. And the best part is you can then drop it right into an FPGA. I hope you aren't planning to wire up raw gates using transistors. You will save many months of your life if you buy a cheap FPGA, and learn VHDL or Verilog. Curious which way you go with it.
This is really cool, this popped up in my recommended and I already love it. But you should make your own OS to run on the CPU, which can run a basic word processor that you made.
What is an OS? Why do consoles tend to only have copy protection code in ROM, but no OS? Home computers at least had BASIC . Unix started as a file system.. So on a console with only ROM, or only a CD-ROM you don’t need an OS?
Just wanted to say, you're extremely underrated, and I can't wait for your future success. You're an inspiration, and a perfect mix of learning and insanity, and its addicting!
VHDL and Verilog aren't witchcraft! I find it quite impressive how well you were able to design this at a gate level. Because, if you could do this at a gate level, writing this CPU in VHDL or Verilog could have easily taken you about an hour or 2 max.
@@theforeskinsnatcher373I read that the branch delay slots needs a lot of testing. But then it is only one bit in addition to the return address. It should only need twice as many tests and only need one branch in microcode.
Bro wrote a custom architecture, an EMULATOR, and a fricking ASSEMBLER? holy cow this guy's unstoppable. I don't care if you say you dont have "formal education" in this stuff, you're a whole hecka lot better than me.
The point of an LL(1) language is so you can easily build a recursive descent parser by hand for it. Perhaps you mean LALR(1) languages such as Yacc and Bison can automatically generate parsers for? As for recursive descent parsers, you just need one function per nonterminal in the grammar of your language and in those one clause for each production that can replace that non terminal. And at the start of each function you can pick which clause to use based on the next input token. That's what LL(1) stands for. Left to right scanning of the parser across the input tokens, doing Leftmost expansions. With one token of lookahead. Each clause is straightforward as well. For each symbol in the corresponding production, we treat them, in the order listed in the production, in the following way: A nonterminal means call the function for that nonterminal. A terminal means consume a token from the input stream. If the token read isn't the same type that the production, there's an error and the program doesn't parse. If the token is of the required type, you consume it somehow. Minimally remove it from the token stream. If the goal is to just check programs are correct, you can just ignore the token after you remove it from the token stream. Your parser will simply do nothing except empty the input of tokens for a correct program but produce an error if you input an incorrect program. If you want to do something more interesting like generate code you can have your tokenizer add more information to your token, and that information will be in its proper context, and not only that you can leave breadcrumbs for subsequent steps. For example you could tag variable name token types with the actual variable name. Or some computer readable equivalent. As you go along you can build up a symbol table containing such things as the type for type checking and the memory allocated to the variable. You could implement block scoping by having a stack of or chain of symbol tables. And if a variable name isn't found in the deepest nested block scoping you look for it in the next highest scope and symbol table. You can also do something with it like generate code for an assignment. Which would simply be a store into the memory location of whatever the right hand side evaluates to. Which would be calculated recursively and available when the function for the last nonterminal returns. That said an assembler doesn't likely need a parser, just a lexer or regular expression matching/FSM because there's likely no nested language constructs. And two passes, so you can find/computer the address of labels when you see the definition of a label and then come back and replace any forward references.
This is more of a hardware engineering thing, often you learn about this kind of thing when working with FPGAs and such. Computer science has lots of far higher level concepts and ideas. You can come across it in pure computer science of course, but it's not too common.
The CPU design animation is a thing of beauty.. very impressive. I could write something about the psychology of people who aim for total control and independence, but this text field won't allow the necessary number of characters. Looking forward to seeing the finished machine video. Hat off for your hobby project focus. (Btw, I created a Pong physical game using Tinkerforge components and a Java program to drive/read the three OLED displays, joysticks and buzzer in 2-3 days.. yeah, I know, that's peanuts compared to your project ;-)
Next: Moving electrons by hand like a real programmer
*for playing space invaders
XD
yes
@@licklack159 no they use very very basic python lol I suck (at least rn I will be the next jbh)
@@red-52 Python is okay but definitely not good enough as other languages like C or C++
@@JuanR4140 ya ik
Next: So I decided that I'm just too good for electricity, so I built a steam-powered turing machine
Inb4 he makes the Charles Babage Computational Engine
Thats actually not a bad idea, I know I have seen a water pump based calculator device on youtube.
after that comes a better wheel
Bruh
He's too cool for machines now, so he just imagines it.
Next video: I went to a mine and gathered raw resources to build my own CPU
After that:. Custom mine tools for doing it..
Next video after that: I reverse engineered imperialism and have established the means to exploit local ethnic conflicts for the sake of extracting conflict minerals for my CPU
Oh no....
*HE LIKED YOUR COMMENT. THAT MEANS YOU GUESSED RIGHT.😅*
@@astroblurf2513 after that: using a large stockpile of up quarks, down quarks, and electrons to create ethnic groups to fight and exploiting them for materials to make a particle accelerator where wich I create silicone to make transistors that I put together to make a CPU
@@matthewe3813 next video after that: I’m sick of exploiting premade humans. I made my own human clones army
Next: Too good for keyboards, or software, or languages: Just write code in binary with hole-punched notecards like a real programmer
blink for 1, dont blink for 0
If you aren't writing programs for a Jacquard loom, I don't wanna hear it!
well, after watching this episode... your idea sounds easy to do
@@przemcio6867 satire?
Punch cards? The first computer I worked with had 16 switches for the address bus, 8 switches for the data bus, a deposit button, a reset button and a run/stop switch. Based on an 8080. Hand assembly and switch input. It taught me to become a wiz in binary.
As electrical engineer I actually think you used logisim for its intended purpose, education. The visual representation of gates as well as real-time status updates will make the circuit much more intuitive to understand than 2 pages of VHDL or verilog.
Also watching Ben Eaters series is an additional treat.
then you do it like that
not a student but I play a lot of Logisim and made my own 16x16 video player, which I another circuit dedicated on animating every frame of it.
years pass and I forgot how I even able to do that.
I'm honestly surprised he didn't write a tool to convert from Verilog to a logisim diagram
Next video: **LOGIC GATES ARE TOO HIGH LEVEL, NEED TO MAKE IT TRANSISTOR LEVEL**
No he’s going to have to go the vacuum tube route and program the computer with punch cards
I’m fed up with depending on transistor companies, I’m manufacturing my own
@@theoarcher896 I'm fed up with depending on transistor manufacturing machine companys, I'm building my own machines xD
I need to make my own universe.
Video after that: *TRANSISTORS ARE TOO HIGH LEVEL, AM NOW PUTTING P AND N SILICONE TOGETHER*
He's evolving, just... backwards
I think they’ve got a word for that
@@maxwellclark1615 yes, but you see, it's a reference
@@maxwellclark1615 devolving
@@maxwellclark1615 its just called evolving. evolution isnt linear.
@You thought it was a joke? Terry Davis who developed TempleOS?
This is absolutely mind blowing. As someone who is studying EE and CS the amount of work here is crazy! The range of skills is nutty! Big props to this dude
Similarily studying computer engineering. In theory I understand all of this but in practice holy shit.
CS student here did my compiler project last semester and a VM project a year before (which the compiler's asm output runs on), and dear god this guy is pushing himself way too hard.
@@eebilu idk man kinda sounds like you aren't pushing yourself hard enough so maybe work on that ??
@@arc-sd8sk idk man, it kinda shows you don't recognize different levels of intelligence. This guy is not normal, statistically speaking. He's for sure way above 135 IQ (SB) , i'd bet 140+ . He doesn't even have enough views/subs to make a living from youtube, so he does this as a part-time.
I'm a CS graduate, i've build a compiler from scratch, lexer and parser in C (Among other "low level" stuff, like a 3d engine), i've spent countless hours studying (You go through a lot of maths, that don't add up to practical skills , if one's not invested in research later in life), i've met much smarter people than me, and they went a step further by creating their own OS. Great for them, loved learning from smarter peers.
@@rubiskelter it was a joke-I was just busting his balls :)
but I appreciate such a detailed and thoughtful reply so thanks for that
I completely relate to the StackOverflow bit about lexical analyzer generators. I eventually managed to build a working compiler from scratch but initially, all my questions were met with "use this existing tool that does all the work for you".
After all of this he still doesn’t have his own programming language
yeah, he's done so much crazy shit that at this point a compiler for a custom programming language would be almost disappointing.
Can't wait to run hello world on jdh
@@soupgirl1864 If he did this he'd probably just make his own unique version of binary or smth
@@fr4781 he did. It's the machine code of the computer he invented and built. And also he's got a assembly language and assembler for it.
technically speaking, he did
What a timing! I just finshed my own cpu 2 days ago (it was an extra project for my second semester of IoT Engineering). I made a proper design in verilog so I could get it manufactured but it turned out that no manufactures take any orders under 2.000.000$ so i just got it to run on a 5$ fpga (tang nano). Having physical hardware do something is much cooler than emulation and It's actually not much harder than logisim. To be honest it was a lot easier than I expected. I think you will be pleasantly surprised!
sweet! I'm thinking about ordering an FPGA first to simulate this thing before I build it for real. I've heard of people getting a small number of PCBs manufactured and putting it together with ICs to get something in between a manufacturer-made CPU and a breadboard machine, if you can invest the time. and it's much cheaper than $2 million to get the circuits :)
@@jdh That's so cool!!! Yeah fpga's are great and while doing the project I learned that they are used not only for prototyping but you can actually find them inside cars, excavators and other industrial machines. Also I cant wait to see how you approach making the "PCB computer". It seems very hard to find a sweet spot in terms of how high level circuits you want to use. It's a weird balance between full on cpu ICs and raw transistors.
@@jdh You should BenEater's series about building a computer on breadboard! He even sells kits if you want to make your own.
You should make a video on it
my ben eater binging is paying off and i can now understand 15% of what this dude's saying
omg, I thought I was the only one thinking that. lmao
@@atraps7882 me too haha
Soo true
Lmao same
Same.
Frankly, I'm surprised that you didn't go "Wait, logisim doesn't have a dark mode? Hold on, I gotta fix this."
I mean, this would take one minute max to do that. Just need to search for a setting to invert the colors of the display :)
@@Agnostic080 If you're as clever as him, at the very least, hex edit the exe to invert the color on the program itself, or maybe even create a custom theme
@@YOEL_44 Although these options sound like fun, it's so unnecessary. I just have a Keyboard shortcut to switch to inverted screen colors whenever I want. But I guess, most of the things on this channel could be considered 'unnecessary' :D
Falstad's Circuitjs has a night mode ;-)
@@Agnostic080 I thought it was very obviously a joke reply that you replied to
In my senior year integrated circuits course in undergrad, I was given an integrated circuit architecture and was instructed to program a simulation in c-spice and do some benchmarks. Going from the block diagram to c-spice code directly was not something that I had done before, so I recreated the circuitry in Logisim. I included the Logisim macros in the report and expressed how much easier the project was for me to complete with it. It is a great program for simplification and understanding.
"If you want to make a game from scratch, create an universe first."
or it won't run!
Him : "I am too cool for universe, so I decided to make multiverse"
@@InnerEagle you shall optimise after all!
Me: hehe im making a platformer game yeah im a programmer
jdh: proceeds to create a computer from scratch
I spent a few months trying to grasp the basics of programming, decided I'm too stupid, then I got this in my recommended feed. UA-cam is bullying me.
@@plebisMaximus lmao noice
LMAO i feel u dude
@@plebisMaximus You're absolutely not too stupid. I can be hard, and it could be that it's not something you really want to do (the result is VERY different from the method), but find the right guide and you're able to do simple programs in 2 hours from now.
Another 10 hours and you could be making a website.
Another 7 days and you could be making a game in unity.
If you have a clear (and small) goal in mind, it can be surprisingly fast to learn.
@@tokeivo Thanks a lot for the pep talk, mate! I'll give it another shot.
next video: I build my own universe to recreate life
bro pls no spoilers
Video after that: inventing time travel and breaking previous video’s universe
@@jdh nice se
"If you wish to make an apple pie from scratch, you must first invent the universe"
*this comment was hearted ny jdh*
**vsauce bg music intencefies**
this man is about to invent the transistor
I invented frequency and way to store sound in computer but then I come to know that that's how things are working today.
I was sad
*reinvent and make it better xD
who needs three connectors in a transistor? it should have one and output the unchanged input. the program runs in your imagination
Transistors are waaaaayyyyy too fancy and fast.
Look, I'm playing with electromechanical relays ;-)
@@shailmurtaza9082 damn you invented frequency? sick
This is super fun. When I was in university, we had to make an ALU (among other things) on a breadboard directly from basic gates (and, or, not). We also had to design those gates at the CMOS logic level as well, and of course, we studied the physics of semiconductors, but we didn't do any manufacturing at that level, haha. Most people reading this will already know, but it's cool that each individual component of the circuit in this video could be implemented using nothing but a bunch of NAND gates, and of course each NAND gate of course can be made using two PMOS and two NMOS transistors. For students interested in this type of thing, there is actually a game online called, unsurprisingly, NAND game, where you build up towards a "computer" from simple gates.
Very cool video, I'm excited to see what comes next! :)
One of my classes in college was creating a MIPS CPU with SystemVerilog and getting it working on an FPGA with keyboard input and VGA output. I ended up writing 2048 in ~500 lines of MIPS assembly. Probably my favorite class in all my education! Great video :)
"If you wish to make an apple pie from scratch, you must first invent the universe. - Carl Sagan" - jdh
"If you wish to make an apple pie from scratch, you must first invent the universe. - Carl Sagan
- jdh"
-therealsome1
@@randomguy-gb9ge """If you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" -random guy
@@gigabit6226 """"if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit
@@dontsmi1e """"""if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit" - Kazat0" -DonTSmi1e
@@2wugs """""""if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit" - Kazat0" -DonTSmi1e" - GalaxyStudios0
The amount of knowledge and dedication required to make this is mind-blowing
not really, he uses his knowledge to make a *good* turing machine
@@PFnove huh? Designing a circuit for any architecture and writing Pong in assembly for it is already quite a challenge.
Next: Designing my own universe to play Sonic the Hedgehog
Green hill zone music intensifies
Imagine my face, trying to construct my own CPU in breadboard, thinking me as a God, seeing your video. Man you are astonishing. Best video that I watched in YT in my entire life. Thank you man!
That's gonna be one expensive computer.
Also massive props, not only did you do all this, you also had to sift through hours of recording and commentate for a 20m video while editing (rather masterfully might I add) and overlapping all of it.
I would not have had the patience. Not to deal with strings for hours in C just to make an assembler, and much less to do all that in logisim. Hell, even display stuff has gotten too cumbersome for me to not get bored with. And maybe I'm just not a video guy, or maybe I'm just disorganized with footage, but much less would I have felt like reliving the whole thing while having to also edit it and comment over it later.
Massive respect and gratitude from me! Looking forward to the next video and seeing you bleed all over your breadboards from stripping wire and moving around ICs for hours haha.
Next video: I recreate electricity just to play snake.
Or better yet, recreate DNA to play with real snakes.
You can't do that
this man is just re inventing the modern computing systems again
"modern"
"just"
"man"
literally just for gits and shiggles too
@@matthewe3813 Oh, forgot using the two genders was sexist and transphobic.
Next: So I decided that I am just too good to use the standard molecules and laws of nature, so I decided to create my own universe!
dont encourage him
Well, it’s an emulatior of another universe, yet...
Amazing. Reminds me of my EE degree ! We had to do such things in the early 90s. Going down memory lane A. Hope your next step is building your processor using a fpga or something like that ! The most incredible thing is you probably managed to design and build everything faster than it would have taken in the 80s to do it !
15:43 While I do much more simple coding compared to this, this little montage really hit home. Sometimes it feels like everything starts blurring together, I really felt that.
This is actually something I hope to do one day as well, I want to make a machine entirely made by me, that runs games/programs made by me. Just for the learning experience, and to say I did it I guess, but this gives me some hope.
Gotta say, I was very shocked and very impressed when you whipped out the "I don't have any formal education in digital electronics" during the hardware design phase. I would NEVER have attempted something like this without the university classes I took on Verilog and digital systems design. Mad props! Can't wait to see the final hardware!
next video:
so here i have:
- 4 transistors
- some cables
- a waterfall
- a forest & an axe
and now, we're going to make pacman! :)
"in a cave...with a box of scraps"
@@wewilldiehere
I have:
>matchbox (for phosphorus dopant)
>borax (for boron dopant)
>sand
>fuselage of the plane I crashed in
_and today we'll be running crysis!_
I can't wait for the Ben Eater collaboration
Looking forward to seeing how you do with your processor. Have nearly finished my own and it's taken a year of spare time to do it.
See ya next time when: Mining copper with hands to make everything myself.
If you really want to implement your CPU irl I would recommend you use a fpga, most fpga boards include an editor similar to logisim so it shouldn't take you long to replicate your design.
But that’s boring!
one of youtube's most underrated ten channels
list 'em pls
@@szigo3211 suckerpinch needs to go on the list, if you want channels that put too much effort into useless computing tasks.
@@thomaspeck4537 Thanks
This guy: Makes a god dang computer
Me: Googles "how to declare variable in HTML"
You cannot declare variables in HTML, dude. Because HTML is not a programming language and there is no variables. :D
@@1kvolt1978 And that's the joke.
@@1kvolt1978 someone is probably going to be really mad at what you just said.
(I meant the HTML is not a programming language)
@@hetsmiecht1029 But it is really not. What's the point to be mad about the fact?
@@1kvolt1978 Learns when to take words as sarcasm.😉
Saw title if the vid. Tapped on it expecting him to fool me and just switch to buying parts and assambling a computer. Coupleminutes passed. Man start creating his own instruction set. I can't believe what I'm watching. Instantly subbed.
I basically did this a good few years ago for an A-level project (yeah, it was massively out of spec for what i *should* have been able to do), designed an 8-bit CPU (a somewhat more basic architecture than yours, could only access 256 bytes of RAM and all registers were memory mapped) in logicsim, and wrote an emulator in C#. interestingly, you took the same approach as i did with running instructions, one big case switch, but having recently written a Z80 simulator, i kind or realised that was a poor design choice. In the process writing the emulator, I also wrote a simple assembler (very simple) but still offered a useful feature set. I actually designed a few circuits that made the architecture unique, while I did store an overflow and negative bit in a flags register, I also implemented a magnitude comparator, this meant i had some opcodes for jump is less than, jump if more than and jump if equal, as well as jump if magnitude greater than (equal to and less than) which made some operations use significantly less instructions. Was a fun project, especially given how much experience i didn't have at the time. Logicsim wasn't brilliant, but it worked for the design. My test programs were fairly simple, counting up and down to 100, calculate Fibonacci sequence etc...
Alternative title: "how to get rid of Intel Management Engine without actually get rid of Intel Management Engine"
Now Intel buys RISC V, which was a promising alternative
😭
@@nicolefischer1504 Well, thanks for the clarification but that's not much better 😂
Amd
@Marco Bonera Intel bought SciFive, a RISCV manufacturer, the ISA itself is still open for anyone to use
I know those Ben Eater tutorials carried you so hard
Next: creating Super Mario Bros. on a lamp
How 'bout Metal Gear on a calculator?
@@philosophicalearthworm6819 it's not universal enough. Surprisingly Calculators actually change a lot overseas. Most of Europe uses Casio because Texas Instruments are way to expensive
@@philosophicalearthworm6819 that would be easier
@@InnerEagle too easy for jdh? lol
I am genuinely curious about calculators(despite not being a math person), like ways that they could be re-purposed.
I took a class in college where we described a computer's operations and then used "Register transfer language" to express all the operations; these were then collected and used to generate the logic for each left-hand-side of those expressions.
The computer happened to be a PDP-8.
Note that there was no anachronistic 64K ROMs for decoding!
I would be more impressed if you actually did express the design using only basic logic gates, preferably as "sum of products" or "product of sum" subexpressions wherever possible.
(This was in the late 1980's)
i feel proud of already having done that, tho jeez congrats, the amount of work must've been insane, i mean it took me days to work on my own 16 bit computer design
also good luck for the next vidéo: "i build my own computer in real life on a bread board"
I understood very little of this more than superficially but the coding montages are very cool to watch, so I had to leave a like.
I never thought I would be seeing logisim ever again but god damn was I happy when I saw it. Might be old and discontinued (at least the original) piece of shit but it just works™
I love Logisim, holds a very special place in my heart. don't know any other programs like it and don't want to.
@@jdh Same, I can't really use anything else because I don't know about another simple software that is as complete as logisim.
Though everytime I have to rotate components or choose the number of inputs of a component from a list I die a little bit inside
There acually is a fork of logisim, call logisim-evulution that is still maintained
There is an open source program similar to logisim that is being actively developed, you can find it at hneemann/Digital on github, don't know how it compares do logisim though
@@jdh mulimedia logic (yes mulimedia)
I have a huge respect for all the low level C/C++ and assembly programmers as I am a python programmer and find these languages really difficult
I am a Python programmer, but I am switching to C now. I don't really code in Python now much, because of C. Believe me, C/C++ is easy, you just need to learn the syntax. Assembly is easy as well, you just need to understand how your CPU works, what are memory addresses, etc.
@@dominikmazurek753 how do you learn this lower level stuff? I have java exp but want to learn computers on the lower level as that'll help me be a better dev tbh
@@avidreader6534 Watch ben eater
@@neekap5987 will do thank you. Any more advice would be greatly appreciate too lok
@@avidreader6534 Sure, i would advise you to try the Harvard CS50 course for learning the C language, you can skim over concepts that you already know about from python but you should focus on pointers and memory management
This is amazing. I wrote a cross compiler in a manic weekend, but the rest of this is just beyond me.
I was tired of "you missed a semicolon" errors, so the compiler just printed the error message and said "one assumed" when it could.
"So today we're going to write our ohne matrix."
Goes to show perfection gets in the way of progress. Great job with this "computing independence" project!
next next video: I mined and refined silicon into wafers to build my own SoC
Absolute insanity, absolutely amazing, massive kudos to you fro doing this.
This is awesome. I did this exact same project in Logisin back in my undergrad. 8 bit cpu with the built in video display and an assembler I wrote in Java.
Up next uploading it on an FPGA, making your own internet and then deep diving into VLSI design to get down to CMOS transistors and clock generation. This is fun!!
This low-level stuff is super impressive
Crazy how some logic gates arranged in the right way can make literally anything happen
Real Question: Where have you learned these stuff
on the internets !
@@casimirwallwitz8646 the osdev wiki wont be too much help if you're making your own cpu.
His tetris os honestly isn't that hard to do, he basically just got a basic booting os and then started on graphics right away after input is working.
Were you creating an os meant to be an os you would also need to worry about paging, user mode, system calls, executable formats etc
he is rich
@@leyasep5919 People using internet for the purposes it was likely originally intended... It is strangely hypnotizing.
@@pythonista_333 And that has to do what, exactly?
Possible Next Video: I created my own TV that can only let you play tic-tac-toe
Ahhh man logisim brings me back. The logic projects I got to do in uni were my absolute favorite. Writing quick sort in MIPS is a close second. Love everything about this project
JavaScript being in jdh's book of uncool things is so unfathomably based
Using particles that some other being created seems too cheaty, you've got to create your own universe next.
spoiler if you are like me: don't worry, he renames the second E instruction to F in the doc at 5:27
I couldn't take my eyes of that second E over there
phew* than goodness
I love your videos!
They inspired me to try making simple 2d games on C++ last weekend
“If you wish to make an apple pie from scratch, you must first invent the universe.”
― Carl Sagan
I like how you’ve come from web to os to CPU ARCHITECTURE!! love your videos btw
*”mumbo jumbo says the redstone is simple when it sounds really complicated!”*
People who understand this video: *amatuers*
Mumbos redstone skills are overrated as fuck :D Just saying, I like his vids. I always cringe when somebody says he's the best at redstone. But he's not even saying it, people are.
I grew up watching redstone computers and stuff, in terms of this it kinda sounds trivial.
@@aviko9560 redstone computers are different in my opinion, what mumbo jumbo does is actually practical and useful, but whos actually going to use a redstone computer? its impressive but practically, mumbo jumbos redstone is more useful.
@@aviko9560 Mumbo actually once said that he isn’t the best at redstone
Legends who watched all of crash course computer science and binged all of ben eater's videos: noobs
@@aviko9560 yeah, he literally says that what he's doing is simple.
"I designed my own custom computer just to play PONG"
They did this in the 70s. Quite a lot, actually.
*So make pong with literal atoms.*
*make a computer with air molecules and stuff*
Or just p a r t i c l e s
YEESSS YEEESSS YESSS Man You are a beast! So much time spend into this it's almost incredible! Make that circuit and the screen too for fucks sake! Amazing work ! I reeeeallly would love to know where did you learn or search all the info you need to make such a thing like this tho, because if you are not an electric/electronic engineer you must have worked your brain AF to make this happen. Congrats once more, and will be expecting anxious the next episode!!
EE here. Props for doing it with logic gates. Takes a much bigger brain to do that without tools like Verilog. Verilog is easy to learn if you know C already. And the best part is you can then drop it right into an FPGA. I hope you aren't planning to wire up raw gates using transistors. You will save many months of your life if you buy a cheap FPGA, and learn VHDL or Verilog. Curious which way you go with it.
This is really cool, this popped up in my recommended and I already love it. But you should make your own OS to run on the CPU, which can run a basic word processor that you made.
What is an OS? Why do consoles tend to only have copy protection code in ROM, but no OS? Home computers at least had BASIC . Unix started as a file system.. So on a console with only ROM, or only a CD-ROM you don’t need an OS?
Damn, never say ¨I use arch¨ to this guy, he'll completely humiliate you
Good thing I'm putting Fedora back on my laptop as I'm typing this then.
A day with a jdh upload is a good day
Next video:
“I’ve decided that I’m too good for computers”
Just wanted to say, you're extremely underrated, and I can't wait for your future success. You're an inspiration, and a perfect mix of learning and insanity, and its addicting!
I’ve made a couple of redstone computers in minecraft, most are just paraplegic calculators, and this is a whole other level
He set the bar really high with his first video but he never disappoints
So I've been thinking and came to the conclusion that the world isn't my tier, so I remade the *whole word atom by atom*
Next video: I decided to make my own universe to house my custom computer…
I love exponential complexity evolution on this channel. The next step should be custom math & logic for space invaders.
Taking custom pc to a whole other level
Man I'm addicted to this channel...😳😳
Next video: Making the first 128 bit computer but just to play PAC MAN
You mean Itanium? 128 bit registers appeared in N64 and are know as SSE2 today.
I very rarely comment so just wanted to say that you’re my hero this is amazing and quality work
This is the guy that answers questions on StackOverflow
This is the closest thing I can find on the entire internet to a tutorial on making a computer from scratch
LOVE THIS. Makes me wanna dive in and do the same myself! 😅also can't wait to see the next part in the series!
VHDL and Verilog aren't witchcraft! I find it quite impressive how well you were able to design this at a gate level. Because, if you could do this at a gate level, writing this CPU in VHDL or Verilog could have easily taken you about an hour or 2 max.
I write a different CPU each morning before the breakfest in Verilog
"an hour or 2 max" plus a week of writing testbenches and debugging lol
@@theforeskinsnatcher373I read that the branch delay slots needs a lot of testing. But then it is only one bit in addition to the return address. It should only need twice as many tests and only need one branch in microcode.
next video: I realised Assembly is too high level, so I coded Breakout in binary
Ben eater: starts from the bottom and works upwards. jdh: starts from the top and works downwards
Bro wrote a custom architecture, an EMULATOR, and a fricking ASSEMBLER? holy cow this guy's unstoppable. I don't care if you say you dont have "formal education" in this stuff, you're a whole hecka lot better than me.
4:27 16 instructions? That's a lot... you can do the same with just one!
subleq
Off the shelf ALU has Input Pins. Why do you want to waste them?
The legend has returned
10:28 is just beautiful for my eyes & ears ngl
_After a couple more videos_
Jdh : So I theorized 'jdh physics' so that entire universe is just Tetris.
The point of an LL(1) language is so you can easily build a recursive descent parser by hand for it. Perhaps you mean LALR(1) languages such as Yacc and Bison can automatically generate parsers for?
As for recursive descent parsers, you just need one function per nonterminal in the grammar of your language and in those one clause for each production that can replace that non terminal.
And at the start of each function you can pick which clause to use based on the next input token. That's what LL(1) stands for. Left to right scanning of the parser across the input tokens, doing Leftmost expansions. With one token of lookahead.
Each clause is straightforward as well. For each symbol in the corresponding production, we treat them, in the order listed in the production, in the following way:
A nonterminal means call the function for that nonterminal.
A terminal means consume a token from the input stream. If the token read isn't the same type that the production, there's an error and the program doesn't parse.
If the token is of the required type, you consume it somehow. Minimally remove it from the token stream. If the goal is to just check programs are correct, you can just ignore the token after you remove it from the token stream. Your parser will simply do nothing except empty the input of tokens for a correct program but produce an error if you input an incorrect program.
If you want to do something more interesting like generate code you can have your tokenizer add more information to your token, and that information will be in its proper context, and not only that you can leave breadcrumbs for subsequent steps.
For example you could tag variable name token types with the actual variable name. Or some computer readable equivalent. As you go along you can build up a symbol table containing such things as the type for type checking and the memory allocated to the variable.
You could implement block scoping by having a stack of or chain of symbol tables. And if a variable name isn't found in the deepest nested block scoping you look for it in the next highest scope and symbol table.
You can also do something with it like generate code for an assignment. Which would simply be a store into the memory location of whatever the right hand side evaluates to. Which would be calculated recursively and available when the function for the last nonterminal returns.
That said an assembler doesn't likely need a parser, just a lexer or regular expression matching/FSM because there's likely no nested language constructs.
And two passes, so you can find/computer the address of labels when you see the definition of a label and then come back and replace any forward references.
Next: "I travelled to another dimension and caused the Big Bang to make my own universe"
I feel like such a terrible CSC student right now
This is more of a hardware engineering thing, often you learn about this kind of thing when working with FPGAs and such. Computer science has lots of far higher level concepts and ideas. You can come across it in pure computer science of course, but it's not too common.
@@photonicpizza1466 Yeah this is just stuff I'd love to learn. His making Minecraft in C video makes me want to try something like that as well.
Dang, Intel better watch out, there’s a new player in the game, and he’s about to bankrupt them
The fact that this video does not have over a million views already is criminal. Great video!
The CPU design animation is a thing of beauty.. very impressive. I could write something about the psychology of people who aim for total control and independence, but this text field won't allow the necessary number of characters. Looking forward to seeing the finished machine video. Hat off for your hobby project focus. (Btw, I created a Pong physical game using Tinkerforge components and a Java program to drive/read the three OLED displays, joysticks and buzzer in 2-3 days.. yeah, I know, that's peanuts compared to your project ;-)