Excellent video! I read a book ages ago called "But how do it know?" which covered not just this, but how to make memory cells, RAM, registers, everything. Basically a full NAND to Tetris style book. This was a really good summary of the same fundamental concepts, excellent work!
This is the concept behind a Single Instruction Set Computer using NAND. Another great addition would be to show how NAND can be used to create a latch which is used for memory, flip flops, pipelining, etc.
Attempting to construct a computer exclusively from gates identical whose inputs are indistinguishable from each other will yield a problem: if all gates have a propagation time of exactly N, all positive feedback loops will have a propagation time that's a multiple of 2N, making it impossible to build a synchronizer. If one can control propagation times to avoid such issues, or impose setup/hold requirements on all inputs and outputs including things like buttons, building a computer out of NANDs wouldn't be the most practical approach, but it would hardly be impossible. The number of two-input NAND gates required to form an N-bit RAM, for example, would be O(N)--not even O(NlgN)--if one is willing to accept an O(lgN) access time, and some practical machines like the Apollo Guidance Computer were built almost entirely out of a single kind of gate (NOR gates in the case of the AGC) except for things like the memory system which could have been built out of NAND cates, but were more efficiently constructed using magnetic cores.
@@mocringe811 It's possible to build a one-bit RAM "RAM" using four NAND gates, if the data input is available in both true and complemented formats. Given two N-bit RAMs, it's possible to produce a 2N bit RAM by adding about eight NAND gates. As RAMs get bigger, the average cost per bit will increase, but never exceed twelve NAND gates (1.5 times 8) per bit.
I've played around a bit with these logic gates, making these registers, ALUs and such, I thought Daniel was referring to the NAND function itself creating memory, thanks anyway
Oh nice, I like how you used a variety of games to motivate the concept of completeness! It gives a solid intuition because games are obviously logic-based and computable, but also an unlimited, creative, imaginative space.
There is an interesting concept called a OISC: One Instruction Set Computer. It's a processor that only does one operation while still being a general-purpose computer. An example of a single operation that could be used for this computer is SUBLEQ: subtract and branch if result less than or equal to. There's an excellent article on TechTinkering that illustrates this one.
Your channel is like several years ahead from me to understand it, I've found really interesting videos here that are like 5 or 9 years old (such as self-modifying code and x86 Asm). It's just too much information for a regular C# programmer like I am, but I always wanted low level things that I'm just starting to understand. Anyway, thank you very much for this because no one in the entire planet talks about, and that is very valuable for me.
There's a great programming class online called Nand 2 Tetris where you start off with making all of the boolean operators with nand gates and then build an ALU and CPU with that and write assembly and eventually build an OS. But the key idea there is that everything is just layers of abstraction on top of a bunch of logical gates (which could technically be all nand gates)
Ages ago in school, I remember working out how to make a complete two-bit adder, using nothing but NAND gates. You can't do this with 'OR' only gates, or 'AND' only. But 'NAND' gates can be used to create NOT, AND, and OR so you're able to create just about anything.
The third truth table in statistical logic would that be common-sense statistics or government statistics? Experience shows the two do share the same truth tables.🤣🤣=😭
Excellent video! I'm experienced computer scientist, and this explanation is just perfect, especially final conclusion that all computations can be done with one kind of logic gate, which is obvious for some of us, but sometimes hard to explain. Your conduct of explanation is just perfect.
@@daniel.watching as far as I understand it, you just have one line coming in with constant power that goes straight to the output. Then you have a transistor coming off that line that goes to ground. When the transistor is off, the power goes past it into the output making a “1”. If it’s powered on, all the flow goes through the transistor and into ground making the output “0”.
@@jgained5065 Yes, but a transistor isn't an AND gate. You can make all the gates from just NAND gates. NOT is just the input going to both inputs. An OR is just a NAND with both inputs negated (with the NOT, so 3 NANDS). AND is obviously a NOT on the NAND output. Etc You can make anything with NAND. I think you can do the same with NOR and XOR. But AND has no way to negate with just gates. Also zeroing your signal to ground sounds like a really expensive way to make a NOT gate. You're basically shorting that signal. That means anything else reading that signal will also read 0. And it probably back propagates through all the transistors. I'm not an electrical engineer but it sounds inefficient.
@@daniel.watching I’m not really sure what you interpreted my comment to mean but I was talking about the beginning of the video, where Boolean and is introduced.
I just luv the way you can make such dull subjects (just like boolean algebra can get in a discrete maths book) become so exciting to talk about, not to mention the unique sense of humor you have in discussing such things. You have the gift of teaching. Great stuff, just awesome!
I recommend the game Turing Complete to play around with this. In the game you start with a nand gate and gradually build a computer from it, and then you programm your computer. I had a lot of fun with it.
Beautiful explanation. I'm familiar with logic gates and boolean algebra, but never knew that NAND and NOR are universal in this way. Just a fantastic job on this.
I was watching for the first 17 minutes, all the time expecting you to show that the NOR (or NAND) gate is all you need, and then just a few seconds later you actually got there. :-)
Brilliant video, thanks Creel for your effort. Some time ago I also watched and studied your entire (newer) playlist of assembly, I like the way you explain topics!
That's a fine idea!! I do love Socratica, if you've not seen their videos on the topic, they're definitely worth a watch! I would love to make videos on that. Not sure when tho. Thank you for watching, have a good one :)
@@WhatsACreel oh, I have not heard of them. I'll definitely have to check them out. I've been self learning that stuff after Haskell's heavy usage of abstract algebra concepts introduced me to it. The problem is that the main resources I've been using are Wikipedia and nLab... And my formal math education ended around pre-calc, so I often end up in a black hole on those sites.
Oh that's cool! I'm not very good at keeping up with the git hub. Is there soemthing I need to do there? I did want to get back into it. Possibly upload a much larger project, 64 bits per channel raw photo editor, not sure when I'll get round to it tho.
@@WhatsACreel You'd need to accept is as a merge! user name is the same. in terms of implementation, it verifies for the feature using cpuinfo and pigibacks off of the mechanism/style you used in the base functions for ease of readability. Flops is very close to your own implementation, but larger registers. the SHR uses some tricks of moving across AVX port 0/1 and port5 for the xmm,ymm and zmm registers to save on instruction latency. You can comment on the request before merging
Nice! Takes me back to classroom days when, with paper & pencil, we used '•' for AND, '+' for OR and bars over top for NOT... Fun to swap things around like: NOT(a) • NOT(b) == "a NOR b"... Figuring out XOR was something special...
Boole based his algebra on the ancient Indian system of logic, Nyaya. In fact, Boole's binary logic is a special case of Nyaya's ternary, quaternary... N-ary logic
Excellent video! Great presentation and production, good sense of humor and all that. And, as has been mentioned elsewhere in the comments, a truly touching ending. Alan Turing was a phenomenal genius and we all owe him a debt of gratitude.
It's weird to see operations that I have come to intuitively learn how they work, broken down in great mathematical detail like this, and describing behaviours I know to be true from hands on experience but had never seen explained this way. Interesting video
This isn't a criticism, more of a continuation. While the fact that computers can be conducted entirely out of NAND gates is impressive, my favorite minimal Turing complete system would have to be the SK calculus. It's based on lambda calculus, which played a crucial role in showing the usefulness of Turing machines. But SK calculus breaks it down even further into simply two combinators (aka functions): K, the constant function constructor: K x y -> x S, the branch constructor: S x y z -> (x z) (y z) That's it. That's Turing complete. The fact that this is the case is one of the most beautiful things in computer science. For example, you can make recursion without self referencing: Y f -> f (Y f), where Y = S (K (S (SKK) (SKK)) (S (S S (S K (SKK))) (S (SKK) (SKK)) Of course lambda and SK calculi are not really able to be implemented directly in hardware the same way NAND is able to, but computers were not originally a physical concept. In a way, NAND is the face of minimal turing completeness on the hardware side, while SK is the face of minimal turing completeness on the software side.
Interesting, make me remember about the Karnaugh map's which not only compress but also you can solve any complex combination of logic gates arrays using only one type of gate AND, nice to see another of your interesting videos!
Great video explaining boolean algebra. I began studying boolean algebra in grade 7. This, and some other mathematics topics I was looking into at that time, drove my math teacher insane. She had gotten used to me returning the math exercise book we were given at the beginning of each school year the day after I received it with all questions solved since 4th grade. But now I was above her grade, and she could no longer answer my questions. Boolean algebra got me interested in computer science, so I studied computer science and math at university. But since you mentioned that this can compute everything, please make a follow-up video showing how it can compute the halting problem :)
This was a nice and ... umm ... logical explanation of how this all fits together. Very complete. If Boolean logic blows your mind, wait until you see FPGAs where you can define hardware, using software, by literally writing out giant boolean expressions.
and turing machines have never been created because no one has found a way to make infinite ram. In all seriousness if you find a new uncomputable function you might win yourself a spot in wikipedia. alongside Rice. Most things are computable unless you're trying a way to fuck with the machine's memory instructions.
@@theguythatcoment It's true that no one can create a Turing machine, but we still speak of programming languages as being Turing Complete, even if they don't have infinite ram. An new incomputable function? How about a function that can prove that a program will write the number 5? Or 6? Or 7?
But everything is able to be generated. For example most real numbers are uncomputeable but if you generate a random string of digits then you are generating an uncomputeable number.
This brings me back a lot of memories makking circuits withh just nand gates. It saved space since a single chip comes with 4 or more nand gates, while doing "and, or, and not" would require at least 3 chips. The nor gate is universal too :P
I'm old enough to have worked on 'computer controlled equipment' in the 70s. elevators on Naval supply ships to be specific, and at that time the 'computer' to control these elevators was a big hard wired circuit board using discrete transistors ... to save on the number of different logic gates, which were created with individual relatively high powered transistors and 'modular' to be easily replaced when one failed, all the gates were NAND gates ... all logic circuits can be created using combinations of NAND gates.
Fun video, lively and clear presentation! But - did I hear correctly?! - Charles Babbage did not use 'valves & tubes & all kinds of craziness': his Difference Engine was purely mechanical, with gears & levers. And let's never forget the pioneering work of Ada Lovelace!!
The term I learned was "complete base", rather than "universal". My book recommendations: 1) The Turing Omnibus, A.K. Dewdney. 2) Bebop to the Boolean Boogie, Clive Maxfield.
Oh man... this reminds me of when I took a logic class in college. The professor was brilliant but had a _super thick_ Indian accent. So in a class about zero's and one's, I had a professor who could pronounce neither 'zero' or 'one'. On top of that, we had randomly assigned "lab" partners... and mine was a Korean dude who spoke almost literally zero English (cool dude though! We pretty much just smoked weird foreign cigarettes and communicated through gestures and math lol). That semester was an absolute nightmare. I think I got a B, but that was by far the hardest class I've ever taken. Makes much more sense the second time around, honestly. Great video btw, I'm subscribing :)
In fact, all one needs is a nand gate to compute all logic. ALL computation including that in your brain can be done with a simple nand operation(of course in immensely complex expressions).
That all stuff was thought about before we had whizzy silicon things just blows my mind. I wonder if Boole had half an inkling as to the true possibilities his ideas would unleash?
Yes. Its needed in building electronic machines or also in programming. In electronic design it is needed to know how its made of transistors - and some cases it can be hard challenge.
You can read the output column as a number and use that number to uniquely identify a function by number. E.g. if the bottom row is MSB, AND is function 8, TRUE is 15, OR is 14, etc.
fantastic episode :) ⭐️⭐️⭐️⭐️⭐️ I'm getting the impression theres "dice info-anim renderer pro" or something out there somwhere. I've seen this exact animation style before.
Once you arrive at Alan Turing, I think also Alonzo Church deserves a shout-out for the equally powerful Lambda Calculus. But hopefully, you will do that in a future video.
He presented a dichotomic way to produce the formula for any function. But there's a simpler method: just list all the cases that produce a "1" output and OR these. Example: a 3 input function that is always 0 except for 000 and 011 inputs. The function is f=(~a * ~b * ~c) + (~a * b * c) (* is AND, + is OR, ~ is NOT)
Oh, I like that :) I remember doing something similar with AND, you just like (A&~B&C) and then OR together any of the 1's in the table. Cheers for sharing mate!
@@WhatsACreel The method is to find the 1 and then ORing subfunctions that are zero everywhere except for that particular input. That produce a sequence like ( *... ) + ( *... ) + ( *... )... But we can do the opposite by finding the 0 outputs and then ANDing subfunctions that are 1 everywhere except for that particular input and get ( +... ) * ( +...)... en.m.wikipedia.org/wiki/Canonical_normal_form
A sum of products is the way most combinational circuits like this are implemented, because it settles in constant time. Look up Karnaugh map if you want to see how to optimize it. ;) Of course there are programs that will do this for you. Logisim is one I know of, although the interface for inputting a truth table can be kind of painful.
it is much simple to use karnaugh maps, to simplify the truth table into simple mathematical operations especially when the number of bits are increased.. Another way is to simplify the truth table, select the rows where outputs are 1 and add them for example XOR gate: A B O 0 0 0 0 1 1 1 0 1 1 1 0 So what we notice is A = 0 and B = 1 outputs 1 or A = 1 and B = 0 outputs 1, we can translate the function above into: O = A'B+AB', this can be also noted O = (~A)&B | A&(~B) If the output function is too big, it can be simplified using simple math, example OR Gate. A B O 0 0 0 0 1 1 1 0 1 1 1 1 O = A'B+AB'+AB = A(B+B') + A'B = A + A'B =A+B A+A' = 1 A+A'B = A+B A' = ~A AB = A&B A+B = A|B
Before you transited to Allen Turing you remind me of lambda calculus in general where it can do everything also with the S K logic. PS: turning machines and lambda calculus are equivalent XD quite the relationship there.
Had to pause and come see if someone made this comment. Spot on! However, while the Difference Engine was made of gears and rods, not valves and tubes (vacuum tubes came in 1904), there is still some potential validity, as he could possibly be referring to steam valves and copper tubes. Given that had the DE actually been completed and put into operation it would likely have been operated by steam, as indicated by his purported quote: "I wish to God these calculations had been executed by steam." Check out this steam-driven 4-column difference engine: ua-cam.com/video/t8aYkow-Fv8/v-deo.html Still, the engine itself was gears on rods, with carry arms - purely mechanical, as you say.
Great video!! Loved the content. Extremely curious about the prime gate you mentioned @13:53, is it just a odd detector or does it work with binary primes?
there are ways of explaining simple things that make them almost impossible to understand. but there are also ways of explaining complex things in ways that make them trivial to understand.
Excellent CGI. I'm assuming you used Blender? Makes me want to play with esoteric languages again, BF being my favorite. Though I will admit to laughing harder than I should have at Moo.
Along with NAND are the lesser known alternatives: NOR (a nor a = not a) AND-NOT with ONE (1 and not a -> not a) OR-NOT with ZERO (complementary to &~ with 1)
Excellent video! I read a book ages ago called "But how do it know?" which covered not just this, but how to make memory cells, RAM, registers, everything. Basically a full NAND to Tetris style book. This was a really good summary of the same fundamental concepts, excellent work!
Thanks mate!! Sounds like a great read, if I ever come across that book I'll defo pick it up! Thanks for watching :)
This book is a hidden gem! I go back to it once in a while, simply amazing material.
I bought this book for my son and nieces/nephews, and I have extra copies on stand-by 😅
Book ordered. Thanks for the pointer :-)
@@RupertReynolds1962 Great! If you like the subject you'll definitely love the book!
This is the concept behind a Single Instruction Set Computer using NAND. Another great addition would be to show how NAND can be used to create a latch which is used for memory, flip flops, pipelining, etc.
Oh that is a good addition!! Cheers for watching :)
Attempting to construct a computer exclusively from gates identical whose inputs are indistinguishable from each other will yield a problem: if all gates have a propagation time of exactly N, all positive feedback loops will have a propagation time that's a multiple of 2N, making it impossible to build a synchronizer. If one can control propagation times to avoid such issues, or impose setup/hold requirements on all inputs and outputs including things like buttons, building a computer out of NANDs wouldn't be the most practical approach, but it would hardly be impossible. The number of two-input NAND gates required to form an N-bit RAM, for example, would be O(N)--not even O(NlgN)--if one is willing to accept an O(lgN) access time, and some practical machines like the Apollo Guidance Computer were built almost entirely out of a single kind of gate (NOR gates in the case of the AGC) except for things like the memory system which could have been built out of NAND cates, but were more efficiently constructed using magnetic cores.
Can you link somewhere for me to look into this?
@@mocringe811 It's possible to build a one-bit RAM "RAM" using four NAND gates, if the data input is available in both true and complemented formats. Given two N-bit RAMs, it's possible to produce a 2N bit RAM by adding about eight NAND gates. As RAMs get bigger, the average cost per bit will increase, but never exceed twelve NAND gates (1.5 times 8) per bit.
I've played around a bit with these logic gates, making these registers, ALUs and such, I thought Daniel was referring to the NAND function itself creating memory, thanks anyway
Oh nice, I like how you used a variety of games to motivate the concept of completeness! It gives a solid intuition because games are obviously logic-based and computable, but also an unlimited, creative, imaginative space.
There is an interesting concept called a OISC: One Instruction Set Computer. It's a processor that only does one operation while still being a general-purpose computer. An example of a single operation that could be used for this computer is SUBLEQ: subtract and branch if result less than or equal to. There's an excellent article on TechTinkering that illustrates this one.
Your channel is like several years ahead from me to understand it, I've found really interesting videos here that are like 5 or 9 years old (such as self-modifying code and x86 Asm).
It's just too much information for a regular C# programmer like I am, but I always wanted low level things that I'm just starting to understand.
Anyway, thank you very much for this because no one in the entire planet talks about, and that is very valuable for me.
There's a great programming class online called Nand 2 Tetris where you start off with making all of the boolean operators with nand gates and then build an ALU and CPU with that and write assembly and eventually build an OS. But the key idea there is that everything is just layers of abstraction on top of a bunch of logical gates (which could technically be all nand gates)
Ages ago in school, I remember working out how to make a complete two-bit adder, using nothing but NAND gates. You can't do this with 'OR' only gates, or 'AND' only. But 'NAND' gates can be used to create NOT, AND, and OR so you're able to create just about anything.
I remember college learning these truth tables in 3 different classes - assembly programming, philosophy &logic, and statistical logic.
The third truth table in statistical logic would that be common-sense statistics or government statistics? Experience shows the two do share the same truth tables.🤣🤣=😭
@@robertjames4908 Definitely common sense. Government logic makes my head explode 🤣
Alan Turing is the reason we are all here... on the internet.. on UA-cam.. in front of our phones/computers/tables/etc... Cheers, mate!
Excellent video! I'm experienced computer scientist, and this explanation is just perfect, especially final conclusion that all computations can be done with one kind of logic gate, which is obvious for some of us, but sometimes hard to explain. Your conduct of explanation is just perfect.
I kept saying “and gate” out loud over and over until you said Boolean and, it made me feel so accomplished to know it before it was said.
Isn't the NAND gate the "universal" gate though. I can't imagine how you'd create a NOT operation with just AND gates.
@@daniel.watching as far as I understand it, you just have one line coming in with constant power that goes straight to the output. Then you have a transistor coming off that line that goes to ground. When the transistor is off, the power goes past it into the output making a “1”. If it’s powered on, all the flow goes through the transistor and into ground making the output “0”.
Diagram:
/---output
|
+-T--ground
| |
| \-input
p
w
r
@@jgained5065 Yes, but a transistor isn't an AND gate. You can make all the gates from just NAND gates. NOT is just the input going to both inputs. An OR is just a NAND with both inputs negated (with the NOT, so 3 NANDS). AND is obviously a NOT on the NAND output. Etc
You can make anything with NAND. I think you can do the same with NOR and XOR. But AND has no way to negate with just gates.
Also zeroing your signal to ground sounds like a really expensive way to make a NOT gate. You're basically shorting that signal. That means anything else reading that signal will also read 0. And it probably back propagates through all the transistors. I'm not an electrical engineer but it sounds inefficient.
@@daniel.watching I’m not really sure what you interpreted my comment to mean but I was talking about the beginning of the video, where Boolean and is introduced.
I just luv the way you can make such dull subjects (just like boolean algebra can get in a discrete maths book) become so exciting to talk about, not to mention the unique sense of humor you have in discussing such things. You have the gift of teaching. Great stuff, just awesome!
I recommend the game Turing Complete to play around with this.
In the game you start with a nand gate and gradually build a computer from it, and then you programm your computer.
I had a lot of fun with it.
I've never heard of that one but I have played a browser game called NAND Game
First year of curriculum: rewrite the expression only using NAND gates. The game starts when you golf it!
it can also be done using only XOR gates!
@@proloycodes a xor a 0
So, xor need a constant. Eg: not a 1 xor a.
Nand work without constants, only non inverted inputs.
So, nand does it better ^^
@@programaths hmm can you show me a not gate without using constants, emulated using NAND?
@@proloycodes a nand a not a
@@programaths great! you've converted me into NAND-ism!
All Hail the Great NAND!
lol
This is a very good way of explaining logic to some who might not know a lot about computing
Touching ending mate. I'd also like to thank Alan. Bloody legend.
Indeed! Thanks Alan!
There was nothing in this video that I didn't know before, but it's a nice refresher. A very nice refresher.
I don't have words to describe how much I love boolean algebra.. and this presentation makes me feel very much at home.. Thanks Creel ;)
Beautiful explanation. I'm familiar with logic gates and boolean algebra, but never knew that NAND and NOR are universal in this way. Just a fantastic job on this.
I was watching for the first 17 minutes, all the time expecting you to show that the NOR (or NAND) gate is all you need, and then just a few seconds later you actually got there. :-)
Brilliant video, thanks Creel for your effort. Some time ago I also watched and studied your entire (newer) playlist of assembly, I like the way you explain topics!
The video taught me more than the past two years of CS at university
This reminds me the digital electronic class and fpga class at university, good memories, thanks for the video
Thanks mate, amazing breakdown. “NAND is all you need”
Now of only someone could explain abstract algebra/group theory with this level of clarity :)
That's a fine idea!! I do love Socratica, if you've not seen their videos on the topic, they're definitely worth a watch!
I would love to make videos on that. Not sure when tho.
Thank you for watching, have a good one :)
@@WhatsACreel oh, I have not heard of them. I'll definitely have to check them out. I've been self learning that stuff after Haskell's heavy usage of abstract algebra concepts introduced me to it. The problem is that the main resources I've been using are Wikipedia and nLab... And my formal math education ended around pre-calc, so I often end up in a black hole on those sites.
Hey mate! Heads up, for the Phenom benchmark, I added AVX-512 options to both flops and SHR as a pull request
Oh that's cool! I'm not very good at keeping up with the git hub. Is there soemthing I need to do there?
I did want to get back into it. Possibly upload a much larger project, 64 bits per channel raw photo editor, not sure when I'll get round to it tho.
@@WhatsACreel You'd need to accept is as a merge! user name is the same.
in terms of implementation, it verifies for the feature using cpuinfo and pigibacks off of the mechanism/style you used in the base functions for ease of readability.
Flops is very close to your own implementation, but larger registers.
the SHR uses some tricks of moving across AVX port 0/1 and port5 for the xmm,ymm and zmm registers to save on instruction latency.
You can comment on the request before merging
Nice! Takes me back to classroom days when, with paper & pencil, we used '•' for AND, '+' for OR and bars over top for NOT... Fun to swap things around like: NOT(a) • NOT(b) == "a NOR b"...
Figuring out XOR was something special...
Really good video, really like the way you present everything with the animations. Keep it up!
Boole based his algebra on the ancient Indian system of logic, Nyaya. In fact, Boole's binary logic is a special case of Nyaya's ternary, quaternary... N-ary logic
The AR presentation is absolutely sick!
Excellent video! Great presentation and production, good sense of humor and all that. And, as has been mentioned elsewhere in the comments, a truly touching ending. Alan Turing was a phenomenal genius and we all owe him a debt of gratitude.
It's weird to see operations that I have come to intuitively learn how they work, broken down in great mathematical detail like this, and describing behaviours I know to be true from hands on experience but had never seen explained this way. Interesting video
Is this made in blender? I really love the graphics, helps me understand the concept a lot more. Great video!
wait... you're that aussie guy who makes great videos. i didn't know you were back!
This isn't a criticism, more of a continuation. While the fact that computers can be conducted entirely out of NAND gates is impressive, my favorite minimal Turing complete system would have to be the SK calculus. It's based on lambda calculus, which played a crucial role in showing the usefulness of Turing machines. But SK calculus breaks it down even further into simply two combinators (aka functions):
K, the constant function constructor: K x y -> x
S, the branch constructor: S x y z -> (x z) (y z)
That's it. That's Turing complete. The fact that this is the case is one of the most beautiful things in computer science. For example, you can make recursion without self referencing:
Y f -> f (Y f), where Y = S (K (S (SKK) (SKK)) (S (S S (S K (SKK))) (S (SKK) (SKK))
Of course lambda and SK calculi are not really able to be implemented directly in hardware the same way NAND is able to, but computers were not originally a physical concept. In a way, NAND is the face of minimal turing completeness on the hardware side, while SK is the face of minimal turing completeness on the software side.
Interesting, make me remember about the Karnaugh map's which not only compress but also you can solve any complex combination of logic gates arrays using only one type of gate AND, nice to see another of your interesting videos!
Great video explaining boolean algebra.
I began studying boolean algebra in grade 7. This, and some other mathematics topics I was looking into at that time, drove my math teacher insane. She had gotten used to me returning the math exercise book we were given at the beginning of each school year the day after I received it with all questions solved since 4th grade. But now I was above her grade, and she could no longer answer my questions. Boolean algebra got me interested in computer science, so I studied computer science and math at university.
But since you mentioned that this can compute everything, please make a follow-up video showing how it can compute the halting problem :)
i love when i found videos that i know that i will return for years, good job
This was a nice and ... umm ... logical explanation of how this all fits together. Very complete.
If Boolean logic blows your mind, wait until you see FPGAs where you can define hardware, using software, by literally writing out giant boolean expressions.
Mr. Creel, Thanks Mate, You're Amazing.
Really liked the ending.
"can compute EVERYTHING!" that is computeable. Turing proved that not everything is computeable.
and turing machines have never been created because no one has found a way to make infinite ram. In all seriousness if you find a new uncomputable function you might win yourself a spot in wikipedia. alongside Rice.
Most things are computable unless you're trying a way to fuck with the machine's memory instructions.
@@theguythatcoment It's true that no one can create a Turing machine, but we still speak of programming languages as being Turing Complete, even if they don't have infinite ram.
An new incomputable function? How about a function that can prove that a program will write the number 5? Or 6? Or 7?
But everything is able to be generated. For example most real numbers are uncomputeable but if you generate a random string of digits then you are generating an uncomputeable number.
@@lyrimetacurl0 Why is that uncomputeable? For example, 3.27 = 3 + .2 + .007.
@@quintrankid8045ry listing every number between 0 and 1, there are infinite and with most your computer is gonna run out of ram for that number only
Nice video. At school, we also learned to use Karnaugh diagrams.
This brings me back a lot of memories makking circuits withh just nand gates. It saved space since a single chip comes with 4 or more nand gates, while doing "and, or, and not" would require at least 3 chips. The nor gate is universal too :P
Superb ! Brings me way back to 1st learning digital.
I'm old enough to have worked on 'computer controlled equipment' in the 70s. elevators on Naval supply ships to be specific, and at that time the 'computer' to control these elevators was a big hard wired circuit board using discrete transistors ... to save on the number of different logic gates, which were created with individual relatively high powered transistors and 'modular' to be easily replaced when one failed, all the gates were NAND gates ... all logic circuits can be created using combinations of NAND gates.
Great video! Was starting to expect to see Karnaugh maps :D
Wonderful and informative video as always, thank you for taking your time to do this.
Fun video, lively and clear presentation! But - did I hear correctly?! - Charles Babbage did not use 'valves & tubes & all kinds of craziness': his Difference Engine was purely mechanical, with gears & levers.
And let's never forget the pioneering work of Ada Lovelace!!
I'm always happy to see your video pop up into my feed :))
Wow what a channel :) you got a new sub!
Cheers, mate. Rockin' viddy. I can feel my brain much more than usual rn.
And this is the reason that the very first of the 74-series of TTL chips, the 7400, is exactly that. Four NAND-gates on a chip. Genius!
The term I learned was "complete base", rather than "universal".
My book recommendations:
1) The Turing Omnibus, A.K. Dewdney.
2) Bebop to the Boolean Boogie, Clive Maxfield.
Magnificent presentation. Thank you, sir!
Great video! Makes me go back in time when I went through the "From NAND to Tetris" Coursera content.
Oh man... this reminds me of when I took a logic class in college. The professor was brilliant but had a _super thick_ Indian accent. So in a class about zero's and one's, I had a professor who could pronounce neither 'zero' or 'one'. On top of that, we had randomly assigned "lab" partners... and mine was a Korean dude who spoke almost literally zero English (cool dude though! We pretty much just smoked weird foreign cigarettes and communicated through gestures and math lol). That semester was an absolute nightmare.
I think I got a B, but that was by far the hardest class I've ever taken. Makes much more sense the second time around, honestly. Great video btw, I'm subscribing :)
The god is actually just NAND.
You animation skills are so good as programming!
In fact, all one needs is a nand gate to compute all logic. ALL computation including that in your brain can be done with a simple nand operation(of course in immensely complex expressions).
That all stuff was thought about before we had whizzy silicon things just blows my mind. I wonder if Boole had half an inkling as to the true possibilities his ideas would unleash?
Great video, I love the energy and the teaching style!
Keep up the great work!
This reminds me of the Esolang named "FlipJump", it's more simple than Brainfuck but still Turing-Complete, amazing
Yes. Its needed in building electronic machines or also in programming.
In electronic design it is needed to know how its made of transistors - and some cases it can be hard challenge.
You can read the output column as a number and use that number to uniquely identify a function by number. E.g. if the bottom row is MSB, AND is function 8, TRUE is 15, OR is 14, etc.
Great video! You made me love again computer programming :D
Oh, that's great, cheers for watching! :)
fantastic episode :) ⭐️⭐️⭐️⭐️⭐️
I'm getting the impression theres "dice info-anim renderer pro" or something out there somwhere. I've seen this exact animation style before.
Ha! Yes, it's Blender 3D :)
Nice 3d graphics :)
Missed your videos, they're great!
Once you arrive at Alan Turing, I think also Alonzo Church deserves a shout-out for the equally powerful Lambda Calculus. But hopefully, you will do that in a future video.
It gets more interesting when you discover what a Universal Construction is. (Category Theory)
The multiplexer is an even more admirable universal gate !
He presented a dichotomic way to produce the formula for any function. But there's a simpler method: just list all the cases that produce a "1" output and OR these.
Example: a 3 input function that is always 0 except for 000 and 011 inputs. The function is f=(~a * ~b * ~c) + (~a * b * c)
(* is AND, + is OR, ~ is NOT)
Oh, I like that :)
I remember doing something similar with AND, you just like (A&~B&C) and then OR together any of the 1's in the table.
Cheers for sharing mate!
@@WhatsACreel The method is to find the 1 and then ORing subfunctions that are zero everywhere except for that particular input. That produce a sequence like ( *... ) + ( *... ) + ( *... )...
But we can do the opposite by finding the 0 outputs and then ANDing subfunctions that are 1 everywhere except for that particular input and get ( +... ) * ( +...)...
en.m.wikipedia.org/wiki/Canonical_normal_form
@@cmuller1441 You know, I was thinking of normal form too!! BSAT and 3SAT!! Wow, what a puzzle, I looooovve them :)
A sum of products is the way most combinational circuits like this are implemented, because it settles in constant time. Look up Karnaugh map if you want to see how to optimize it. ;)
Of course there are programs that will do this for you. Logisim is one I know of, although the interface for inputting a truth table can be kind of painful.
Model for all the cubes in video?
Minecraft redstone uses universal NOR gates.
I have this on my computer. It's called the "excellent" key. Whatever you want, just press it, and it comes out excellent. Credit: S. Horvath.
it is much simple to use karnaugh maps, to simplify the truth table into simple mathematical operations especially when the number of bits are increased..
Another way is to simplify the truth table, select the rows where outputs are 1 and add them for example XOR gate:
A B O
0 0 0
0 1 1
1 0 1
1 1 0
So what we notice is A = 0 and B = 1 outputs 1 or A = 1 and B = 0 outputs 1, we can translate the function above into:
O = A'B+AB', this can be also noted O = (~A)&B | A&(~B)
If the output function is too big, it can be simplified using simple math, example OR Gate.
A B O
0 0 0
0 1 1
1 0 1
1 1 1
O = A'B+AB'+AB = A(B+B') + A'B = A + A'B =A+B
A+A' = 1
A+A'B = A+B
A' = ~A
AB = A&B
A+B = A|B
Nice to see another video creel, Keep up the good work!
Creel you mad man, another excellent video.
Come to Victoria so we can grab a beer
Before you transited to Allen Turing you remind me of lambda calculus in general where it can do everything also with the S K logic.
PS: turning machines and lambda calculus are equivalent XD quite the relationship there.
BUT...!!!!....we already know the answer to EVERYTHING is 42 :)
Great to see you back!
Creel is the Steve Irwin of Computer Science
Crikey!! :)
4:14 Babbage wasn’t using “valves and tubes”. The Difference Engine was purely mechanical. The Analytical Engine would have been too.
Had to pause and come see if someone made this comment. Spot on! However, while the Difference Engine was made of gears and rods, not valves and tubes (vacuum tubes came in 1904), there is still some potential validity, as he could possibly be referring to steam valves and copper tubes. Given that had the DE actually been completed and put into operation it would likely have been operated by steam, as indicated by his purported quote: "I wish to God these calculations had been executed by steam." Check out this steam-driven 4-column difference engine: ua-cam.com/video/t8aYkow-Fv8/v-deo.html
Still, the engine itself was gears on rods, with carry arms - purely mechanical, as you say.
I used that (A&f(x)) | (~A&f(x)) trick to do conditional subtraction in a long division circuit I was emulating recently 😅
Great video!! Loved the content. Extremely curious about the prime gate you mentioned @13:53, is it just a odd detector or does it work with binary primes?
it only works with numbers upto 7, or if you allow wrapping, it outputs 1 for every 8k + p, where p is prime.
@@proloycodes I knew this was absurd or wouldn't work for all numbers
Patreon is the platform. Patrons are supporters.
there are ways of explaining simple things that make them almost impossible to understand. but there are also ways of explaining complex things in ways that make them trivial to understand.
Excellent CGI. I'm assuming you used Blender? Makes me want to play with esoteric languages again, BF being my favorite. Though I will admit to laughing harder than I should have at Moo.
all you need is nand ❤
Good Day Creel!
G'day mate :)
Yes, you can build entire computer out of NAND gates (and few others too). Congratulations, you have just saved 22 minutes of your life.
Incredible video, thank you!
Thanks mate!
shout out to xor, crazy ass fella
Electrickery is my new favourite word!
Good video.
Just a note, you only need NOT and OR to construct all binary Boolean operations because (p AND q) equals NOT (NOT p OR NOT q).
Nice video, really wish I had seen this in my computer science class😂
The button or switch that toggles the power state of a computer (off or on) is the fundamental bit.
No tubes or valves in Babbage’s devices, since they did not exist then.
Reminds me of “From NAND to Tetris”
Along with NAND are the lesser known alternatives:
NOR (a nor a = not a)
AND-NOT with ONE (1 and not a -> not a)
OR-NOT with ZERO (complementary to &~ with 1)