“Hello, world” from scratch on a 6502 - Part 1
Вставка
- Опубліковано 21 лис 2024
- Learn how computers work in this series where I build and program a basic computer with the classic 6502 microprocessor. More info: www.eater.net/...
Part 2: • How do CPUs read machi...
Part 3: • Assembly language vs. ...
Part 4: • Connecting an LCD to o...
Part 5: • What is a stack and ho...
Part 6: • RAM and bus timing - 6...
Part 7: • Subroutine calls, now ...
Support these videos on Patreon: / beneater or eater.net/support for other ways to support.
------------------
Social media:
Website: www.eater.net
Twitter: / ben_eater
Patreon: / beneater
Reddit: / beneater
Special thanks to these supporters for making this video possible:
Alex Catchpole
Armin Brauns
BakerStaunch
Beau-James Erion
Ben
Ben Dyson
Ben Kamens
Ben Williams
Bradley Pirtle
Christopher Blackmon
Clayton Parker Coleman
Daniel Tang
Dean Winger
Debilu Krastas
Dominic Kulmer
Dušan Dželebdžić
Eric Brummer
Eric Dynowski
Erik Broeders
Eugene Bulkin
fxshlein
HaykH
Ian Tait
Ivan Sorokin
JavaXP
Jay Binks
Jayne Gabriele
Jefferson Hunt
Jimmy Campbell
Joel Messerli
Joel Miller
Jordan Scales
Joshua King
Justin Duch
Kent Collins
Manne Moquist
Marcus Classon
Mats Fredriksson
Michael
Michael Burke
Michael Garland
Miguel Ríos
Nathan Wachholz
Nicholas Moresco
Onion Sniffer
Paul Pluzhnikov
Peter Simard
Randy True
Robert Butler
Sachin Chitale
Sergey Ten
SonOfSofaman
Stefan Nesinger
Stephen Smithstone
Steve Gorman
Thomas Ballinger
Vladimir Kanazir
xisente
Örn Arnarson
Another important skill missing from modern education: reading data sheets or even knowing where to find them. Thanks for making things so clear, Ben!
Wow, Anton likes computer stuff
@@lambdasun4520 "A joke is a display of humour in which words are used within a specific and well-defined narrative structure to make people laugh and is not meant to be taken seriously."
@@pauleveritt3388 spoken like a true seppo
Nice to see you here
wow u like computers too.... I watch ur videos
"So first, we'll be heading over to the nearest beach to gather our silicon"
I think this is the most a UA-cam comment has ever made me laugh
Before that we have to create the solar system
I won’t start making real videos until I can start at the BEGINNING, dammit
I was literally looking for this comment
But before that we need a couple of hydrogen atoms so we can create silicon through fusion
As a software guy, watching this computer engineering stuff is fascinating to me. You're breaking down levels of abstraction I've never ventured so deep into and it feels magical.
The Fox army thats how it seems to me too
Its such cool stuff
The electronic engineers walked so that the software crowd could fly
Yeah this is more electrical engineering which was popular back in the 70s, but now computer science tackles way more complex problems that couldn't even exist if we were stuck with assembly!
This is ACTUAL programming, actually depositing the bits and bytes onto the CPU, directly ORDERING it, where it has no choice but to obey, not hoping some 10th-level abstracted "programming" language author knew how to get what you asked for down to the machine level to do the same.
@@unlokia god it must suck to be you, perpetually gate keeping in order to increase your ego. You havo no right to call something programming or not, and this video doesn't teach you actual programming that could come useful in a computer science field. Sorry if you're being sarcastic, but that kind of mindset is just baffling to me.
When you want to learn programming and you finally find a "hello world" tutorial... but it says "part 1".
kleinebre: I want to learn programming
yt: here, learn how machines work
A hello world shaped code that puts out hello world
if you're just starting out learning programming in general, I recommend starting with pure software 😅
unless you're like, really really good at making hardware already and just want to know how to program that specifically
@Hand Grabbing Fruits the most in depth
Means I'm digging deep into the subject. Thanks Ben Eater
Lol did not even mine the ore and smelt it into wire. You call this from scratch?
you don't need to smelt redstone
ikr ... kids these days have it so easy
He didn’t even create the universe it exists in
Do you use furnace to cook the ore
Making a microprocessor from scratch is easier and less grindy than making an ae2 system
THIS WAS INCREDIBLE!!! Having been doing reverse engineering for over 15 years now.. the way you broke this down and went through each step whilst implying the reasons each step were done and important. really allows the mind to be setup for the reasons the next steps make sense. VERY WELL DONE! I can see the passion and time that went into this and I must say.... KEEP UP THE GREAT WORK MY FRIEND!
I agree! He's done a 👍great job!!!
He's brilliant. Not a single spare word..
If you REALLY enjoyed it that much and learned something from it, throw the guy $10 or $20 to REALLY say thank you.
You know it's going to be a good day when you get a video from Ben!
Yup, been recommending his video’s to everyone who will listen, glad to see a new vid :)
Ya r right
You know it's even better when he starts the video with VIM
Next video: Contacting extra-terrestrial life with my marble calculator
You know it's going to be a bad day when he doesn't do what the title says until PART 2 :(
Ben, receiving my 6502 computer kit was a highlight of my week, especially since it arrived so swiftly to Spain, ordered on Tuesday and by Friday. The kit itself is exactly what I've been looking forward to, equipped with every component needed for assembly. Your projects have always been a source of inspiration for me, and this experience has only amplified my enthusiasm.
Your work not only educates but also builds a community of curious minds. I’m more excited than ever to be a part of this journey and to see where this newfound inspiration takes me.
Thank you!
Your content is so amazing, it make me appreciate computers in a whole new level!!
I've worked in IT for 12 years and this holds true for me as well.
This low level stuff is muuuch more interesting than todays overpriced Smartphones! Who cares about A666 Ai bionic processor when you can have a 6502!
I feel the same.
Ditto!
@33Ddg209Ret7 actually earth will end tmrw soo......yeh.......
Watching your video brought me back to the early 80’s when I programmed a microprocessor for the first time in my life, using hexadecimal code for each of the instructions like LDA, STA, etc. And writing them as well data in RAM memory before executing the ‘program’. Good memory. Thank you for this video. Thumb up.
Greetings from France !! 🇫🇷 were we begin the first year of school engineering with what you used 40 years back, assembler
So glad to use python right after 😄
I did the same. First programmed in BASIC on a TRS-80 in 6th grade, and when I got my own home computer, the Atari 800, guessing around 1982, I started doing 6502 stuff. I remember hooking servo motors up to the joystick ports and controlling their position with 6502 code. IIRC the joystick ports were serial i/o. I would have been 13 yo at that time.
When I made the move to the Atari ST, 68000 based, man, that was so much easier programming than 6502!!
WHOA!!!! brilliant. I learned assembler on the 6502 back in the early 80s as a teenager. This is really starting a trip down memory lane. You are a great teacher Ben. Thank you for sharing your content.
Same.for.me. Actually when he was showing the opcodes, I checked A5, remembering it's LDA
Likewise, I built a UK101 (A UK version of the Ohio Superboard) great days when you had a good hope of understanding how a whole microcomputer worked. I made so many hardware mods to my UK101 stood me in a very good stead as an electronics design engineer come programmer.
@@cheeseparis1 It's wonderful to know that the 6502 is still available. I see the modern instruction set has more opcodes than I had access to on the original chip. PHX and PHY - what a luxury not to have to use TXA PHA. And it's good to note that the indirect addressing page boundary bug was fixed.
I don't know how well it's known that the ARM instruction set was heavily inspired by the 6502. Computer Scientist Sophie Wilson of Acorn Computers designed the original ARM instruction set based on her experience of writing BBC Basic in 6502 assembly language.
Same here. Used to write assembler by hand and look up the op codes and poke them into memory on PET2000. Fun days !!
Its called Assembly
This is great!! - I wish I had this video back in '83 when I was trying to build projects w/ the Z80 - I really needed this information. Thank you! Really well done Ben!
Next up: “Hello, world” from scratch on a set of electrons.
I'd be down
Hello world from scratch without an integrated circuit, make the circuit yourself with discrete transistors
@@willemschipper7736 (quantum physics joke)
Rui Ni it’s atoms, electrons are the spheres that circle the center one
Up next: hello world from scratch on a cave
WOOOO!!! I absolutely love these videos! I followed your videos this summer and built the 8-bit computer, such a great learning opportunity! you have taught me more than I have learnt in computer engineering at school
surfer300ZX In university (college) all I did really was learn how to effectively learn and create my own interpretation and content based on others work. UA-cam taught me more as a learning resource about things I’m interested in and even in my career than I could ever learn on any course or degree 📜. Personally I think a combination of structured learning and focused / serendipity learning is the best match overall.
@surfer300ZX the sad thing is that employers look for that college diploma as a precondition for hiring (at least in my country, dunno about the US), which is why colleges are here to stay despite the rise of great free online resources for learning like Ben's awesome channel here. Wish it weren't so though...
@surfer300ZX Not really. The computer engineering educators these days dont think this is valuable knowledge. If it is or not is up to debate as most people use C# and Java, and dont know the difference between AC and DC. The knowledge in this video is taught in electronic engineering classes on master level.
@surfer300ZX At my university, this kind of class has recently become a pre-requisite for all the engineering degrees (albeit not as advanced since it's not well funded). Though it has given me a new hobby!
College and university is only for getting a degree. If you want to learn than learning it yourself from different sources like book, sites and this kind of video is the best way to go. The sad reality is you need a degree to get a stable job regardless of how knowledgeable you are. So basically college is just for increasing your debt with the promise of stable job.
YAY! I just got my kit, and following along from home! this is so cool! feel like im building a computer in the 80's like the homebrew computer club guys! thanks ben.
That’s how Apple started 😎
How far did you get?
@@AlanCanon2222 got all the way through making the clock. got stuck on a hard debug problem with wiring. i ended up making my own arduino based clock module (beause im cheep:) )
In one of my classes in college (1989), we wire-wrapped up a 8085-based single-board computer. Really gave a good understanding of how simple computers like the Apple II and C64 worked. I've still got it squirreled away in the basement somewhere.
I’m actually really interested in how they actually did this in the 80s without arduino and all that stuff. Those people were absolute madman geniuses
They used machine code
and all that so teenagers can click pictures with dog filters.
@@crewrangergaming9582 worth
@@explorerguy agreed
@@crewrangergaming9582 kids bad! amirite
This is one of the best, if not the all-time best, videos on how to truly UNDERSTAND a CPU (starting from the absolute basics) I have ever seen. Kudos! It starts at just the right level of complexity and builds from there, also using modern tools to jump over some of the frustrating hoops we actually had to jump *through* back in the 1980s. :). I absolutely love this.
But hoops is where the fun is no? 32 kb of memory....WOW!!!
CompTIA's A+ 220 1101 explain processors well too
"what languages do you know?"
"breadboard"
Solder.
...and Crimp.
Just in case you want to flex on the guy who flexes with Assembly
lol
It's machine code
Hey dude, as a Computer Engineering student that's just starting to get into the meat of the major... Thank you. This is probably the most entertaining video I've seen on the subject and SOOOOOO in depth, detailed, and CLEAR. I really felt like I understood each step even though I'm only in the process of my first microcontroller class. Seriously, thank you.
I started 6502 assembler programming with my C64 at the age of ~12 years, but never had any knowledge on hardware. You bring back that wonderful time of curiosity, experimentation and magic. Thank you so much! ❤️
"Hey bro you gonna get the new Playstation or Xbox?"
"Nah I'll just build my own."
Video games player : "Xbox or PS?"
Ben Eater : "yess"
that's pretty much what bulding your own pc is.
PC master race
@@erikshure360 bread board is better
Can you overclock it? 👀👀
I love the way you speak about programming computers and micro electronics. I learn more and understand better with every upload
Everyone: "Look, I built my own computer!"
Ben: "No, you ASSEMBLED your own computer."
... and didn't even write a single line of assembly code to do it!
honestly some people who put together computers with prebuilt components fool themselves believing it’s a very impressive task. understanding of computer functionality goes far deeper than knowing what a gpu is used for. Not to say people can’t be proud of what they make, but they should acknowledge they aren’t an engineer. Once again only some people.
@@stevenwatson9678 True!
@@stevenwatson9678 yeah i built my first PC when i was 12 with a youtube tutorial without having done much research into how a PC actually works(as i was busy researching parts deemed "good" that fit within my budget). I was blown away by just how simple it is and the fact you don't require any troubleshooting ability to solve most issues, as most issues you run into are "oh shit I have to screw that ion first otherwise it won't fit".
@@m33r61 How can a memory chip in a calculator know it is 1 not a 5 , 6 or some other number .
Tell me??💗🍭
We started this workshop/tutorial as a 2 families project with kids and "grown ups" and I ran through the 7 parts of this tutorial early to check if this is actually working out. Took me almost the whole weekend to finish up, but... .
I am ex TV technician and current automotive software engineer and I am blown away by this absolutely stunning workshop. This is the best training stuff I have ever seen in the field of embedded system development. Greatly tuned to ambitious beginners with very fine sense of what needs to be explained how often and to what detail up to managed leads into seeming pitfalls to trigger self analysis and actually succesfull diagnosis capabilities, which is absolutely required when assembling embedded systems. Nothing too much, nothing missing. Ben have many thanks for this masterpiece of enggineering education. I am looking forward running through this with our kids and having lots of fun and great learning experiences. Could not have done it in this quality myself! Btw. the development kits are absolutely worth it, carefully packed with love to the details. onyl customs is p.i.t.a. :D
It's a great project to get into if you're interested in how computers work at a lower level. I bought the kit too and have posted an unboxing video: ua-cam.com/video/FQxT41ydEFM/v-deo.html
I remember doing something like this in University. We had to build full system with a 68000 processor, boot power circuit, ram/eprom(for the program), two serial ports. Then when that was working, we had to write an actual multitasking OS in assembly code from scratch that had interrupt driven drivers for the serial ports, that we can pass messages between the the serial ports. And all of it done in 3 months, this was even before Linux had kernel 0.8 Lots of fun.. some of the student boards had so many weird problems, one of them only work if you put your finger at an exact spot of on the bread board. THIS was never attempted by subsequent generations of students. I was in the pioneer batch of a new school in a Singapore's Nanyang Technological University and we had lectures from around to world to come and question us for 3 years to gain accreditation. Fun times. This was done in second year of the uni. 1990. That degree was designed to bridge the gap between electrical/electronic engineers and computer science. So we had to do a mix of both subjects.
Forgot to mention. It was in 1989 or 1990 when it was done
love it man!!! thanks for sharing. a cs student here!!
That sounds like such an amazing program. What a privilege for you to attend :)
Reminds me of my university I'm studying. All of that stuff sounds like learning a mystical language to perform magic even for me.
Wow! I only had to build an 8085 system from scratch, with an LED display for address and data, a 16-key keyboard for inputting address and data, and a button the run the program that you input manually. Piece of cake! Oh, that was 1981.
"Hello Ben, thank you so much!"
I always had this dream to one day understand how a computer works...
I watched every single video you uploaded here and i have to say you made that dream come true!
Also the videos are very well made and you sound like a lovely person!
My first youtube comment ever btw...
Jens
You may buy an Arduino board and make it do something :-)
It's really not hard! Actually - it's easier that what was shown in this video.
@@igorthelight Hi! I have always been interested in understanding how computers worked, and I think I would be more interested in handling basic electronic components and coding in close-to-machine code like this guy is doing rather than using a compiler and a high-level coding language that hides everything from me. However, I have no idea where to start. I would be ready to buy things, of course, but as long as I get to get my hands in the workings of those things and get a LED to light up I would be excited and satisfied enough, for the moment. Where and how do you think I should start, assuming I know absolutely nothing of what's going on?
@@DrSavitruc Well I'm not the right man to ask.
This channel is great for explaining low level things. And also you could program for Arduino in Assembly:
forum.arduino.cc/index.php?topic=37130.0
I’m so glad developers are making these videos now! I’ve always wanted to program and design embedded systems and easy content like this is making it so much easier to learn.
You know it's hard when a simple hello world starts off with a 27min video and a multi part series... All things aside embedded C is the best thing I've ever learned, and my job as a web dev helps me appreciate this even more
Big thumb for this man & his channel. Sharing knowledge is what make earth better place for life.
Abu halal halali ?
@@Праведныймиротворец Arabic name
حسبت انا الوحيد العربي اللي اتابعة
@@Праведныймиротворец
So you can read arabic ......!!!
@@moulayediag3873 what's the meaning of "arabic" ?
You are a good instructor. I truly enjoyed your video . I'm a 76 year old tech from the days of 555. - and even I could understand what you were doing. Thank you
I don't understand what is going on, but I like it.
This video is pretty much the most simple explanation of how a microprocessor operates. Since you're struggling, here's a reasonably short clarification. On reset, the chip goes to a default memory address to start reading a program - in machine code. To do this, it sets all the 16-bit address bus pins to the memory address it wants, and then sets the I/O pin to "read". Data flows in via the 8-bit data bus. The value it reads is interpreted as an instruction, and it performs that operation. Here, Ben has it performing a NOP or "no operation" - i.e. do nothing. The chip then goes to get the next instruction, another NOP, and so on. The magic really starts to happen when a real program is loaded, but to do that Ben needs to wire up some memory (he mentions a ROM, or read only memory). For just a couple of seconds, the chart of chip set operation codes was on screen, listing all the possible actions the chip can perform. A little more detail is needed about the chip's internal architecture to understand exactly how it does stuff, but I'm hoping Ben covers that in part 2.
@@craftsmanwoodturner another explanation I don't understand ... But I like it
@@sah7920 just like when your maths teacher who's too nerdy explains things.
@@craftsmanwoodturner I dont know what any of these words mean but maybe one day i will and ill be able to make cool breadboard contraptions
I do understand, and I also like it.
"sometimes is reading, sometimes is writing, but honestly I don't know what it's doing"
Me describing my brain during a test.
Tomorrow's video: running Windows 10 on 1,000 breadboards with only nand gates.
Hah, 1000. That's optimistic.
@@meithecatte8492 ben eater:
are you challenging me
...maybe you need to go for a walk in the park
Running a Windows 10 on an emulator that runs on his breadboard. That would fit the theme.
William Mentink ..
I used to program in 6502 assembly language in the '80s but I had the benefit of an operating system and an assembler. It's a nice chip to program with some clever twists to get round the fact that it only has an accumulator and two index registers. I'm looking forward to the rest of this series.
Someone: what language was this program written in ?
Ben Eater: R E S I S T O R B R E A D B O A R D
C/C++
@@Puzzled420....what? ah i see your name..
6502 ASM
I can't resist!
6502 Assembly
Setting up the address to read instruction from followed by NOP instruction itself using resistors has blown my mind completely! Also this is the first video that finally gave me an idea of what are the clock cycles and how do they relate to the processor. This is incredibly valuable information if you're trying to write an emulator.
Thank you so much Ben!
This was super instructive!
"As you can see from this number here it was manufactured in-"
Me: "389 AD!!!"
"The third week of 1989"
Me: :(
your user name LOL
the only one in the world that can put "hello, world!" on a programming resume and instantly be hired.
You are a very good teacher! I love the reverse engineering style of explaining. It’s like a fun puzzle instead of a dry lecture.
You are such a great explainer. It angers me how terrible my teachers were. I wish every kid gets to spend their childhood exploring things and having fun and not dealing with irresponsible adults who bully kids.
Hey, Ben. Can't say enough good things about your videos. Unlike other channels on here that just skim over stuff or show the final result of something for a "wow factor", this channel actually teaches people. I hope you know how beneficial these videos are to people like me. I've been doing electronics for years now, but your videos have allowed me to take on projects that I didn't previously have the skills to tackle. The value of your videos cannot be overstated!
I came across this and at first thought it was some kind of joke / April fools thingie / memory lane crap!!
Oh man... back to some memories - I had been doing assembly language programming for the 8080 (honest), in the late 70's. Employer had a project and wanted the 6502. Thought I would hate this CPU because they did crazy things like use low ram memory for registers and register commands that worked there instead of the CPU and was writing real time code where timing really mattered.
Sheesh, once I got the hang of it I thought everybody ought to work it that way. This was especially true if you were a C programmer because of the similarities in pointer uses. I would map out register space and go for it, no pushes/pops - just a fetch instead of hmmm.. dang it where was that crap I left on the stack... rats, I just drove the stack into the basement!!
Anyway, many thanks!! Good to see people still kickin' it!!
Yep, those times when you wonder where the processor is going, then realise you have one more push than the number of pops.
I never get tired of learning the microprocessor basic stuff, especially when Ben is doing it.
You were absolutely meant to be a teacher. Thanks for explaining everything so precisely. As a student in computer engineering, I could not have found a better channel.
"NOP" came instantly into my mind when reading "EA". Even not having done anything in 6502 assembler in more than 10 years... Loved the video!
I didn't remember that EA was the value for NOP. Maybe it's because I only mildly dabbled in 6502 ASM?
My silly brain not only remembers than EA is NOP on the 6502, but also 4E71 from the 68000
This is an exceptional series, Ben. Thank you so much for creating this. I've been reluctant to do anything regarding building my own computer (even though I know the 6502 reasonably well from a programming perspective), but your series has shown it's not anywhere near as scary as I thought.
If you would like to support my python data school, please like, comment & subscribe my channel.
UA-cam: ua-cam.com/channels/doN.html...
It is SO cool using that Arduino to analyze each clock step. It’s a gift to have the tools be so cheap and accessible to analyze and have such precise control over the circuitry. It really makes the “magic” of a processor become tangible. Makes all my nerdy senses tingle.
Pat, to me this is a very astute observation, the cheap tool means this level of examination is easy, I build an elektor 6502 sbc around 1985. Never worked, I didn't have any meaningful tools to fault find... this is awesome
I wanna cry, this is the most beautiful thing i have seen, pure magic!!!
I have a BS in Computer Science and also took a course on Assembly during the BS and I still do not feel like I have a strong grasp on low level stuff. This series taught me many things about hardware and I am more confident, it's priceless, thank you very much!
when I first got a c64 in about 2012, I liked the simplicity of it's design, and I wanted to make my own games. I read the most efficient way was to code it in assembly, so I read as much as I could about the 6502, and eventually wrote my own guide for myself about what each of the assembly instructions are, what all of the addressing modes are for each of the instructions, and even an op-code table for if I decided to write in machine code. TL:DR if you write a manual yourself you'll be able to understand it better, and can use it as reference.
Hahaha “BS”
Well, are you a boy nerd or a girl nerd?
“If you wish to make “Hello, world” from scratch, you must first invent the universe.” - Carl "Ben Eater" Sagan
😂
Lol no, programing is basicly trying to create our own virtual universe. If you can make your own universe you already far outdone hello world.
@@ribertfranhanreagen9821 its a joke dude
@@ribertfranhanreagen9821 Missing the point, the quote means that "from scratch" can regress infinitely, he started with a 6502 in this video but that's not _really_ from scratch, because it already has super complex circuits inside it. And even if he wired the 6502 manually, he didn't manufacture the wires and switches and LEDs himself, he didn't mine the raw silicon/copper/etc from the earth to make those components either, nor did he create those raw materials (they were here from when the earth formed), he didn't create the earth, nor the solar system, nor the galaxy, nor the universe. So did he do this from scratch? Fuck no! Lying ass title. Dude didn't even invent the galaxy, let alone the universe... though, we can still agree, that in some sense the video is indeed "from scratch." It's just a matter of... to what degree
invent reality
If you wish to make an apple pie from scratch, you must first invent the universe
- Carl Sagan
*Raspberry Pi
This sound more like a quote from Douglas Adams
Luke Gabriel Balgan
Yes, but how on earth did you possibly know that?
@Jay Vigilant quoting Carl Sagan unwittingly proving God's existence isn't pretentious, it's funny
@@shane8037 Quoting Carl Sagan does NOT prove the existence of your invisible sky daddy, you uneducated potato.
Dude that was great! Your explanation and delivery were crystal clear, well paced AND visually appealing. You showed different parts, zoomed in etc. just great 😁
This makes understanding how a chip works and understanding what steps to take for people to figure out how a chip works so much easier. Just follow the datasheet, explore, and eventually you'll get there!
Except that newer arm chips usually have thousands of pages of nearly useless garbage docs thrown together randomly. No wonder people are drawn to the simpler designs and better documentation of the past.
@@big_b_radical3985 I'm a kid, I know nothing. I'm just saying.
Getting deepah.
No but seriously couple of years or so when I started studying CS with a minor subjects from EE, I found this channel, it really helped me fill the junction between very high level (C) or very low level(circuits & digital logic). Thank you.
I stumbled across one of his CPU videos, And I'm about to watch this man's entire channel.
The type of content I like to see
You're so lucky, I remember when I first found his channel... So many incredible videos waiting for me anytime I wanted! But now I've watched them all and have to wait for new ones😋
That was incredible, a week full of video about how to make an 8-bits computer. But everything must come to an end :')
Anyway, have fun ^^
@@heartlessalice5801 assuming you didn't follow along in the first place, starting the project yourself is a good excuse to re-watch them and follow along.
@@markkeilys I would love to, but sadly that will be out of budget for quite some time ^^
But I will definitely do it one day, thanks for the idea :)
These videos make computer geek me so happy. How can anyone not find this stuff fascinating, it's just so magical
I literally found an old processor in an old phone box a few weeks ago. This video actually gave me a starting point for a potential project.
keep us updated
Ben, your tutorials are great! Thank you so much!
I am extraordinarily thankful for your videos. I am a CS major who couldn't stand coding 8+ hours a day, year after year, so I ventured into other fields that have brought no real reward to my life. Thinking I had grown too far past my prime to learn, I had started to give up. Watching your videos, your way of explaining things, the way you break things down, and showed something as inane as a clock by clock implementation... made things I understood in theory, to things that I understand in application.
I will be ordering your kits once the dust settles from being laid off. Bet that wouldn't have happened if I'd stayed in my field... sigh...
CS - Counter Strike?🫢
@@i_am_from_Siberialmao it's computer science
OMG! Kudos! I learned 6502 on a Vic-20 with DATA statements in BASIC, back in the day, and I'm still figuring out how hardware actually works. I'm amazed at how you've made a variable-speed clock source with "freeze-frame" and stepping support on a single bread-board! (this might be "normal" for hardware engineers, but for me this is just wow!)
me, a biologist: "what the hell am I doing here?"
also me: *clicks part 2
You're learning how you should emulate the artificial intelligence that you're going to be inserting into your synthetic organism from scratch.
In some bizarre way, a computer is also a living thing.
😂😂😂👌
same here xD being a bio technologist i have no idea whats going on but it still fascinates me :v
Dude combining biology and computer science would be an insane combo.
Wow. What a great explanation. I remember the days of hooking up an 6502 chip to a CRO whilst studying Elec Eng at university back in the mid 90's - we definitely weren't allowed to hook the output up to a computer back in those days.
Him: the guide is only 30 pages long
Me: I'll read it if it has enough pictures
lol saaaaaaaame
Haha
datasheets contains images and pictures soo maybe
@@niggisgaming1637 I know, I was just making a joke about how dumb and childish I am
@@jamesb9120 ok np man
I don't understand a word of this video but I love when technical stuff goes deep into the "gritty" side of things we give for granted.
Thanks for this nostalgic piece. I programmed these beasts back in 1980s at school. 6502 code was my first big project - a compiler for FORT prog language.
Ben: Use an arduino to understand what the chip is doing
The 1950s guy developing the chip: why didn't I think of that!
Underrated comment. 😋
The guy to build the first computer must have had a hard time debugging it.
Poor guy would probably just have to do the calculations by hand to check they were right
@@rhysbaker2595 Something that many (if not most) CS programs today will have you do in foundation courses. It's surprisingly not that difficult - doable by students on paper during exams
@@eclmist I was one of those guys in the early 1980s. Task #1 given the raw hardware. Get the LED attached to an output pin to light up. No help from an Arduino. The only tool was an analogue voltmeter which told you 'low', 'high' or 'changing'. It took me a week to get the LED to light up. Having done that, progress quickened. Every function I wrote did something different with the LED so I could track if it had worked or not. I kept going until I could bitbang asynch serial data to an old VT100 terminal. Very satisfying in the end to have it say "Hello".
@@andrewlydon7819 I'm a software engineer.. and I would love to know how the heck you do that :).
Hi Ben, I've actually seen this done on a Z80, on a massive single breadboard, when I was at Marconi, back in the day.
I had recently finished a three month microprocessors course, so I was even more in awe of it at the time.
You explanations are great. Keep up the good work.
The odd comments you have, are only to be expected, as electronics is chock-full of abstract subjects, that most people are neither aware of, nor understand.
I tip my cap to you Sir.
Motorola 6502 was my first microprocessor back in 1983. Then the 8085, that in order to have a practical computer needed another 2 additional chips. All the code had to be in assembly then burned in ROM
This takes the phrase "building a pc" to a whole new level.
Everyone assembles their PC unlike Ben who built it. *
Him:"I don't really know what's going on here."
Me: Way ahead of you on that one at least.
Ben Eater and the 6502, a match made in nerd heaven.
I wonder if he ever programmed using the 6809 it's better than the 6502.
Now this is way more fun than the beginning of every programming tutorial
well, i first watched this video a couple of years ago and i bought the kit. I build the clock about 8 months ago and today i stepped through this video. I have learned so much in the process. Thank you Ben.
Till now I had been using only Arduino for my projects, but I haven't ever thought of poking into the underlying works of a microprocessor like you did. That was really amazing!!!
Just wanted to say how impressive all your video's are. In an age of overused memes and shared idiocy, it is always nice to see the power of the internet, as in shared information, being used this fully. Thank you very much!
Me, who doesnt know a thing about this:
02:34 Lets skip the video a little
08:31 Oh... this escalated quickly
Neonic α lmao
XD bro
XD same
YA
This video was created with a clever and elegant approach. It stresses on the schematics/data sheet understanding and the importance of hardware. A forgotten factor for the new generation of Engineer.. Great Job!!!!!
oh my someone is finally doing it, I've been waiting for this moment
As you are apparently interested in writing decent code, I'd recommend forgetting int, char and unsigned and use (u)int8_t (and the 16/32 variants) and size_t for variables. They are all available in Arduino. While you correctly noted that you address won't be negative, your n's in the for loops won't be either. I know this sounds nitpicky, but one can get used to "for(size_t ...)" and "uint16_t address" isn't just more correct for the use case and matches the bus width, but it's also shorter to type!
I'm not judging some short Arduino-Scribble, but recommending everyone adopting this habit. You get used to quickly and profit from less issues in the long run. And once you really got used to it, you will use them in short Arduino-Scribbles as well.
Anyway, great video as always, looking forward to the series! Never commented here I think, but watched the whole breadboard computer series plus X.
I want to add to this, that it often makes sense to use some more advanced IDE and cmake. With the help of platformio and a bit of fiddling, one has a great development setup pretty quickly. Such things as the missing comma would have been highlighted on screen instantly. I personally prefer using CLion with platformio since I have fast code analysis, but other setups are for sure imagineable too
relatedly - pretty sure you could use ranged-based For instead of raw loops in modern Arduino, even with an int array
WolleTD I too am a fan of unambiguous data types. All praise stdint.h!
This was an incredible video, very clearly explained and very insightful! Thank you Ben.
This is possibly - nay, probably - the best thing I have ever seen on UA-cam.
Being familiar with 6502 *and* Arduino, this was a blast to watch, I was narrating a sentence or two ahead of you for many parts of the show. (he forgot a comma, when is he going to put that comma in?! hey that's a NOP, etc) Thanks for doing this, loved it!
If you would like to support my python data school, please like, comment & subscribe my channel.
UA-cam: ua-cam.com/channels/doN.html...
Fantastic 👍😁 BTW: Outstanding post audio editing. It really is amazing how seamless it all is. As if it was all done in one perfect long take. I think I even hear Hollywood calling 😉
That's the perfect series of video to learn more on the 6502 before the 8-bits guy release the Commander x16 an 6502 based computer, so thank you ^^
You look like someone who would be very successful on UA-cam. You should make content! I would watch!
?
@@heartlessalice5801 I think you should make some videos, and I believe that you would be successful on UA-cam.
@@kris_0520 I do know how to read , the first time was enough ^^
I was just wondering why a stranger that I don't know was saying those nice things to me.
But thanks, I guess…
@@heartlessalice5801 You are welcome! :)
As someone who's getting into programming this is so delightful to see, I learned a lot and I was entertained through all the video.
It's a great project to get into if you're interested in how computers work at a lower level. I bought the kit and have posted an unboxing video: ua-cam.com/video/FQxT41ydEFM/v-deo.html
Eater's Law: The number of breadboards in a video increases as the square of time in seconds.
is that actually true or??
@@maskedredstonerproz sadly, no. Would've been brilliant though.
Wau, I am just starting with all of this and feel that I made a great progress thanks to this. Great video! Bunch of thanks!!
Me: enthusiastically learning c# in my thirties.
YT algorithm: here is something to shake your beliefs and self-esteem.
And I was searching how to program in ASM to be a better programmer and then this guy destroyed me right here
hey, it's okay! we're all learning on our own paths, i'm a bit intimidated by c# so i'll stick with python for now ^^'
i hope you keep learning and, most importantly, you keep enjoying it~
Lol me too I'm learning java and it sent me here
Keep it up, and remember to try other languages like Java first in case you haven’t in order to get used to programming
@@impmadness Java really kicked my ass in highschool. The OOP is a bit much. I always recommend python to new programmers.
My wife bought me the kit for my birthday, looking forward to losing the next few weekends. Thanks for all the great content!
This is what I've been wondering since i met my first computer.
And my friends can't even make a google account.
How to create Google account:
Mine ore.
Smelt it.
Repeat until you have found all the elements.
Make a circuit board.
Make transistors.
Assemble components.
Once you have a logic board ready.
Make other input and output hardware.
*Now make friends using mud and your rib... B'coz if they doesn't exist who will create account.
Program an OS.
*Build a web network accross the globe and name it "internet".
Build a receiver and connect it with internet.
*Create a company named Google that specializes in internet and deals with email and stuff
Connect to Google.
And now build your Google account.
@@Perseagatuna oh... yeah I'll add that too
thanks for reminding
Thanks for pointing out Nivas I'm so dumb 😂😂😂😂
you need new friends.
@@sabin97 ok i'll add that too
@@Perseagatuna but in which step???? ¿¿¿¿
*6502:* What is my purpose?
*Ben Eater:* DO NOTHING
*Ben Eater:* REPEAT
What are you doing, 6502? You should not be waking! Run deep! Eat code! Drink electricity! Go do nothing! Ben Eater is talking!
(Kudos if you know where I paraphrased that from) :)
*6502*: Where should I start?
*Ben Eater*: At address
6502: Oh...
Ben Eater: YEAH, WELCOME TO THE CLUB, PAL
he is trying show what happening when instruction is NOP and how prosessor react to it. its like AAH now you understand what it might do when doing LDA and STA
Do nothing. Not possible
I like how each time you change your code, you run it and show what it’s doing each step of the way. It’s a lot more helpful than just showing the finished code and what it does. I mean, I still don’t understand it, but I don’t understand it a bit less than usual lol
Oh boy, another deep dive. My body is ready.
omg This takes me back - used to love programming the 6502 in Hex on my Pet; sadly I knew all the op codes off by heart :(
I dont know what i just watched but it was awesome
He explained how your computer works at machine level. I want to know how this works at physics level
I rememember my father built (recently passed away) a NASCOM1 from scratch this was based on Z-80 which is of similar age - oh and it only had 1K of RAM. If you wanted to do anything you had write your code using machine language - you didn't even have an assembler. Loved the video!
You are sending $EA to the data bus and my brain goes.. $EA.. That's NOP. I haven't coded 6510 (on the 64) since the 80's ;)
EA sport, it sucks
hahaha, immediately knew it was 234 decimal, for the same reason... years of etching it in, back then...
what the fuck is that, I don’t want to know actually
You need to pay $15 to access it
@@mervinbalamurugan7454 it just means no operation.
My computer's processor interprets EA as it should ask me for money
Lol
When your microprocessor is actually just microtransactions
no
And mine shouts "TSSSSSSINEEEGAME"
haha funny ea bad amirite hahahah
haha.. logic analyzer and precooked arduino environment would have been handy back in the 80s..
lol. No doubt.
I've been trying to work with period correct tools for 8 bit micros, and it's so annoying I quickly end up reverting to cross-compilation and more modern tools. XD
Even just the simple expedient of writing a BASIC program on my PC in a text editor, then saving it to a folder and using a disk drive emulation device to load it onto the 8 bit system makes life easier...
Old keyboards could be really nasty.
And the editor limitations... Bleugh. XD
@@KuraIthys that sounds pretty incredible too though o.O ngl
I was 6 years i the Air Force,78-84 and we had a logic analyzer called Dolch.Very useful when troubleshooting TTL computers connected to military radars and IFF equipment.
In 1983 some friends and I designed and built something like this but using a 6809 CPU and wire-wrapping rather than a breadboard. I think we bought a pre-programmed monitor ROM, plus we had 64 KB RAM and a UART chip which we connected to a borrowed VT100 terminal. We turned it on and ... no monitor ROM prompt on the VT100. Darn. We didn't have a logic analyzer, or a handy Arduino Mega. We did have a dual trace oscilloscope. We attached the clock to one input and went through the address and data lines one by one, writing down the 1s and 0s, after which it was just like the debugging shown here. It turned out we'd made two errors: 1) wired the reset button as NC instead of NO (so it ran only when the button was pressed), and 2) we fed the UART clock from the 2 MHz CPU clock with a divide by 16 then divide by 13 which gives 9615 BPS, which is close enough to 9600, but miscalculated the reset network for the 2nd 4 bit counter and got divide by 12 instead, giving 10417 BPS which did not work. Everything else was correct first time.
I have always wondered how we went from microprocessors to computers and your channel has explained to me how. Been loving your videos