The Complete History of the Home Microprocessor

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 1,6 тис.

  • @TechKnowledgeVideo
    @TechKnowledgeVideo  4 роки тому +144

    Hi all! Thanks for watching the video :) If you're feeling generous and would like to support my work, you can do so via Patreon (link in description) or using the 'Thanks' button underneath the video :) and if you're interested, check out the trailer for the next retro computing documentary on my channel!
    This project took 6 months to complete and was huge fun to make! If you enjoyed the video(s) then don't forget to subscribe, like, and share the video on social media! It really does make a difference when trying to grow a small channel.
    Thanks again everyone :)
    -Archie

    • @jell_pl
      @jell_pl 2 роки тому +4

      if you will plan to get back to this topic and make an errata, you should correct info about the first computer. ENIAC despite american propaganda was not even third ( there was 2 versions of e.g. en.wikipedia.org/wiki/Colossus_computer and several constructions from Konrad Zuse, e.g. en.wikipedia.org/wiki/Z3_(computer) ).

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +11

      This is technically true, but it ultimately comes down to your definition of “computer” - neither colossus or the z3 were Turing complete, so I ruled these out

    • @masternobody1896
      @masternobody1896 2 роки тому +4

      @@TechKnowledgeVideo can you also make history of gpu

    • @FlockOfHawks
      @FlockOfHawks 2 роки тому +2

      y're welcome & i like what i've seen so far 👍

    • @acmefixer1
      @acmefixer1 2 роки тому +1

      It's a great video, very comprehensive.
      But the first thing I noticed was its exceedingly long time. This was why I almost didn't watch it. It should have been divided into at 3 parts at minimum, each no more than 29 minutes long. Thanks!

  • @setdetnet5001
    @setdetnet5001 11 місяців тому +76

    I'm an ASIC designer, i worked on Motorola MC68000 design. What your video fails to mention and is worthy of mention, is the ever constant fight between hardware and software. In the 60s,70,80s software developers needed to develop code within CPU constraints. (and memory). Then we saw software drive hardware... that is to say, if you wanted to play the latest games you needed to spend megabucks on the latest PC hardware. Then, a switch back around 2000 to chips being far superior and software not truly making full use of muticore threading. And now, we see CPUs evolution limited by foundries. Its now that we will see software start to drive innovation in CPUs

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  11 місяців тому +12

      That’s a very interesting point! Thanks for sharing :)

    • @therealcfiddy592
      @therealcfiddy592 10 місяців тому

      Okay

    • @charliefoxtrot5001
      @charliefoxtrot5001 10 місяців тому +2

      Software is still driving the hardware design. Just look at the development of GPGPUs over the past 20 years or the specialized processors in mobile devices. With the end of Dennard scaling and Moore's law, what we do with the limited transistors we have on a chip will become more and more important.

    • @therealcfiddy592
      @therealcfiddy592 10 місяців тому

      @@charliefoxtrot5001 thanks mike

    • @Raderade1-pt3om
      @Raderade1-pt3om 9 місяців тому

      Softwares still drive hardware price snd performance for budget segments

  • @thehookupiowa
    @thehookupiowa 2 роки тому +119

    This brought back some vivid memories. I was 7 years old in 1979 when our elementary school library got it's first PCs. A pair of Apple II with the green monochrome displays. I joined an after-school class teaching BASIC 2.0 programming, the built-in programming environment that was part of the Apple II ROM. I recall settling on a Death Star related project for my program, as any sane 7 year old would have. I asked the teacher "How do you make a circle?" and his eyes lit up. He was being asked to explain Pi to a 7 year old and he was delighted.

    • @strictnonconformist7369
      @strictnonconformist7369 2 роки тому +6

      The Apple 2 series never had a BASIC version 2, there was Integer BASIC (written by Steve Wozniak, and it had no floating point support) and AppleSoft BASIC, written by Microsoft, which did have floating point support built-in.
      I’d get a big smirk on my face to have a 7 year-old asking such questions because that’s a huge recursive rabbit hole taking a kid that age far deeper than most kids several years older ever go in their lives.

    • @tarstarkusz
      @tarstarkusz Рік тому +5

      I tried rotating a multi-segmented circle (AKA Star Castle arcade game) in Commodore basic 2.0 when I was maybe 12 in 1982. Can you say slow!

    • @RetroDawn
      @RetroDawn Рік тому +6

      @@strictnonconformist7369 I knew that as well, but assumed that they just meant AppleSoft BASIC, since it was the 2nd BASIC version for Apple II computers; and they said 1979, which was the year that AppleSoft BASIC was released, along with the Apple II Plus, which had AppleSoft built-in.
      They possibly got the "2.0" in their memory from the widely popular Commodore 64, which called it's BASIC "V2" at the top of the screen when you turned it on with no cartridge inserted.

    • @strictnonconformist7369
      @strictnonconformist7369 Рік тому +1

      @@RetroDawn an interesting possible memory failure explanation, I can agree. I didn’t have enough access to Commodore 64s to have that burned into my brain.

    • @YourCapyFrenBigly_3DPipes1999
      @YourCapyFrenBigly_3DPipes1999 11 місяців тому

      I got to learn computers too on the Apple lle with green-only display. 1985-87. Fun memories! Never seen a home computer before, we were endlessly fascinated. One unit even had a color display and we would always fight over who got to use it. Many ancient games of Oregon Trail were played on those machines and others like it. Later, my 4th and 5th grade class got it's OWN computer, which felt extremely luxurious, and we discovered the wonders of Carmen SanDiego. 80s/90s, great times for kids, lulz.

  • @denniswofford
    @denniswofford 2 роки тому +255

    This is a great long form documentary on the history of CPU development. Very interesting and fun to watch, especially for a guy who is old enough to have seen it all (and work with most of it) as it played out. Thanks Archie! Well done!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +15

      Thank you for your kind words Dennis, they mean a lot :)

    • @dovahkiin159
      @dovahkiin159 Рік тому +13

      Same here. When I was doing my doctorate in engineering at the U. of Waterloo, I built my own Apple II+ clone (soldered the whole thing) and used it do all my mathematical modelling of reactor fluid dynamics and heat transfer using MBASIC (LOL) and write my thesis. The PC had a CP/M card and 16k expansion card (Oooooo). The mathematics were so difficult to solve numerically that I had to develop a new mathematical method, and it took weeks to get a single solution. Now with my current PC, X670E MB and Ryzen 9 7950X CPU overclocked to 6.8 GHz, it takes a few hours.

    • @jp34604
      @jp34604 Рік тому +2

      @@dovahkiin159
      .
      How do you cool 6.8 gigs of clock speed, chilled water?

    • @dovahkiin159
      @dovahkiin159 Рік тому

      @@jp34604 At the moment I am using the Dark Rock Pro 4. This cooler barely keeps the CPU at or below it maximum continuous operating temperature of 95 C at that clock speed, which I should clarify was achieved on a single core only. I use the AMD Ryzen Master software to optimize the overclocking. I plan on switching to a water cooler. I did not get one originally because the water cooler required for this CPU would not quite fit into my case (Corsair Obsidian 800D FT). I can, however, make some minor mods to the audio/USB and optical drive bays to make one fit.

    • @garymartin9777
      @garymartin9777 Рік тому +4

      yea me too. First computer was a single-card with TMS 9900. A neat chip for homebrew. I had to solder all the chips and sockets myself. Launched my career in microprocessors and hardware development.

  • @raggersragnarsson6255
    @raggersragnarsson6255 Рік тому +10

    This is a truly great exploration and documentary of the history of computing. As a child of the 70s I was already aware of lot of new technologies that emerged around that time. Home gaming with PONG, VHS machines and dedicated handheld single game machines. I was aware of the huge cost around 1980 of the PC. I played games in arcades in pre teen years until I received a Spectrum 48k and everything changed and to this day I'm a tech head.. I'm watching this now and I have learnt even more from it thank you.

  • @arn3696
    @arn3696 2 роки тому +7

    I can't believe I just watched a feature length video about microchips...but you know what - I enjoyed every second of it!

  • @stevetodd7383
    @stevetodd7383 2 роки тому +109

    Itanium wasn’t a RISC design, it was what’s known as a VLIW (very large instruction word) processor. It also relied on the compiler to optimise for concurrent operations (so, for example, while an arithmetic operation was in progress the compiler was expected to work out other steps that the CPU could handle that didn’t depend on the result), but compiler technology wasn’t really up to the task.

    • @absalomdraconis
      @absalomdraconis 2 роки тому +7

      Which is sorta sad (not that I shed any tears for Itanium), because an iterative application of a logic language (like Prolog) probably would have been able to very cleanly encode the instruction sequencing.

    • @andrejszasz2816
      @andrejszasz2816 Рік тому +7

      That said, the Transmeta Crusoe was also a VLIW processor aimed at the mobile market with a compiler that translated x86 machine code on-the fly. BTW I remember the press release classifying VLIW as a RISC-like architecture

    • @glenwaldrop8166
      @glenwaldrop8166 Рік тому +7

      @@andrejszasz2816 You are correct. I remember several articles about Itanium and describing it as "Intel just doesn't want to call is RISC".

    • @RetroDawn
      @RetroDawn Рік тому +7

      @@glenwaldrop8166 I can believe it. The computer press has a lot of folks who don't know the lower-level details of the technology. VLIW is definitely distinct from RISC, even if it builds off of RISC.

    • @MultiPetercool
      @MultiPetercool Рік тому +1

      One of the early design goals for Itanium was for it to run HPs PA RISC binaries. Hence the very long instruction word. The reason Itanium failed was that they broke Intel compatibility. Having seen Digital’s, Alpha, technology, stumble, software vendors, like Oracle, Peoplesoft, JD, Edwards, SAP and countless others were not willing to invest in a new platform.

  • @michaelhawthorne8696
    @michaelhawthorne8696 Рік тому +18

    I was 16 in 1980 and lived through the developement of the home micro and then the PC.
    I can relate to this video with great fondness having had the ZX81, BBC Electron and its add ons, BBC Micro, Atari STe. Buying my first PC in 97 (IBM Aptiva) and then building my own.... Its been fun watching this video bringing back great memories.
    Thanks for your hard work Archie...👌

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  Рік тому +2

      Thank you so much! Glad it brought back many happy memories :)

    • @franciscorompana2985
      @franciscorompana2985 11 місяців тому

      I had the first Pentium in Portugal. 1994 😂
      I got a Matrox 4MB from Canada. 😂
      The IC had the famous bug from factory, so I changed it in the US, for a clean Pentium. 😂

  • @PaulSpades
    @PaulSpades 2 роки тому +17

    This is spectacullarly comprehensive and relevant. I'm blown away.
    I only have one small quibble: During the 80s and 90s the presentation focuses on low cost solutions, while the 2000s focuses on high end x86. This leaves out mips and arm powered tablet computers, and SBCs like the raspberry pi. And they are relevant, especially arm powered SBCs.
    A new cycle was attempted to drive down cost with the netbook and tablet craze, but the software wasn't there yet, there just wasn't enough incentive to push Android as a new universal OS for home computers, and it wasn't suited to replace wintel. The raspberry pi, and linux distros ported to it, is the new platform.

  • @Bduh2
    @Bduh2 Рік тому +6

    Fantastic video! As I was watching it, memories came back from all the computers I've had during my lifetime. From Sinclair, the commodore, the first IBM with DOS to the servers and PCs I'm still building to this day for customers.

  • @kob8634
    @kob8634 8 місяців тому +1

    Thank you for this. I'm 63. For half of my adult life I kept this documentary in my head but my brain clicked off when we stopped calling them Pentium. At least now I know how to intelligently shop for a computer again. Your level of research is impressive. V well done.

  • @electronash
    @electronash 2 роки тому +32

    This is great. Well done.
    It must have taken a very long time to narrate and animate.
    This is one of the best summaries of the microprocessor boom of the 70s, 80s, 90s, and beyond.

  • @NipkowDisk
    @NipkowDisk 11 місяців тому +12

    I don't normally watch videos longer than about 30 minutes, but this was worth every second of it. Most of it was a great trip down Memory Lane for me; I was born in 1960. Outstanding job!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  11 місяців тому +2

      Glad you enjoyed it!

    • @marktwain5232
      @marktwain5232 10 місяців тому

      @@TechKnowledgeVideo What a beautiful trip down memory lane for me! I started in 1979 with CP/M on the Z80 and thought I was the last person on Earth to find out about the microcomputer revolution. I then worked 41 years as a professional developer and Software Engineer in GW Basic, Clipper, and finally Java, Visual Basic, and various Javascript libraries. I retired in early 2021 working in Cloud protocols. I loved every minute of it! Every day was something new that no one had ever done before. I am so grateful that I got to work on the long march! Thank you so much for this beautiful presentation!

  • @EliteGeeks
    @EliteGeeks 11 місяців тому +3

    oh man, this brought back memories... well done...

  • @nevrunderstandlada
    @nevrunderstandlada Рік тому +3

    WOW really good documentary. It's clear,simple to understand and complete thanks

  • @pipschannel1222
    @pipschannel1222 2 роки тому +20

    Great content! Love it!
    Did you know IBM wasn't the company that introduced Intel's 386 architecture with their PS/2 Systems? It was Compaq that beat Big Blue by 7 months with their very expensive high-end Deskpro 386, released in september 1986 vs the IBM PS/2 Model 80 which used the same 80386DX-16, released in April 1987. I think Compaq deserves to be mentioned in documentaries like these as it shaped computing history or at least had a vast influence on its development in the sense that the company played a key role in creating open standards which hugely benefitted/influenced the PC (clone) industry, being the quintessential PC clone manufacturer..

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +4

      Thanks! Interesting stuff :)

    • @awuma
      @awuma Рік тому +1

      So glad that somebody today recognises just how significant the Compaq Deskpro 386 was, in my opinion just below the original 8088-based PC itself. It, and not the IBM PS/2, established the industry standard architecture, using the PC-AT's open 16-bit bus for peripherals.
      The greatest missed opportunity was Sun Microsystems' not basing their 386i workstation on the ISA; had they made their SunOS (later Solaris) run on the ISA, they would have blown both Microsoft and IBM out of the PC business, and from 1988 we would all be using full Unix and not the MS-DOS and Windows operating systems, which did not make anywhere near full use of the 386 architecture's power until the 2000's, when Windows became NT; Linux appeared in 1993, and by 1995 had the full power of SunOS/Solaris, but on the standard 86x architecture. Sun gave up in the 386i in 1989. (I replaced my Sun 386i with a Pentium based PC running Slackware Linux in 1995).

  • @alpaykasal2902
    @alpaykasal2902 2 роки тому +4

    My heart went all a-flutter at 28:00 when the Amiga was shown. Great video, fast pace but so thorough!

  • @pssthpok
    @pssthpok 2 роки тому +19

    Nice history! My first computer in high school was a single Commodore PET for the entire school, when I went to University I saved my pennies to buy my very own Sinclair ZX-81. What a beast.
    I recall the Pentium 4 years, and Intel's strange relationship with RAMBUS memory, with all the technical and legal issues that it came with.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +4

      Thank you! Very interesting to hear about your computing journey :)

    • @picklerix6162
      @picklerix6162 Рік тому +4

      We used to call it Rambust memory because it was a disaster for PC companies that decided to use Intel chipsets. Not even Intel’s own server division would use Rambus.

    • @pssthpok
      @pssthpok Рік тому +1

      @@picklerix6162 I used to call it RamButt due to all the difficulties.

  • @trevorjones3755
    @trevorjones3755 3 місяці тому

    I'm a mechanical engineering student who has a few friends who know WAY too much about computers. I've been slowly trying to learn but this has been tremendously useful in that goal! You explain things very well to someone who barely knows anything about computers. Well done! And thank you

  • @ZnakeTech
    @ZnakeTech 2 роки тому +270

    How the hell do this only have 1434 views and this channel only have 711 subscribers at the time of writing, is beyond me. Very reminiscent of RetroAhoy, and I mean that in best possible way. Keep doing content like this, and look into optimizing for the UA-cam algorithm, most obvious thing you might be missing is a decent video description, you are not giving UA-cam anything to work with there, stick a synopsis of the video in there, to hit a lot more of the juicy keywords - this video should be sitting at at least 100000 views or even way more by now, in my opinion.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +24

      Thank you!

    • @scottlarson1548
      @scottlarson1548 2 роки тому +19

      This felt like a 90 minute video with 60 minutes of content. It goes slowly with lots of needless five second pauses that I guess were supposed to make the content seem more dramatic.

    • @FlockOfHawks
      @FlockOfHawks 2 роки тому +3

      Commercial interruptions every ~6 minutes - i may continue viewing beyond the 30' mark in a better mood , coz the content is ok

    • @ReneKnuvers74rk
      @ReneKnuvers74rk 2 роки тому +5

      Should be way shorter, 7-9 minute chunks. And no long waits with a black screen. And also a catchier title would help. ‘Home Microprocessor’ doesn’t really describe the content, to my opinion.

    • @nickryan3417
      @nickryan3417 2 роки тому +7

      Agreed. Way too long in one chunk. Also some of the early content was wrong, for example Random Access Memory is not "storing data in the same place as code", it's being able to access any element of data without having to first read all the earlier data. Get elementary things like this wrong and combine it with a far too long video and numbers will drop.

  • @CattleRustlerOCN
    @CattleRustlerOCN 8 місяців тому +2

    And we all know what's happened in the 3+ years since this video was released. x86 is still the architecture of current desktop PCs, and AMD with Ryzen and Threadripper smack Intel around, and compete very closely with Nvidia in the GPU market, even beating them in some pure raster situations, but are behind when it comes to ray tracing. This technological journey that I have been able to watch and be a part of all these years is fascinating. I'm 54 so I have witnessed a lot, especially the juiciest parts starting in the early early 80s.
    Thank for the video.

  • @felixbaum48
    @felixbaum48 2 роки тому +35

    This may be nearly a year old but it's still absolutely brilliant. Thank you for putting it together!

  • @jeffreyphipps1507
    @jeffreyphipps1507 2 роки тому +1

    Exceedingly well done. Easy for people to connect with. As a college instructor, I will try to get as many students as possible to view this.

  • @johnpenner5182
    @johnpenner5182 Рік тому +7

    very good and thorough chronicle of early processor development - i like that you were able to trace the architectures to their root in von neumann, through the tubes, the IBM 360, the PDP, the 4004, and the altair (didnt catch if you mentioned this was the iconic machine which appeared to bill gates on the cover of a magazine and inspired the founding of microsoft). you did a nice work through the mainframe tube to transistors, and the microprocessor developments. the bascom calculator and the engineer calling for a simplified generalized design resulting in the 4004. thank you for this video. recommended.

  • @fretworka3596
    @fretworka3596 10 місяців тому +1

    Despite quite a few inaccuracies with some earlier market products and trends, and market share implications, it was useful overview of the history of the microprocessor, especially. The post 1990 analysis was more accurate.
    Nice for me to remember some of the kit I worked with. I wrote a traffic light controller in Z80 assembler. I'd forgotten that!

  • @sonicboomish
    @sonicboomish Рік тому +9

    This video was incredible. Thanks a lot for putting all the time and effort into this! really clear and well put together

  • @MrGsteele
    @MrGsteele 11 місяців тому +1

    Excellent documentary and walk down memory lane. This was the world of computing in which I and my peers in the computer business evolved, and I remember the steps and the back and forth chip competition. It's interesting to reflect on what has happened since this video was produced, and an addendum that brings us up to 2024 would be a logical follow-on to this excellent treatise. I was an early fan of the 6502, then the 680X0 for embedded designs, and then the rise of the Intel-based high volume home computers that transformed the landscape. The progress has been truly stunning.

  • @stachowi
    @stachowi 2 роки тому +3

    This was excellent. Looking forward to more content from you.
    I'm a CS/EE, 20 years in the industry.

  • @antonnym214
    @antonnym214 Рік тому +2

    Nice documentary! Thank you. The next leap forward will be photonix. All good wishes.

  • @jparky1972
    @jparky1972 Рік тому +4

    Thank you so much for this.
    I used to be a software developer back in the 90's to early 00's and knew the hardware development to the dual core CPU's.
    But after that, lost my interest in hardware side due to becoming a stay at home Dad.
    So thanks for this. Really filled in a lot of gaps.

  • @TedApelt
    @TedApelt Рік тому +1

    I still remember my Apple II computer in 1979. Programs were loaded on it with an audio cassette tape recorder. Later, I got a floppy drive and thought that was truly awesome.

  • @iVTECInside
    @iVTECInside Рік тому +9

    Good watch. One thing not mentioned was the fact that the desktop market is somewhat limited by the x86 core architecture. The same instructions from 1980 will function on an i7-12900k. ARM never had that hanging around their ankles. It will be very interesting to see how things develop from here.

    • @Mr_Meowingtons
      @Mr_Meowingtons Рік тому

      Are you saying ARM made in 1985 has nothing to do with todays ARM?

    • @AndyGraceMedia
      @AndyGraceMedia Рік тому

      @@Mr_Meowingtons It doesn't really no. I coded for the ARM1 Dev box that plugged into a BBC Master back in 1986 and ARM2 for the Arc 300/400 series and the A3000 BBC Micro. ARM2 was a chipset with the main CPU, video, memory and I/O controllers. ARM3 improved the CPU with 4k cache while the other chips were upgraded separately. Quite cool really.
      Those originals were wildly different from today and are not instruction compatible with even the true 32-bit ARM processors as they used as 24/26 bit bus with the rest of the bits used for passing around the equivalent of a status register and four level interrupt.
      After ARM3 came ARM6 and then the ARM600 series which were all true 32 bit addressing processors. There was also a cut down 16 bit ARM thumb architecture. DEC (and even Intel which bought DEC) released a StrongARM which powered some of the early Windows CE handheld devices like the Compaq iPaq and Dell Journada.

    • @RickHansbury
      @RickHansbury 10 місяців тому

      I agree about micro soft keeping their backwards compatibility but it is necessary. A shocking number of the country's businesses use old computer architecture with new interfaces grafted on.
      Losing backwards compatibility would (they say) cause a financial disaster.
      I think the softies should keep the Windows 11 and a new Windows system with limited and specific compatibility and complete control of API level architecture for security. They could rewrite the whole architecture to eliminate even the possibility of exploits.
      So essentially two Windows. I already have a nice new laptop securitized and firewalled to the point of uselessness and an older one used for games and social media only.

  • @Juancheros
    @Juancheros Рік тому +2

    Excellent video! Glad to see you mention the 4004 and the 8008. Beyond the home Microprocessor, there is the growing massive influx of Microprocessors into the automotive industry where multiple RISC devices simultaneously perform complex independent tasks, communicating with each other via evolving Canbus standards to a central cpu which has the infamous Obd port. This application has become a free-for-all in the industry for the sake of bling but rendering owners and auto mechanics constantly second-guessing what is going on. Would be great to see you make a video on this too, thank you!

  • @samposyreeni
    @samposyreeni Рік тому +19

    You should talk about DEC Alpha as well. In RISC design. Because that was totally insane on the RISC front.
    Also Intel's 860 and 960. Wildly different architectures, of which 960 survived rather far, even into printers and space applications.

    • @Chordonblue
      @Chordonblue Рік тому +4

      Since Faggin was mentioned, I think it's time for Jim Keller to get one also. Worked on DEC Alpha, Athlon, Zen, and for Apple, Tesla, and many others. Now at Tenstorrent, AI may be the answer to the next big discovery in just about everything.

    • @JadedArsenic
      @JadedArsenic Рік тому +2

      And he totally skipped over DEC's foray into desktop computing in 1978 - the DECstation!

    • @BasVossen
      @BasVossen Рік тому +3

      the DEC Alpha really was the first 64bit architecture. Too bad the managers were too nice for the cut throat IT business.

  • @mrflamewars
    @mrflamewars 2 роки тому +7

    Great video. Sandy Bridge was a defining moment for Intel - it's why so many of their later CPUs are so similar.

  • @tonylewis4661
    @tonylewis4661 2 роки тому +3

    And let's not forget the first 16 bit home computer (severely crippled by an 8 bit bus and closed software and hardware development by TI) the ill fated 99/4a with the TMS9900 (but it did sell close to 3 million units, virtually all at a loss).

    • @RickHansbury
      @RickHansbury 10 місяців тому

      I got one for Christmas that year and I loved it. I had the BASIC cart and Assember language. And a bunch of games.
      I got the BASIC cart by using the limited BASIC on board to show my parents what it could do and convincing my parents that instead of the home budget cart I could program it in BASIC. And I did. ❤

  • @samposyreeni
    @samposyreeni Рік тому +1

    Typically I'm hard to surprise where it comes to computing and the lot, but now I learnt about maybe dozens of "new" historical machines. Well done, indeed.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  Рік тому

      Thank you so much :) and thanks for the rest of your comments, it’s always good to hear someone else’s insight!

  • @kensmith895
    @kensmith895 Рік тому +9

    Truly excellent documentary. Well done. Charts the time line of my career from the 8008 to the present day.
    If you do release an updated version of this video it would be good to add a mention the DEC Alpha risc machine. And also mention Colossus from Bletchley Park. There were various other microprocessors that you could make passing reference to along the way such as the LSI11 implementation of the PDP11 and the microprocessor versions of the VAX architecture. Also HP had some proprietary microprocessors that they incorporated into their own products.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  Рік тому +1

      Thank you very much! If I ever make an updated version I’ll take your considerations on board :)

    • @vicheakeng6894
      @vicheakeng6894 Рік тому

      ADHESIVE

  • @joshjones3408
    @joshjones3408 11 місяців тому +1

    The background music is awesome....the video is great 👌👍👍👍👍

  • @dj_paultuk7052
    @dj_paultuk7052 11 місяців тому +3

    What about Colossus in 1943 ?. One of the greatest achievement's overall in computing. And far ahead of anything else in the world at the time.

    • @Stef-2U
      @Stef-2U 9 місяців тому

      Most people especially in USA didn't hear anything about collossus till about 2002, although some of us born in UK with parents in military, knew of it years before the uk government in 1975 started to declassify it, although it was kept a secret as to it's purpose of use during ww2, I knew though cos me dad told me lol

  • @jasonanthony166
    @jasonanthony166 10 місяців тому

    For old geeks like me, this was a pleasant journey down memory lane. You have put such a lot of research into this project and spent a long time with the editing. Your efforts are very much appreciated 😊

  • @thorstenglaubitz1006
    @thorstenglaubitz1006 Рік тому +4

    Well made documentary Archie. I've seen many 'history of the cpu' videos and yours is by far the most informative and thorough one. I enjoyed it alot. Thank you

  • @jaqueitch
    @jaqueitch Рік тому

    This video is probably the absolute best, comprehensive story of the microprocessor. Just amazing.... KUDOS!

  • @--fishiiki-
    @--fishiiki- 2 роки тому +8

    Amazing series! I'd love to see more like this. Great work man, can't believe this only has this many views

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +2

      Glad you enjoyed it! I am working on a similar video at the moment - keep an eye on the channel's community tab for updates :)

  • @insoft_uk
    @insoft_uk 2 роки тому +1

    Very well done, I grew up programming on 6502, Z80, 68000 and the dreaded 386.
    I have early memories of the Acorn’s ARM desktops finally the ARM has come of age

  • @vanlife4256
    @vanlife4256 Рік тому +11

    Archie, this was an awesome review of the Home Computing history! Great production! Thank you for sharing!

  • @chemtech90
    @chemtech90 5 днів тому

    I’m amazed at how well you built the story. Brilliant educative video about the history of computing.

  • @TailRecursion
    @TailRecursion 2 роки тому +22

    The production quality here is absolutely incredible and you've not even hit 3K subs yet. You deserve 1000x that, easily. I'm also seeing parallels to Ahoy in all the best ways. Thanks to the algorithm for bringing me here, and thank you for making excellent content; you've got a new sub!

  • @energysavingday
    @energysavingday Рік тому +2

    A hugely impressive, multi-decade summary. Well done.

  • @MrPDawes
    @MrPDawes Рік тому +17

    I was getting worried that you were not mentioning ARM what so ever until about 80% of the way through. Given the world dominance of this architecture, not one to miss. The early Archimedes Computers from Acorn were a step change in performance over the Intel architecture at the time, but their lack of maths co-processor became a significant disadvantage during the media war as graphics has a high compute demand.

    • @TheUAoB
      @TheUAoB Рік тому +5

      I felt the same way while watching. The ARM did have an FPU (FPA10) and was designed to support coprocessors, but indeed the standard Archimedes didn't ship with it, this meant software didn't really take advantage since it was likely FP instructions would be slow with the trap based floating point emulator.
      Acorn did try to break into the UNIX workstation market which would have meant much more powerful ARM based computers in the early 90s had it been successful. Even then, Acorn chose to make the FPU optional even on their flagship £3,995 R260 (rebranded Archimedes A540), without an FPU you have to wonder what market Acorn was actually aiming for!

    • @AR15andGOD
      @AR15andGOD Рік тому

      math

    • @ThatSockmonkey
      @ThatSockmonkey Рік тому +1

      ​@@AR15andGODis not a word.

  • @kurtisrios8820
    @kurtisrios8820 5 днів тому

    Thanks for keeping this information not only alive, but digestible, for people like me who didn’t experience it first-hand. It’s incredibly fun reading about everyone’s experiences in the comments with the context about the historical events and the progress of technology from the video. Also, the animations and video editing are top notch ❤

  • @notation254
    @notation254 2 роки тому +4

    Great doc, loved all the details throughout the years. You should be proud.... and damn, you deserve more views and subs for this.

  • @unityxg
    @unityxg 5 місяців тому

    The Encarta cameo brought back some flash back memories. What a time to be alive. I am happy to see what the world looked like before mainstream Personal Computers. I think computers have really revolutionized humanity in many different ways, for better and for worse.

  • @Magnulus76
    @Magnulus76 2 роки тому +10

    I had Phenom II. It was a great budget CPU. I used it for some fairly hefty tasks, like chess analysis. Clock speed wasn't everything back then.
    Bulldozer was a disaster and I didn't upgrade my CPU until Ryzen generation.

    • @golangismyjam
      @golangismyjam 2 роки тому +1

      You say it was a disaster but I owned one of those cpus for 10 years and it still plays Triple AAA games to this day. Sure they ran hot and used a lot of electricity but that was all AMD CPU back then

    • @manuelhurtado9970
      @manuelhurtado9970 2 роки тому

      @@golangismyjam Yeah, my brother uses my dad´s old fx8350 and it works fine for games like fortnite, minecraft or roblox. hell it can even run elden ring with a gtx970

    • @ismaelsoto9507
      @ismaelsoto9507 Рік тому

      @@manuelhurtado9970 The hexa and octa cores FX CPUs can still run modern titles well enough, they sure did age better than the i3/i5s from the same time period that started struggling when games used more than 4 threads.

    • @manuelhurtado9970
      @manuelhurtado9970 Рік тому +1

      @@ismaelsoto9507 yeah, true, the fx series had a faster clock and more cores, the only problem is that cores share some stuff like the FP scheduler

    • @ismaelsoto9507
      @ismaelsoto9507 Рік тому

      @@manuelhurtado9970 Yeah, it may had make it easier to develop/manufacture an octa core CPU without being too expensive (An FX 8150 has a die size of 315 mm² vs Intel's Xeon E5 2650 till 2690 all octa cores with a die size of 435 mm² on Intel's 32 nm that was denser than Global Foundries 32 nm process), but it crippled their IPC... AMD hoped software would caught up fast and fully utilize the 8 threads to make it a more compelling option than the competition, sadly it only happened when the hardware was already obsolete.

  • @BigDaddy_MRI
    @BigDaddy_MRI Рік тому +1

    I’m 70 now, and I watched a lot of this (except for the very early development) microprocessor evolution, and I remember the Intel 4004 being produced.
    While in the US Navy, I was able to get a sample of a 2.5MHz Z-80 by Zilog, and using 256 bytes of ram and 2k bytes of EPROM I wrote and created a tic-tack-toe game. The 40 pin Z-80 is still being made and also the Z-80 family of chips are still available. Brand new. I’m still writing code and building projects with that 8 bit chip. Back in the early ‘90’s, the Z-80 ran almost 100% of all the bank cash tellers and even today, it still does, albeit with much smaller and advanced architecture and clock speeds, the core compute engine is still the Z-80. My opinion only, it remains the most powerful 8 bit microprocessor ever designed. 178 instructions, 3 interrupt modes, and indirect addressing for memory and I/O, it is an amazing device. And my Heathkit H89 with a Z-80 still boots into CP/M (or MS-DOS) with no problems at all. I write code on that machine for my projects.
    Thank you for an OUTSTANDING video!! Wow, that took me down a gr8e bit of fond memory lane. Pun intended.

  • @Aurange
    @Aurange 2 роки тому +4

    1 1/2 years after the fact and this video finally got blessed by the algorithm gods.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +1

      Indeed - I now get more views in 6 hours than I did in the first 6 months of the video release!

  • @v8pilot
    @v8pilot Рік тому +1

    In 1965 I was an EE student at Birmingham University. Dr Wright was one of our lecturers and his speciality was semiconductor devices - particularly heterojunction transistors. In a lecture he mentioned the latest thing - integrated circuits. I asked him if he thought it would be possible to get a complete computer on an integrated circuit. My recollection is that he told me not to ask silly questions. He obviously thought I was taking the piss.

  • @01chippe
    @01chippe 2 роки тому +8

    This was thoroughly enjoyable and a great trip down memory lane. Why didn’t you include the shift of including graphics processors on the cpu? Great video and very detailed.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +5

      Thank you! Ultimately in a video like this, you have to limit your scope somewhere, and since integrated graphics and other dedicated on-die accelerators are a fairly new concept (only really appearing in the last 10 years) they were left out.

  • @spicybreadproductions1972
    @spicybreadproductions1972 Рік тому

    It’s fun watching talk about large computers with comparatively low instruction sets, on a phone that handles division like a breeze. We’ve truly come a long way.

  • @Damjes
    @Damjes 2 роки тому +3

    Also, CP/M ran on 8080, it ran on Z80 because of compatibility.

    • @fredbear3915
      @fredbear3915 2 роки тому +1

      Yes indeed. CP/M was written in 8080 Assembly Language so it was only ever going to use the 8080 opcodes that the Z80 also ran for compatibility sake. When I wrote CP/M utilities back in the 1980s, even though I wrote in Z80 Assembly Languange, I had to make sure to only use those 8080 instructions from within the Z80 set otherwise my software would not have run on a lot of machines!

  • @DataWaveTaGo
    @DataWaveTaGo 2 роки тому +1

    As a hardware/software designer since 1972 I have to say - *Excellent!*

  • @cemacmillan
    @cemacmillan 2 роки тому +11

    This is an excellent overview of how things have changed and for me (active in development from 1991 onward) shows just what a mess things really were, and how essentially things were held back. I wish there were more mention of how anti-competitive practices did much to create the duoply in the desktop sphere which existed until 2020 but this addition would have made the video at least twice as long :) Here in 2022, things still appear "open." I'm sitting here with an unused Linux box with an original FX water-cooled series AMD under my desk, and writing this comment on a 2021 MacBook Air which despite so little RAM outperforms everything I've ever personally owned while making no noise. Will we see more and more ARM, or, might something disruptive and interesting emerge? We'll see.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +3

      Thanks! :) Indeed, only time will tell.

    • @awuma
      @awuma Рік тому +1

      A PC is silent if you use high-quality fans (i.e. Noctua), and control them properly.

  • @ovechkin100
    @ovechkin100 Рік тому +1

    as a kid i of course had no idea how new computer tech really was. i was born in 1988. i remember the mid 90s playing on a computer, and the games were on flopy disc's. they were all so ghetto, but back then its all i knew. then the late 90s my parents got a whole brand new computer and wow. msn, computer games, surfing the web. obviously its a lot different today, but you could still generally do the same stuff. talking to my friends, playing games, researching cool shit. and ever since its only elaborating. its bizzare that it all came around right as i grew up and was able to just fall into it. insane times. will be interesting to see how far it goes.

  • @sophiekempston5152
    @sophiekempston5152 4 роки тому +24

    Fantastic series, can't wait for the next one!

  • @mr88cet
    @mr88cet 9 місяців тому +1

    Superb history summary! Thanks.
    I just recently retired from 40 years in this industry, so I remember *_a lot_* of this.
    What’s wild though, is that the parts of this history I do _not_ remember as well, are probably the parts that most do: I mostly worked in the embedded-compute arena, have been an Apple dude from the Apple ][ days, and am not much of a gamer, so the exact details of the Intel-AMD x86 wars were not something I followed in much detail.

  • @CobraTheSpacePirate
    @CobraTheSpacePirate 2 роки тому +4

    Fantastic! Needs more views! What a shame!

  • @kgbmmt
    @kgbmmt 8 місяців тому

    Great job! I was there at there back in the 80's and worked in computer retail through to 2008. It was a fantastic journey and your video brought the memories flooding back. Thank you!

  • @charlesjmouse
    @charlesjmouse 2 роки тому +3

    A rather belated "very good".
    Such an excellent video with so few views! I must go see what else you've done and maybe add to views.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому

      Glad you enjoyed it! I am in the process of creating the next long form video :)

  • @JeremyWinfreeDev
    @JeremyWinfreeDev 2 роки тому +1

    I've been doing all of my dev and design work on an M1 Mac for about a year now and it's amazing

  • @jacobrzeszewski6527
    @jacobrzeszewski6527 2 роки тому +8

    Awesome comprehensive video. Kinda surprised you didn’t mention Intel “tick tock”, and AMD releasing the first 5GHz processor. Even if AMD did it vía a technicality.

    • @toby9999
      @toby9999 2 роки тому

      I guess he can't cover every detail of everything in one hour.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +1

      Thanks! :)

  • @ai_is_a_great_place
    @ai_is_a_great_place Рік тому +1

    24:44 my first reaction would be to take a pic of it to save as reference, but you would have had to write it down by hand, wow!

  • @babythorgaming2166
    @babythorgaming2166 2 роки тому +17

    The production quality on this is so darn high, how is this channel so small?? You've definitely earned a new subscriber, and I hope to see new content from you in the future!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +5

      Thank you so much! I am currently working on a new video, and will be posting updates on the community tab as it develops :)

    • @customsongmaker
      @customsongmaker 2 роки тому +2

      There are 46 ads in this one video, so maybe that's why

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +4

      Hi customsongmaker, I’m surprised you were served that many ads - I just rewatched it on another one of my channels and I got less than 1/4 of that so I’m unsure why you got so many.
      I would add that the video has only been monitized for less than a week and I’ve been playing around with the ads to see what the best ratio is r.e watchability vs revenue. I have received a few comments suggesting that the ad frequency is a little high and I will be adjusting that accordingly when I’m back at my computer in a few days.
      One final thing to say that as a small creator ad revenue is the only source of income (no patrons or UA-cam members), and looking to the future (I will be finishing my PhD next year so may not have a flexible schedule after) it will be difficult to continue making videos like this without revenue. I appreciate your comment and will take feedback on board - feel free to keep an eye on the community tab for more updates :)

    • @customsongmaker
      @customsongmaker 2 роки тому +1

      @@TechKnowledgeVideo I counted the ad breaks on the progress bar. I didn't actually spend 20 minutes watching advertisements just for your video, I stopped watching very early. If there had only been 3 ads - 1 at the beginning, 1 in the middle, and 1 at the end - I would have watched 3 ads, which is 3 times as many ads as I actually watched.
      Try breaking this video into 5-minute or 10-minute videos. "The Complete History of the Home Microprocessor: Part 1". Then you can see how many people will watch 5 minutes or 10 minutes, with 1 ad. If they like it, they will continue to watch the rest. They will also subscribe, since it makes them think of you as someone with many good videos that they don't want to miss.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +3

      The ad breaks on the progress bar do not correspond to ads you will actually see - UA-cam will only play about 1/5 of the adverts it displays on the progress bar - which for this video works out at about every 9 minutes.
      If you look at the channel I have done what you have said - split this video into 5 separate videos, which viewers will typically see 1 mid roll or less per video (with average video lengths of around 20 minutes). However, this combined video has been far more popular than the individual parts. As to why this is I’m not sure, but the algorithm far prefers pushing this video out than the others.
      I would add that video retention has stayed the same in the week the video has been monitized compared to the prior week - people on the UA-cam partnered subreddit have done loads of testing on a whole range of videos and against logic it really genuinely doesn’t affect watch time. However, having watched the video back myself for the first time, I do think the quality of the video is degraded with the current frequency of adverts and I really want people to have a great viewing experience. Hence I will reduce the number of ads after the weekend.
      If you do want to watch the video without ads feel free to use an ad blocker or watch the individual parts which are a lot lighter ad wise :)

  • @video99couk
    @video99couk 2 роки тому +1

    Back in 2004 I bought a super-compact Staples branded laptop with a processor which was a fairly unusual fork and I think wasn't mentioned here. It has a 1GHz Nehemiah C3 processor by Via, designed for low consumption and cooling requirements. It was a "netbook" PC before the name had been coined, and served me well for many years.

    • @absalomdraconis
      @absalomdraconis 2 роки тому

      Yeah, Via processors generally fall into a "forgot to mention they also ran" category.

  • @NesNyt
    @NesNyt Рік тому +61

    The 6502 was not the king......if IC's had a religion the 6502 would be god

    • @toby9999
      @toby9999 Рік тому +2

      It was the 8bit cpu I enjoyed programming the most.

    • @TheElectricW
      @TheElectricW 11 місяців тому +3

      Likewise... good grounding for assembly programming.... I used to discuss with a Z80 programmer... he couldn't see how it possible to write programs with only 2 x 8 bit registers available!

    • @lazymass
      @lazymass 11 місяців тому

      ​​​@@TheElectricW Do you happen to have some source i can look at, that shows how the programming goes for such chips?

    • @hanspeterbestandig2054
      @hanspeterbestandig2054 10 місяців тому

      ⁠@@lazymassHave a peek at the UA-cam Channel of „ChibiAkuma“. Keith will teach you all the things to write Code for 6502 or Z80 or 68000 … based Computers. Among of this he published two books „Learn Multiplatform Assembly Programming with ChibiAkumas“ part 1 & 2… Highly recommended!

    • @sharedknowledge6640
      @sharedknowledge6640 10 місяців тому

      Indeed. The 6502 simply crushed the competition for affordable home computers.

  • @chrisfox8848
    @chrisfox8848 Рік тому +1

    Very interesting documentary, I soaked up all 86 minutes of it, well done. Being an Atari 800 man, the 6502 processor was my world for a good while, I learned to program the beast in assembly language and I was pretty much addicted to the Atari games which usually cost an eye watering £39.99 each (that was a hefty price in the mid 80's.) The Atari 800 clocked in at a blistering 1.79Mhz and had a mind bending 48k, yes 48k or 393'216 bits of RAM and another 10k of ROM (I think!), what a machine. Well, It's a thumbs up from me, happy days...

  • @endofthelinejoel
    @endofthelinejoel 2 роки тому +9

    This deserves millions of views. Well done.

  • @brianbowcutt249
    @brianbowcutt249 2 роки тому +1

    The great and terrible algorithm kicked me here after I purged dozens of recommendations for sub 60 second cat videos and memes, glad something out there noticed my futile attempts to save my brain. Excellent work, looking forward to seeing more.

  • @jamesbond_007
    @jamesbond_007 2 роки тому +3

    This is an exceptionally well done video! Fantastic capturing of the entire history from the origins to today. Great job!!!

  • @brianschuetz2614
    @brianschuetz2614 Рік тому

    Just saw this today. Very interesting, even two years later. I was in high school in the early 80s. I did not grow up with computers around me. My mom was a high school dropout, and to this day won't touch a computer. My dad did graduate high school. He's been a truck driver most of his life. I remember having a "Pong" based computer console in the 1970s, and then I got an Atari 2600. It was the Atari that sparked my interest in computers. I was excited when I got my first computer, the Texas Instruments TI-99/4A, for Christmas. In the late 1980s I was in the military and at one assignment I had I was using a dBase III database. That's when I realized the career possibilities of computer programming. I bought a book on how to program the dBase III application, and wrote some programs to make my job easier. Once I separated from the military I went to college and studied Computer Science. There was so much I didn't know about computers. Now, I've been a computer programmer for a couple decades. This video gave me some context to my life regarding home computers.

  • @Damjes
    @Damjes 2 роки тому +3

    Wrong again. Not "every machine" uses Vacuum Technology. Konrad Zuse created Z1-Z4 computers which were relay-based AFAIK.

    • @crabby7668
      @crabby7668 Рік тому

      Curious mark did an interesting visit to Japan to see a relay based computer that is still running. Iirc they were used commercially once. Worth a look if you are interested.

  • @chafacorpTV
    @chafacorpTV 10 місяців тому

    This video reminds me so much of Ahoy, a channel that I absolutely adore. Everything was concise and easy to follow, alongside some cool synths.

  • @mikedrop4421
    @mikedrop4421 Рік тому +3

    This is stellar work sir. Yes it gives off Ahoy vibes but your style shows through. Please make more stuff!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  Рік тому +1

      Thank you so much! :)

    • @daveroche6522
      @daveroche6522 Рік тому

      Agreed. VERY informative and interesting - for those of us with an actual (functioning) brain......

  • @paulbrouyere1735
    @paulbrouyere1735 Рік тому

    Hear, hear! This is one of the best summary’s I saw about micro computers, it’s history, the differences, from 1947 up till 2022

  • @maniacfox111
    @maniacfox111 4 роки тому +15

    Really great content. Well put together!

  • @markbanash921
    @markbanash921 2 роки тому +2

    I took my undergraduate degree at the University of Pennsylvania, where I did my introductory computer science courses in the Moore School of Engineering building where Eniac had operated. At that time units of the old machine were lying around in open storage on the first floor and when I would walk to class I'd go right past them and be amazed at how large and clumsy they seemed. Two years later I started seeing ads for the Timex ZX81, the American version of the Sinclair machine, appear in magazines like Scientific American. The juxtaposition of those two computers got one thinking about how far the revolution had come, as well as how far it could go.

  • @frankowalker4662
    @frankowalker4662 2 роки тому +5

    Sorry about this, but you did'nt mention the worlds first electronic programable computer, Colossus, developed from 1943-45.
    Great documentary.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +2

      Thanks! Colossus wasn't technically Turing complete so this is why it is not mentioned :)

    • @frankowalker4662
      @frankowalker4662 2 роки тому +1

      @@TechKnowledgeVideo Fair enough. :)

  • @chriswilde7246
    @chriswilde7246 9 місяців тому

    I think it was late 70's when we started to see the small digital calculators in shop windows, they were very basic, and if I remember correctly, they were very expensive.
    I also remember when Quake came out, played it for hours! Lol
    Great clip!

  • @joelstyer5792
    @joelstyer5792 2 роки тому +3

    This is a great video, well put together and researched. I lived through most of this (from the 60s forward) and it is nice to see it all condensed together in a timeline. I was hoping to see something about Intel buying DECs chip division and gaining knowledge from the Alpha processor design (fast speeds and high power dissipation) but understand that not everything can be included. Near the end of the 8086 era, the NEC V series chips also had an impact with consumers as well to increase performance. Congratulations on some excellent work.

  • @jergervasi3331
    @jergervasi3331 10 місяців тому

    This video is the TRUE definition of "epic". Very well done, thank you!

  • @willibaldkothgassner4383
    @willibaldkothgassner4383 Рік тому

    And you are speeking perfect slow and easy to follow even for non native speakers, thanks!

  • @Conenion
    @Conenion 2 роки тому +6

    To say that Intel Itanium was a RISC design is a bit of a stretch. Actually, back then it was the RISC crowd that said that Itanium's VLIW approach was doomed to failure. The main difference between VLIW (which Intel calles EPIC) and advanced superscalar RISC designs is that EPIC does not allow for out of order (OoO) and other dynamic execution of instructions. Instead all this burden is put on the compiler. In fact, if you'd ignore dependencies of instructions and thus the order of instructions produces a wrong result, Itanium will deliver a wrong result happily. Itanium does no data dependency checking at all, this has to be done by the compiler.
    Removing all dynamic execution features presents a dilemma: The compiler, which has to be very very smart, is forced to find every bit of instruction level parallelism (ILP) during compilation. EPIC has no way of any sort of reordering or re-scheduling. If the compiler isn't able to find ILP there isn't any at all, instread a NOP is issued to the pipe, resulting in VLIW instruction bundles which load only 1 of 3 pipes with work, the other 2 just do NOPs. In that case you lose badly. This static scheduling is especially difficult with loads, since the memory hirarchy presents us with practically impossible to predict memory delays.
    VLIW/EPIC works best with code like matrix multiplication, which is a very static w.r.t. input data. In such cases parallelism is basically given. An easy job for a compiler to parallelize code like this. But such code is rather rare in a typical desktop or server setting. Also such SIMD computations can be done nicely in vector units of non-VLIW processors, like SSE or AVX in the x86 world.
    In short, VLIW/EPIC is an architecture that is targeted too much towards specific computational classes to be a general purpose CPU architecture. Also writing a compiler for EPIC which is able to extract every bit of ILP, was/is close to impossible. There were other problems specific to Intel's implementation, notably that executing legacy x86 code was painfully slow.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +2

      Very interesting stuff, thank you for your insight!

    • @absalomdraconis
      @absalomdraconis 2 роки тому

      Baring any memory & similar delays (not particularly familiar with the design, so not sure how those would be handled), it shouldn't actually be _too_ difficult to get decent (not necessarily perfect) scheduling done.
      In essence, you compile & link, but don't output a final executable, instead producing a SSA form or similar. You then take this, and throw it through a scheduler stage- the scheduler defaults to outputting a NOOP for _all_ of the executable code, but _looks for_ instructions that it can grab from the SSA form as _replacements_ for NOOP instructions, marking those SSA instructions as "finished" in the process. The matching would probably be best done by a module written in Prolog or something. Wouldn't be all that fast, but with a decently sized window should be fairly effective.

  • @Easyrhino2k
    @Easyrhino2k 2 роки тому +2

    For the mere quality of this video i can offer you one subscriber - namely me. Altho I don't fully comprehend all the lingo about the tech stuff, I find it truly fascinating and therefore I watched the whole video from start to end in one go, and will recommend it to all I know with the same fascination for tech stuff and especially PC's. Good job

  • @SquallSf
    @SquallSf 2 роки тому +3

    Very interesting video, I watched the whole 1:26h.
    I didn't know that actually the first microprocessor was done for F14. Since it is de-classified now could you point to a video with more details?

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому +1

      Thank you! Unfortunately I cannot post links here, however I can recommend a great article by WIRED entitled “The Secret History of the First Microprocessor, the F-14, and Me” which goes into more detail :)

  • @Nehmo
    @Nehmo Рік тому +2

    I hate to add "old-guy" perspectives, but I lived through the transition from tubes to transistors. Modern kids view that transition as a short misdirection in the march toward the integrated circuit. But it was a huge advancement that has nothing comparable in modern times.

    • @laustinspeiss
      @laustinspeiss 2 місяці тому

      Im in the same boat!
      But it does give us the ‘under the hood’ knowledge that lets us make systems do things better faster with the same hardware!
      I still write code that doesn’t waste any more resource
      than necessary!

  • @bazza5699
    @bazza5699 3 роки тому +6

    wow! how has this got so few views!/?

  • @HelloKittyFanMan
    @HelloKittyFanMan Рік тому +2

    It's pretty interesting that RISC had taken so long to get a major stronghold, but now that it finally has, that came up from way back behind so many CISC CPUs since the 80s when Acorn created it!

  • @XFAiDERse
    @XFAiDERse 3 роки тому +3

    WHAT.A.DOCUMENTARY.
    ⭐⭐⭐⭐⭐
    💚💚💚💚💚
    👏👏👏👏👏

  • @madd20
    @madd20 2 роки тому +2

    why this video have only 2100+ views, it should have 1000x more, this is good compendium of knowledge about cpu history :)
    Great work i will share this.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 роки тому

      Thanks a lot! As a small channel every share makes a big difference :)

  • @SMJSmoK
    @SMJSmoK Рік тому +3

    Really awesome video!
    Regarding the future of ARM on desktop, I think that this will come when some solution to efficiently emulate x86 gets developed. Apple was able to make the jump because of the restricted and curated nature of their ecosystem. But the "Windows world" is so dependent on legacy software (which is also endorsed by Microsoft with their insistence on stable APIs and backward compatibility) in both home and business that I think that this is absolutely crucial. Apple has Rosetta for this purpose, not sure how efficient it is to be honest.

    • @orestes1984
      @orestes1984 10 місяців тому +1

      Apple can efficiently emulate Windows 11, and its comparability layer to run even Windows 95 era software on ARM, it already exists... Microsoft is already transitioning and Windows 11 has a hardware comparability layer much like Apple's Rosetta which runs X86/64 code at near native speeds. I have it installed in a VM on my Mac. The only complaint regarding ARM based computers is where the GPU power is going to come from.
      Due to its current architecture Apple is GPU locked to its Silicon processors that have no ability to access off chip GPUs.
      Microsoft/Intel will have to come up with an alternative chip solution but that's only IF they want to keep the GPU industry alive...
      The days of the GPU or even Front Side Bus are severely limited anyway, eventually as Apple is proving with Silicon it can do at least mid level GPU performance on the same chip as the CPU and run most games well at, at least 1080p with playable frame rates (in terms of the games available).
      Intel is working on a similar chip to the Apple M1, M2, and M3 but it seems they are further away than ever from achieving it, and if they do so, it won''t be x86 based either...
      Which leaves AMD in a better position, especially as they acquired ATI, and then the dark horse which is Nvidia which is also producing PC like performance with its Sheild based devices.
      This all leaves Intel in the weakest long term position currently in terms of its long term roadmap with hot, bloated, and slow processors (until you are draining about 500watt/hours which no one really wants to do anymore) and unable to dye shrink due. to either losing performance, or the current heat issues, where at about 100degrees without water cooling you can basically cook an egg on top of the CPU coolers of most Intel based workstations/desktops.
      Then in the mobile segment, struggling to push battery life up to 4hours (vs 16 on a Mac) while basically using desktop components in what can only be classified under the old term of "sub-desktop" computers. Not withstanding the horrible quality of cases, keyboards, and track pads by comparison to Unibody Macs (which in design principle are now 15 years old). That's a bother.
      You get a better user experience in terms of tactile functionality and ergonomics from a 2008 Unibody Mac than any current Intel or AMD laptop on the market which is why they dominate the industry in that market space still to this date where if you don't need a stupid API like .NET frameworks, or Direct X every man and his dog is using a MacBook Pro as the user experience is miles ahead.

  • @TechDeals
    @TechDeals 5 місяців тому

    At the 1 hour mark, you note that the i7-920 didn't often show a performance gain.
    Having been an early adopter of it, moving from the Q6600, I have to strongly disagree with that. The difference was obvious and noticeable the minute you did more than treat it as a "faster older PC".
    The multi-tasking performance was astounding and the overall responsiveness was a step up. I ended up buying 2 of them in 2009 to replace both of my Core2Quads at my office, the difference was noticeable enough.