Are Computers Still Getting Faster?

Поділитися
Вставка
  • Опубліковано 9 січ 2016
  • In a recent episode I explored a 10-year-old MacBook to see if it could still keep up in today's world, and surprisingly it could. So in this episode I explore why that is.
    Guest stars include:
    Rob Ivy / The Obsolete Geek
    Mike Murray / The GeekPub
    Robin Harbron / www.p1xl.com
    Clint Basinger / Lazy Game Reviews
    Also visit 8Bit Keys here:
    / @8bitkeys

КОМЕНТАРІ • 6 тис.

  • @davidmullenix3757
    @davidmullenix3757 4 роки тому +636

    "Some CPUs have as many as 8 cores."
    Looks over at the AMD threadripper with 64 cores and 128 threads...
    Man, this video was only 4 years ago!

    • @DrDoom-yf2qj
      @DrDoom-yf2qj 4 роки тому +35

      sure, but the average is still 4 cores

    • @ryanmoore8814
      @ryanmoore8814 4 роки тому +47

      I'd argue that 6 core cpus are more popular now what with Ryzen 2nd and 3rd gen. I could be wrong though.

    • @stigrabbid589
      @stigrabbid589 4 роки тому +13

      @@ryanmoore8814 And 6 core Intel core i5 and i7 CPUs (my ASUS ROG Scar II has a 6 core i7)

    • @Moonknife
      @Moonknife 4 роки тому +3

      That's the Moore law for you!

    • @derrickcummings3990
      @derrickcummings3990 4 роки тому

      the bus partly already had quad pumped 4*32 (to make an example would be a Core 2 Duo mit FSB1333)
      if you have a buffer you fill or reapting commands that can be used the cpu also can skip parts of the code like it does with the TLB
      in case the TLB still had the data nothing had to be reloaded only the new commands have to be done

  • @SollowP
    @SollowP 2 роки тому +102

    "As many as 8 cores"
    AMD: "Gotta pump those numbers up, those are rookie numbers."

    • @johnsledge3942
      @johnsledge3942 2 роки тому +3

      the next amd epyc chip could have 128 cores, amd just keeps going!

    • @SkyenNovaA
      @SkyenNovaA 2 роки тому +1

      @@johnsledge3942 I'm really starting to like amd, they're much better than they were a few years back.

    • @everythingtube172
      @everythingtube172 10 місяців тому

      @@SkyenNovaAyeah as soon as zen happened

  • @beedslolkuntus2070
    @beedslolkuntus2070 4 роки тому +376

    Kind of amazing how we are "doubling" every resource of a computer, he told 8 cores is the max? Today we have 64 cores in one socket!

    • @PunakiviAddikti
      @PunakiviAddikti 4 роки тому +28

      64 cores? That's gonna generate a fuckton of heat though, a gaming PC with 64 cores would become a stove!

    • @GhostyOcean
      @GhostyOcean 4 роки тому +55

      @@PunakiviAddikti gaming doesn't fully utilize all the cores on your PC, unless you're using them in virtual machine and they're all gaming

    • @PunakiviAddikti
      @PunakiviAddikti 4 роки тому +6

      @@GhostyOcean Idk what kind of gaming PC you have, but my PC always uses all cores to do various background tasks and assigns a few cores for the game itself.

    • @JamieVatarga
      @JamieVatarga 4 роки тому +26

      @@PunakiviAddikti still, most games are optimized to run on only 4 cores.

    • @PunakiviAddikti
      @PunakiviAddikti 4 роки тому +6

      @@JamieVatarga That might be the case, but consider this: if your PC only has 4 cores and the game needs 4 cores to run smoothly, ignoring the GPU and RAM, then your operating system doesn't have any resources left to do background tasks. Just because you have nothing open currently and your CPU should be idle, doesn't mean your PC isn't doing a lot of things behind the scenes. Especially when you're playing a game. Your operating system will use one or two cores to do other tasks if needed, but that takes resources away from the game. Minimum of 4 cores doesn't mean you can get away with only 4 cores.

  • @nko3210
    @nko3210 Рік тому +14

    Fun to look back after ~7 years to see this perspective. Would be a great candidate for a follow-up episode, maybe on this video's 10th anniversary. It's almost coming up soonish.

    • @morganrussman
      @morganrussman Рік тому +4

      Maybe if we bring it up enough in his newly released video's, maybe he'll release an update video on it. I know people have asked David a few times repeatedly about an X item (take the kitty rocket tower for instance) in his video's and he eventually did make a single video explaining where certain things are.

    • @KofolaDealer
      @KofolaDealer 10 місяців тому +2

      Core 2 duos are still usable

  • @LGR
    @LGR 8 років тому +1199

    Thanks for asking me to be a part of this! I really enjoy how a seemingly simple question has no one answer, it's fascinating stuff.

    • @obsoletegeek
      @obsoletegeek 8 років тому +24

      +Lazy Game Reviews I think we can all agree that modern computing hardware is fairly bland!

    • @8BitKeys
      @8BitKeys 8 років тому +21

      +Lazy Game Reviews Actually, thank YOU for being part of it.

    • @patiencedvrx
      @patiencedvrx 8 років тому

      +The Obsolete Geek I'd somewhat disagree, but it might just be because I bought my first graphics card upgrade in five years back in September, so that was really exciting for me on a personal level :3

    • @-taz-
      @-taz- 8 років тому +10

      +Lazy Game Reviews There might be even deeper underlying reasons for the observations you cited while wearing the MS-DOS shirt I always wanted.
      1. Physics. Processors couldn't get faster or smaller due to bumping up against quantum effects, so they started getting more cores. But the software technology like operating systems, compilers and languages, not to mention programming methodologies in programmers' minds, couldn't very easily adapt. It's just starting to catch up now, and is only partially useful at that.
      2. Oligarchy. Why are tablets and chrome books becoming more prevalent, and cool? Because in the late 90's, big intelligence took back the consumer computer industry which had largely escaped their control. More has to be moved into the cloud, and away from private ownership, so that end users can both be monitored and manipulated. Here's a quotation from Jaron Lanier who coincidentally was the MIT guy involved with the Power Glove:
      "What we call 'advertising' in Silicon Valley is something totally different: It's micromanagement of the portion of the limited options immediately available to you when you're using an information system, for the purpose of calculated behavior modification." -Jaron Lanier

    • @B-Roll_Gaming
      @B-Roll_Gaming 8 років тому +4

      your mustache is jarring and genrally quite troubling. I love you.

  • @Michirin9801
    @Michirin9801 8 років тому +249

    Oh hey! It's LGR! Now that's a welcome cameo...

    • @The8BitGuy
      @The8BitGuy  8 років тому +21

      +Ruko Michiharu Indeed. I was glad to have him in the episode.

    • @8BitKeys
      @8BitKeys 8 років тому +1

      +Computer Whiz 1999 I just watch his channel, although admittedly he has so many videos I have not seen them all.

    • @Michirin9801
      @Michirin9801 8 років тому +6

      ***** I watch his channel, I really like his reviews and his sense of humour, I especially like how in-depth he goes with some of the more obscure stuff he covers

    • @EgoShredder
      @EgoShredder 8 років тому

      +Ruko Michiharu Agreed his channel is very good. I had to disagree with his comment about things like Facebook being low in CPU usage etc. I find this is slow and sluggish even on a fast well maintained computer on a ultra fast web connection. On an older computer from ten years ago or less, it can be a pain in the backside.

    • @EgoShredder
      @EgoShredder 8 років тому +1

      +Ruko Michiharu Agreed his channel is very good. I had to disagree with his comment about things like Facebook being low in CPU usage etc. I find this is slow and sluggish even on a fast well maintained computer on a ultra fast web connection. On an older computer from ten years ago or less, it can be a pain in the backside.

  • @DarthScape
    @DarthScape 4 роки тому +318

    "As many as 8"
    And we now have 64 cores in a single socket.

    • @uroborous01
      @uroborous01 4 роки тому +27

      DarthScape 64, 64 bit cores in a single socket. And we still have trouble emulating certain n64 games on them.
      Were going really fast, but where are we going exactly?

    • @uroborous01
      @uroborous01 4 роки тому +12

      bootleg linux well i was trying to make a cheeky emulation nerd joke, but i guess i kinda failed.
      You do make a valid point tho.

    • @CommonApathy
      @CommonApathy 4 роки тому +5

      And you'll only ever use like 3 of them except for very, very specific tasks.

    • @ILikeStyx
      @ILikeStyx 4 роки тому +2

      Well... 2020. 64-Core (128-Thread) AMD Threadripper has yet to be released ;)

    • @SavageDarknessGames
      @SavageDarknessGames 4 роки тому +2

      The issue isn’t how many cores on a single socketed chip, the issue is the programmers taking advantage of multi core, multi threading, programming.
      Unfortunately most games and common used applications have no use for more than one core.
      The only time you’ll find multi core using programs is in things like gene splicing and physics computations.

  • @Agorante
    @Agorante 5 років тому +158

    All this makes me feel old. You start personal computing at 1975. I started a little before that . I had an Altair but my first real useful computer was my Commodore PET. My boss and I around 1975 both got interested in PCs. He went with a Radio Shack TRS-80. It had 4K of RAM. I wanted more power so I got a PET with 8K. But before he received it it expanded to 16K. I was jealous.
    I don't remember how much RAM the Altair had. I think it had been about 512 bytes. Of course it didn't even have a boot ROM. In order to boot it you had to set the switches on the front panel in a specific order and watch all the little lights. I only did that a couple times. Once you booted it you could load a Microsoft BASIC tape. It took a long time to even get get a prompt indicating that you had BASIC.
    So when I got the PET and I turned it on and it show the OK prompt immediately, I was thrilled. It was all so easy. Of course that meant I had to go to the public library and get a book on BASIC.
    Sounds primitive now but I worked in the government and nobody anywhere except Carl with his TRS-80 had a computer. I was looked on as some kind of wizard. It made my career. I wrote statistic programs. The county paid for the PET. And I had a kind of statistical job. There was almost no pre-written applications in those days. i wrote a word processor which allowed me to publish a monthly agency status report. By modern standards it was a terrible word processor but I was the only person who had a word processor. That's how I got a couple promotions into management. So when the Reagan cuts came I wasn't cut . I did the cutting.
    I wrote a sort of Visi-Calc program too. That's how I got into managing the budget. The relevant metric is not how much power your computer has, its how much you have versus the guy in the next cubicle.

    • @drewbadwolf5182
      @drewbadwolf5182 5 років тому +12

      means your more knowledgeable then most of us

    • @jimhewes7507
      @jimhewes7507 4 роки тому +7

      I started almost as far back, on the VIC-20 which had 4K of free RAM. (I don't count a little Fortran in college before that). I started with BASIC but very quickly realized there were some programs running faster than what was possible with BASIC and so I discovered machine language. After that it was almost all machine language for me with TinyMon. I discovered how to insert a game cartridge so that I could see the code in the cartridge and so I would disassemble the whole code on paper to see how it worked. When I finally got a C64 a couple years later, 64K was like boundless space.
      But I was never as productive as you to write your own word processor and spreadsheet. Wow!

    • @uroborous01
      @uroborous01 4 роки тому +2

      Damn... those were the days...

    • @TheSodog8
      @TheSodog8 4 роки тому +4

      Ah the early days, I got a promotion to CNC programed because I knew how to type a C/ prompt.

    • @khemikora
      @khemikora 4 роки тому +4

      My first computer was my beloved C64 with a whopping 64 Kilobytes of RAM. After that I moved to the Amiga 1200 with 2 Megabytes of RAM! Twas quite a leap! My next machine was a 300Mhz AMD PC with I think 64MB of RAM. My current computer now has 16 Gigabytes of RAM. Isn't computing progress great!

  • @QuantumBraced
    @QuantumBraced 7 років тому +493

    The 1990s were insane. You'd buy a PC and 3 years later it would be almost completely obsolete. You couldn't install Windows 98 on a machine from 93-94. Hardware worth would half every year. It was like hyperinflation. Nowadays, a 6-7 year old PC is totally fine, if not fast. Even for gaming, get it a budget $150 graphics card and it'll be alive and kicking new games. I had a 120GB HDD in 2003, and in 2016 that's the starting capacity of new MacBook Pros.

    • @dycedargselderbrother5353
      @dycedargselderbrother5353 7 років тому +30

      If we were to collapse upgrades into discrete computers, I had something like 5 PCs in the 1990s. 3 years is generous. While "total obsoletion" is accurate, realistically you wouldn't be able to run next year's software without some sort of upgrade, and in areas no one thinks about anymore like CD-ROM drives, sound cards, and incompatible cases and power supplies.

    • @Zack-dk3pt
      @Zack-dk3pt 7 років тому +4

      yep got an 09 computer only ne wthing isa cup fan dual core apu 4 gb ram can still play most games on low quality sewttings some o hiugh but for the most part still a fully functioning computer would be perfect for a non gamer

    • @stereorail
      @stereorail 7 років тому +6

      In the year 2001 I installed Windows95 on a 1992 IBM PS/2 386sx system.
      The most I had to fiddle with is setting HDD access mode to 16-bit since
      SX was a 16-bit bus. It took minutes to boot but then worked OK (though
      slower than 3.1) and even went online. Sold it for 40 bux :-) (in 1994
      it was on clearance sale at Costco for some hundreds)

    • @maxis2k
      @maxis2k 7 років тому +21

      It definitely was insane. I still remember in 1992 getting my first real computer. Within 3 years, I found that it could barely run Windows 95 and I was playing games like Sim Copter at 5 FPS. I had to struggle on with that all the way until 2002 when I finally got a new computer as a graduation present. And marveled at the 'amazing' 733 MHz processor. Which I found out was already obsolete for that time.
      My last computer, a Core Duo Q6600 on the other hand lasted me 7 years and can still function just fine for anything except games and video rendering. Like the video pointed out, I think processing power has greatly surpassed the software requirements.

    • @Advection357
      @Advection357 7 років тому +15

      The first system I built in late 1995 cost over 4000$... for a Pentium 100mhz with 16mb EDO ram (800$ for the ram alone) and a Matrox MGA Millenium (the first consumer 3D accelerator ever made... another 500$ for that). Needless to say... it was obsolete in 2 years when the Pentium II came out. I don't even remember what I did with it... I think I sold it... been a while hah

  • @theinternetstolemysoulbuti2740
    @theinternetstolemysoulbuti2740 7 років тому +59

    Silicon processors are at its limit right now since transistors have become small enough for electrons to skip over them when manufacturers try to make them smaller, effectively ruining the binary code that we rely on. But once graphene becomes the new medium for circuitry we will see a resurgence of progress. Graphene CPUs that are being thought up can handle higher temperatures with less energy wasted.

    • @tobiaszstanford
      @tobiaszstanford 7 років тому +10

      Yes that's true. But graphene transistors is in development and if that stuff gets into a CPU, you'll have +20Ghz standard

    • @theinternetstolemysoulbuti2740
      @theinternetstolemysoulbuti2740 7 років тому +5

      It's mainly because graphenes ability to be so thin/electrically efficient is it's biggest advantage/disadvantage. It's very easy to mess up a core during it's "growth" which would render the processor useless. But IBM has recently made a graphene processor that is capable of sending basic code 10000x faster (I may be wrong. CBA to look it up) than conventional processors

    • @theinternetstolemysoulbuti2740
      @theinternetstolemysoulbuti2740 7 років тому

      In a nutshell, graphene is so difficult to produce (in high purity with high quantity) that 90s technology would prove innefective. Silicon is much easier to make circuitry than graphene and it's good that we're discovering it now. (Instead of shelving it as a' 'failed' idea.

    • @Hellcommander245
      @Hellcommander245 7 років тому +5

      It makes you wonder how large a computer processor would be if it used vacuum tubes instead of transistors.

    • @CzornyLisek
      @CzornyLisek 7 років тому +3

      Emmm we already are able to make one atom trasistors. First one was created in 2004.
      right now there is numerous of one atom/few atoms transistors. End I don't think eny of them use graphene. They used silicene(form of silicone), alumina(aluminium oxide) end silver.
      About vaccum tubes. they also can be made rly rly tiny using same processes as to make today trasistors . Smallest vacum tube trasistor work on 460GHz. End I'm not shure about specific diameter.
      Keep in mind that "speed" of trasistor =/= "clock speed" of whole CPU/GPU/ect.

  • @uthoshantm
    @uthoshantm 4 роки тому +664

    Throw an SSD in a 10 years old PC, add some RAM and you are good to go.

    • @dragunovbushcraft152
      @dragunovbushcraft152 4 роки тому +42

      I have a nine year old T500 ThinkPad, with 8gb/RAM, and a 256gb SSD. It would make someone a GREAT "Daily Driver". It has Radeon graphics, and still plays some pretty decent games. :)

    • @philstuf
      @philstuf 4 роки тому +19

      Have a circa 2008 Dell Precision M6300. Threw an SSD in it and it runs like a current gen PC, with minimal latency. I won't be playing Half Life 3 anytime soon, but it is a stout system compared to current commodity I3's

    • @arwahsapi
      @arwahsapi 4 роки тому +19

      Using SSD in 10-year old laptops is cool. But DDR2 RAM's are scarce and expensive nowadays, and replacement batteries are mostly obsolete

    • @dragunovbushcraft152
      @dragunovbushcraft152 4 роки тому +1

      @@arwahsapi My T500 is DDR3

    • @uthoshantm
      @uthoshantm 4 роки тому +12

      @@arwahsapi You have a point. I installed Linux Mint and it works well with 2GB.

  • @mbunds
    @mbunds 5 років тому +13

    Here’s a trivial (and quite possibly obvious) observation:
    I’ve noticed a trend over the years where concentration in the microprocessor industry shifts between substrate improvements to get more FLOPS, and efforts to reduce power consumption of the substrate components, individually, and by applying power saving logic to the system etched into the chip as a whole.
    Critical to, but aside from the obvious desire to improve battery life for mobile or terminal devices, these efforts occur simultaneously during the evolution of ever faster circuits by necessity, but a subtle shift in industry focus can be observed over time, as more development becomes required for the reduction of power consumption, before the density of the devices on a substrate may be increased without facing problems with heat dissipation. Once the power budget for the system has been proven to fall within specs, concentration subtly shifts back toward increasing clock speeds and packing more circuitry into the substrate.
    Thanks for an excellent presentation!

  • @RonJohn63
    @RonJohn63 4 роки тому +9

    2:04 The company I work for just sent me a 4GB DIMM to add to their laptop, bringing it up to 8GB before the upgrade to Win 10. Yes, for the past two years, I've been running W7Pro with 4GB.
    OTOH, my own desktop machine is a "PC of Theseus". The mobo is about 5 years old, with AMD FX-6100, but 240GB SSD and 32GB RAM. A 10TB spindle is for data. I have zero need to upgrade the CPU or mobo any time soon, and have finally reached "32GB ought to be enough for anyone".

  • @johnh6524
    @johnh6524 5 років тому +84

    Back in the day the expression was “Whatever Intel giveth Microsoft taketh away”. I wonder if the rise of open source has had an effect on this?

    • @tiporari
      @tiporari 4 роки тому +8

      Nope. Open source, unix/linux, all very cool. Impact to 99% of consumers? Negligible. Performance demands for a fully featured open source only machine is right up there with Windows. All the bells and whistles cost the same in terms of compute resources whether MSFT developed it or some open source coder. In many cases linux performance is worse overall because of poorly optimized drivers, lack of hardware acceleration, and less maturity for open source alternatives to mainstream tech like Java for example.

    • @herrfriberger5
      @herrfriberger5 4 роки тому +3

      I sure "hate" MS, but few of their products was very inefficient in terms of clockcycles or CPU usage, at least compared to Adobe and many others (although something like that saying was true regarding disk space during the 80s and 90s).
      Windows 3.1 and up started *far* to slowly, for no other reason than bad design and optimization, but this had more to do with lots of (slow) hd-accesses than with CPU usage. Many (not all!) Gnu/Linux or open source projects are equally sloppy designs, unfortunately.

    • @deusexaethera
      @deusexaethera 4 роки тому +2

      It absolutely has had an effect -- a negative effect. Open-source code is generally written like shit and is even more inefficient than Microsoft products, when adjusted for the same level of feature-richness and hardware compatibility. Just look at the horrible clusterfuck that we call "the internet", which is mostly based on open-source code. The best code is written by a single developer who has a long time to refine it -- basically what I do for a living.

    • @martijngeel9224
      @martijngeel9224 4 роки тому +7

      @@deusexaethera Ah, you are a programmer that get payed by line and the more lines of nonsence you produce the more you get payed. Linux is faster than windows. But i guess you never heared of Linux. Linux is rising, but certainly because microsoft stops with windows. I mean windows 7. windows 10 is crap, click more in more time, look busy and do less.
      The longer it takes for your code to execute, the less you should get payed. And you thought that it was a work of art, shame on you. Look at the demo makers on the C64 that is a work of art, that is well programmed by smart people. No i am not a demo maker, but i sometimes program in C64 assembly and look at their code.
      Have a nice day.

    • @deusexaethera
      @deusexaethera 4 роки тому +9

      @@martijngeel9224: Nobody pays programmers per line of code anymore, dumbass. I get paid a salary. And of course I've heard of Linux. I've used it occasionally since the late 1990s, and my opinion remains the same: Linux is the best OS to use when I want an OS that requires me to fix every single feature because none of them work correctly.

  • @charleshayesvaughn
    @charleshayesvaughn 5 років тому +88

    And now you'll be at a million in probably less than a month. Congrats dude.

  • @alihelmy
    @alihelmy 5 років тому +1

    Mate, your videos are some of the most awesome videos to watch. I absolutely love hearing about how these awesome machines from my childhood worked!

  • @fatihbilgeylmaz3966
    @fatihbilgeylmaz3966 7 років тому +16

    Fundamentals of pc usage was 1- Web browsing, 2- Viewing quality photos, 3- Document creating , 4- Watching a movie, 5- Listening music ten years ago and that was perfectly doable. There is no room for improvement. Improvement needed only for 3d rendering, video editing, specific work needs, better gaming experience .

    • @Raptor3388
      @Raptor3388 7 років тому +4

      Yes the most common sites have become so bloated they will put older computer to their knees, even dual core computers. UA-cam, mostly because of Flash, is very demanding, and even Facebook now, I've noticed it's become noticably harder to use on a lower end 2009 laptop (Pentium T4500 and 4Gb DDR3 in my case), which used to do everything flawlessly.

    • @OMA2k
      @OMA2k 4 роки тому

      @@picketf Webpages using PHP is irrelevant to the power your computer needs to display it, because PHP is run in the server and your computer only gets a plain HTML representation generated in the server from the PHP (or any other server language). Scripts and HD video do put strain on your computer, though.

    • @OMA2k
      @OMA2k 4 роки тому

      @@Raptor3388 : Flash has an undeserved bad reputation. You say UA-cam is very demanding "mostly because of Flash" when in reality UA-cam had already stopped using Flash since quite some time before you wrote this comment 3 years ago. In fact, HTML5 animation is more CPU-demanding than Flash. Those pesky animated banners that are everywhere in websites didn't die when everyone stopped using Flash, but now they tax the CPU a lot more than when they were actually Flash animations. I'm sick of hearing my laptop fans blowing at full speed whenever I'm in a website with several animated "HTML5-canvas" banners. That didn't happen with Flash. I hate that my laptop has to consume more battery just to play some stupid unwanted ads. Those kinds of ads should be banned, and only static image banners allowed. I'm not more likely to click on them just because of some silly animation. But I digress... For some reason, people like to blame everything on Flash even when it's not really used, probably because of some second-hand opinion from some "guru".

  • @vwestlife
    @vwestlife 8 років тому +317

    The combination of the Great Recession, the failure of Windows Vista, and the netbook fad forced software to continue to support older operating systems and underpowered CPUs much longer than they normally would've. Also CPU designs like Pentium 4 and PowerPC G5 reached the thermal limits of what a normal desktop PC could handle, which forced new CPUs to focus on being more efficient rather than just increasing the clock speed.

    • @vicr123
      @vicr123 8 років тому +10

      You're here? :O

    • @KaiserTom
      @KaiserTom 8 років тому +10

      +vwestlife It's funny because Intel was actually coming up with entirely new Mobo standards like the BTX in order to better handle the amount of heat generated from things like the Pentium 4 and whatever they thought may come in the future that would generate even more heat. We went from 10 watt CPUs to 100 watt in a matter of 10 years from 1995 to 2005, people were convinced we weren't going to stop, at least not as hard as we did when things like the core series started coming out promising more performance for roughly the same power usage.

    • @bryndaldwyre3099
      @bryndaldwyre3099 8 років тому +3

      +vwestlife yes, anyone remember the "Presshot" series?

    • @saturnotaku
      @saturnotaku 8 років тому +4

      +Bryndal Dwyre I had one of those. Worst computer I ever owned.

    • @bryndaldwyre3099
      @bryndaldwyre3099 8 років тому +3

      thankfully i owned an amd 64 4200+ at the time. never had any issues with it and it ran like a dream alongside my gt6600

  • @VideoNOLA
    @VideoNOLA 5 років тому +106

    If you ran the same old MS-DOS programs we had in the late 1980s through mid-1990s on today's PC (if you could magically do so in the first place) it would run so blindingly fast as to be useless. Back then, programming was streamlined and efficient out of necessity as CPU cycles and memory came at an extreme premium (remember running HIMEM and MEMM386 just to shoe-horn a few more TSR's into RAM during bootup?).
    Sadly, today's bloatware is the opposite. Huge, sloppy, un-optimized, resource-hungry and so high level and fault tolerant that it's as if IBM were back paying developers by the line of code. I'm constantly amazed that ANYTHING manages to slow down a 4-core, 8GB, 64-bit desktop computer when asked to do JUST ONE THING (i.e. open Chrome browser). It's so sad.

    • @Lambda_Ovine
      @Lambda_Ovine 5 років тому +18

      Let's not forget that, back then, developers fine tuned their software to run in specific architectures and platforms with considerable life spans. Today, with all the diversity and configurations of hardware out there, you simply don't have that luxury anymore.

    • @frikkiethirion8053
      @frikkiethirion8053 5 років тому +8

      Use Dosbox to run your vintage/abandonware on new machines.

    • @arthur_p_dent
      @arthur_p_dent 4 роки тому +24

      Back in those days, software developers had 2 precious resources to look after: they would optimize a program either for minimal memory use (both RAM and hard disk space), or for maximal speed.
      Nowadays, both memory and speed are available in abundance, so a third resource comes into play: human effort. From a developer's viewpoint, it simply doesn't make typically sense to spend an extra thousands of hours of time developing and coding in order to minimize RAM usage or maximize speed. Hence programming languages like JAVA, which are 1000 times slower than GWBASIC while needing a 1000 times more RAM, but allow for applications to be developed in record time (compared to machine language anyway).

    • @SecretSauce2738
      @SecretSauce2738 4 роки тому +6

      @@Lambda_Ovine That's what kernels and drivers are for. I can open Chrome on a Windows machine with a completely customized build, and any inefficiencies in the kernel or drivers is simply not enough to be a bottleneck to the speed of the system. Try running a minimal open source web browser on Linux with that same hardware, and then compare it's speed to the original Chrome on Windows. If they can do the same thing, the difference in speed is certainly not a lack of optimization for a specific hardware set up, it's the bloat in the software.

    • @RetroDawn
      @RetroDawn 4 роки тому +4

      @@arthur_p_dent Actually, Java is *significantly* faster than GWBASIC, or any interpreted BASIC. Java is actually on par with C++ in speed, and is even faster on many occasions. And, Java definitely doesn't need 1000 times more RAM. I'm a professional software developer for over 25 years, and taught myself BASIC and PILOT back in 83 and 84, respectively, when I was 10.

  • @JavascriptJack
    @JavascriptJack 5 років тому

    I really loved this video because you have shared commentary from others, and... it made me smile to see you all working together.

  • @MEGATestberichte
    @MEGATestberichte 6 років тому +14

    This channel is just awesome. I am an aged technerd myself too. Love it.

  • @10p6
    @10p6 8 років тому +14

    I think the main point comes down to these 4 issues.
    1. Manufacturers have reached the maximum clock speed without expensive cooling systems. Therefor to get more speed more CPU cores are being added. Once again though these are limited by clock speed of the bus.
    2. With regards to number 1. Ram clock speeds have stagnated too, making it harder to effectively push more bandwidth of multiple cores through the bus. Front side bus speeds have seen limited speed growth.
    3. A majority of people have moved over to laptops. This creates two main issues: 1 Limited space for CPU cooling, but also battery power has not increased at the same rate as CPU power did. So on a laptop manufacturers do not put the fastest CPU's in them as otherwise when running on battery the computer would only last a few minutes.
    4. In the older days the main CPU handled the bulk of the systems graphics. These days virtually everything has been offloaded to the GPU meaning smaller CPU's are required for general operation.
    Right now I am writing this on a 6 year old Toshiba M780 Tablet (real tablet PC) With 8GB ram, Dual Core i7, Raid, touch screen and so on, so why upgrade as it does 99% of what I need to do, and even now, the bulk of laptops are no faster except in 3D unless you are going to large, expensive and bulky laptops? Next to this tablet / laptop though is my HP Workstation with 36 cores and 256GB Ram and quad SLI Quadro's. On the Laptop side speed has not really increased, on the desktop / workstation side, they have drastically increased, however so has the cost of them.

    • @volkswagenginetta
      @volkswagenginetta 8 років тому

      +10p6 you mean 18 cores with hyperthreading. the xeon e5-2699

    • @10p6
      @10p6 8 років тому +3

      +volkswagenginetta no I mean twin 18 core xeons, 72 hyper threads

    • @volkswagenginetta
      @volkswagenginetta 8 років тому

      10p6 oh right double socket. my bad

    • @clyax113
      @clyax113 8 років тому

      +10p6 Is there a similar model that has a little higher performance that could last me about 8-10 years than the one you mentioned originally?

    • @ToriRocksAmos
      @ToriRocksAmos 8 років тому

      +10p6 just wanna add one comment to your point 1.)
      Just adding more cpu cores instead of increasing clock speed (or performance per clock) doesn't scale nearly as well as increasing clock speed.
      For example: If you increase the clock speed of your 3 Ghz Quadcore by 10%, it'll perform almost 10% better in most tasks, resulting in a close to 100% scaling.
      If you add 2 more cores @ 3Ghz to your quadcore you'll prbly won't see much of an improvement - unleass you are doing a lot of multi-tasking, or if you are using prosumer multicore optimized software.
      (I know that I reduced the complexity of the matter in my statement; hopefully it still makes sense.)

  • @Itsmekimmyjo
    @Itsmekimmyjo 5 років тому +5

    I think one of my favorite things about your channel, is that you always seek opinions of others instead of trying to appear like you know it all. Genuinely honest and intriguing videos ⭐️ ⭐️⭐️⭐️⭐️
    Oh.. and also your fabulous shirts!

  • @jacobleedowney
    @jacobleedowney 4 роки тому

    @5:35 I like your graph & animation skills! Thank you for taking the time!!

  • @alcidiow
    @alcidiow 8 років тому +67

    since amd can't barely compete with intel for now intel barely has to try to improve in their cpus, which means we see less improvement overall in the maket.
    basically why should we try to improve performance by a long shot when we don't need to.
    i hope zen makes an impact when it comes out

    • @manictiger
      @manictiger 8 років тому +13

      +alcidiow
      Intel will be effectively out of new products once they reach 7nm chips.
      Once that's out, IBM will have their 4nm chips out and Q-bit hybrid chips and other exotic solutions will start appearing on the market.
      Intel will effectively be finished if they can't come up with a rival to all that.
      So they need to buy time.
      They are deliberately delaying the release of 10nm. They could have made a 10nm plant in 2014 or 2015, but that would be like speeding up the construction of their own gallows.

    • @Lefsler
      @Lefsler 8 років тому +2

      +alcidiow They create new techniques to improve the performance, new DMA, Branch prediction, cache hit and others.

    • @nameless-user
      @nameless-user 8 років тому

      +manictiger Wait, 4nm is a thing? No wonder computer performance is levelling out. I don't know what the smallest possible silicon process size is, but we must be getting close.

    • @manictiger
      @manictiger 8 років тому +1

      *****
      IBM announced that they made a working 7nm carbon transistor. For some reason my brain thought it was 4nm.
      The theoretical limit for transistors is about .1nm, which is based on the concept of a phosphorus atom transistor.
      I guess Moore's Law isn't quite finished, yet.

    • @nameless-user
      @nameless-user 8 років тому

      manictiger How did I not hear about this? XD

  • @LazerLord10
    @LazerLord10 8 років тому +102

    You should do a video about how efficient computers are. It seems like even though the performance "numbers" have increased quite a lot, a lot of the usage seems to be pretty heavy on the hardware, even if they do things very similar to old hardware. I mean, why does my internet browser use up 500MB of RAM? It can't need all those resources, can it? How did I prows the internet on a system with 256MB of RAM?
    (I know this is a drastic oversimplification, and I'm aware that newer programs do more, but it just seems like with all this excess hardware power, things have become a lot less efficient.)
    Also, I wrote this before the end of the video.

    • @raafmaat
      @raafmaat 8 років тому +17

      +LazerLord10 the higher ram usage is only because the new internet browsers are not just an internet browser anymore like 15 years ago... they can now play smooth ultra HD video and stuff... but if you want you can revert back to a browser from 1998 (use the compatibility for windows 98 option) and just have it use like 10MB of ram ;) sure it wont be able to play youtube videos and stuff but yeah, thats seems to be the thing you want :P

    • @fablungo
      @fablungo 8 років тому +50

      As someone who has been doing a Computer Science degree, the difference today with efficiency and 10 or more years ago was that back then optimisation was really important. Nowadays maintainability and performance outweighs this and so even when using 2x more resource won't gain you 2x performance it is deemed worthy for any gain. Another issue is complacency when developing. We have basically been taught that for most cases optimising efficiency is a low priority and when it comes to a developers time in the commercial world efficiency doesn't compare to the commercial gain of more features: the attitude is "PCs have the power, might as well use it", whereas before the efficiency of the application was the difference between it running on home hardware or not and commercially you weren't going to be able to sell the software if it wasn't efficient. Adding to all this is the power of the internet and the ability to update applications, this drives two things: more maintainable code (usually less efficient) and less time testing and optimising before release (it can be patched after release).
      That's my thoughts on this anyway.

    • @LazerLord10
      @LazerLord10 8 років тому +1

      Fabrizio Lungo
      Thanks for your response!

    • @simplelifediy1772
      @simplelifediy1772 8 років тому +4

      +raafmaat Also you can add to HD video, the sheer amount of images and streaming advertisements...
      Remember static webpages that had 50-84k images and a midi song playing in the background? And java script was used for mouse trailers... lol

    • @raafmaat
      @raafmaat 8 років тому

      SimpleLife DIY
      cached data does not add up to RAM usage.... that kinda stuff is all just saved on the HDD ;)

  • @BamaChad-W4CHD
    @BamaChad-W4CHD 4 роки тому

    Now over 1 million subs! Well deserved I must say. Love your content. Thanks for doing what you do!

  • @dragunovbushcraft152
    @dragunovbushcraft152 5 років тому +2

    I've been repairing, and working on computers for 43 years. I've watched them "grow up". I have had WELL over 1000 different computers (maybe twice that!), I still use a Lenovo T500 with Radeon graphics for lots of things. I also have several Skylake, and Kaybe Lake systems. Other than gaming, and a select few pieces of graphics software, I find my T500 to be more than up to the task for modern computing. You are partially right about the software issue, however, there is one more important factor:
    Upgradability.
    My T500, came with Vista, 2gb/Ram, a 160gb spin drive, a 1650x1080 display, and ATI Radeon 3650m Graphics, in 2009. It was a very, expensive machine when new. It now has Win7Pro, 8gb/RAM, a 256gb SanDisk SSD, a T9600 Core2Duo processor, and a 500gb, WD "Black" HDD in the optical drive bay. I can do some gaming on it, I can run CAD software, I can edit video on it, I can run Office 2013 on it. It runs most of these as well as my Kaybe Lake systems. It boots nearly as fast as my Kaybe Lake systems running Win10. If I installed Win 8.1 with Classic Shell on it, it would be even faster. The display is more than adequate for most modern applications. This computer was upgraded from all used parts I got mostly for FREE. This computer will still be usable for at least another 7-10years. Maybe longer, depending what I use it for. I use it as a demo model, as well as daily use. People are AMAZED at my T500, as it keeps up, and even surpasses the speed, and usability of their newer machines.
    I've sold untold numbers of R400/500 and T400/500's that I rebuild, and I give a very HEFTY warranty with them. All have the Radeon graphics in them. Other than a blown ram chip or two, and a crashed mechanical drive or two, the ONLY thing brought back on warranty is for general upkeep. One customer has had one of my T400 laptops so long, that I had to re-paste his CPU (covered under limited, lifetime warranty), general HDD cleaning (part of warranty). I have NEVER had a "failed" motherboard, or display come back. Machines built from 2009, to 2016, or so, are robust, and very upgradable.
    No "obsolete" here!

  • @klaxoncow
    @klaxoncow 4 роки тому +57

    There's another obvious thing that I'm surprised was missed.
    When you showed the 2015 model and the 2005 model, they're both PCs. But when you showed the 1990 model versus the 1980 model, it's a Commodore PET versus a NEC Ultralite.
    Early computing had zero regard for interoperability and compatibility. Every single machine was its own island.
    Even in regards to its own brand, such as with the PET, VIC-20, C64 and Amiga. All from Commodore but, with regard to interoperability with each other, there was essentially none and they might as well have been manufactured by different companies. It made no difference.
    But, with the PC, IBM accidentally created a standard which others could follow. And very much a crucial point in selling your "PC compatible" was that, yes, it was compatible. Manufacturers were aiming to make their machines compatible and interoperable with other PCs and PC software.
    Microsoft is known to go to quite some lengths - under the hood - to preserve backwards compatibility. To the point that various versions of Windows are known to detect certain pieces of the most popular software and actually change settings and recreate bugs from previous versions to keep them alive and well on a new OS. It has a big list of "exceptions" for various software and then changes its behaviour to keep those bits of software alive.
    Then there's the user visible stuff like being able to select compatibility profiles on applications. Please run this app as if this were still Windows XP, please.
    And, over those decades, we've increasingly had more interoperability standards. From well-known file types - like MP3, PNG, etc. - to Internet protocols to, like, industry bodies getting together to define the Vulkan API.
    In 1980, no-one gave a crap about interoperability. It was essentially non-existent. By 1990, it was understood to be a good thing and was actively strived for. By 2000, it was essential - the web simply can't work without it. By the late 2010s, even Microsoft is conceding that it needs interoperability with Linux and is including "Windows Subsystem for Linux" and then starts work on its own Linux distribution. It's also conceded - a few days ago - to using the open-source Chromium for its Edge browser (there's little to gain from the immense effort of developing its own HTML engine, as you can't lock people in like that anymore and, anyway, Chrome is kicking their arse and they'd never make up the difference and they know it).
    And there's been a slow but steady embrace of open source, as it's understood that, actually, interoperability is king. Proprietary closed source implementations cut you off and erect "walled gardens" around you - this really isn't useful. At first it might seem fine. But as you upgrade machines, change vendors, change software, etc. then it has become increasingly clear to many businesses that, whatever closed source offers, it'll sting you badly in the arse in 10 years' time, when you can't escape their "walled garden" that keeps you paying.
    In 1990, there was zero interest in making any of your software from your 1980 Commodore PET function. And if you were attempting "backward compatibility", then your 1990 laptop would have to deal with a hundred systems. Those were the bad old days. Let them go.
    But, in 2020, there is reason to still be concerned with XP compatibility, with 32-bit machines, and generally aiming to be "backwards compatible".
    And there's a long legacy of this now. For example, let me use my psychic powers. Choose any Windows EXE on your system. I predict that the first two bytes are "MZ". Check that in a hex editor. Further along - probably about at the 4KB mark - you'll then find the two bytes "PE".
    How am I making these predictions?
    Because, you see, MS-DOS detects whether a file is an executable by the initial two "MZ" bytes - the initials of one of Microsoft's engineers, Mark Zbikowski, if you're wondering - and, for "backwards compatibility", Windows executables still start with "MZ" and have an MS-DOS "stub" program at the front. It's literally a working MS-DOS executable that is usually programmed to print "This program requires Microsoft Windows" and exit.
    Because, yes, that prompt is NOT coming from the OS. MS-DOS is running the EXE file. Because the "MZ stub" program at the beginning is a legitimate MS-DOS executable.
    And then, within the MS-DOS executable headers, is a pointer to the "New Executable". This was originally an old style Windows executable, identified by "NE" for "New Executable", but later 32-bit versions of Windows moved to the "PE" (or "Portable Executable") format. There's also a 64-bit PE these days too.
    But, anyway, technical details aside, the point is that every Windows program (and DLL) on your system is preceded with an MS-DOS executable, on the off-chance that you attempt to run it in DOS, for it to print "This program requires Microsoft Windows" and quit.
    This hopefully gives you a measure of what I mean by how much effort and expense and legacy is in the modern system to be "backwards compatible" with all that preceded it.
    In future, when you're downloading "Half-life: Alyx" to play the most modern VR on your latest and greatest Windows 10 system, the first two bytes of the executable will be "MZ". Because there's a fully working MS-DOS "stub" program bolted onto the front of them all, to maintain compatibility with MS-DOS.
    Here's another thing to consider as well. There are natural limits to things.
    On the 8-bit machines, you had monochrome (1 bit per pixel) and maybe 4 colours (2 bits per pixel). Perhaps your system was capable of 16 colours (4 bits per pixel). And then VGA brought us 256 colours (8 bits per pixel). These days, we all pretty much universally use "TrueColour" or 24 bits per pixel.
    In certain fields, like scientific imaging, you might push beyond that and HDR TVs get all floating point about it. But there's the idea of a natural limit.
    The human eye just can't see more than 24 bits per pixel. So there's no point, for pure display purposes, in storing more than that for an image. Your eye simply can't see it.
    (In fact, the range of human eyesight is actually closer to 16 bits per pixel. But our colour sensitivity is not uniform across the range - we can see more in the yellow / green range - so 16 bits is not quite good enough to be completely imperceptible to the human eye. But it's right on the edge. 24 bits per pixels - a byte for each of RGB - exceeds that, is more than good enough and is simple to program and deal with.)
    While video qualities rise - 576p to 720p to 1080p to 4K to 8K, and 30fps to 60fps to 144fps - the number of letters in the English alphabet remains 26. Once UNICODE now covers every conceivable alphabet (and much more) then text files don't naturally get bigger.
    (And I'm talking about 4K / 8K and 120fps / 144fps - well, there's diminishing returns on these things. More resolution and higher frame rate does make things slightly better, but only slightly. It's four times more data to do 8K than 4K - which itself is 8 times more data than 1080p - but is 8K really 16 times better than 1080p to look at? When you compare 320 x 200 resolution to 640 x 480 then the increase is stark and obvious. From standard definition's 576p to HD of 720p and 1080p, everyone could see the clear advantage (particularly as it coincided with screens becoming flat panels rather than bulky CRTs) so everyone upgraded. But selling 4K and 8K has not been so easy, because the leaps get less and less significant.)
    There's the "law" that things expand to fit the available space. More RAM, more disk space and programs and data just get bigger to fit. Well, that's largely true - until you start hitting these natural limits. There's no point going beyond 24 bits per pixel. So images stay at that and all the extra RAM and disk space starts meaning "more pictures", not "pictures getting bigger to fit the available RAM / disk space" as they once did.
    And once machines are good enough to, you know, display a 24 bit per pixel image or decode an MP4 stream, then all exponential hardware improvements mean is that the CPU / GPU is less taxed and capable of doing more things simultaneously.
    There are some things that will, no doubt, keep on getting bigger and bigger to match hardware improvements. But there are lots of things that have now reached the natural limits. Older hardware can reach those natural limits, so improvements are not about being capable of doing things anymore, in quite the same way, but about being able to do it easily, with CPU to spare for another 8 tasks at the same time.

    • @pointlesspublishing5351
      @pointlesspublishing5351 4 роки тому +1

      Excellent answer. Very indepth. Coincidendly, it fits with my gaming- and job-experience with computer. My second screen is an old 19" from 2005...and my main screen offers Full HD on 24", so that a book page is a "real page" on a screen (enough for me as an professional author). WHY should i get more? It does not make SENSE. To see my text written in UHD? And in gaming...i have the feeling the console-thing (basically all consoles have the same games available, excluded some exclusives) also stopped the Gearing Up for Games process. Since Xbox360, i remember being able to buy the same game for PS3, 360 AND PC...and while of course i COULD get better performance on a potent computer, i got the impression in the 2000s that i just NEED a computer which can MATCH console performance for ACCEPTABLE gaming results. Everything else seemed to be a waste of money, unless you're really into it. Crysis...ahem.

    • @stevejobs6693
      @stevejobs6693 3 роки тому +1

      While the human interface device, screen or speaker certainly will hit the limit based on what our senses can perceive, the state of the art (imagine 64 bit color space or 16K resolution) will continue to grow for two reasons: 1. The increased software capabilities (things you can do with the additional data) think of how your smartphone uses multiple cameras to input spatial data for security or utility (ie. portrait mode). Or for example an infrared sensor augmentation of data through software. And 2. Machine learning / deep learning (AI) will eventually surpass humans in computational throughput. Neural networks are already using/creating non-human interpretable data and intermediates to solve problems, so we'll have to keep growing to make these systems more robust/efficient.

    • @Porygonal64
      @Porygonal64 2 роки тому +1

      tldr

    • @Xyspade
      @Xyspade Рік тому

      I know it's been 2 years but wow, you put an immense amount of effort into that article, and it did not go unnoticed. I read the whole thing. And I too am surprised that this was missed in the video because I think this is the most accurate answer. Accept my one upthumb because you deserve a lot more.

  • @obsoletegeek
    @obsoletegeek 8 років тому +25

    Computer hardware is far less interesting these days. I miss the "Megahertz Mania" years.

    • @-taz-
      @-taz- 8 років тому +1

      +Computer Whiz 1999 Once we got to GHz, progress in clock speed got very slow. Instead, we kept making transistors smaller and adding more CPU cores until now we've run out of space in 2 dimensions! If we make transistors any smaller, they break because the electrons start hitting quantum effects and jumping around all over the place. We can make start adding redundant logic for error correction, but then the processors would get even larger, so it makes more sense to just stop shrinking at this point. Next processors will start getting stacked up in 3 dimensions.

    • @-taz-
      @-taz- 8 років тому +1

      ***** We already know what they will be. Powerful computers and the electricity to drive them will be removed from the people. Computers in the cloud (aka siren servers) will be owned entirely by rich oligarchs so they can keep tabs on us. Computers will lose hard drives and be implemented as only banks of CPUs and RAM. One of the companies very close to central intelligence is HP. You can examine their "Machine" technology for Things to Come.

    • @-taz-
      @-taz- 8 років тому +1

      ***** Yeah that's what solar power is all about, too. It restricts energy a person can use to the amount of 2-D real estate and finite resources (silver for solar panels) they can afford. Like everything, it makes us *feel* more free. The big server farms will have huge solar collectors or just keep using coal and oil in reality. That drives Facebook, Google, Apple, Amazon, the NSA. We can't compete against them because we have to pay them for everything. It's "kick away the ladder" economics.

    • @-taz-
      @-taz- 8 років тому +1

      ***** That's true. With everything centralized, like with Google, it can scan all our emails and UA-cam comments to learn from us. There is NO security for the individual because they own the data. They can even nudge our behavior by showing us just the right material. With distributed systems (Cisco routers, Ubuntu, Windows), there are plenty of back doors so data can be collected, but it's not by default. With Android, you can be watched in every app, and they know every button press.

    • @-taz-
      @-taz- 8 років тому +1

      ***** I know... I wish. It's going to be like the Borg though from Star Trek, which they also write by the way. The same groups that developed the modern PC also overlapped with the same elites, futurists, theosophists including Gene Roddenberry. I've been thinking a few years and can't even image how to escape.

  • @DaMaster1983
    @DaMaster1983 4 роки тому +2

    I know you will find it weird.. but what makes your channel so successful is your voice.. and how pleasant you are that its kinda relaxing to watch.. plus off course im from the 80's so it brings good ole memories.. and we also learn to fix stuff.. thanks to your wonderful well edited videos..

  • @xyrzmxyzptlk1186
    @xyrzmxyzptlk1186 5 років тому +1

    This was a great episode. It cleared a lot of stuff up for me. Thanks 8 Bit Guy. 👍🏼

  • @matlilly8795
    @matlilly8795 5 років тому +3

    That collaboration was incredible. It's so nice to see the nerd commnity working together.

  • @faded.0913
    @faded.0913 8 років тому +37

    In 2015 we don't have 8gb ram we have 128 and we have over 4,000 MHz. I found those charts to be Irrelevant

    • @The8BitGuy
      @The8BitGuy  8 років тому +108

      +The Computer Tech Guy The charts were showing "average" hardware, not top-of-the-line.

    • @brandonjensen63
      @brandonjensen63 8 років тому +10

      +The 8-Bit Guy I still think that the average would be somewhere around 3,000 MHz

    • @beat461
      @beat461 8 років тому +18

      +WarlordLeo 1 more like 2.5Ghz. Don't forget a lot of casual computer users, which are the majority, use i3/i5 dual cores clocked at 1.6-2.5 ghz

    • @WR3ND
      @WR3ND 8 років тому +2

      What's a "computer?" The average is pretty low, but then I still occasionally use my pre-Titanium TI-89 calculator which is... approaching being 20 years old and uses a processes that was used in some of the very first personal desktop computers ever made.
      Alternatively, my main computer is a 64bit 6 core/12 thread i7-3930K @ 4.2GHz, with 64GB quad channel DDR3 RAM @ 1600MHz, 2 Titan Black cards in SLI, and a TB SSD as the main system drive... and is a few years old now, not including the video cards and even those are getting a little long in the tooth. Still, there really is no reason for me to bother upgrading yet, because there isn't anything to upgrade to that is significantly more powerful. Maybe in a year or two the video cards might be worth upgrading.
      Maybe the average has been increasing a bit, but the high-end is stagnating probably for a few reasons. Most people don't have significantly more demanding applications and the popularity of mobile devices which are still trying to outdo each other every year, but can't really compete with current desktop options in terms of raw computational power and data storage.
      In short, computers overall haven't really improved all that much recently, but the proliferation of the low-end and power efficient mobile devices and their improvements have brought up the average.
      In terms of computing power, I primarily use my main computer to crunch for scientific and humanitarian research and it can work on as much as I want to throw at it, so it's an outlier.

    • @brandonjensen63
      @brandonjensen63 8 років тому +4

      +WR3ND +Johnnie Bassdrum +The 8-Bit Guy but you guys don't get it, moors law is based on the "new" processors and his video was in 2015 which means that it would be. about all of the Skylake and new A10 processors which are mostly all above the 3000 line...

  • @JosephDewey
    @JosephDewey 5 років тому

    Awesome video, and great guest answers!

  • @bend1119
    @bend1119 4 роки тому

    Congrats on just passing 1 million subs. Awesome content!

  • @ChrisMusty
    @ChrisMusty 4 роки тому +89

    I am from the future, in 2019 AMD has released a 64 core processor!

    • @mikem9536
      @mikem9536 4 роки тому +5

      Yeah, Intel just slowed down.

    • @a64738
      @a64738 4 роки тому +13

      Intel has had a 64core CPU for many years, only problem is that it cost about 15.000 $...

    • @ChrisMusty
      @ChrisMusty 4 роки тому +8

      @@a64738 are you sure about that? 64 cores or 64 bits?

    • @Graeme_Lastname
      @Graeme_Lastname 4 роки тому +3

      @@ChrisMusty All my bits are made of cores, good as quantum. ;)

    • @ieast007
      @ieast007 4 роки тому +3

      I'm also from the future and AMD is out of business since they sold out and transferred their Ryzen CPU technology to the Chinese state owned company Tianjin.

  • @BubblegumCrash332
    @BubblegumCrash332 4 роки тому +3

    The 90s had the best jump. From 91 to 99 it was like a different world.

  • @collrock1000
    @collrock1000 3 роки тому

    Congrats on 200,000 subs!

  • @TK199999
    @TK199999 5 років тому +2

    When I was lad, in the long ago late 1990's, I was in Explores. Now the one I was in was run by engineers, who taught us things like how solder and read circuit diagrams, but one of was much older and had been retired for many years. He started working as a computer engineer back in late 60's, so it was like 1999, we the students asked about Moore's Law and was it stopping. He explained to us, from his prospective that no it wasn't, but that what's happen in computers over the last 40 years and what Moore was predicting were not necessarily the same thing. He said in terms of computing power the difference that he said was being made was in efficiency. He explained that since the late 1960's computing was from programing to microprocessors was very brute force, more watts, more transistors, more microprocessors more lines of code. Which made things faster but not necessarily better or more useful.
    Basically added more stuff to get the same results at little fast, but at massive cost increase. The example he gave was CPU heat, which anyone who remembers those heady days when CPU's like the Pentium 4 Prescott was breathing fire and terrorizing small villages knows was a very hot and very power hunger chip. He argued we know there is a better way and now we are starting to see we can't brute force ourway to better performance. That its efficiency in hardware and software design that the real jumps in computers was heading. Which I think has happened over the last 20 years, with more efficient multicore chips, running at less power, less heat and more efficient software doing the same amount of worker at about the same speed as things like a single core 10 Ghz chips using software made in old way, that require the power of the atom to run, with just as much heat.
    So the reason older computers can still do a lot of the newer can is because of the software side of greater efficiency (I believe its called IPC today). Not to say those older chips were not capable, once we started down this road of efficiency over brute force, the change was as massive as from going non-microprocessor systems to x86 CPUs. Which can use every last once of power of those older systems and don't need 5 Ghz to do a lot of the work. Don't get me wrong 5Ghz makes thing faster, and more capable but in terms of efficiency well that number keeps shrinking. So the future I see is more cores, less heat, less power, better software and clock speed increases will be an afterthought, but will probably steadily climb upward if not rocket in days past, because it doesn't need to.

  • @Jere616
    @Jere616 7 років тому +24

    At 5:50, who else was more interested in reading all those titles Clint Basinger had instead of listening to what he was saying?

    • @kneekoo
      @kneekoo 7 років тому

      I just paused the video so I can read them without the background noise. :))

    • @ddnava96
      @ddnava96 7 років тому +2

      I was interested in playing Zoo Tycoon again :(

    • @jamiemarchant
      @jamiemarchant 7 років тому +1

      I subscribe to his channel and there is often something neat in the background. This time I saw the cheesy FMV game Star Trek Borg.

    • @frankstrawnation
      @frankstrawnation 7 років тому

      Caesar II for the win!

    • @sunilkumar-id5nm
      @sunilkumar-id5nm 6 років тому

      Jere616 lol I actually did that,,,

  • @KneelB4Bacon
    @KneelB4Bacon 7 років тому +22

    I think the thing that amazes me most right now is how much external storage has improved. In just the last few years, the cost of flash drives has dropped dramatically while the increase in capacity has been just as impressive..

    • @japzone
      @japzone 7 років тому +6

      I can fit 512GB of storage on my fingernail now. It just boggles my mind.

    • @wildbikerbill6530
      @wildbikerbill6530 5 років тому

      This exponential increase in capacity/decline in cost has been seen before in both hard drive capacity and memory (DRAM).

    • @m8x425
      @m8x425 5 років тому

      The makers of the Flash Memory IC's shot themselves in the foot by limiting production of the chips, which drove up the cost. Now the Chinese are getting in on the gravy which is causing a flood of memory IC's.
      The same thing is happening with RAM. Samsung is trying to limit this by making limited quantities of high speed RAM, but that probably won't help them.

  • @oceanbreeze3172
    @oceanbreeze3172 4 роки тому

    This was a lovely and insightful video!

  • @papachili7290
    @papachili7290 5 років тому

    I subbed and turned on the bell just because this guy looks genuine and I'm here to support him in every endeavor in his life

  • @rtv190
    @rtv190 8 років тому +112

    6 months later he is at almost 300K subs now

    • @pablorojo7989
      @pablorojo7989 7 років тому +2

      1 week later and he's at 302k :D

    • @danicic87
      @danicic87 7 років тому +1

      2 hours later 303.5 k subs :P

    • @metaldrums1015
      @metaldrums1015 7 років тому +1

      7 hours later, 305.3k

    • @bobalobalie
      @bobalobalie 7 років тому

      Subscribers mean little to nothing. There are plenty of people with millions of subscribers yet. They only get 200k views per video. Views is what determines how much UA-cam pays for monetization of ADs.

    • @metaldrums1015
      @metaldrums1015 7 років тому +2

      Yeah but they could have less views because they didn't have reoccurring viewers which would be the subscribers. Also minutes watch count towards the pay as well.

  • @JoeZasada
    @JoeZasada 7 років тому +10

    nice bookshelf of classic star trek games. 25th anniversary. Judgement rites. those were awesome!

    • @omma911
      @omma911 7 років тому

      Noticed it too. I wish I'd taken better care of mine.

  • @biggshasty
    @biggshasty 4 роки тому +2

    I know this is an old video, but it's still kinda relevant. I get asked this question all the time. Thanks for uploading.

  • @tharivol01
    @tharivol01 3 роки тому

    Great video as always! Don't forget, when you check at CPUs, you've put Applications used, but I'd also consider OS used. OS back then were so small compared to the ones today...

  • @sdphotography4733
    @sdphotography4733 7 років тому +31

    I am oft reminded of a 'scientist' (circa 1970) that claimed that the computers on the Star Ship Enterprise were impossible because it was impossible to fit enough vacuum tubes on such a ship to power the computer. To the point, we can't imagine what is down the road nor how fast computers can really get.

    • @jhoughjr1
      @jhoughjr1 5 років тому +2

      SD Photography funny since by then vacuum tubes had been replaced by transistors and ICs by then.
      He obviously didn’t know his trek because their computers didn’t use vacuum tubes.

    • @oldtwinsna8347
      @oldtwinsna8347 5 років тому +1

      @@jhoughjr1 i thought everyone knew they used duotronic circuitry.

    • @jhoughjr1
      @jhoughjr1 5 років тому

      @@oldtwinsna8347 lol there is a term I aint heard in ages. My instructor's instructor told him BJTs were just a fad and not really useful.

  • @someoneorother3638
    @someoneorother3638 7 років тому +195

    If you're a casual computer user, there is absolutely NOTHING you're doing on your computer that you couldn't have done on a computer 10 years ago. Checking email, using social media, using a word processor... the computers of 10 years ago could do all that stuff and they could do it well.
    There's no reason that 10 year old technology SHOULDN'T be completely adequate for casual computer users.

    • @phreak1118
      @phreak1118 7 років тому +41

      Unless you casually play any game made in the last 5 years.

    • @phreak1118
      @phreak1118 7 років тому +40

      Also, I have a laptop from 10 years ago and it cannot play video at 1080p.

    • @someoneorother3638
      @someoneorother3638 7 років тому +11

      It depends on the game. FPS's and such that require high graphics, sure. But most games don't require that level of graphics and can be played perfectly fine on a 10 year old computer. My desktop is about 10 years old and I can play most games on it fine still. Good thing I'm not into FPS's.

    • @CzornyLisek
      @CzornyLisek 7 років тому +2

      Some newest end fastest video, music, wi fi ect. standards use hardware. So today even weak pc/laptops on spec can do 4k, super hi speed network stuff, safety ecryptions, they have specialised parts to do the work so that main cores(CPUs) don't do it. While old Pc/laptop will strugle do to shit.
      Becouse of that specialised hardware we can put slower CPU, but build overall more efficent laptop in "normal" tasks.
      Also remember that even expensive laptop, unless it's super hi level gaming laptop that cost fortune, is already like 3-5years behind mid lev Pc, when relased
      Using browser(or something like Photoshop end other graphic/vid software) well... There is no end to RAM usage, even 128Gb could be to little.
      For Me max. was like 12GB used + 12GB reserved by browser(Chrome) alone. While CPU usage was almost none.

    • @MaaveMaave
      @MaaveMaave 7 років тому +18

      It's possible, although software bloat makes it difficult. There's so much CPU-intensive CSS, JS, flash, etc on modern webpages that it chugs on old hardware.

  • @Veronique487
    @Veronique487 3 роки тому

    all you guys really rock! ..in a retro-computing kind of way💻🖥🖨⌨🖱🖲💽💿🎹

  • @tylercole3514
    @tylercole3514 4 роки тому +1

    Been a while since you had 200k subs, congrats on 1m!

  • @lobaxx
    @lobaxx 7 років тому +582

    They are all wrong.
    Clockspeeds have stagnated not because of changing consumer habits, but because of the heat wall. Simply put, heat increases exponentially with linearly increasing clock speeds, which makes faster processing cores impossible. This forced processor makers to go for increased parallelism (more cores at the same clock speed instead of one core that is faster).
    Now, in theory, increased parallelism increases processing power, but it isn't as simple in practice.
    In the olden days, a program would get faster with newer hardware because clock speeds increased - programmers didn't have to change anything in a program to see performance improvements. However, since 2005-ish the change was not in clock speeds but in the number of processing units. Since the clock speed has not changed, a program will run just as slow on new hardware as it did on older hardware. In order to get this theoretical increase in performance, programs need to do various things at the same time.
    So here is the first hurdle - programmers now need to re-write their code to gain performance where before the performance just happened by itself. Programmers are lazy and management is cheap, so this rarely happens. Also, programming parallel code is a hard. Like, really really hard with bugs appearing everywhere that are almost impossible to reproduce.
    But assuming deep pockets and motivated staff, you still cannot expect performance increase to match the theoretical hardware output by parallelizing the code. Very, very few problems are completely concurrent (mostly, those lie in the domain of graphics, which is why graphics cards with hundreds or even thousands of processing units exist), and thus Amdahl's law kicks in.
    Lets look at an example program: it does some task A, and uses the result to do B, and both take the same amount of time to do on a single core processor. No matter how many thousands of cores we have, we must calculate A before we can calculate B, and although B is a type of problem that can be parallelized, A isn't . This means that no matter how many cores we throw at the program, B will always run at the same speed. So even if we have a fictional supercomputer with infinite amount of cores, it will only run the program twice as fast compared to on an old, single core processor. Infinite processors, but only twice as fast! So it's no wonder that new programs run just fine on old hardware.
    And there is as a condensed introduction to modern computer science and distributed programming as I could muster.

    • @steve1978ger
      @steve1978ger 6 років тому +30

      Yup, this is only in part demand driven (if at all). As well explained in the video, the computer industry has never had a problem creating demand for faster machines. IC miniaturization has come to physical limits, in heat dissipation, in manufacturing processes, in getting quantum effects (not the good ones) while approaching the atomic level; and at the same time there are tasks that can not be parallelized to multi core CPUs very well.

    • @theelectronwrangler6416
      @theelectronwrangler6416 6 років тому +16

      I was going to add to this but then saw there was more talking about parallel computing and its own hurdles. This about completely covers the why, though there are some other technological advances that may still help improve clock speed, or at least the speed/bandwidth of inter-core communication. I'm fond of carbon fiber nanotubes, but mostly just because it's fun to say :)

    • @marlona5205
      @marlona5205 6 років тому +15

      i am with you on this one. we have a thermal limitation, until we find better affordable materials, CPU speed won't increase. every device we currently own has a life span of less then 6 years depending on usage. heavy usage devices such a cellphone only have a life span of less then 2 years. and its all due to the materials being used . where solder joining IC's/CPU either brittles or melts creating shorts and connection issues

    • @ncnhomegrown
      @ncnhomegrown 5 років тому +25

      I agree with what you say, in that the heat walls, thermal transfer and package complexity are the reasons for clock speeds not increasing beyond 4ghz.
      But I also agree with the people in this video, it's because consumers drive the vendors and everyone wants lightweight portable devices with great power efficiency. This is why die sizes have been shrinking, this is why the only real gains in the last 5 years has been in Watt per FLOP.
      If the die sizes were bigger, power caps were not limited to 140/200W for the CPU sockets, we would easily be seeing stock computers running at 5ghz or greater today, which would certain yeild higher performance.
      I have a water-cooled 4930k @4.6ghz, this chip is now 5 years old. It pulls about 230W through the CPU socket at full tilt, in a socket designed for about 130W. My biggest limitation is heat and feeding the processor electricity. A more modern 6 core at 4.6ghz would use about 2/3rds of my 6 core and deliver 10‰ more performance.
      With all that being said, processors and motherboards these days are not designed for higher clocks. Giving the right engineering and architecture, processors could easily see 6ghz, but the heat outputs and power consumption would be significant.

    • @woodwind314
      @woodwind314 5 років тому +16

      Good reply. The heat wall, though, is more a symptom than a cause. Of the early 2000s mainstream processors the Intel Prescott probably was the worst in terms of heat production, but it it could still be ramped up over 4GHz clock speed - if you put in the right cooling. Now while this was feasible in big desktop machines, it was (and is) not feasible in smaller devices. Hence a need for a drastic lowering of power consumption (== heat output) of the processors. Of course considerations of battery usage in mobile devices also played a big role in that paradigm shift in CPU design.
      Another reason why we don't see any speed increase in CPUs (especially if you look at the individual core) is that memory is lagging orders of magnitude behind CPUs in speed, but at the same time memory demands continue to increase. So to actually get faster computers, the road to go is to get RAM faster, and to write software that is optimized to decrease memory usage and, more importantly, memory access. And that is actually easier done that parallelization.

  • @flinxmeister
    @flinxmeister 8 років тому +3

    I've been an "IT professional" for 20+ years now. My observation is slightly similar to your analysis, but it's based on the needs of the user. In a nutshell, computing power finally fulfilled the needs of the average computer user, probably about 7 or 8 years ago.
    Word processing, web browsing, email, etc. All the things you do while sitting on the couch or at a desk are pretty well defined and have been addressed by current software, and that use case is more than covered by typical hardware.
    Where I work, we exploit this by refurbing machines multiple times...using machines for 6-7 years in about 40% of desktops where the desktop needs are extremely basic. We usually only replace them when they're just worn out and the cost of maintenance/downtime becomes an issue.
    Compare this with the needs of gamers. Their needs are always increasing and as a result you can see the march of innovation and improvement of GPUs. Mobile devices have their own progress since those needs are evolving too. (For an extreme case, look at the short but explosive story of Bitcoin mining hardware.)
    So that's my take. There's not much more you need to do with the typical desktop/laptop from about 2008. The increase in need is reflected in other devices, and you can see the corresponding improvement and innovation in that hardware.

  • @bdttrance4194
    @bdttrance4194 5 років тому

    Now knocking on the door of 1M! Well done

  • @KokkiePiet
    @KokkiePiet 4 роки тому +1

    Over a million subscribers now, congratulations!

  • @immortalsofar5314
    @immortalsofar5314 5 років тому +7

    Coding has also changed. I remember changing JSR, RTS to JMP on the C64 to save a byte. What I remember most about the '90s was the frustration of trying to use new software that was written for the *next* generation of computers and developers not caring about efficiency because it's running under windows anyway.
    The one thing they've struggled to increase exponentially is productivity. They tried 3GL, OOP and numerous other strategies but despite some gains, hardware has outstripped it so the *cost* of using all of the system's resources has increased. It's no longer a case of "can we fit this bell or whistle in there, it's down to whether or not it's worth it (and, frankly, does anyone what it?)

    • @CristalianaIvor
      @CristalianaIvor 4 роки тому +2

      I think thats pretty sad. If you think about the many tricks devs did to fit something or make it run faster - it feels like thats all forgotten today. everything is about making stuff look fancy in 4k graphics but nobody cares that it runs like shit because nobody ever cared to optimize anything -.-
      the shit some game devs serve us could be programmed better by an entusiastic five year old. hell even a monkey hitting keys randomly.

    • @silentbloodyslayer98
      @silentbloodyslayer98 4 роки тому +1

      @@CristalianaIvor CPU's arent getting faster than used to be so optimization will have a big comeback real soon

    • @CristalianaIvor
      @CristalianaIvor 4 роки тому +1

      @@silentbloodyslayer98 hopefully, yes

    • @annagulaev
      @annagulaev 2 роки тому

      Until the 1990's, coding things as simply as possible was a sign of competence and cleverness. Since then programmers have switched focus to impressing their peers with complexity. So much of modern programming practices is counter productive, geared toward demonstrating competence through compliance with ego-focused rather than productivity-focused methodology. In the 90's, OOP switched from being a TOOL to being a RULE. Use it, always, for every problem, and be able to recite the precise definitions of its terms, or you're a moron. And from there, complexity as a virtue has only gotten worse. The expected tool set now includes what are going to turn out to be fly-by-night technologies that will hamper maintenance coders until the code is replaced by the next set of ego tools. Coding has become something that only code farms can do, because learning the tools has to be amortized over multiple clients. It's becoming a thing that small companies can't do anymore, because programmers have made it way more complex than it needs to be. Out of ego, resume building and salary protection.

    • @immortalsofar5314
      @immortalsofar5314 2 роки тому

      @@annagulaev In my experience, the new tech tended to be pushed by management to simplify (sic) what they don't understand. Complexity isn't the sign of good code but elegance is. I once wrote what I thought was an idiot-proof system, someone else made a change and broke it which puzzled me so I asked a colleague to see if he could fix the problem without having my knowledge. He went in, found the right module and could see at a glance not only what had been changed but what should have been changed. Idiot- but not moron-proof. Complex solutions need rewriting. Simple solutions to solve complex problems - they're what make you say "Wow!"

  • @mikepelletier1399
    @mikepelletier1399 5 років тому +3

    Woa, way time to do an update to this!!

  • @_GhostMiner
    @_GhostMiner 5 років тому

    You're the best channel about old tech/computers

  • @linuxretrogamer
    @linuxretrogamer 4 роки тому

    Brilliant video, nail hit well and truely on head.
    Software wise the core OS, web browser and office suite only needs so much power to function (even iTunes and Picasso for that matter). We pretty much passed that point 15-20 years ago.
    It's bespoke apps - games, databases, video editing, etc, that continues to demand more.
    Would have been good to see a bit more on CPUs IPC as a point of comparison and a similar charting of GPU performance. Also expansion of the whole P4 (netburst?) debacle and Intels switch back, away from pumping clockspeeds, to the more efficient P3 microarchitecture used by the Core series.
    Also, how is your Ultimate 8Bit project coming along? Not seen an update in some time.

  • @humanperson8363
    @humanperson8363 6 років тому +23

    Can you do a follow up video on this topic?

  • @The_Metroid
    @The_Metroid 5 років тому +26

    I'm using a "craptop" from 2003. In 2019. Windows 10 has never been so slow.

    • @umbertohaterhd722
      @umbertohaterhd722 4 роки тому +1

      Yay Pentium M bro.

    • @tl1882
      @tl1882 4 роки тому +1

      I'm using an i7 and i can't run windows 10

    • @tl1882
      @tl1882 4 роки тому

      @Michael Francis where i live windows 10 is ~200$

    • @earlpottinger671
      @earlpottinger671 4 роки тому

      Get a SSD. First, I have Windows 7 on my desktop, and I find it way better than Windows 10. However, I have a laptop with Windows 10, and it was driving crazy how slow it responded to me even with the fast hardware it had. Worse, I installed Haiku on a USB flash-drive, and it still ran faster than Window 10 on the hard-drive.
      I replaced the hard-drive with a SSD and the difference is amazing. I think Windows 10 tries to read too many small files and has a poor cache system so it is very slow on a hard-drive, on a SSD the computer works the way I expect it to.
      Next step, Haiku on a SSD for this computer.

    • @The_Metroid
      @The_Metroid 4 роки тому +1

      @erik masterchef I'm saving up. I don't have enough at the moment.

  • @antonnym214
    @antonnym214 4 роки тому +1

    Well-done as always. Interesting, because I've had computers since 1977's TRS-80. Right now, I'm using a desktop PC built around 2010's Phenom II/x4 955 quadcore, with 8GB RAM and SSD and it does everything I need, which is mostly just web surfing, watching movies, and occasionally transcoding a video file. Good for AMD it has hung in there, and also, in 2019, I upgraded onboard graphics to an AMD R7 250. I always knew by the time I could afford the really good CPUs, they would be cheaper. :D This new Zen stuff by AMD is looking very attractive. I would like to go from this to a nice Ryzen 9, 3950x. Sixteen cores seems very much as though yesterday's future is here today. ALL GOOD WISHES FOR A GREAT 2020.

  • @EarlOfMaladyCrescent
    @EarlOfMaladyCrescent 3 роки тому

    I'm no computer expert, but I found this interesting. I'll definitely have to check out that 8 bit keys channel! Music is more my thing really. Nice "Lemmings" theme you played @ the end!

  • @danieldougan269
    @danieldougan269 6 років тому +25

    The reduction in clock speeds also has to do with an interest in maximizing battery life and minimizing power consumption.

    • @monday6740
      @monday6740 5 років тому +8

      I don't think so, clock speeds are important too for marketing purposes. Expected battery life is never met anyway, so they like to not mention that too much ... Batteries are 19th century technology, and hasn't really advanced since then.

    • @arwahsapi
      @arwahsapi 4 роки тому +1

      To stay in market Intel's strategy is to sell less powerful CPU like the Celeron lineups nowadays

    • @johncooper9448
      @johncooper9448 4 роки тому +3

      ​@@monday6740 that's bullshit, batteries have changed a ton since the 1800s, lithium batteries weren't commercialized bascially until Sony did it around 1991 and there's been a ton of advancement in lithium batteries since then, both in cost, technology, battery chemistry, and energy storage. modern batteries last years longer than 1990 batteries and hold way more charge in a way smaller space. just compare early 90s laptop batteries to modern cellphone batteries.

    • @iQQator
      @iQQator 4 роки тому

      Reduction of clock speed improves CPU thermal picture.

  • @Storm_.
    @Storm_. 8 років тому +4

    One thing none of the experts mentioned is GPU power. In the last 10 years GPU's have outstripped CPU's in terms of exponential computing power and these days you have things like CUDA that offload processing that the CPU would normally do on to the graphics chip. So really what we have these days is a CPU and then a multi-media beast processor GPU supporting everything. Even on Macs you'll notice their latest kernels after snow leopard started utilizing the GPU to speed up OS tasks. So in effect - gaming has been the main driving force for development in computer hardware.

    • @SerBallister
      @SerBallister 8 років тому

      +Storm Gaming Media Yup, cannot be over stated how much of a shift GPUs have brought in terms of processing power.

  • @mpruitt756
    @mpruitt756 4 роки тому

    Love your channel. You should do an update to this video. It would be interesting to see the difference in the last four years...thanks

  • @ChristianHegele
    @ChristianHegele 3 роки тому +2

    I think another aspect at play is how the entire industry has really settled down in the last 20 years around the x86 architecture (and the x86-64 extension which remains fully compatible with it). In past ages there were so many competing processor architectures from many competing firms, and advances in CPU power often came at the expense of backwards or cross-platform compatibility. Commercial software ecosystems just disappeared as newer hardware appeared, which was fundamentally incompatible with the code written for earlier machines.

  • @Durakken
    @Durakken 8 років тому +249

    I don't know why you would use clock speed which isn't a good measure of comparison at all due to some of the things you mentioned. You should instead use FLOPS which can be found roughly by looking at CPU's FLOPS/Cycle which in older CPUs was 2, now get up to 8, but most CPUs are the standard 4, if I remember right. You take that number and multiply it by the clock speed multiplied by the number of cores.
    Any CPU before the 2000s is going to look like 2*C*1 where as the average CPU today is going to be 4*C*2 with the average top end being 8*C*8+ and still higher end having 10 to 12 cores and thats if you're not crazy and doing a Multi Server CPU build where you can have I think like 3 CPUs and those types of CPU have around 48 cores at max...
    But the answer to your question is fairly simple really... All that improvement in raw power is there but there is a second benefit here that is being over looked which leads us the right answer.
    Let's suppose I want to write a program and don't want to deal with the headache of Multi-threading and core selection and I want to make sure that there are no problem whatever the configuration settings are on the computer... Well, that means I'm going to take into account mostly how fast the program can run on 1 core.... So what is the difference in terms of performance between the average computer in 2005 and the average performance in 2015? Between none and double. Double sounds like a lot, but double is a 1 year difference in most situations that have to do with PCs. So Mostly applications you're going to run are going to have the same level of impact in terms of processing power as an application run in 2006 run on a 2005 machine. It'll be slower and take more percentage of the resources but it should run more or less the same because applications aren't designed to take up all the resources available...especially OSs which try to keep from taking many resources if possible, because all OSs other than Apple OSs want to be able to run on as many systems as possible and, you know, actually be used to run other programs.
    So you're idea is close, but it is missing the component of threading into account which means that the reources to run most applications is, more or less, static since multi-core processors became the norm. The idea of power users and the failure of Windows, etc that were presented are good ideas, but ultimately they don't matter when you take what I said into account, because with this fact alone you get the results we have every time.
    Considering that your test was done on MacBook, the suggestions presented might play a bigger role due to them being much more first party development, but the fact is that most application developers don't want to deal with multi-threading stuff unless they have to and most application developers probably couldn't even if they wanted to. Apple obviously wanted the app store to be somewhat friendly to a greater number of developers since it is a major selling feature since around 2005 which would result in the same thing happening with all their apps and thus creating the same situation, maybe even more so when you take into account the portability of acounts between devices. Making sure everything works on the poorest quality device ensures across the board quality.

    • @The8BitGuy
      @The8BitGuy  8 років тому +43

      +Durakken One problem I ran into was trying to compare very old CPUs from the 1980's to modern CPUs. I found it difficult to find comparison data for the old CPUs that would have data in the same format for modern ones.

    • @Durakken
      @Durakken 8 років тому +4

      The 8-Bit Guy I didn't mean to come off as an expert or something if I did. I very much suspect you know a lot more about the subject than I do, but the answer to me seems to me to be what I said based on what I know and the reasoning I said.
      I'm by no means a hardware guy so I don't know much about that info other than what I've randomly come across and don't really follow any of that stuff so I don't don't know what format you're really looking for but I would imagine that the very old CPUs are hard to find info about for various reasons, but you might be able to get info from people who post on the computerphile channel, but I didn't have any problem finding comparisons for MIPs which are a good comparison.

    • @asirf.3634
      @asirf.3634 8 років тому +2

      +Durakken its weird that my 1.1ghz macbook is way faster than my old 2.5ghz mac, but this explains it.

    • @Durakken
      @Durakken 8 років тому +7

      Asir F. Well, yeah... consider the following...
      You run the OS and a Game. Let's say the OS takes 2 Billion instructions per cycle to run and Game takes 2 billion instructions per cycle to run.
      Your old 2.5ghz mac is likely running 5 billion instructions per cycles, 4 billion of which is taken up running the game... more over the OS's instructions are ran first, then the Game's so there is 200 to 300 milliseconds between each time the system is running the Game instructions or OS instructions depending on how they use that 1 billion unused instructions per cycle. generally they just go on to the next instruction set, but it is possible they dont
      The 1.1ghz mac likely running at least a dual core and can handle more instructions per cycle so 1 core is running 4.4 billion instructions per second and the other is running 4.4 billion per cycle. So you're like "oh more instructions per cycle, that makes sense" but it's more than that... The programs are also running simulaneously so you're OS is always running on the 1 core and the Game on the other core which means there is no wait between when the system is handling those instructions... Even if you say the cores each only hand 2 instructions per cycle, that's 2.2 billion instructions per cycle per core which is just enough to run either... and even though the total instructions are 4.4 billion, 600 million less instructions than your old system, because the programs are essentially running on different cores and at the same time, the performance is better.
      Obviously there are some short comings to this, like if your game uses 2.3 billion instructions per cycle... in which case 100 million of those instructions would then be processed on the core that the OS is running on and requires slightly more instructions to govern which instructions are processed where and whether or not the game has to wait for those instructions or not. In which case it may be better to have your old mac.
      This is a really simplified explanation mostly this is done automatically if done... a lot of times people and developers don't bother setting things up this way so what you end up getting is everything is stacked on a single core until itsfilled up and then the next etc... but it depends on the developers and the person who configures the computer.
      Also, this doesn't take into consideration caches and other such things that can and often do bottleneck the system.

    • @asirf.3634
      @asirf.3634 8 років тому +2

      Durakken my goodness, are you taking a masters degree in computing? but thank you for explaining that to me, had to read it a few times to fully understand. So like the 1.1ghz is faster because 1 core is for the os and the other core is for the game or any other program , but if its 2.5ghz both would be run on the same processor and thus it will be slower.

  • @seethransom
    @seethransom 8 років тому +5

    Average people don't need an abundance of power for everyday tasks. In fact, most of my friends computing needs dont go past a phone, or a tablet.
    Many people are going to laptops. It seems the manufacturer's goal is to get performance to be better, while needing less power to cool. lower voltages makes it easier to cool the machines. No more laps being burnt. This is good for the batteries too.
    I think it was a guy who made DIGITAL said we would all work through terminals, with cental servers. It came true.
    Then the market quickly changed. We needed to power each machine with its own sofware.
    However, the Cloud changes everything as mentioned. We are now on terminals again. Chrome Books are proof. Steam Boxes, and our own PC's stream high end games on average hardware. It appears that Digital corp. was right after all.
    I'm only 46, but I have been hands on with computers for 40 of those years. My Dad was one of many pioneers who got us to where we are today. I'm grateful I got to sit in the sidecar, and witness all this a little bit closer, because it was his field. I was born at the perfect time to remember our simple existance, and how we got to this place. At no other time have we progressed and evolved so much, in just 50 years time.

  • @djtoddles8750
    @djtoddles8750 5 років тому

    I loved you in breaking bad/better call saul. Seriously, great vid

  • @wb00
    @wb00 4 роки тому

    Almost a million now. And, with the quality of your content, it should be at least 10 times as much.

  • @Vriess123
    @Vriess123 8 років тому +16

    CPU's do seem to be reaching diminishing returns. A Sandy Bridge processor from 5-6 years ago is actually very close to the brand new Skylake's and honestly there isn't much reason to even upgrade if you are overclocking it.

    • @Vriess123
      @Vriess123 8 років тому

      Ivy was only a little faster than Sandy. Sandy is really the start of the cpu's that still are very close to the newest ones.

    • @Patchuchan
      @Patchuchan 8 років тому

      True a Sandy bridge is still a good CPU today.
      I can see where he's coming at as compared to the past the rate of advancement has slowed.
      Compare an Apple II+ that came out in 1979 to an Amiga 1000 that came out in 1985.
      Both were state of the art machines for their day but the latter is vastly more capable.

    • @howlingwolven
      @howlingwolven 8 років тому +1

      The big thing today seems to be with graphics. GPUs nowadays use a LOT more power and do a LOT more than they used to. Ever more pixels means ever more cores to drive them at an acceptable framerate, which has also increased greatly.

    • @Vriess123
      @Vriess123 8 років тому

      Yeah, while cpu speed has gotten pretty stale graphics cards are still getting faster and faster. I wonder when they will start hitting a wall with that as well though

    • @bit2shift
      @bit2shift 8 років тому +2

      Power consumption seems to be going down with each process node.
      The trend going forward is to add more cores and larger SIMD registers.
      GPUs still have to go above the 2GHz barrier, and with even more stream processors.

  • @moptim
    @moptim 5 років тому +3

    Seeing the plaque and a subscriber count of 931k... figured I'd hit that button too, to see how the 1M plaque would look like!

  • @izzoboy91
    @izzoboy91 3 роки тому

    Almost 200,000 subs 4years later over a million good job 8-Bit Guy Great content I really enjoy them

  • @98SE
    @98SE 5 років тому

    very well explained video, thank you!

  • @kaltblut
    @kaltblut 7 років тому +4

    yeah its mostly because computers of the last decade have been able to perform basic tasks for home users quite well. even though websites get fancier and videos have higher resolution which all needs more power, its still quite tame and cpus and graphics cards have been able to operate that for quite some time. based on what it is, there is only so much processing power it needs. same with office applications. they are more interactive and all but in the end writing a letter is not so demanding.
    back in the day it even made a huge noticable difference depending on what cpu you had, if you typed "dir" in DOS. nowadays not so much.

  • @PearComputingDevices
    @PearComputingDevices 5 років тому +8

    For years back in the late 90's and after I was a developer for a company called Be. inc for their OS called BeOS. One of the problems with BeOS at the time and is still largely true about this OS was the fact there was no major developers. Today this isnt as big of a deal as it seemed back then and to make matters worse Be, inc had a tough time attracting developers. At some point in early 1998 during a major meeting with management and shareholders I gave my impression about the future and in that was the fact that we wouldn't need software like Office *gasp* I proposed that set top boxes could be the future and BeOS could be the underlying software that ran on them. A bit like WebTV but much less limited, Since WebTV didn't offer much outside of a browser. This prompted Be Inc to research the idea of avery compact version of BeOS to fit in roms. IT had never gone far I admit. But I think we were indeed too far ahead. Ironically a decade later.....

    • @jonwest776
      @jonwest776 Рік тому

      I had that OS. But no software.

    • @PearComputingDevices
      @PearComputingDevices Рік тому

      @jon west Well there was tons of software. Just few commercial packages. Something I didn't mind. I had the Chromebook vision long before Google though. Because if you could do it all online, what does it matter? Nobody buys a Chromebook today expecting additional software, let alone commercial. But the great news all along: No bloat. BeOS had very little bloat. Unlike Windows then and now.

    • @jonwest776
      @jonwest776 Рік тому +1

      @@PearComputingDevices The no bloat is why I bought it. I loved the whole concept of it. Unfortunately I used commercial software, and nothing I used migrated to it. I guess it shows what a monopoly does to any competition.

  • @ellooku
    @ellooku 5 років тому

    As I am watching for the first time, its over 800,000 subscribers. Congrats.

  • @EkanVitki
    @EkanVitki 5 років тому

    One of the important factors is that we're bottoming out on the physical possibilities due to the scale of the components vs the molecules they're made of... when it gets so thin (now 7nm, moving towards 5nm in 2020) higher frequencies bleed interference to neighbouring parts of the circuit, so you need to keep the clock speed down to limit this, and that's not to mention the extra heat caused by so much going on in a tiny area with so little material now to carry it away. At the moment the cheapest/easiest way around the problem is to increase parallelisation (as we are) by multiplying the numbers of cores. The next big jump will most likely come after we have mastered being able to produce CPUs out of other materials with characteristics that allow us to miniaturize even more (maybe graphene, maybe something else)

  • @nplowman1
    @nplowman1 5 років тому +6

    I think the perception that things have slowed down is also partly due to the fact that the off-the-shelf configurations we see on consumer laptops has been pretty stagnant for a while. For example, I bought an entry level Dell laptop for $500 in 2009 that had 4gb of ram and a 2ghz dual core processor. If you check a Best buy ad today, that's basically the same configuration you'll see on the cheaper entry level laptops. They've probably gotten cheaper, thinner and improved battery life since 2009, but otherwise not a lot has changed at the entry level.

    • @Chriserino
      @Chriserino 5 років тому

      I think you just described all current chrome books.

    • @EsotericBibleSecrets
      @EsotericBibleSecrets 5 років тому +1

      Actually new Laptops completely suck, because they usually don't have any actual disk space. They are trying to promote this "cloud" crap which means you can't access your stuff without an internet connection, and for all you know, the government can. They are trying to phase out desktops, but you shouldn't buy a desktop from a store anyway. Design it from the ground up instead, otherwise it'll be more expensive because it'll come with Windows 10 instead of 7 and a bunch of other stuff you don't really want. I would never buy that. How dare they sell us Laptops with less then 100 gigs disk space!

    • @theravenmonarch9441
      @theravenmonarch9441 5 років тому

      @@EsotericBibleSecrets there are gaming laptops that ONLY come with a 256 or 512GB SSD... GAMING and half a TB and LESS. So it's not because of cloud, it's stupidity.

    • @dragunovbushcraft152
      @dragunovbushcraft152 5 років тому

      @@EsotericBibleSecrets How dare you BUY them. If you went to a good, Lenovo T440p, with nVidia graphics, you have a VERY powerful machine, that is a halfway decent gamer, and will cost you between $160-$400, refurbished. Stop buying their crap, and they'll stop making it.

  • @TKnightcrawler
    @TKnightcrawler 8 років тому +17

    LOL at the Lemmings music. ;-)

    • @TheFoodnipple
      @TheFoodnipple 7 років тому +2

      I knew that sounded familiar!

    • @firstcoastdude
      @firstcoastdude 7 років тому +2

      Loved that game

    • @thesnare100
      @thesnare100 7 років тому +3

      I remember it was one of the "tricky" (titled) levels that I couldn't finish on the SNES version, how far did you get?

    • @TKnightcrawler
      @TKnightcrawler 7 років тому +2

      thesnare100 I was on the PC, so I had a mouse to use. I got about half-way through Mayhem, I think.

  • @icecreamjunkie6790
    @icecreamjunkie6790 Рік тому

    It's an interesting question, and I agree with each of these theories and love how they all connect with each other!

  • @Congslop
    @Congslop 3 роки тому

    Going through to watch every episode ever

  • @fuzzyfoyz
    @fuzzyfoyz 4 роки тому +3

    Would love to see an update of this, given the shift to cloud at present and possible future shift to blockchain.
    I would say that the shift from desktop apps to SaaS is where hardware speed requirements are becoming a thing of the past. At least as far as hardware for joe public is concerned anyway.

  • @Forlorn79
    @Forlorn79 7 років тому +5

    Older computers used to have a CPU that handled everything in a single thread. Now PCs have multi-threads and GPUs to handle graphics, so the main difference is time. A PC with better components will be able to do more in less time, while a computer with older components will be able to do less tasks and will take longer. Gamers often upgrade the GPU first, because that is the new limitation for high end graphics, while other users might require more CPU power for handling many threads and data crunching. For instance, an older PC can watch UA-cam, but maybe not at the highest resolution, and maybe not while doing other things.

  • @KornflakezRandomStuff
    @KornflakezRandomStuff 5 років тому

    Always love the Intro!

  • @samuelschwager
    @samuelschwager 5 років тому

    Three years later and you're almost at a million subscribers, wow! I can imagine a future where only gamers & professionals have a PC, everyone else is happy with a smartphone/tablet/notebook. I think some people are already there.

  • @Stigsnake5
    @Stigsnake5 8 років тому +14

    07:05 I wouldn't say windows 8 was a failure in performance, just the GUI was terrible for PCs and non-touch screen laptops and maybe some other subjective design choices but otherwise was a performance increase over previous versions.

    • @TheRadmin1724
      @TheRadmin1724 8 років тому +4

      And that's why I use Windows 7.

    • @TheRadmin1724
      @TheRadmin1724 8 років тому

      ***** I'm upgrading to Windows 10 anyways.

    • @neeneko
      @neeneko 8 років тому

      +Blaze I think the idea as not that windows 8 was a failure in performance, but that its marketing failure resulted in a larger than expected number of machines remaining on windows XP and 7

    • @videotape2959
      @videotape2959 8 років тому

      +SilverGenerations Studios Be careful with that. In my opinion Windows 10 is even worse than 8.

    • @TheRadmin1724
      @TheRadmin1724 8 років тому

      VideoTape Not really.
      Privacy issues is a big problem in Windows 10, but the other features are very good additions, well, except that there's only two options in Windows Update.

  • @Geardos1
    @Geardos1 8 років тому +19

    If you look at certain industries the difference is even more profound.
    Compare what you needed to edit video in 1995, 2005, and 2015.
    1995: specialized expensive workstations and hardware for SD video
    2005: Reasonably powerful computer for SD video
    2015: nearly any new computer can edit HD video

    • @TheRadmin1724
      @TheRadmin1724 8 років тому

      Good.
      Sony Vegas is still going after 2015 on my Sony VAIO.

    • @Mandragara
      @Mandragara 8 років тому +5

      +Geardos And now people demand 4K video for their 1080p displays

    • @Geardos1
      @Geardos1 8 років тому

      1080 is fine for viewing if you have a display smaller than 60 inches

    • @mathieunouquet1928
      @mathieunouquet1928 8 років тому

      +Geardos only because the videos is handled in hardware, and not in software. Note that the NLE part is one thing, but encoding is a whole different one.

    • @Geardos1
      @Geardos1 8 років тому

      Encoding got way faster too.. Encoding SD video used to be painful.. Now I can encode HD video relatively quickly.

  • @ashtonmark24
    @ashtonmark24 5 років тому

    I've been building computers for years and recently repaste my £2k GS65... But it was repasting an old Amilo Pro that got me... No screen on reboot. Checked all connectors, powered off, took everything out and rebuilt step by step. No screen. Tried power and ram trick and it's now working again. Thanks v much!

  • @lostcarpark
    @lostcarpark 5 років тому

    Good video and interesting discussion. As you point out, the one thing that has stopped increasing at exponential rates is clock speed, but I don't think that has slowed down Moore's law too much. What has changed is the range of hardware levels that operating systems are expected to perform on. The advent of low cost computers such as Chromebooks and even the Raspberry Pi mean that OSs have to be designed to perform reasonably well on hardware well below their "recommended" specs. They do this by shutting down some of the functionality like "glass" display layers that look pretty, but make heavy use of the graphics processor, but don't ultimately make that much difference to the user. You'll also find that a high end computer will let you open dozens of browser tabs before you'll notice any impact on performance. Browser tabs don't consume a lot of CPU power, but they do take a ton of RAM, so on a low-end machine, you'll really see performance go through the floor when you have 3 or 4 tabs open.

  • @PixelOutlaw
    @PixelOutlaw 8 років тому +15

    What is really killing computers these days is the weight of the web browser. They are more a virtualized JavaScript and plugin circus than HTML browser. It is shameful how resource hungry software has become. I think a lot of this comes from layer upon layer of abstraction in software libraries. Each library needing 2 or 3 more libraries to function. The browser is like the .pdf of the program world. It is expected to do everything and support everything to a fault. You hit the nail on the head it is software getting sloppier and sloppier. That said, I quite like writing things in Lisp and Python rather than C++ and ASM.

    • @coopergates9680
      @coopergates9680 7 років тому +1

      Lol what do you think of the OOP craze in the last few years? I write in Java but I don't
      miss OOP much when hopping back to C. I have found Java does run very fast for a non-native language.

    • @fgvcosmic6752
      @fgvcosmic6752 6 років тому

      Whats lisp and ASM?

  • @jovetj
    @jovetj 4 роки тому +22

    This video is outdated. Time for a 5 year update! Seems to me that Moore's Law has, indeed, ended.

    • @TheFallingFlamingo
      @TheFallingFlamingo 4 роки тому +3

      People have been saying that since the early 2000s, computer engineers continue to ignore them and make progress. The only people who think Moore's Law is dead are those who profit off a slower stream of updates. Unfortunately for them, there's always going to be potential profit in making things smaller in tech, if they don't make the move eventually someone else will. *Cough* Intel and Nvidia *Cough*
      Realistically there's a lot of untapped potential for our chip design, the trouble is a lack of competition driving innovation. Intel is working on stack-able architecture, which in practice would eliminate the limited space we run into working strictly on a vertical plane, but who knows what decade they'll deem worthy to release it in. Someone could discover another material that better retains strong currents on a nanometer scale, allowing for even smaller transistors. We could even find a better alternative to transistors or find advancements for an alternative method that already exist, like the spin transistor.

    • @1pcfred
      @1pcfred 4 роки тому +4

      It only looks like Moore's Law has ended when you only consider the consumer market. Moore's Law was never about the consumer market though. The Law is about the highest end devices. There it is still going strong.

    • @thetrashmann8140
      @thetrashmann8140 3 роки тому

      AMD might have kick started Moore's law backup again but only time will tell. AMD managed to get a 7 nm process node working and Intel are still on 14 and 10 nm process nodes (14 being for low end and 10 for high end), and because of that AMD is able to fit a lot more cores in the dye than Intel with less power consumption and heat as well, and in the graphics card space AMD is catching up to Nvidia but Nvidia are still in the lead even if by barely in some cases.

    • @Koffiato
      @Koffiato 3 роки тому

      Except computing power kept getting multiplying

  • @Phryj
    @Phryj 3 роки тому +1

    This would be a good topic to revisit. One thing to consider: it's getting much more difficult to further shrink down components. At the same time, not only are we seeing more and more cores, but better use of the die space in each core, allowing for faster and more efficient operation, more advanced extended instruction circuitry, and multiple execution pathways in each core. Modern CPUs can get a lot more done per clock cycle, so although we're not seeing the raw speed increases we once were, CPUs are still getting better.

    • @morganrussman
      @morganrussman Рік тому +1

      I definitely do agree that this would be a good topic to revisit, considering that almost 7 years later we seem to be at that point where computer specifications seems to be hitting that curve up point again for the half bell up hill.

  • @quenguin2178
    @quenguin2178 4 роки тому +1

    TBH we started getting affordable Solid State drives around 2010 ish too so they speed up application loading and boot speed. Another thing we covered in college was that around 2004-2006 multicore CPU's started coming in, so you had multiple CPU cores to spread the load of the application (in theory). we also started to see an increase in IPC (Instructions per clock) so CPU's can do more work per mhz clock cycle than they could before

  • @leeverink32
    @leeverink32 7 років тому +12

    our CNC machines at work still use windows xp even in 2016!

    • @snbeast9545
      @snbeast9545 6 років тому +4

      Dedicated industrial/business machines (ATMs, CNC machines, cash registers, etc.) don't need OS upgrades, unless they're connected to the Internet and the OS they're running is no longer receiving security updates. In that case, upgrade ASAP!

    • @jonross377
      @jonross377 5 років тому

      The reason is because they would have to buy a new version of their software, whichever it is that they use. I am a machinist also and we just upgraded our software and computers it was in the hundreds of thousands for the software upgrades... Most companies are reluctant to do something like that.

    • @ragnarb8331
      @ragnarb8331 5 років тому

      Bro, our big machine lathe’s are using windows 2000 😂

    • @common_c3nts
      @common_c3nts 5 років тому

      Our CNC machines still run dos and use floppy disks. LOL

    • @magnusevald
      @magnusevald 5 років тому

      I work at a lab where we still use a Macintosh from 1984 to do a breath-analyze because the software does not exist on anything else :D

  • @deusexaethera
    @deusexaethera 4 роки тому +4

    Yeah, measuring processing power via clock speed ignores a ton of ancillary information, because even though clock speeds have leveled-off, single-core performance continues to increase thanks to ever-more complex circuits in each processor core that handle common computing tasks orders of magnitude faster than software possibly could.

  • @robertojunior8711
    @robertojunior8711 5 років тому

    The best technology UA-cam Channel u deserve the popularity

  • @ahsanmoazzam3805
    @ahsanmoazzam3805 4 роки тому

    Starting video sounds awesome