Your NEW PC will be Irrelevant…

Поділитися
Вставка
  • Опубліковано 27 чер 2024
  • Current x86 processors have been ruling the computing landscape, but are looking like their progress is slowing down. It's been a LONNNNG time coming, but ARM Architecture for our PCs looks like it's finally getting around.
    ==JOIN THE DISCORD!==
    / discord
    We saw recently with Apple's M-series Silicon that ARM offers many improvements to them over the legacy x86 processors that we have been using for so long. However, Apple has had an easier time of making this transition than Windows, because MacOS can be vertically integrated. Where Windows has a lot more struggles, but it has been improving significantly and NOW we are seeing new ARM chips like the SnapDragon x Elite SOC that will be able to take advantage of it in just this year. Is ARM the future?
    GN: • Intel's 300W Core i9-1...
    HUB: • Intel CPUs Are Crashin...
    Optimum: • Apple's Fastest Mac vs...
    Dave2D: • The Biggest Moment For...
    MaxTech: • Snapdragon X Elite vs ...
    www.apple.com/newsroom/2023/1...
    nvidianews.nvidia.com/news/nv...
    nvidianews.nvidia.com/news/nv...
    resources.nvidia.com/en-us-gr...
    en.wikipedia.org/wiki/X86
    / how_does_x86_licensing...
    www.quora.com/Why-is-it-that-...
    en.wikipedia.org/wiki/RISC-V
    semiconductor.samsung.com/new...
    www.arm.com/partners/catalog/...
    en.wikipedia.org/wiki/Google_...
    www.theverge.com/2024/4/9/241...
    www.theverge.com/2023/10/23/2...
    www.intel.com/content/www/us/...
    blogs.nvidia.com/blog/mediate...
    www.qualcomm.com/products/mob... / how_is_the_app_support...
    learn.microsoft.com/en-us/win...
    / arm_hardware_ram_upgrade
    forums.macrumors.com/threads/...
    0:00- Are things actually getting faster?
    2:53- Progress slowing down
    3:40- Most exciting silicon of recent years
    6:30- ARM the future?
    7:34- x86 and why it may become irrelevant
    12:03- ARM is OPEN
    13:42- Efficiency
    16:37- Competition with ARM
    17:42- The SnapDragon X Elite chip is looking awesome for Windows
    18:23- ARM's negatives
    21:40- ARM has crazy potential
  • Наука та технологія

КОМЕНТАРІ • 1,3 тис.

  • @vextakes
    @vextakes  Місяць тому +312

    My bad I misspoke, meant to say RISC-V is open-source not ARM and it seems to be easier to acquire a license compared to x86 because ARM doesn’t make their own chips (yet).

    • @ninjabastard
      @ninjabastard Місяць тому +2

      ARM makes their own chips. It's that they also license at low fees (.30 cents per a chip for apple ) their cpu cores for others to integrate into their own chips or SoC's (apple, qualcom, nvidia, etc.). You can find ARM A series cortex's chips in many embedded devices and phones. ARM has traditionally not been big in the desktop or server market where x86 legecy support which has been the dominant factor for business. There are a few companies moving to RISC for desktop or server such as AWS and their gravitron servers using risc processors. But, it's only really Apple who has a decent desktop/laptop consumer offering at the moment.
      RISC-V is an interesting project. But it's pretty not really that mature and whatever processors that do exist are like 10-15 years behind the current procesors.

    • @sleepingvalley8340
      @sleepingvalley8340 Місяць тому +8

      As somone that does buy SoC systems for gaming the closest thing I can compare it to is the BD790i you will still have the x16 PCI-e Slot, RAM Slots, M.2 slots, Sata slots and other misc slots, but it will all be in 1 neat little package, but most people will just buy it as a Mini PC prebuilt because it is easier.

    • @helpmedaddyjesus7099
      @helpmedaddyjesus7099 Місяць тому

      I was about to comment on that lol, thanks for the correction.

    • @NoX-512
      @NoX-512 Місяць тому +23

      RISC-V is not open source, but an open spec. It means you are free to make RISC-V cores without asking for permission, or pay license fees. The cores you design can be either open or closed source. That’s up to you.
      RISC-V will dominate in the future because it’s an open specification, not because the design is brilliant, which it is.
      Currently, most RISC-V cores are embedded. You probably already own several products with RISC-V cores. For example, Western Digital use RISC-V for their storage controllers.

    • @aleksazunjic9672
      @aleksazunjic9672 Місяць тому +11

      @@NoX-512 RISC "dominates future" for the past 25 years 😁 On a serious note, I remember bombastic titles about RISC taking over already in late 90s. It did not happen mostly because RISC CPUs were never fast enough to justify abandoning enormous x86 software library.

  • @temperedglass1130
    @temperedglass1130 Місяць тому +715

    Are you threatening me about my imaginary PC.

    • @user-zw1oy5pm3s
      @user-zw1oy5pm3s Місяць тому +12

      haha

    • @Redditard
      @Redditard Місяць тому +8

      sucks to suck
      edit: it's me

    • @lamquythestupid
      @lamquythestupid Місяць тому

      Let just build a gas stove pc

    • @Choom2077
      @Choom2077 Місяць тому +1

      I seriously pity all of you broke boys
      ...for being in the same exact situation as me.😐

    • @BraxtonHutchins
      @BraxtonHutchins 4 дні тому

      I loath the conversation around GPUs..... Because no one seems to understand..... The rtx 30 series changed the target audience for graphics cards....... Gamers are no longer the focus of developing more powerful graphics cards..... Graphics cards are designed for heavy duty work loads. Video games fall in to that category...... I've built a PC with a 2060S right before the 30 series dropped and on one hand I kick myself...... On another hand. My 2060S with the rest of my rig has been able to play just about anything I put on it. And I've had the build for about 5 going on 6 years now I think...... Upgraded my monitor and.... Idk my next step will be to either upgrade my GPU or just wait a few more years and build a new rig

  • @Pouria_1664
    @Pouria_1664 Місяць тому +1148

    what do you mean, my pc is already irrelevant

    • @KattarAthiest
      @KattarAthiest Місяць тому +8

      can you tell which game is this 14:37

    • @Pouria_1664
      @Pouria_1664 Місяць тому +3

      @@KattarAthiest i have no idea what the game is, looks like an indie game though

    • @ricky4673
      @ricky4673 Місяць тому +11

      What do you mean, I am irrelevant 😅

    • @kira991
      @kira991 Місяць тому +6

      @@KattarAthiest Jusant

    • @haniawaja9311
      @haniawaja9311 Місяць тому +2

      @@KattarAthiest I think its jusant

  • @larrythehedgehog
    @larrythehedgehog Місяць тому +158

    Please be aware. Apple is already hitting the same wall that all the CISC chip makers are at. The wall's name is: the laws of physics. The M series has also jumped in wattage gen over gen for improvements.

    • @totallyrealcat4800
      @totallyrealcat4800 Місяць тому +18

      Yeah, while using a different chip design improves performance, we're approaching the limits of silicon and have to start using new materials to get vene more performance

    • @andyH_England
      @andyH_England Місяць тому +18

      That is true, but they are starting from a much lower wattage per performance metric, which means they have an inherent advantage. The M4 chip should add some headroom as it is considerably more efficient than the M3, which used N3B, which was more performing and less efficient. The M4 on N3E will be the opposite.

    • @wadimek116
      @wadimek116 Місяць тому +3

      ​@@andyH_EnglandThey have advantage until they start adding instructions to their cpus. Even now x86 is just easier to work with and its much more compatible with everything. I doubt most programs are even ported to arm.

    • @TamasKiss-yk4st
      @TamasKiss-yk4st Місяць тому

      Apple M series only used thr same design since M1, so M3 was just a boosted M1 with better manufacturing progress, and yes, that basicaly increased the power consumption, but check the M4 what made with N3E instead of N3B (it's only the enchanted version of the enchanted 5nm, all 4nm and even the first 3nm was just the 5nm yearly enchancements, and the current N3E is the new technology), it's already show a huge performance jump with less heat generation, so with less power consumption.

    • @crisalcantara7671
      @crisalcantara7671 Місяць тому +2

      so the new iphone will summon lighting to play games lol .

  • @internetbestfriend
    @internetbestfriend Місяць тому +573

    RISC-V is royalty free... But ARM, unfortunately, is propriety. Both are RISC

    • @dex4sure361
      @dex4sure361 Місяць тому +81

      Yup he didn’t realize ARM and RISC-V are 2 different things, even if both are RISC.

    • @ricky4673
      @ricky4673 Місяць тому +1

      He knows, it is irrelevant to his points.

    • @ThePgR777
      @ThePgR777 Місяць тому +42

      @@ricky4673 lol? It isnt royalty free and it isnt open

    • @dex4sure361
      @dex4sure361 Місяць тому +27

      @@ricky4673 he clearly didnt know based on what he said on the video

    • @RavenZahadoom
      @RavenZahadoom Місяць тому +37

      Vex has made these types of mistakes a lot recently... a bit more time researching instead of trying get a video out every X days would be better for him for sure.

  • @karathkasun
    @karathkasun Місяць тому +218

    Guess what? ARM is nearly as old as X86 and carries the same baggage.
    All of these "hot takes" on RISC VS CISC are ignorant of the fact that everything is RISC under the hood with CISC front ends now, including both ARM and X86. ARM is just designed to run in a tighter power envelope and has been for decades. AMD chose the middle ground with Zen, and Intel bet the farm on huge FAST cores that eat power. It has nothing to do with the underlying architecture at this point.

    • @HappyBeezerStudios
      @HappyBeezerStudios Місяць тому +28

      And the whole CISC vs RISC argument is truly something of the 90s. Nowadays both share so much design.
      The Intel P6 design translates x86 CISC instructions into RISC-like micro-ops. The AMD K5 is based on a highly parallel RISC design with an x86 decoding frontend.
      So every CPU of the two big x86 manufacturers since the mid 90s is internally much closer to a RISC design than a CISC design, but offers a legacy x86 CISC interface on the outside.
      And modern RISC designs are as complex as their CISC contemporaries.
      And interestingly Intel's most efficient era was when AMD wasn't breathing in their neck.
      I'm talking P6/Pentium M-based Core 2 (which easily outpaced K8) to about Kaby Lake (which is just Skylake with higher clock scaling and was quickly succeeded with another Skylake design with less efficiency to combat Zen)
      Before that was NetBurst, the hyper inefficient, ultra long pipeline design that came to be because IA-64 turned out to be bad.
      And afterwards when AMD offered real competition again, Intel started forcing their CPUs to run way past their optimal efficiency, basically factory overclocking it.
      And even IBM had POWER6, which was a RISC design scaling up to 4.7-5.0 GHz, with the same low efficiency seen from NetBurst, Itanium, Bulldozer/Steamroller and Alder Lake.

    • @karathkasun
      @karathkasun Місяць тому +5

      @@HappyBeezerStudios Absolute facts.

    • @mhavock
      @mhavock Місяць тому +2

      If the hardware is ready, then its a perfect time for an ARM/Linux OS with translation code for x86 etc to take the stage.
      Let build PCs like that!

    • @mintymus
      @mintymus Місяць тому

      It's cool to hate Intel and shill for AMD/ARM. In reality no companies care about any of us, they just care about our $$ and how to separate us from it. Another thing Vix missed is that AMD is more efficient, and they're x86 based. He totally missed on the node size.

    • @jclosed2516
      @jclosed2516 Місяць тому +4

      Yep - The first (relatively affordable) home computer with a RISC processor was the Acorn A305 and A310. Those where sold around 1987. I owned one, and have learned machine code programming on those chips for the first time. We have come a long way since then.

  • @PlayingItWrong
    @PlayingItWrong Місяць тому +137

    The AMD X3D processors are genuinely more efficent, just by virtue of their shortcoming removing heat, even outside of gaming they definetly have use cases.

    • @lycanthoss
      @lycanthoss Місяць тому +4

      Intel chips are more efficient at low power usage simply because of the chiplet nature of Ryzen chips. High power usage at max load does not indicate how much power a CPU uses at small workloads.

    • @PlayingItWrong
      @PlayingItWrong Місяць тому +1

      ​@lycanthoss absolutely, I only meant in comparison with other amd cpu.

    • @YountFilm
      @YountFilm Місяць тому +10

      ​@@lycanthossWho's getting these chips for small workloads...?

    • @lycanthoss
      @lycanthoss Місяць тому +7

      @YountFilm older games or things like browsing the web, using excel/word and etc. "Small workload" might be the wrong words here. "Lightly threaded" is probably better for what I mean.

    • @HappyBeezerStudios
      @HappyBeezerStudios Місяць тому

      @@lycanthoss For that kind of workload a well optimized Intel design might actually compete well with Ryzen.
      For a very narrow threaded fixed task (like mp3 encoding, which has a set amount of work and is single threaded), the Intel P-core might simply be done faster and go back to a low power state earlier.
      (That is why I overclocked my Core 2 Duo back in the day. It was about 20% faster, but drew only about 5% more power, meaning overclocking made it more efficient)
      And for lightly multithreaded workloads with low per-thread requirements, a block of Intel E-cores might offer sufficient performance at lower power draw. If four "slow" cores are still enough to render a website or run old games at the aimed at framerate, they might do better.
      (On my slightly older i5 I've set up different power plans with lower maximum clocks, because they are sufficient for older games. Same with my GPU, I have a profile set to 50% power limit with reduced clocks, that is still enough to run old games)
      Sadly nobody tends to test that kind of use.
      Just like nobody tests APUs using their strong iGPU against "pure" CPUs with dedicated cards and APUs with added dedicated cards against a CPU/dGPU setup. They tend to be only run against CPUs with their weak iGPUs, in which case the APU always wins.

  • @Carnage_Lot
    @Carnage_Lot Місяць тому +353

    Your cat is drinking your water at 7:40 lol

    • @ricky4673
      @ricky4673 Місяць тому +43

      Never saw a cat, you have catovision. You could be a superhero that finds missing cats. 😮

    • @garystinten9339
      @garystinten9339 Місяць тому +28

      Brought to you by NVIDIAs RTX series graphics cards.

    • @asdfjklo124
      @asdfjklo124 Місяць тому

      @@ricky4673 It's all about focus, check out the gorilla study (edit: that's also what pickpockets take advantage of)

    • @danielhayes3607
      @danielhayes3607 Місяць тому +4

      That's not water 🥺

    • @bodasactra
      @bodasactra Місяць тому +14

      No, the cat found out Vex was drinking his water.

  • @ilovelimpfries
    @ilovelimpfries Місяць тому +198

    When this kid said 1978 is sooo long ago and eligible for midlife crisis, I feel attacked.

    • @TropicChristmas
      @TropicChristmas Місяць тому +16

      Bold of him to assume I'm mid-life. I was born in 84 and I still figure I'm 2/3 life

    • @Thomas_Angelo
      @Thomas_Angelo Місяць тому +7

      You are all ancient. People call me an old timer even tho I'm born in 2005. Just because PS2 and DVDs existed in my time doesn't mean I am old lol.

    • @JohnnyEMatos
      @JohnnyEMatos Місяць тому +20

      ​@@Thomas_Angelo you need to be like 50 before I consider you old

    • @Thomas_Angelo
      @Thomas_Angelo Місяць тому +3

      @@JohnnyEMatos That's not what the new kids are saying

    • @TropicChristmas
      @TropicChristmas Місяць тому +4

      @@Thomas_Angelo
      settle down, greybush

  • @vstxp
    @vstxp Місяць тому +144

    Dude, the reason why we are still on x86 is back-compatibility. Apple M-Series have a ton of issues that their x86 compatibility layer barely fixes. Microsoft is also trying to make Windows on ARM work with programs made/compiled for x86, but it barely works. We have a LONG way to go before we can shed x86 completely, don't let the headlines fool you.

    • @henson2k
      @henson2k Місяць тому +3

      Who is holding developers from recompiling for ARM?

    • @Dragon_Slayer_Ornstein
      @Dragon_Slayer_Ornstein Місяць тому +17

      @@henson2k Legacy stuff won't be re-compiled, so all games won't be, you will have to rely on emulation.

    • @aleksazunjic9672
      @aleksazunjic9672 Місяць тому +16

      @@henson2k No one in their right mind would rewrite whole PC software library to run on RISC processors. I said rewrite, not recompile, because lots of stuff was tailored specifically to run on x86 or x64 CPUs. Furthermore, more software that is x86 compatible is written every day. Only way RISC could win is if someone makes RISC CPU that would run x86 trough emulator at decent speed at decent price, which is unlikely.

    • @vstxp
      @vstxp Місяць тому +7

      @@henson2k Sadly, it is never that easy. Because the instructions are entirely different, there is an entirely new toolkit required for that. There are bound to be bugs, to put it mildly. There is also the issue that a lot of software relies on older pieces of software, in general.
      Furthermore, I think Vex should have mentioned that, most 3D libraries and games, both on the PC as well as the main consoles, use certain complex instructions in the CPUs for 3D rendering, and supporting the GPU operations. Most of the big games we play use drivers and libraries that simply cannot work, either at all or with the same performance, on anything other than x86-64. Sadly advanced 3D-rendering in general is actually better with an x86 CPU than any other, no matter what we might say now. And do not let the synthetic benchmarks fool you, the application run in the different platforms is not even the same, most of the numbers are meaningless. Several simple day-to-day apps can run well on RISC due to the compilation on the new environment being relatively easy, but other advanced apps REQUIRE specific complex instructions. CISC computers aren't going away anytime soon, and there are several scenarios they are better for, and yes that sadly includes 3D and gaming, and it will for some more years for sure. Do not think your PC is geting obsolete anytime soon.

    • @ivok9846
      @ivok9846 Місяць тому +6

      can i build risc pc for 300$?
      that draws less than 100w from wall at 100% cpu?
      right now?
      i just did that with x86-64 few months ago...

  • @guydude4124
    @guydude4124 Місяць тому +19

    ARM fans have been saying this for years and nothing has come of it. ARM has more issues than this.

  • @mleise8292
    @mleise8292 Місяць тому +202

    Bruh, ARM itself is 41 years old. 😂

    • @Riyozsu
      @Riyozsu Місяць тому +9

      Yet took decades to make itself known.

    • @nadtz
      @nadtz Місяць тому +16

      @@Riyozsu ARM has been used in all kinds of devices for years, I think you mean make itself known in consumer desktop hardware. MS tried (very badly) with Windows/Surface RT a while back so I grudgingly have to give Apple credit for making it very clear that ARM is viable for desktop systems.

    • @Skylancer727
      @Skylancer727 Місяць тому +3

      ​@@nadtz Even if it is viable it has always been over hyped. The value of ARM over x86 has always been a talking point based on theoretical over proof. The same extent that ARM can run better by chewing the fat can also just translate to rebuilding windows from scratch rather than just building on existing versions. There are still remnants of DOS in windows today. That is far more significant than the hardware.

    • @nadtz
      @nadtz Місяць тому +2

      @@Skylancer727 There are OS's aside of Windows and there is a reason hyperscalers and the like are looking to move to ARM or are moving to ARM where CPU compute isn't the priority. Whether QUALCOMM's first offerings are going to be as good as they claim is still up in the air but either way it's just a matter of time before decent/good ARM offerings come to consumer PC's/laptops.

    • @Gen0cidePTB
      @Gen0cidePTB Місяць тому

      ​@@nadtzHyperscalers were on ARM when Opteron was teaching Intel what multicore was all about.😂
      This transition to ARM's big selling point for Windows is that it's the first clean break they will have had since Windows 95. They will be able to get rid of lots of legacy code and streamline the OS, and because that code will still have to be there on x86, people will think it's the new ARM CPUs. This clean break will also get them away from Intel, which has been stagnant on 14nm then 10nm for a decade.

  • @jredubr
    @jredubr Місяць тому +144

    Dude, nvidia uses ARM, because it can’t use x86.

    • @OneAngrehCat
      @OneAngrehCat Місяць тому +69

      x86-64 is also known as AMD64 because AMD designed the spec.
      Intel pays a license to AMD every year.
      Nvidia would rather launch the nukes all over the earth than pay AMD a penny.
      So they've been trying with ARM instead, lol

    • @terliccc
      @terliccc Місяць тому

      what you mean cant?

    • @frankseyen9156
      @frankseyen9156 Місяць тому +22

      @@terliccc because they dont have a x86 License. If they would have bought VIA, they would have one.

    • @Darth-TBAG
      @Darth-TBAG Місяць тому +10

      @@OneAngrehCatirony is the ceo of both companies are relatives. lol 😂

    • @torque4394
      @torque4394 Місяць тому +14

      @@OneAngrehCat it's not that they don't want to pay, it's because amd would not license it out to them, and why would they? only reason why intel licenses amd64 is because they both have ip that they cross-license and are critical to making a modern cpu

  • @ThisGuyDrives
    @ThisGuyDrives Місяць тому +62

    A PC is never irrelevant. They can become outdated, but never irrelevant. All depends on what games you want to play.

    • @salvadordollyparton666
      @salvadordollyparton666 Місяць тому +4

      i hear they can even do other things, besides gaming... i know, crazy... and even then, a LOT of those other things, are even LESS demanding. absolutely insane. like, i could be using a 3rd gen i5 now, and not even pushing it above 10%... hypothetically. cause in this entirely hypothetical situation my 4 gen board didn't somehow all go on strike at once, while not even being installed but when i do, nothing. and because of all these idiots paying ridiculous money for 2% gains, prices stay high and i refuse to pay retail for a 12th gen cpu to finally use my 690. yeah, stupid rant on a kinda dumb comment.

    • @saricubra2867
      @saricubra2867 Місяць тому +2

      I played Counter Strike 2 at 80fps (GPU bottleneck) with a CRT monitor on an i7-4700MQ laptop that is 11 years old, gaming is not CPU intensive.

    • @vyruss9348
      @vyruss9348 Місяць тому +3

      ​@@saricubra2867It depends on what games.
      Cities Skyline 2, The total war games, Star Citizen , ARMA 3. are all very heavily CPU based

    • @saricubra2867
      @saricubra2867 Місяць тому

      @@vyruss9348 Laughs in Dragon's Dogma 2.

    • @talison461
      @talison461 Місяць тому +1

      Sure, have fun playing your ps1 graphics games lol 😂😂

  • @UltraVegito-1995
    @UltraVegito-1995 Місяць тому +264

    ah yes Qualcomm the NVIDIA of Android..
    Hope Mediatek breaks the ARM PC monopoly

    • @Bolt451
      @Bolt451 Місяць тому +58

      to be fair Qualcomm has done pretty well and hasn't massively screwed over consumers at least as far as I know

    • @azravalencia4577
      @azravalencia4577 Місяць тому +37

      Qualcomm are more like Intel intead of Nvidia tho.

    • @navilzawadkhan1213
      @navilzawadkhan1213 Місяць тому +32

      Qualcom is pretty fair imo its like Nvidia in its early stages

    • @viktorbaresic4180
      @viktorbaresic4180 Місяць тому +2

      Exclusivity deal for woa is expiring this year, mediatek and samsung will enter laptop soc market next year

    • @aviatedviewssound4798
      @aviatedviewssound4798 Місяць тому +4

      Qualcomm is more like AMD since they're older partners.

  • @reD_Bo0n
    @reD_Bo0n Місяць тому +81

    CISC vs RISC doesn't matter that much.
    The name CISC was only coined after the introduction of the RISC concept.
    The summary would be: CPU instructions do more things at once vs a CPU instruction does only one thing.
    Also RISC-V is not the umbrella of all RISC processors, it's just one CPU architecture set which utilizes RISC principles like ARM. And RISC-V is royalty free, not like ARM.
    If you, as a CPU producer, want to use ARM technology you have to either buy a base ARM design and then modify it to your liking, or do the Apple way and buy a License of the ARM instruction set.

    • @rusty1253
      @rusty1253 Місяць тому +1

      so... x86 will still be relevant or nah ?

    • @YannBOYERDev
      @YannBOYERDev Місяць тому +7

      ​@@rusty1253 x86 won't disappear for at least 15 more years.

    • @AAjax
      @AAjax Місяць тому +10

      Agreed. Muddying the water even more is the fact that modern x86 first decode complex instructions into RISC micro ops.
      Legacy instruction support on x86 is indeed a source of die bloat. But that doesn't have much to do with RISC vs CISC.

    • @johndavis29209
      @johndavis29209 Місяць тому

      It does matter though? X86 is CISC, ARM/RISC-V is RISC based. There's differences in their fundamental designs in both.

    • @xianlee4628
      @xianlee4628 Місяць тому

      @@YannBOYERDev Probably much more

  • @bogganalseryd2324
    @bogganalseryd2324 Місяць тому +104

    clearly you havent heard that they faked benchmarks. they arent even half as fast as claimed

    • @Riyozsu
      @Riyozsu Місяць тому +30

      That's the best thing about the arm. Only show benchmarks and not show real-world performance. If the chip cannot beat the 7800x3d in game performance or 14900k in productivity or have less power consumption than a 7600 yet deliver i7 level performance, it's a waste of sand or if it wants to create headlines it should be cheaper than i3 so 60 bucks or something.

    • @bogganalseryd2324
      @bogganalseryd2324 Місяць тому +7

      @@Riyozsu 100%

    • @Gen0cidePTB
      @Gen0cidePTB Місяць тому +22

      ​@@Riyozsu Remember when the M1 came out and people were saying it beat all i7s of the era. Yeah they switched their tune to "it beats some i7 mobile CPUs on a fraction of the power" soon afterwards. The Snapdragon X1 won't even be able to be fairly benchmarked in Windows due to the codebase changes for Windows on ARM, you'll have to do it in Linux when it comes out.
      P.S Geekbench was developed by an ex-Apple reviewer. The type that published for Apple-only sites. Can you smell the bias?

    • @hjf3022
      @hjf3022 Місяць тому +1

      It's a claim, and an unsubstantiated one. I'll wait for tests once it's available to reviewer.

    • @bogganalseryd2324
      @bogganalseryd2324 Місяць тому

      @@hjf3022 Yeah that is true, there are rumors going both ways so the benchmarks will settle it.

  • @user-ot3zm2rz2x
    @user-ot3zm2rz2x Місяць тому +27

    I remember hearing about how RISC was the future and would make x86 obsolete in the very near future back when I was 12.
    I'm 40 now.

    • @anonymousx6651
      @anonymousx6651 Місяць тому +5

      To be fair, ARM has completely taken the smartphone market and may be more common than x86

  • @mythicalnewt7242
    @mythicalnewt7242 Місяць тому +77

    You got one thing wrong, while RISC is royalty free ARM is proprietary. Please correct it.

    • @karehaqt
      @karehaqt Місяць тому +4

      RISC is an ISA which ARM uses and charges licence fees for, whereas RISC-V is the ISA that is royalty free.

    • @wuza8405
      @wuza8405 Місяць тому +17

      @@karehaqt RISC is a philosophy for how to create an ISA, RISC-V and ARM are ISA that implement this philosophy.

    • @chefren77
      @chefren77 Місяць тому +3

      @@wuza8405 Yes! The whole video is full of this basic misunderstanding. RISC and CISC are like philosophies about how to design CPUs, they are not specific instruction sets.
      RISC-V is not the same as ARM, it's an architecture designed at Berkeley University and released in 2010 as open source/royalty free. Berkeley's RISC designs originate in the early 1980s (RISC-I is from 1981), and strongly influenced ARM at the time.
      ARM was designed by the UK company Acorn Computers during 1983-1985, and designed to be low-power in order to make it cheaper to manufacture (by being able to use a plastic rather than ceramic housing). Acorn's computers didn't manage to stay relevant, but they spun off their architecture team into a separate company Advanced RISC Machines (=ARM) in 1990 which still today designs and licenses the ARM architecture.

    • @1DwtEaUn
      @1DwtEaUn Місяць тому +2

      @@chefren77 I still like saying Acorn Risc Machine

  • @bengolko2270
    @bengolko2270 Місяць тому +18

    Minor distinction to be made, ARM is not open-source. Risc-V is the open source architecture but ARM is a licensed architecture from the ARM company. They are both RISC architectures but ARM is a discrete architecture licensed by Apple for their M-series of chips.

  • @Vialli100.
    @Vialli100. Місяць тому +34

    I have a Ryzen 9 5900x, absolutely great CPU..
    Got a negative 30 curve all core, it hardly ever gets over 60 - 62c under full load, it's going to last a few years..
    I wouldn't buy anything made by Apple!

    • @ervinase9661
      @ervinase9661 Місяць тому +1

      I have it too. It's so calming to have a powerful cpu. You don't need to check any game requirements, you just know it will run anything. Including creator programs.

    • @onatics
      @onatics Місяць тому

      @@ervinase9661 9 5900x wont run everything lol

  • @kevinwood3325
    @kevinwood3325 Місяць тому +9

    So happy with AM4. In 2017 I spent $1,000ish on an R5 1600/B350/16GB/256GB/RX 580/700W rig. In 2022, I spent another $800 and now I'm running R7 5700X/B350/32GB/1TB/6800XT/700W. 7 years from the original build, and it's still considered a fairly good gaming PC.

  • @GimeilVR
    @GimeilVR Місяць тому +23

    these cpus/gpus are gonna be in VR headsets... just think about that.

    • @micmanegames695
      @micmanegames695 Місяць тому

      Standalone PCVR.. that will be the day.

    • @anonymousx6651
      @anonymousx6651 Місяць тому +1

      They were from the beginning. Look at the chip of the Quest 1. GPUs already use different architecture from CPUs and the advantages of using an ARM system for VR mostly don't translate in terms of graphics.

  • @XeqtrM1
    @XeqtrM1 Місяць тому +74

    To be fair doesn't matter how good the M2 is when almost no games work

    • @caydilemma3309
      @caydilemma3309 Місяць тому +12

      I mean it matters for every other use besides playing games lmao but I know what you mean

    • @gbitencourt
      @gbitencourt Місяць тому +3

      ​@@caydilemma3309what other uses? We are fighting trash ports from console, we don't need to port from PC to arm again. It will be trash

    • @nopenope1
      @nopenope1 Місяць тому +8

      don't forget the (shared) RAM. Tim Apple (Accountant) does run the company ;) How much electronic waste because of the 8GB models is out there... at least lost potential/reduced usefulness in the long run.

    • @NatrajChaturvedi
      @NatrajChaturvedi Місяць тому +1

      Always has been the case with Mac. Its not an arm problem, its a Mac problem. Lower adoption and Apple's b.s has driven away devs from wanting to develop and port to Mac.
      However both those things are changing and there could be an explosion of ports if Apple just plays its cards right.
      (Just like x3d, Apple could have a more gaming focused sku or machine which gives more value to buyers but Apple is Apple so we dont know)

    • @bradleylauterbach7920
      @bradleylauterbach7920 Місяць тому +2

      @@gbitencourt professional apps. Physics simulations, video editing, CAD, etc. It’s not all about gaming and it never has been. Gaming follows professional use.

  • @saricubra2867
    @saricubra2867 Місяць тому +4

    It's a tradeoff. On x86, memory is very cheap, on ARM, it's very expensive because you need significantly faster bandwidth because there's a lot more instructions per second.
    Also, it can happen that x86 actually is significantly faster for some tasks vs Apple, i remember that Hardware Unboxed tested some video related stuff that gets a huge perfomance increase on x86 (i7-12700H) thanks to AVX2 vs the M1 Pro.

  • @xxyourhunterxx4044
    @xxyourhunterxx4044 Місяць тому +5

    "My house was built in the 1940s, there's all this legacy stuff, there's all these BONES."

  • @tmsphere
    @tmsphere Місяць тому +11

    My old PC with 1070 and 6700k was so good and lasted 7 years I decided to put it aside and build a brand new case instead of upgrading.

    • @user-uj4gr9ql4m
      @user-uj4gr9ql4m Місяць тому

      >1070, 6700k
      >intel 6xxx
      >upgrading
      how do you think you can upgrade it without basically replacing the entire pc?

    • @uncommonsense8693
      @uncommonsense8693 Місяць тому +1

      @@user-uj4gr9ql4m He said he isn't upgrading.
      He just built a new case for it.
      Reading comprehension and humor are not your strong points my autistic friend.

    • @user-uj4gr9ql4m
      @user-uj4gr9ql4m Місяць тому

      @@uncommonsense8693
      >humor
      i was trying to be funny there?

    • @uncommonsense8693
      @uncommonsense8693 Місяць тому

      @@user-uj4gr9ql4m ... reading comprehension DEFINATELY not your strong point friend.

    • @user-uj4gr9ql4m
      @user-uj4gr9ql4m Місяць тому

      @@uncommonsense8693
      what's wrong?

  • @chrisbird4913
    @chrisbird4913 Місяць тому +22

    We are approaching the theoretical limit to computing power, things will slow to a crawl, that's why the focus on efficiency

    • @VertexPlaysMC
      @VertexPlaysMC Місяць тому +2

      there isn't really a limit to computer power, just as large as you can make your supercomputer array. Efficiency is the thing that has a limit.

    • @Nabee_H
      @Nabee_H Місяць тому

      I think its also just rising costs of electricity, materials, finite resources and the push for renewable energy that has put efficient energy use into the spotlight even more than it already would have been.

    • @HappyBeezerStudios
      @HappyBeezerStudios Місяць тому

      There was a big focus on efficiency from about 2007-2017, when peak consumption stagnated and idle consumption decreased. Ironically that was a time when Intel had no competition trying to beat them at the upper end and they could focus on other things.
      The issue since than is that competition again put the focus on being "the fastest" ant any given task, and if you want to get that 5% lead, you'll sacrifice efficiency and throw 30% more power consumption at it.
      And that efficient era also started with moving from low thread, high clock designs to wider, multicore designs with improved efficiency.

    • @mintymus
      @mintymus Місяць тому

      100% wrong. Have you ever heard of node sizes?

    • @kennyoffhenny
      @kennyoffhenny Місяць тому

      @@mintymushave you heard of quantum tunneling?

  • @braindead2813
    @braindead2813 Місяць тому +8

    I was reading some numbers from a national article about the quality of PHD candidates and other professionals graduating in their fields and they found that the overall capabilities and knowledge of the newer graduates are substantially worse than the same counterparts even 10 years ago. I think the new generation of professionals going to work at these companies are just lazy and stupid 😂 at least that is what the data shows.

    • @drxcxrys
      @drxcxrys 14 днів тому

      They are as long as i can remember i am a sucker for all things tech and ive worked for/with both software and hardware companies and as a hobby too throughout my still short time on earth and i can tell u that more than 70% of the staff in these companies is incompetent lazy irresponsible and disrespectful towards customers and technology the backbone of 95% of these companies are ppl with passion and love for the “craft” but the overall 30% are not enough

  • @uwu_peter
    @uwu_peter Місяць тому +11

    20:30 there is the ARM Ampere Altra Server CPU, that has a socketable ARM CPU, socketable RAM and PCIe, so upgradability is possible

    • @thedesk954
      @thedesk954 Місяць тому

      It have been caught running with a770 gpu

  • @bumbeelegends7018
    @bumbeelegends7018 Місяць тому +11

    The way pc consume power right now is rediculiously High

    • @davidszep3488
      @davidszep3488 Місяць тому +2

      Undervolt it. Facepalm. I have a 7950X and its only consume 120W. CB score is over 38000...You dont have to use the default bad efficienty preset.

    • @rasky2684
      @rasky2684 Місяць тому +2

      Nah man, mine probably uses less power now than it did 10 years ago 😂 depends what you put in, then that's on you.

    • @mintymus
      @mintymus Місяць тому

      @@rasky2684 True. In his defense, he's just trying to be trendy.

    • @HifeMan
      @HifeMan Місяць тому

      ⁠​⁠​⁠​⁠@@rasky2684what’s not power efficient about 14900KF OC to 7Ghz and an 4090FE on water OC to 3.5Gghz?!?! lol
      But jokes aside, you aren’t lying, the amount of raw CPU and GPU power you can get per watt now days is insane, especially if you pick the parts with efficient in mind and do some tweaking you can get crazy efficiency.

    • @rasky2684
      @rasky2684 Місяць тому +1

      @@HifeMan I know! To be honest I'm still on Am4 and my whole system runs pretty much most things maxed out at 1440 and probably doesn't even touch 400w.
      5700x3d with 7800xt cheap and cheerful 😆

  • @Bennet2391
    @Bennet2391 Місяць тому +3

    This has all been said since the 386 and nothing has happened, because the anti-x86 crowd doesn't understand that content is king not performace or power saving. Porting everything to arm is such an astronomical undertaking, that you can't even estimate how long it would take. Also software with lost source code would need a complete rewrite.
    Emulation only takes you so far and if you need high performance, you are out of luck - unless you have a risc processor with HW-level x86 compatibility, which is exactly what amd and intel doing today. No modern PC CPU is truly CISC. The cpu translates one cisc instruction into several risc instructions before it executes them.

  • @KushiKush
    @KushiKush Місяць тому +3

    The power draw from the cpu is at stock, with motherboard default. It has been known for a while that the motherboard overclock the cpu's which are already at their near overclock default limits. I have a 14700k and it draws 125-160 in games at max settings, and only gets to its tdp of 253wats when it needs to boost and thats only for 53 seconds.

  • @the_oc_brewpub_sound_guy3071
    @the_oc_brewpub_sound_guy3071 Місяць тому +1

    Weve been saying "your pc will be irrelevant" since the the 90s, when I helped run a computer repair shop, working on like 20 desktops sometimes .
    Once MMX processesing came out i felt this way, same with dual channel ram.

  • @user-gb2ly2kv2x
    @user-gb2ly2kv2x Місяць тому +21

    So you're saying to fix x86, we need to get rid of old instructions? You know, those are the instructions we use the most because they have more support and can most of the things we need (example: mov, add, and, xor, etc.).

    • @killkiss_son
      @killkiss_son Місяць тому +1

      Also x86 is so shit that if you remove one of the instructions set, even if it's not used anymore, it's going to break everything. x86 was originally made to do word processing, now we game on it.

    • @ABaumstumpf
      @ABaumstumpf Місяць тому +10

      @@killkiss_son "Also x86 is so shit that if you remove one of the instructions set, even if it's not used anymore, it's going to break everything."
      Bruh - tell us you have absolutely no clue about computers without telling us you have no clue.
      That is the same for EVERY architecture. If you take something away that was specified to exist and programs are using it then it breaks. That's it.

    • @AwankO
      @AwankO Місяць тому +2

      got to move on from that at some point, its holding efficiency back

    • @nolan412
      @nolan412 Місяць тому +1

      Loads and stores are what stalls all the cores.

    • @svechardannex4200
      @svechardannex4200 Місяць тому +2

      @@AwankO CPUs are already RISC architecturally, there's translation layers. Nothing is holding efficiency back but physics and people's demand for performance. Modern AMD and Intel CPUs are already running up against the limits of how much performance you can get without just driving more power through the system, and Apple has already run into these issues too, because it's fundamentally a physics problem now, not an architecture problem.
      At this point all we can do is try to minimize logic gate size and minimize space between them, re-designing them any other way doesn't really do anything for us anymore.

  • @fellipecanal
    @fellipecanal Місяць тому +6

    Probably won't happen. The core of how CPU works is completely different.
    All software wrote for x86 need to be remade to Arm.
    There is a motive all of our CPUs still support 32bit instructions. A 64bit only today probably break more than half of all software running in the world.

    • @Gramini
      @Gramini Місяць тому +1

      For the vast majority of application it'd be enough to simply recompile them for an ARM target.

  • @dontowelie1302
    @dontowelie1302 Місяць тому +3

    That power consumption comparison you did is @wall.
    So you basically included the power draw of the casefans etc.

  • @headlibrarian1996
    @headlibrarian1996 Місяць тому +2

    Itanium was intended to replace x86, but it got no traction in the marketplace because vendors didn’t bother to recompile their Windows applications for native Itanium and Microsoft shipped an incomplete edition of Windows for it. Basically it was DOA. For the same reasons ARM will never get traction on the desktop, as Windows basically reinforces the x86 monopoly.

    • @timothygibney159
      @timothygibney159 26 днів тому

      Windows on arm is only taking off in azure running cloud apps or sql server for arm to save$$$ for power. Outside Windows Server in a few lob apps or Azure it’s ignored

  • @holyknighthodrick5223
    @holyknighthodrick5223 Місяць тому +2

    Power efficiency is so bad because of competition in the CPU market. The voltage is being ramped up for very small gains in performance, for massive increases in power draw. Unlike the 11900K, modern CPUs essentially come overclocked out of the box, the same applies to graphics cards. There is also the fact that, why leave extra performance on the table, when the product could come with it out of the box, then you get to charge more for it. The mobile market on the other hand takes power efficiency much more seriously because its a huge concern for all consumers as being in a country with cheap power such as the US, doesn't get around the fact that your battery can only store so much charge.

  • @mrtuk4282
    @mrtuk4282 Місяць тому +4

    I disagree with your assessment that NVidia know what they are doing by using ARM. ARM is basically a standard chip design which you pay for and then tweak/modify its design in any way you want. NVidia choice ARM rather than Intel or AMD because its cheaper not because its better and probably because the would have total control rather than beholden to its competitor being AMD/Intel.

  • @beasttitan8747
    @beasttitan8747 Місяць тому +27

    Spending millions for shadows and volumetric lighting is wild.
    If my 5700xt can run red dead redemption at 60 fps 1080p ultra am good, lmao I don't even own a 2k monitor.

    • @Grandmaster-Kush
      @Grandmaster-Kush Місяць тому +2

      People on the internet are pushing HARD for 1440p / 4k and above 60 fps refresh rate, meanwhile my 6700xt with a cheap freesync monitor WILL last me another 5 years at 1080p 60 fps if simply because smaller transistors can't carry a current and silicone is hitting the physical limit of the material.
      And 12gb vram is the same as PS5 so that's future PC ports taken care of all in a 300 euro gpu second hand bought, I don't care about tech in its infancy (framegen, RTX) I already bought into that once before with Nvidia Physx that is now either incorporated in several engines or considered deprecated technology in others.

    • @CeeDeeLmao
      @CeeDeeLmao Місяць тому +3

      Ray tracing is good now it isn’t like it was on 20 series gpus

    • @christophermullins7163
      @christophermullins7163 Місяць тому +1

      ​@@CeeDeeLmaoI disagree. It has benefits and some games are starting to be worth running RT but overall it is still nvidia's manipulation on devs to implement it to sell Nvidia GPUs. Raster can soooo good.aking RT look good is easier so it is a crutch just like upscaling and frame gen.
      Also, before your head explodes.. RT looks better in some games. Sure. It also brings 4090 to a fraction of the fps so it's not viable unless you're literally bored of throwing money away and want to buy some new fancy tech.

    • @vitalsignscritical
      @vitalsignscritical Місяць тому

      red dead redemption isn't on PC.

    • @stangamer1151
      @stangamer1151 Місяць тому +1

      Even if you do not own high res screen, it does not mean you can not take advantage of higher resolution rendering.
      Just use VSR (or DSR/DLDSR in case of Nvidia). VSR makes your GPU render 4x more pixels and then downscales it to your monitor's res (for example, 4K->1080p). If your GPU is not powerful enough to run games at native 4K, just use upscaling.
      I always use DSR or DLDSR, since it greatly improves image quality. 4K + DLSS Performance looks much better than native 1080p, even with applied DLAA. And even 1440p + DLSS Quality still looks significantly better than 1080p + DLAA. So just render the resolution, your GPU can handle at targeted framerate. Higher res renderer provides better AA, better texture filering and more micro details. The resulted image looks so much better. Even native 4K renderer still looks a bit better than any lower res on a 1080p screen.
      RDR2 looks very soft and blurry at 1080p. Even at 1440p it is much sharper and cleaner, let alone 4K.

  • @Skylancer727
    @Skylancer727 Місяць тому +1

    Let's be serious, other companies don't use ARM because it's better, they use it because it's open source. The bloat benefits have more just been a theoretical claim with very little proof. It's hard to prove it works better when there are no apple to apple comparisons.
    People have made the same claims of ARM being superior in efficiency, but there isn't any proof other than a couple people working on them having good designs. It could just be the competition in x86 has limited it.

  • @KillerPSS
    @KillerPSS Місяць тому +2

    If we must replace the whole motherboard just to upgrade a single component (like RAM, CPU or even GPU) it will be so much waste so we will sink in it.

  • @GonthorianDX
    @GonthorianDX Місяць тому +4

    Arm isn't new lol. the GBA uses Arm. It is probably way older than the GBA too
    Arm also isn't royalty free either

  • @bh0llar702
    @bh0llar702 Місяць тому +3

    Nah. It may seem like things are slowing down, but they're not. Every other generation uses insane power. Then, refined and efficient. Rinse and repeat

  • @darnice1125
    @darnice1125 Місяць тому +2

    If nothing, and no games run on it, no future for snapdragon.

  • @Vancer876
    @Vancer876 Місяць тому +1

    Man this is an awesome video this is why i follow u

  • @vstxp
    @vstxp Місяць тому +4

    There are server motherboards that take ARM processors and take common RAM kits and PCI-E devices. RISC processors in general can be replaceable in the same way as our x86 are now. It's just that there is no market for it YET.

    • @HappyBeezerStudios
      @HappyBeezerStudios Місяць тому

      Yup, it's all about the market.
      And on the other side x86 CISC chips are soldered on in modern laptops.
      So it's all about supply and demand.

    • @headlibrarian1996
      @headlibrarian1996 Місяць тому

      There never will be a market on the desktop. No native apps and I’m not sure where you’d get Windows for ARM. Apple made the transition from Intel because they could force it. The x86 monopoly reinforces the Windows monopoly, so MS has no incentive to force adoption of ARM.

  • @justindressler5992
    @justindressler5992 Місяць тому +3

    MIPS and PowerPC were RISC architecture as well. Its not really the instruction set it's almost always the node size. For each node size decrease you can literally build the same CPU and improve performance and power efficiency. The only reason to re-design at each node size is to copy paste more cores and add more cache with the extra space. ARM has been focused on power efficiency from the beginning they change instruction set every 5 or so years to take advantage of better process nodes. ARM has alot of headroom to grow considering the performance they get from top end phone chips on a 5w chip. RISC has been around for a vary long time.

    • @justindressler5992
      @justindressler5992 Місяць тому

      Also I think Intel is already talking about dumping x86 in future designs. They actually might have too at this point.

  • @bummers
    @bummers Місяць тому +1

    Just to correct you CISC is called Complex ISC does not mean that there is something complex in there that can be simplified. It is one of the two design philosophy of which the other, RISC, is what ARM is based on. CISC simply means that the op codes are optimised a different way from RISC.
    CISC has more instructional code sets that can do more things in one op cycle while RISC is optimised to have instructions that are much more simpler, requiring functions to be composed of a series of instructions instead of say 1 or 2 for CISC.
    So if CISC instruction set are simplified, they get broken down into small op codes that fundamentally changes it into a RISC design.
    There was a time and place for CISC where commonly used instructions can be optimised as a single call, simplifying assembly code and in turn higher level language compilers like c and the likes. At that level, reducing the need for a few fetches and execute can mean quite a bit of savings for CISC arch.
    And the most part, Intel and AMD was able to get away with simply running their chip at a higher clock speed to get more performance. Now we are hitting the physical limit and ARM arch is showing its advantage.
    Also, while ARM uses RISC architecture, RISC cpu != ARM 'cos besides ARM, there are quite a few other CPUs that uses RISC arch, like Alpha, MIPS, Power Architecture, and SPARC etc.
    I prob got some (or a lot) of the details wrong 'cos I'm recalling mostly from memory stuff I learn from 30+ years back. Just google CISC vs RISC to get your facts straight.

  • @Ernismeister
    @Ernismeister Місяць тому +2

    Clickbait title. The benchmarks aren't fair because of different litography nodes (intel 10nm+++ vs TSMC 3nm), not to mention the X Elite benchmarks are proven to be fake.

  • @ninjabastard
    @ninjabastard Місяць тому +62

    Maybe its not clear to me but it seems you're mixing up ARM and Risc-V. ARM is a company who licenses its proprietary RISC based processor using with its own instruction set. RISC-V is a an open source RISC based instruction set that can be used in processors, which there are a few. Since they're both RISC instruction sets for RISC processors there will be some overlap but the instruction sets, which tell the processor what the transistors mean and how to do calculations, are not the exact same. RISC-V is far behind ARM in support and capability at the moment.

    • @quantumdot7393
      @quantumdot7393 Місяць тому +22

      Dude the whole video seems like someone researching something for the first time and just repeating what others said with no understanding that is why I didn't bother trying to correct him on everything. This is a video made to get views and nothing else

    • @karehaqt
      @karehaqt Місяць тому +12

      @@quantumdot7393 There's no researching at all, he's reading the RISC-V wiki page when talking about the RISC ISA. I get that he's trying to branch out from just doing GPUs as they're pretty boring now but at least do the research so you actually understand what your talking about.

    • @uncommonsense8693
      @uncommonsense8693 Місяць тому +2

      @@quantumdot7393 Yeah... literally everything in the video was factually incorrect.

    • @imkvn4681
      @imkvn4681 Місяць тому

      His main point was that the optimization and chip fabrication is subpar. I don't see it getting any better because of the increase complexity for a company. Currently, the new thing is chiplets and added cache to the cpu. He should have avoided the Risc-v and ARM.

    • @quantumdot7393
      @quantumdot7393 Місяць тому +1

      @@imkvn4681 I know a ton of people keep mentioning how x86 is bloated but Jim Keller said multiple times that they are no fundamental advantages to arm or risc V and it is all about design. unless you are talking about things like boot times. i trust what he says over internet arm chair experts .

  • @thenewimahich
    @thenewimahich Місяць тому +3

    With out going into detail i think when they made the 4090 i'm pretty sure they have the 5090 or even 6090 all ready planned and made already but wont release it yet

  • @rabahaicha7724
    @rabahaicha7724 Місяць тому +3

    i searched for it and i found that amd own x64 archetecture

  • @teddy0139
    @teddy0139 18 днів тому

    big thumbs up for the resource links in the description

  • @ThePCExpertAmateur
    @ThePCExpertAmateur Місяць тому +1

    Well done! RISC has been around for a long time, so I'm hoping it finally replaces CISC.

  • @christophtoifl6848
    @christophtoifl6848 Місяць тому +4

    My PSU is a semi-passive so it makes zero noise while surfing, reading, etc
    My 6800XT has a 0rpm mode and is also completely quiet for surfing, office work etc
    And my 3900x really really isn't quiet at all...
    I would love a fully semi-passive system, completely quiet when watching UA-cam videos, but a lot of headroom for gaming, python programs and other stuff ...

    • @revi5343
      @revi5343 Місяць тому

      watercooling maybe? or beefier air cooler and undervolting? the 3900x is a really warm cpu.

    • @Spock369
      @Spock369 Місяць тому

      Noctua has a fanless cooler...

    • @christophtoifl6848
      @christophtoifl6848 Місяць тому

      @@Spock369 yeah, but it is too weak for the 3900X. I could use it, but my CPU would throttle a lot. And it is not optimized for semi- passive mode, even with a optional fan at full power it couldn't handle a 3900X at max power draw...
      Right now you could ether have a system that can operate completely noiseless with light load, but is severely limited in max computational power (even with fans), or a system that is really powerful but it will never be silent.
      Wells there is a case that can passively cool a powerful CPU and GPU at full power, but it is waaay to expensive for me...

  • @BruceRichwineJr
    @BruceRichwineJr Місяць тому +5

    Problem is Apple is already running into the same problems you speak of in this video. Definitely more power efficient, but they’re pasting processors together for more performance. The M4 won’t be a major upgrade over the M3. But the PC space definitely needs to do the same thing.

    • @BlogingLP
      @BlogingLP Місяць тому +1

      Hopefully not because I hate when CPUs aren't exchangeable.

    • @1DwtEaUn
      @1DwtEaUn Місяць тому

      @@BlogingLP There are socketed ARM options out there like the Ampere ARM chips, and some use a COM-HPC daughter board for CPU / RAM slots that in theory allow further upgrade than most PC MB designs, in that changing out the daughter board can change out RAM spec and CPU socket.

    • @BlogingLP
      @BlogingLP Місяць тому

      @@1DwtEaUn
      Iam not in the arm game but if I understood correctly it would still be possible to Exchange CPU and RAM because I want to upgrade for example?

    • @1DwtEaUn
      @1DwtEaUn Місяць тому

      @@BlogingLP not universally, but with the Ampere correct, you could upgrade from a 32-core to a 128-core CPU for example and it uses DIMMs for RAM. For a socketed CPU in a non-COM-HPC you could use a AsRock Rack ALTRAD8UD-1L2T or ALTRAD8UD2-1L2Q with an Ampere CPU.

    • @BlogingLP
      @BlogingLP Місяць тому

      @@1DwtEaUn
      Why not universally like we can do now with X86? because I think it's kinda whack if I could not do that?

  • @eye776
    @eye776 Місяць тому +2

    94w in IDLE is a bit sus. Even a Dual CPU workstation from 2018 didn't draw that much power at idle.

  • @H786...
    @H786... Місяць тому +1

    why is it "gpus arent getting better" and not " cpus arent even close to bottlenecking gpus" or more importantly, optimisation.

  • @hoffyc.h393
    @hoffyc.h393 Місяць тому +3

    I use Ryzen 7 5700g at 4.5Ghz 1.32v draw around 30-50w during Gaming :D

  • @hunn20004
    @hunn20004 Місяць тому +5

    With CPUs, I've yet to witness a reason for why I have to sell my 5800x3d....
    It's at a point that I pointlessly upped my ram from 16 to 64 to avoid any more bottlenecks in that department. Modded games are going to run great at least.
    My only weak point would still be my rx5700xt, but the moment I get past 4k 60fps in my favourite game, I'd probably stick with the GPU for as long as the silicone lasts.

    • @HappyBeezerStudios
      @HappyBeezerStudios Місяць тому

      Remember, unused RAM is wasted RAM.
      But neither modern Linux nor modern Windows lat it sit idle. Stuff get's precached (which is why photoshop or chrome start so much quicker the second time)
      I'm still on 16 GB and I have games that sometimes crash because I run out. Even with an additional 64 GB swapfile it can't handle it.

    • @Dolvey
      @Dolvey Місяць тому

      If you end up upgrading, the 7900xt has been fantastic so far. A bit pricey as of now at $700 but the prices are dropping. Even a 7800xt would nearly double performance

  • @1967KID
    @1967KID Місяць тому +2

    When I was in California a friend told me about arm is going to take over this was in 2012.

  • @purerandomness4208
    @purerandomness4208 Місяць тому +1

    I'm not scared my CPU will be irrelevant in a few years. I want my CPU to be irrelevant as soon as possible as it shows a great degree of progress that is worth investing in.

  • @tonep3168
    @tonep3168 Місяць тому +8

    You got this very wrong. The benchmarks have been proved to be fake.

  • @lysergicaciddiethylamide6127
    @lysergicaciddiethylamide6127 Місяць тому +5

    I just built my $2,400 pc and it’s already irrelevant 😐

    • @YannBOYERDev
      @YannBOYERDev Місяць тому +7

      No it's not, this guy made a click bait video, he doesn't even know what he's saying...

    • @lysergicaciddiethylamide6127
      @lysergicaciddiethylamide6127 Місяць тому

      @@YannBOYERDev I was being facetious lol

    • @user-rt9qd8pe7f
      @user-rt9qd8pe7f Місяць тому

      @@YannBOYERDev yeah this guy is clueless, hailing M1 chips what a clown

    • @definingslawek4731
      @definingslawek4731 Місяць тому

      @@user-rt9qd8pe7f I just got an m3 max laptop and it benchmarks significantly higher in cpu single and multicore than the asus zenbook pro I was testing before (i9 13900h, top or close to top chip on windows laptop.)
      So I don't see how it's clownish to point out the incredible performance of apple silicon.

  • @henkohallows
    @henkohallows Місяць тому

    I've been wanting manufacturers to switch to arm for years! but desktop operating systems haven't quite gotten there yet, but then again look at what happened with the m1 or what happened with Linux compatibility when valve put resources into proton, it would probably go faster if a bigger commitment to switch happened, power bills would probably go down, and hopefully price too.

  • @nilsolsen8727
    @nilsolsen8727 Місяць тому

    Ok, since we are plateauing on graphics can we work on higher framerate, and stable framerates now?

  • @alexandrustefanmiron7723
    @alexandrustefanmiron7723 Місяць тому +4

    M3 has hit the thermal wall.

  • @xan1242
    @xan1242 Місяць тому +11

    About the x86 bloat, there's a great video on Primagen's channel which explains why that's not the correct way to approach the issue of x86.
    x86 CPUs haven't been true on-die x86 CPUs for years now. They're all executing their own microcode within which the architecture specifications are defined.

  • @shittyboxBuilds
    @shittyboxBuilds Місяць тому +2

    I just picked myself up a laptop with RTX 4060 and 7940HS and hopefully that can last me 3-5 years… cpu is insane for a laptop and the 4060 laptop is equivalent to the desktop which is a really efficient gpu. So I hope it will last!

  • @itsdeonlol
    @itsdeonlol Місяць тому +1

    10 years is crazy ngl...

  • @Misfit138x
    @Misfit138x Місяць тому +3

    Dude, I love your channel, but Arm is not open source! You are confusing it with risc-v

  • @dy7296
    @dy7296 Місяць тому +8

    20:12
    You're kinda wrong on this part. Socketed ARM cpus with separate ram slots and separate gpus like usual already exists, just on the server.
    ua-cam.com/video/ydGdHjIncbk/v-deo.htmlsi=jZHm0qkoCd7SS_ON
    You don't have to watch this entire thing. Just the first 10 seconds and you'll notice that it has normal dimms and graphics card.

    • @TamasKiss-yk4st
      @TamasKiss-yk4st Місяць тому

      But if you compare the 40-50GB/s DDR5 transfer speed (even if it per channel, you still need 8 RAM slots at least to match the 400GB/s M3 Max which is just a laptop, not a server..) and the PCIe 4.0 x16 slot 32GB/s also a huge limit. (you need PCIe 8.0 x16 slot for your GPU to reach the 400GB/s, and again that is only the laptop speed for Apple)
      So in other words you need to decide which one you prefer the HDD or the SSD? Because even if you can more and more RAM/GPU after a comfortable point, the extra max capacity mean nothing without the transfer speed, just like the the 10+TB HDD, not a lot of us have one, but guess how many of us have 1+TB SSD..? So when you need to pay with your transfer speed, you should think it twice, before you think the way slower but replaceable things are the future, the future is that what not just calculate the 20GB data in a blink, but also can send the result without 1sec delay too..

  • @TheMyKillClan
    @TheMyKillClan Місяць тому +1

    I have a 11700kf and never really max out performance lol. these top end chips are way more than most gamers need.

  • @philipreininger2549
    @philipreininger2549 Місяць тому +1

    The only problem with the x elite arm chip is it seems great now, but when it launches next gen x86 will have advanced so while being insanely efficient, it won’t be as problem free and as fast.

  • @Gin2341
    @Gin2341 Місяць тому +29

    Heard that before with Apple's M1 chips and they're still a laughing stock at gaming and there's no way an ARM cpu running x86/x64 games and application

    • @arewealone9969
      @arewealone9969 Місяць тому +11

      Apples SOCs are quite impressive actually.

    • @nuddin99
      @nuddin99 Місяць тому +7

      That’s only because they run Mac OS. Their performance per watt is very good and is usually much faster with Mac specific applications

    • @uwu_peter
      @uwu_peter Місяць тому +14

      ARM CPUs are able to run x86_64 Applications. There are translation layers on mac OS, Windows and Linux for that

    • @Gin2341
      @Gin2341 Місяць тому +11

      @@uwu_peter translation layer which cost performance, decrease, introduces higher latency and stuttering also it doesn't even do AVX that most modern games already uses

    • @terliccc
      @terliccc Місяць тому

      @@nuddin99 exactly M1 on windows would be as good at most

  • @Yasir_emran
    @Yasir_emran Місяць тому +3

    What do you mean? My old pc is already irrelevant 😅

  • @uncrunch398
    @uncrunch398 Місяць тому

    Each gen it'd be nice to see benchmarks compared with a power cap that the lowest powered typically uses when loaded.

  • @imkvn4681
    @imkvn4681 Місяць тому +2

    Arm is just an instruction set. Depends on TSMC and capital to shift the market and how well the product sells. The future is intel, apple, google that just won contracts and government subsidies to build fabrication facilities for chips. TSMC, samsung, amd, hawuai, baidu will check the US companies.

  • @user78405
    @user78405 Місяць тому +4

    pretty soon intel will move onto 128bit memory address space from 64bit memory address space, since data space already in 256 avx to 1024 bit amx for ai processing

    • @anttikangasvieri1361
      @anttikangasvieri1361 Місяць тому +10

      Why? No computer is anywhere near exhausting 64-bit addresses. There is nothing to gain from making addresses bigger.

    • @YannBOYERDev
      @YannBOYERDev Місяць тому +6

      ​@@anttikangasvieri1361True.. but you know people talk shit even when they don't understand what they are talking about lmao, 128bit CPUs for the average consumer is useless, and increasing bits isn't this easy.

  • @KaoruSugimura
    @KaoruSugimura Місяць тому +4

    The future isn't ARM. It's quantum computing. ARM is just an alternative to x86 with a different use case. In terms of processing ARM is like a specialized tool while x86 is like a full set of tools. None are as efficient in a particular task as arm but x86 allows you to do much more.

    • @4ytherium
      @4ytherium Місяць тому

      yeah but will we ever have consumer quantum computing

    • @andyH_England
      @andyH_England Місяць тому

      Yes, but most people do not need "to do much more". That is why WOA will take over the ultrabook market for starters. Apple has proven that ARM can do 95% of X86 better and people buying Macs see that. This will be the same now Windows ARM chips are finally catching up.

    • @Gramini
      @Gramini Місяць тому +1

      Isn't quantum computer super-useless for common tasks and *highly* specialized?

  • @andrewsolis2988
    @andrewsolis2988 Місяць тому +1

    Snapdragon is no joke, it is already taking it to Apple...and now coming for PCs! It is a beautiful thing

    • @borky1987
      @borky1987 10 днів тому

      I have been using a Snapdragon laptop since like 4 years ago, and it's awesome. I'm guessing the newer models can only be better 👍

  • @pinatasrule
    @pinatasrule Місяць тому +1

    The day ARM replaces x86 is the day I go back to console. ARM would completely fuck compatibility with everything like using a mac.

  • @darkhorse29-yx8qh
    @darkhorse29-yx8qh Місяць тому +5

    nope x86 instruction sets are better

    • @anttikangasvieri1361
      @anttikangasvieri1361 Місяць тому +1

      Any reason why?

    • @Riyozsu
      @Riyozsu Місяць тому

      ​@@anttikangasvieri1361they have been a standard for all developers for atleast half a century. For arm to make revolution, it would take decades. So no one is jumping to arm for their main pcs just yet.

    • @anttikangasvieri1361
      @anttikangasvieri1361 Місяць тому +1

      @@Riyozsu few developers deal with cpu instructions directly. Apple made switch over to arm with few problems. But yes history has huge inertia and x86 will be available for the forseeable future.

    • @Riyozsu
      @Riyozsu Місяць тому +2

      @@anttikangasvieri1361 we will probably reach the limits of silicon's use as the semi- conductor for more performance upgrades, and Moore's law will be officially dead by then. Most probably they will have to switch silicon for a better semi-conductor. Switching to another element or a compound will be more of a trouble than switching to another architecture.

    • @mintymus
      @mintymus Місяць тому

      @@Riyozsu No...they're just reducing the node size.

  • @TerenceKearns
    @TerenceKearns Місяць тому

    Regarding your questions at the end, i think the answer for arm PCs are compute modules. So basically cluster computing on a single motherboard.

  • @jonaswox
    @jonaswox Місяць тому +1

    risc is seeing great progress as well. It is an ACTUAL upgrade over x86 compared to ARM. ARM is just another proprietary tech. We need open source.

  • @patrickweaver1105
    @patrickweaver1105 Місяць тому +1

    So how many decades have they been claiming that? If it happens fine, but until it does no one will believe it.

  • @fanshaw
    @fanshaw Місяць тому +1

    Should we mention that ARM goes back to 1985? Or that the functions that are executed are pretty much the same across all chips because they all need to do the same things. The high-complexity operations are just broken down into lots of lower-complexity ones. Its an extremely efficient process. Today we have more cores and we do more speculative execution, which means we are trading... yes - power for speed. We run lots of operations in case we manage to hit the jackpot on one of them. Hence ARM performance (with fewer transistors) tops out much lower than x86 but gets better performance/watt figures. Apple throws in more hardware accelerators because they are very efficient. The downside is that you can't change the algorithm, because its baked into the hardware; you can't change the RAM; you can't plug in a 40, 50, or 100G NIC; you can't even plug in a graphics card. You're stuck with Thunderbolt4 which seems like a nice 40G link you could use for storage or networking, but then you find a big chunk is dedicated to graphics and you can't change it.

  • @SolarLantern424
    @SolarLantern424 Місяць тому

    I thought this was a really great video. I loved to see someone not just comparing cpu speeds but the power they consume too. It does matter, it matters a lot even when it is plugged in the wall. Lately people have been working out how much all the extra electricity costs them and how by saving that electricity they could pay for a new game or ram upgrade etc for their computer. It's a really positive step in the right direction.

  • @DummyFace123
    @DummyFace123 Місяць тому

    Another thing that caught my attention from video: is that ARM architecture isn't synonymous with System On a Chip. SOC's however are a key component on a product's ability to Power Sip, very low power draw memory is best achieved by having unified memory soldered close to the components that use that memory.
    ARM is just an architecture and isn't tied to SOCs, but combining both ARM architecture + SOCs is what achieves these extreme power efficiencies.

  • @J4KE499
    @J4KE499 Місяць тому +2

    I’m waiting 2 generations for my pc

  • @jeffreydurham2566
    @jeffreydurham2566 Місяць тому

    On the subject of what may happen if everything is integrated on the motherboard, it might be a good idea to be able to order the board with the specs that you want. Maybe not totally customized like people can do on a PC now, but definitely better than here you go take it or leave it.

  • @kramnull8962
    @kramnull8962 Місяць тому +1

    "7800x3d does about 1/2 the work of a 13700K no matter how you try to slice it...... 18K R23 worth of rendering for a 13700K's 31K.... Or 1636 for the 13700k in R24 vs ~900 for a 7800x3d.
    "Those E cores don't do shit......" -Lisa Su

  • @dmaxcustom
    @dmaxcustom Місяць тому +1

    The success or failure of something on mass scale is determined by the price.
    If cheaper than current tech? Sure. I dont think people would be too angry about it.

  • @calcariachimera
    @calcariachimera Місяць тому +1

    With a TDP of 65 W, the Core i7-13700F consumes typical power levels for a modern PC. Intel's processor supports DDR4 and DDR5 memory with a dual-channel interface.

  • @DavidAlsh
    @DavidAlsh Місяць тому

    Engineer here. AppleSilicon is amazing, their vertical integration helps but their power efficiency is largely derived from their experience making mobile phones for decades. The hardware is amazing but Apple's anti-competitive practices nerf their hardware (e.g. refusing to support Vulkan & Proton for gaming was a choice they made - Valve ditched MacOS after that). Basically, Apple sells Lambos with square wheels. If Apple decided to supported Linux on AS, the MBP would be the best laptop ever developed, period.

  • @Traumatree
    @Traumatree Місяць тому

    For those curious, the CISC complexity comes from having the machine instructions being of different length, whereas on RISC every instructions is the same number of bytes. This imply that you can have a lot more performance decoders on RISC (like ARM does) than on CISC as it is easier to "predict" what to do with a said instruction when you need to optimize.
    Edit: And I'm waiting for AMD to launch ARMx64 compatible processors... that would be really huge!

  • @alejandroruiz8617
    @alejandroruiz8617 Місяць тому

    You forgot to mention how many SBC run with ARM, like the raspberry pie, not so powerfully but makes for a great home server using only 25 ish watts even when running multiple services at the same time thanks to containers.

  • @27Zangle
    @27Zangle Місяць тому

    My 6800XT is working well, a little loud at time and it does not like being in windowed mode when gaming (gets noisy), but it is doing the job. I am thinking about upgrading it next year and using this one in a spare PC for the kiddos but I've thought about keeping it for a few more years.