I'm loosing interest in the new gpus and cpus. Finding it hard to even care. A lot of it is due to price and my lack of interest in any new game releases.
@@Killswitch1411 I can buy a prebuilt right now with a 4080 and a 14900k for chepaer than what it would cost me to build it new.. The prices are fine unless you're one of the ones who expect to get a high end machine for no more than $1000
VESA creating a solution to the problem they created... So they can double dip on monitor manufacturers... So monitor manufacturers can raise prices for consumers... Brilliant.
@@MrTaylork1we are at the peak of "Ai" that doesn't do anything I to everything. Either they are going to figure out something useful for it to do. Or people are Going to start to get wise.
I love watching these vids of new shiney tech, while my PC's specs in 2024 are: i7-3770 GTX 1060 16 GB DDR3 RAM Play my cards right, I might finally afford to get a 1080 Ti this year!
Probably more complicated for CPU but it could be cool to be a way to upgrade the CPU without replacing the motherboard (like Framework does). Then while we are add it add both things for dedicated GPUs so you can upgrade both the GPU itself and memory, lol. I would sadly think it will be too costly to do this for CPUs/GPUs to ever be practically done.
Big difference between a small piece with 200~ pins, and a CPU socket with close to a magnitude more, as we approach sockets with ~2000 pins. I don't think it would be practical.
There are also none Ultra. But they have rarely been talked about (maybe still haven’t been released in actual products). Various people have often said new Ultra line up to talk about these new Intel tile based CPUs compared to old monolithic CPUs, possible that has made Paul just call them all for Ultra. Edit: Possible the none ultra isn't tile based (at least not the current ones), which makes the new naming even more confusing when the point of new naming would be completely new design like tiles.
3:16 even further and more reasonable speculations.. Nvidia is aiming for global launch with "Blackwell" architecture. which is only possible if they release the 5080 first.. the card that is supposed to comply with U.S. export regulations
If the 5080 release first, it will be as a test to see how many customers are willing to pay 4090-level prices for a 080 skew. The answer, as far as I am concerned, will be: bring a 7800 XT to 500$ and NVIDIA can shove that useless AI back where the Sun don't shine.
If NVIDIA sticks with a $1,000 price tag for the 5080 people will buy it. A 4090 level of performance for $1,000 and with less power consumption people are gonna be grabbing that as candy.
@@laszlozsurka8991it won’t reach 4090 level of performance. It will probably be 85-90 percent most of a 4090 and maybe just 15 percent faster than current 4080S all for the low price of 1299.00 and people will bend over to take it from Jensen and love it
The ROG Ally X is ASUS's way of admitting that the ROG Ally was a lemon of a device to start with without having to do a recall to replace all those busted devices. (Just ask Steve at Gamers Nexus.)
I'm surprised ASUS didn't simply classifying white shells as being out-of-warranty damage, and just automatically charged anyone sending in a white Ally for the "fix".
Lmao 😂 I have had mine since launch and have had no issues at all lol legion go has had their fair share of issues too lol 😂 the X is the same thing just more ram and better cooling.
@@BainesMkII But technically they did that with the Gamers Nexus Ally. There was a very small ding that didn't affect anything, but they tried charging GN $200 for a screen replacement, because the screen was permanently attached of the case.
I've owned an Ally since launch, and I have never had any problems with it. It's a well-known fact that you will have a small percentage of electronic devices fail in the field within the first 90 days of usage, due to "early failure" of components which tested fine right before the product shipped (the failure rate then drops precipitously, and slowly rises as the product ages). You might also have process problems in assembly, but I would assume that someone in the (laptop) business as long as Asus already has processes in place to fix these issues when they are detected. AFAIK, nobody who has actual data on failure rates has posted any statistics. Calling it a lemon based on 4 or 5 people posting about their device failures on the internet means ALMOST NOTHING if hundreds of thousands of units were shipped.
@@glenndoiron9317 It's not the failure rate that is the issue. ASUS has established a history of straight out scamming people through their repair process. This isn't just "normal" warranty/repair shenanigans either (though ASUS is guilty of that as well). Horror stories are everywhere, but a good breakdown of complaints along with their own absolutely absurd story can be found in the Gamers Nexus video "ASUS Scammed Us".
According to MLiD at least the 5080 launch is because the 5090 will be restricted from being sold in China so NVidia will want to launch their "sanctions buster" GPU first.
@@tringuyen7519 Yes. Paper launch only in US. Call me out if we're wrong, but I don't think the 5080 will be sent anywhere in large quantities, not named China, for the first couple of months.
@Oliver-sn4be Even when the 5080 is available for me to get, I won't be buying it anyway. I was just telling the stock situation as it's been reported. I'm interested in the higher end gpu cards for technology sake but don't need one.
Get ready for that crappy connection to melt yet again on the 5090s since Nvidia will not accept the truth that it’s a extremely bad connector for powerful cards.
Or at least transfer it to location where cable can be properly mounted. I mean, why would ANYONE put cable in place where it doesnt hit case, its obviously inferior design.
Also screw ASUS for not replacing those melted connectors. DO NOT BUY ASUS!!!!!!!! TOMS hardware news posted today as of 5/12/2024 Asus quotes ridiculous $2,750 fee to replace chipped GPU power connector- docs back up claims of egregious repair pricing for $2,000 RTX 4090 GPU
Much like avoiding OLED panels for risk of burn-in/image retention. Is the risk reduced? Yes. But it's still a risk. I'm not gonna buy a card with that 12VHP connector for that reason. My 3080trio looks fairly clean even with its 3x8pin connectors due my cable management. Even if I was lazy with my cable management, who is really ever gonna see it but me?😂
I don’t even consider new gpus anymore, im more interested in the used market, i’l wait for a new gpu to be released so i can buy the previous gen at what it should’ve costed, my current rtx 3090 costed me 500 euros
@@Rubysh88 I do just the opposite. Buy the new card at launch, sell for 3/4 the price I paid before the next cards launch. Timing is key of course, that and keeping your card and it's packaging in mint condition.
If HT is gone, does that mean they can now move a whole thread easily if something stalls? Do the pipelines have junctions to throw the thread to other pipes if stalled? Or are they saying modern software doesn't cause the stalls that were common that necessitated the HT solution?
C'mon, you're being logical. This is Intel. It's not like AMD and Microsoft are known for good naming schemes, either. I thought the same exact thing, btw.
I wasn't going to buy Intel in my next upgrade, but Intel's new naming scheme definitely guarantees I will go Zen 5, if their insistence on adopting the bulldozer no-hyper-threading and lots of little cores philosophy wasn't a "we're having trouble shrinking our process node" kind of red flag already.
Re the ad, One of those pins on the 4-pin fan connector is a speed signal, that goes back to the motherboard so it can see if the fan is dead or too slow. If fans can be "daisy-chained" from just one connector on the board, which of the daisy-chained fans is being read? What about the others?
A bit lost on the need for it this year. I mean, what games is there to sell it with. We had like Witcher 3 and Cyberpunk games that helped sell cards, but what is there this year? Stalker 2 maybe? Normally I go with each new release, but even I am struggling to see the need for 4090 owners to jump again so soon, even more so if there is no games that really need the 5090.
*The big.LITTLE era:* *heat is a concern* *stability issues* *IMC issues* *DDR5 tech issues:* *heat is a concern* *stability issues* *The long and short of it is: You have to tinker like a MAD MAN in the BIOS and find the PERFECT mixture and tuning. Or... You have to make a sacrifice or two, to get a stable system.* *ie: dial back your ram frequency and tighten the timings, ramp up the voltage and pray for hitting the lottery and get your advertised frequency running stable, upgrade AiO cooling or custom loop to something massive like a copper rad, 420mm setup (3x140mm fans), stronger pump, higher static pressure fans, etc.*
This rumor came up last gen. Probably engineering samples being pushed. With Amd not challenging the top end, they shouldn't push into higher watts. Who knows though.
Because this isn't the card wattage. This is the cooling capacity of the FE model. Nvidia overengineer their FE cards. The 4090's shroud is rated for 600W.
god i wish more companies would just adopt an annual revision of a product and literally just call it the "productname 20xx" so you can just easily reference how old it is and whether or not you ~might~ want to consider a new revision upgrade.
Scalpers tend to dictate prices at launch. Then again Nvidia were the scalpers for 4000 series, so I'd bet dollars to donuts that 5080 will be at least 1300USD.
The 80 series makes the most sense to release first as they can sell the most cut down die that matches the 4090 for the biggest margins. What would have been a 70 series die in the past, but for current 90 series price tag
Your delusional new 4080s will still be roughly the same price and second hand 4080s by people wanting to upgrade to the new 5000 series will still price them high.Your better off just getting a new 5000 series as it will have a longer lifespan and better performance.
Are you confusing "what is currently in process" to "what will be reality in the future"? I think most people saying MLID is wrong aren't actually listening to the words he is saying.
@@TheHighborn It isn't even that. I am betting the OP thinks MLID is dumb for saying "Arc is canceled" when that was never what he fucking said. The true meme is how bad people are are listening comprehension.
lol you got attacked by these MLID defenders brother its hilarous when even a well known reliable leaker on twitter and some others knew how clown MLID is and meme'd on him too
They said that the 40 series TITAN Class GPU was 600 watts and had 48 VRAM or that was the rumors I remember, I don't have high hopes for 50 series GPU's having 600 watts and not seeing melting connectors like what happened in the 4090's... hopefully Nividia has addressed that issue this time around...
It is mentioned that NV releases RTX 5080 due to china restriction and NV wanted a global launch where their first BW card can be sold in china as well. In comparison, the rumored RTX 5080 have the same compute power as 4090D which basically circumvent the GPU sanction against china. A non full GB202 GPU (94 SM instead of 96), clocked at 3GHz is still within the limit of the restriction.
"Like I did in college"...yep 😟...totally just in college 😨...definitely didn't use a box fan as side panel in my 30s😰...nope! P.S. Sorry if any Gen Zers felt attacked with all my ellipses usage 👍
I find LPCAMM interesting because it allows for previously non socketable memory to be socketable. LPDDR is different from SO-DIMM and is the kind of memory that previously was only possible to be soldered directly to the board. CAMM and CAMM2 would be the one they're trying to replace SO-DIMMs with.
Intel launches new "K"-series CPUs - everyone skips this part... Your news are put together with so much love and the pictures are top notch! Thanks to you both.
I am going to stick with my 7-10700k and 3080 until they either break, or they can no longer play the games I play. I initially estimated I would be running this setup until the 60 series, but I am now thinking this setup could last even longer. The more they cost, the longer I am going to go between purchases. My next computer is going to be a monster.
Another glorious tech news video to accompany my Sunday morning visit to the thunderbox. It's becoming a Sunday tradition for me, followed by roast beef and Yorkshire puddings for dinner.
Fun fact, all processors are AI processors, the AI part means noting, just shirt hand for "fast at specific calculations" like GPU, wasn't a thing for many years was just part of the CPU. So you already have an AI CPU, and GPU and Phone and .... You get it. Just won't be as fast as possible for the workload.
i'd rather have more freq and/or cores in place of minus the ai part. is it just instructions built into cpu or is it another set of cores or such dedicated to just the ai function of the os/software?
@@Tea_1745 both currently. But as the software develops there will likely be more specific hardware for it. There are parts being included now for the "AI CPU" stuff. And we don't use them, yet, but I could see in maybe 10 years games taking advantage of it for their "AI" like NPCs and stuff. But that's purely a guess right now.
Really does not matter because the new cards will cost silly amounts of money that many of us do not have. 5070ti for $1000? I do not think it will cost much less than that. I am looking at a 7900xt or 3080 for about 500. Currently have a vanilla 3060.
The reasons for moving away from SMT are pretty easy to understand given the P core E core architecture dynamic and Intel needing to cut maximum power draw at any cost.
It would make sense if the 80 tier launched before the 90 tier. Ever since GTX was a thing the 80 tier has always been their first one to launch, even if a 90 tier was announced at the same time, the 80 tier would be available first. Good example is the 600 series, the GTX 690 was released after the 680 despite being announced in the same event.
wowowowo the naming scheme change will be a very good one... going from 5diggits to 3 may be a big loss tho ahahahha thanks Steve. and Paul as always, thanks
$1200 - $1500 for 5080 based on GB203. Full blown GB202 (if two GB203's put together) could be $3000. Sounds absurd but with nothing from AMD or Intel to compete at those levels for a while, why not from Nvidia? Even bigger curve ball would be using GB203 for vanilla 5090 and use GB202 for a 5090 Ti (AI focused card). With AI being the focus of Nvidia and no competition at the high level, who knows that Nvidia is thinking for launching Blackwell. DLSS 4 will be put in the spotlight as mandatory to get performance out of 5080 and below.
Good job Asus, for ruining your reputation 'right before' launching a new Ally with all the waranty _'scams'_ and faulty hardware in most, if not all, first gen Ally's!
I wonder if Intel is going to rush through Battlemage to get to Celestial's architecture improvements. I would imagine Celetial will have improvements based on what programmmers and developers have observed on Alchemist. So instead of spending toomuch time on the intermediate Battlemage, effort may be focusing more on Celestial with mobile being the initial test rollout.
Architecture improvements don't really work like that. CUDA cores are an insanely mature all-purpose parallel compute technology that is extremely well optimised from a software and hardware perspective. Tensor Cores are much more specialised FP32 linear algebra machines. It's been finely balanced in the consumer market but H100s are literally just tensor cores. Intel's XE cores are literally just half 256 bit ALUs and half 1024 bit ALUs. It will get better in the next gen but the real breakthroughs are 2 to 3 generations away. Whilst not as chaotic but an appropriate or equivalent example is infinity fabric. At first it was more of a hand brake on performance but in the fourth gen 7000 series - it's what is really making the difference. XE cores will likely experience a similar trajectory.
5080 and 5090 gonna be priced through the roof. Wouldn't expect a 5080 for any less than $2000 usd retail but there's gonna be little to no stock til mid 2025 so get ready for scalper pricing of $3k or more. 5090s are gonna go straight out the backdoor to AI farms.
nah that 32gb (if they even give it that) isn't enough to make it interesting to AI "farms", maybe single-card hobbyists still better to load up on 3090s cuz nvlink (nvlink was removed 4xxx+) or to just get 48gb cards like the rtx6000 the vram matters much more than the compute power in the single server world, that's why they're scared of giving the gamer cards more vram
Tech Marketing departments really are over-paid. Oh, did that fan trick (sticking a desk fan in the case) back in the old days, only problem was if you changed the fan speed the computer might reboot due to the emp.
Yep, ROG Ally X will take up to a 2280, have more than 16 GB of RAM, & have "way more than just a 40% better battery." Whatever that means, exactly... 🤷♂ & They also moved the SD card reader away from the heatsink, hoping stopping the SD read issues.
I think the 50 series will be just a little faster with the correct number of cuda cores that were missing from the 40 series cards. Long live my EVGA 3090!!!
I would have been interested in the number of shaders of a RTX 5080. As I do have the 3080 as a water cooled eGPU with 8000 cores, it is great for AI models.
The best thing you can do after installing a new cpu is undervolt. Thou I get that not everyone wants to tinker themselves, but technically you can't kill CPU by lowering the power and if u go too low, system will start to hang, so u know immediately when u give not enough power. It might take about 10 minutes to get some undervolting done and it's much easier than oc'ing, but giving you much more temp headroom.
i feel like beside fsr, ray tracing and a few other features these new cards are just getting larger and drawing more power to achieve more performance without a lot of optimizations.
I think the idea of them doing the dual ship design and waiting till next year is the most plausible because it allows them to sell a shit load of 5080s to the gamer community and then they actually don’t have to meet the high demand of trying to launch a 5080 and 5090 at the same time and everybody holding out for the 5090 and then they don’t sell as many cards in general. So we’ll see however I could be wrong on that part because most people who already have a 40 series cards are not gonna see more than a 20% gain and therefore selling the 5090 at the same time could very much lead to having more sales overall. As the people with the 40 series cards will certainly jump at a 5090.
I'm not sure what all AI tasks are normally handled on a CPU. I know image generation is typically done on the CPU if you're using an AMD GPU, due to the absence of CUDA. But DLSS and all that are ruled out as I believe they're handled on the GPU. Not that it doesn't make sense to add a dedicated AI processor to the CPU, like video encoding has had dedicated architecture on GPUs for a while, but I don't think it makes sense to put them in every model since most people will probably have no use for it. Idk, maybe I'm missing something or maybe they know something we don't.
I would say in some regards it makes sense for Intel to do a new naming scheme given it is now tile based design for the new naming where it was monolithic before. Thereby no need to drop the i and have Ultra for some products while not for others. My best guess for AMD AI naming is they have 45+ TFLOPs (maybe some other measurement) NPUs which seems to be the requirement for Windows copilot to run on device, which is what the new Snapdragon has. But otherwise I would think it is still the same design by AMD and not new design like Intel’s tiles. Then I do wonder if any of the current lineup with NPUs will even be useful when they dont seem to be powerful enough for copilot thereby a waste for people to have bought it specifically for such things (like Ryzen 8000 series basically just bit more powerful NPUs compared to 7000).
Done that...Both side panels off with 2 Box Fans in Push-Pull on my FX8375 with 2 R9 290X Sapphire OC 8gb cards in Cross Fire...thing STILL shut down even when set outside in 15 F temps. 8( very sad.
Nice of intel and AMD to both swap to 3 digit names at the same time just to confuse all their consumerbase again
Well intel changed so obviously AMD had to follow.
Wait what? What did AMD do?
@@platinumsun4632 Followed intel with changing naming scheme and went a bit overboard
@@platinumsun4632 literally watch the video smh
Yet those Intel fools somehow seem to believe the Ultra moniker has chances against the AI buzzword.
Why not rename the whole company to AIMD while we are at it?
dont give them any ideas
Quality comment
nVidia, AiMD begets... APP le that is oddly correct gen their pension for App and limited editions. ASSus also may be a solid change.
@@EthelbertCoyotePerfect, yes.
More AAIMD "Advanced AI Micro Device" and "NvidAI"
The only thing we know for certain is that the next generation of cards from AMD and Nvidia will be too expensive for what they deliver.
I'm loosing interest in the new gpus and cpus. Finding it hard to even care. A lot of it is due to price and my lack of interest in any new game releases.
@@Killswitch1411 way to keep it loose
@@red5standingby419 Better keeping it loose than being uptight?
Yeah and the power draw will be absurd.
I very rarely run my 4090 at max power as it is so a 5090 is a big no no for me.
@@Killswitch1411 I can buy a prebuilt right now with a 4080 and a 14900k for chepaer than what it would cost me to build it new.. The prices are fine unless you're one of the ones who expect to get a high end machine for no more than $1000
VESA creating a solution to the problem they created... So they can double dip on monitor manufacturers... So monitor manufacturers can raise prices for consumers... Brilliant.
Everybody wins!
Who needs money anyway??
@@Stev-mi2fd xDDD
My 7 year old intel i7 7700k CPU has already done away with hyper-threading all on it's own (via the blue screen of death). Ahead of it's time.
My 6700k is still a beast.
@@yugdesiral am surprised by it 2 have it already 7 years now wow
@@Oliver-sn4be 8 years now, just handed it down to my 5yo. The 980ti is still delightfully demolishing most games at 1080p.
@@yugdesiral hopefully you keep your 5 year old on single player only but i doubt it.
@@Joker-no1fz mostly single player yeah, if online i just turn off chat
oh great, a new Asus handheld for them to screw you over with when it comes to repairs and all that.
GN ftw!
Why did people even try to argue that the Asus handheld was a worthy competitor to Steam deck? It's like obvious who the winner was by track record.
I'm so sick of "AI" being crammed into everything.
Buckle in. We’ve barely started scratching the surface of AI being added to everything.
@@MrTaylork1we are at the peak of "Ai" that doesn't do anything I to everything. Either they are going to figure out something useful for it to do. Or people are Going to start to get wise.
Get use to it lol
I'm barely recovered from NFT's and Metaverse taking over...
Too bad this is just the beginning.
I love watching these vids of new shiney tech, while my PC's specs in 2024 are:
i7-3770
GTX 1060
16 GB DDR3 RAM
Play my cards right, I might finally afford to get a 1080 Ti this year!
I have a 1080 and it is fantastic in 2024
Why you watch Paul than ????
Only hobby freaks need it, keep the old system, and stop watching the adss crab here !
@@lucasrem Just because I can't afford them, doesn't mean it's not fun to look at the shiney things.
Has the prices gone up again? I picked up a 1080ti for around $200 on ebay a year ago
The compression plate of CAMM being replaceable should be adopted by the motherboard cpu socket.
Probably more complicated for CPU but it could be cool to be a way to upgrade the CPU without replacing the motherboard (like Framework does).
Then while we are add it add both things for dedicated GPUs so you can upgrade both the GPU itself and memory, lol.
I would sadly think it will be too costly to do this for CPUs/GPUs to ever be practically done.
Big difference between a small piece with 200~ pins, and a CPU socket with close to a magnitude more, as we approach sockets with ~2000 pins. I don't think it would be practical.
if all your Intel Core processors are "Ultra" what does Ultra even mean here?
There are also none Ultra. But they have rarely been talked about (maybe still haven’t been released in actual products).
Various people have often said new Ultra line up to talk about these new Intel tile based CPUs compared to old monolithic CPUs, possible that has made Paul just call them all for Ultra.
Edit: Possible the none ultra isn't tile based (at least not the current ones), which makes the new naming even more confusing when the point of new naming would be completely new design like tiles.
They do have non Ultra CPUs but they are more for OEM and lower end and not really loudly announced
So in summary: whoever proposed AND accepted the naming change should be fired
@@aerosw1ft they can't - that person is DEI hired...
Ultra confusing
3:16 even further and more reasonable speculations..
Nvidia is aiming for global launch with "Blackwell" architecture.
which is only possible if they release the 5080 first.. the card that is supposed to comply with U.S. export regulations
New Intel management does not want their reign to be confused with previous management products.
If the 5080 release first, it will be as a test to see how many customers are willing to pay 4090-level prices for a 080 skew.
The answer, as far as I am concerned, will be: bring a 7800 XT to 500$ and NVIDIA can shove that useless AI back where the Sun don't shine.
If NVIDIA sticks with a $1,000 price tag for the 5080 people will buy it. A 4090 level of performance for $1,000 and with less power consumption people are gonna be grabbing that as candy.
@@laszlozsurka8991 If nvidia sticks to a $1000 price tag, then they will sell exactly the same number of units as they did in the previous generation.
@@laszlozsurka8991it won’t reach 4090 level of performance. It will probably be 85-90 percent most of a 4090 and maybe just 15 percent faster than current 4080S all for the low price of 1299.00 and people will bend over to take it from Jensen and love it
@ 3:47 No one under the age of 40 is going to get that " video clip " 🤣
The ROG Ally X is ASUS's way of admitting that the ROG Ally was a lemon of a device to start with without having to do a recall to replace all those busted devices. (Just ask Steve at Gamers Nexus.)
I'm surprised ASUS didn't simply classifying white shells as being out-of-warranty damage, and just automatically charged anyone sending in a white Ally for the "fix".
Lmao 😂 I have had mine since launch and have had no issues at all lol legion go has had their fair share of issues too lol 😂 the X is the same thing just more ram and better cooling.
@@BainesMkII But technically they did that with the Gamers Nexus Ally. There was a very small ding that didn't affect anything, but they tried charging GN $200 for a screen replacement, because the screen was permanently attached of the case.
I've owned an Ally since launch, and I have never had any problems with it. It's a well-known fact that you will have a small percentage of electronic devices fail in the field within the first 90 days of usage, due to "early failure" of components which tested fine right before the product shipped (the failure rate then drops precipitously, and slowly rises as the product ages). You might also have process problems in assembly, but I would assume that someone in the (laptop) business as long as Asus already has processes in place to fix these issues when they are detected. AFAIK, nobody who has actual data on failure rates has posted any statistics. Calling it a lemon based on 4 or 5 people posting about their device failures on the internet means ALMOST NOTHING if hundreds of thousands of units were shipped.
@@glenndoiron9317 It's not the failure rate that is the issue. ASUS has established a history of straight out scamming people through their repair process. This isn't just "normal" warranty/repair shenanigans either (though ASUS is guilty of that as well). Horror stories are everywhere, but a good breakdown of complaints along with their own absolutely absurd story can be found in the Gamers Nexus video "ASUS Scammed Us".
Great intro Joe and Paul!
That Computex logo reveal was quite something. Good job, Joe.
According to MLiD at least the 5080 launch is because the 5090 will be restricted from being sold in China so NVidia will want to launch their "sanctions buster" GPU first.
Which means gamers in the US won’t see 5080 on shelves until 2025 if at all.😅
@@tringuyen7519 Yes. Paper launch only in US.
Call me out if we're wrong, but I don't think the 5080 will be sent anywhere in large quantities, not named China, for the first couple of months.
@@frommatorav1 in Europe the are gona rot 😂 to expensive if usa sell them for 999 here 1300 euro 😢 or more
@@frommatorav1 in Europe the are gona rot 😂 to expensive if usa sell them for 999 here 1300 euro 😢 or more
@Oliver-sn4be Even when the 5080 is available for me to get, I won't be buying it anyway. I was just telling the stock situation as it's been reported. I'm interested in the higher end gpu cards for technology sake but don't need one.
Get ready for that crappy connection to melt yet again on the 5090s since Nvidia will not accept the truth that it’s a extremely bad connector for powerful cards.
Or at least transfer it to location where cable can be properly mounted. I mean, why would ANYONE put cable in place where it doesnt hit case, its obviously inferior design.
Also screw ASUS for not replacing those melted connectors.
DO NOT BUY ASUS!!!!!!!!
TOMS hardware news posted today as of 5/12/2024
Asus quotes ridiculous $2,750 fee to replace chipped GPU power connector- docs back up claims of egregious repair pricing for $2,000 RTX 4090 GPU
Much like avoiding OLED panels for risk of burn-in/image retention. Is the risk reduced? Yes. But it's still a risk. I'm not gonna buy a card with that 12VHP connector for that reason. My 3080trio looks fairly clean even with its 3x8pin connectors due my cable management. Even if I was lazy with my cable management, who is really ever gonna see it but me?😂
$1399 for 5080 FE that you won’t be able to get at that price and $1599 for other models that still will sell through… mark my words
I don’t even consider new gpus anymore, im more interested in the used market, i’l wait for a new gpu to be released so i can buy the previous gen at what it should’ve costed, my current rtx 3090 costed me 500 euros
supply and demand will be marked up.
You're thinking way too small. 1899 minimum for the 5080 and then it goes up from there.
@@Rubysh88yeah gpu’s are basically the new car. Most people don’t consider new vehicles either.
@@Rubysh88 I do just the opposite. Buy the new card at launch, sell for 3/4 the price I paid before the next cards launch. Timing is key of course, that and keeping your card and it's packaging in mint condition.
don't call me shirley
roger, roger
ok, donn
Ok... Shirley 😏
No, I’ve been nervous lots of times.
It's big building with patients but that's not important right now
If HT is gone, does that mean they can now move a whole thread easily if something stalls? Do the pipelines have junctions to throw the thread to other pipes if stalled? Or are they saying modern software doesn't cause the stalls that were common that necessitated the HT solution?
It's just so that they can sell you the exact same CPU but with HT enabled, for the next "generation".
Why the hell is Core ultra 9 285K,Core ultra 7 265k & Core ultra 5 245K It doesn't make any sense? Shouldn't they be 295K,275K & 255K?
C'mon, you're being logical. This is Intel. It's not like AMD and Microsoft are known for good naming schemes, either. I thought the same exact thing, btw.
I wasn't going to buy Intel in my next upgrade, but Intel's new naming scheme definitely guarantees I will go Zen 5, if their insistence on adopting the bulldozer no-hyper-threading and lots of little cores philosophy wasn't a "we're having trouble shrinking our process node" kind of red flag already.
Re the ad,
One of those pins on the 4-pin fan connector is a speed signal, that goes back to the motherboard so it can see if the fan is dead or too slow.
If fans can be "daisy-chained" from just one connector on the board, which of the daisy-chained fans is being read? What about the others?
Thanks for reminding me that "A Current Affair" existed.
Uh, nostalgia... Maury Povich "was the father" of that show, so-to-speak. It was definitely a better news show than what's around today.
A bit lost on the need for it this year. I mean, what games is there to sell it with. We had like Witcher 3 and Cyberpunk games that helped sell cards, but what is there this year? Stalker 2 maybe?
Normally I go with each new release, but even I am struggling to see the need for 4090 owners to jump again so soon, even more so if there is no games that really need the 5090.
4070 Super just arrived, but if a 5070/80 is coming in 3-6 months, then it's going back. Currently Rx 470
*The big.LITTLE era:*
*heat is a concern*
*stability issues*
*IMC issues*
*DDR5 tech issues:*
*heat is a concern*
*stability issues*
*The long and short of it is: You have to tinker like a MAD MAN in the BIOS and find the PERFECT mixture and tuning. Or... You have to make a sacrifice or two, to get a stable system.*
*ie: dial back your ram frequency and tighten the timings, ramp up the voltage and pray for hitting the lottery and get your advertised frequency running stable, upgrade AiO cooling or custom loop to something massive like a copper rad, 420mm setup (3x140mm fans), stronger pump, higher static pressure fans, etc.*
I'm generally looking forward to get a new gpu after skipping the 40 series but seeing 250-600W ... what the hell nvidia, starting at 250W?
This rumor came up last gen. Probably engineering samples being pushed. With Amd not challenging the top end, they shouldn't push into higher watts. Who knows though.
Because they are not designing FE cooler for their low end card (like 4060).
Because this isn't the card wattage. This is the cooling capacity of the FE model. Nvidia overengineer their FE cards. The 4090's shroud is rated for 600W.
ikr! it should END at 250w
5080 350W. 5090 600W.
That subtle Midsommer background was great.
god i wish more companies would just adopt an annual revision of a product and literally just call it the "productname 20xx" so you can just easily reference how old it is and whether or not you ~might~ want to consider a new revision upgrade.
How expensive do we think the 5000 series cards are gonna be? 3080 RRP seems so long ago 😢
Scalpers tend to dictate prices at launch. Then again Nvidia were the scalpers for 4000 series, so I'd bet dollars to donuts that 5080 will be at least 1300USD.
Get that RTX 3090 for cheap ?
Depends if the scalpers will get them or if the demand will be high.
THANK YOU!!! now I will stop getting hate by saying these people are just guessing based off what nvidia has done the past 10 years 🤣
AIMD RAIZEN for AILL!
AI See what you did there!
faice palm
With a bag of raisins with Raiden on the front.
The 80 series makes the most sense to release first as they can sell the most cut down die that matches the 4090 for the biggest margins. What would have been a 70 series die in the past, but for current 90 series price tag
Perfect! I'll finally be able to buy an RTX 4080 at a reasonable price when the 5080 and 5090 drop.
Not really they gona be overpriced for 8 to 9 months
What would be a decent price?
@@Deathwish699 I can wait 8-9 months. I have a 3080 now.
@@fawkkyutuu8851 That all depends upon your own purchasing habits. For me personally, more that $600-$700 USD for a graphics card is too much.
Your delusional new 4080s will still be roughly the same price and second hand 4080s by people wanting to upgrade to the new 5000 series will still price them high.Your better off just getting a new 5000 series as it will have a longer lifespan and better performance.
No one should take what Moore's law is dead says seriously. He gets it wrong so often that it has become a meme.
Are you confusing "what is currently in process" to "what will be reality in the future"? I think most people saying MLID is wrong aren't actually listening to the words he is saying.
No way bro. Are you telling me *RUMORS* are not accurate. THE HORROR.
@@TheHighborn It isn't even that. I am betting the OP thinks MLID is dumb for saying "Arc is canceled" when that was never what he fucking said. The true meme is how bad people are are listening comprehension.
lol you got attacked by these MLID defenders brother its hilarous when even a well known reliable leaker on twitter and some others knew how clown MLID is and meme'd on him too
He iw ambiguous
............................and here comes the 5080 at $2000 with a massive 8gb vRAM LOL
Listening to Paul while I try to salvage my brother in laws pc after he spilled coffee into it was a great way to start my Sunday. Thank you Paul!
They said that the 40 series TITAN Class GPU was 600 watts and had 48 VRAM or that was the rumors I remember, I don't have high hopes for 50 series GPU's having 600 watts and not seeing melting connectors like what happened in the 4090's... hopefully Nividia has addressed that issue this time around...
Oh boy a new PCIE standard, now they can catch fire on both ends!
Thanks for changing the background music Joe! 🙏
It is mentioned that NV releases RTX 5080 due to china restriction and NV wanted a global launch where their first BW card can be sold in china as well.
In comparison, the rumored RTX 5080 have the same compute power as 4090D which basically circumvent the GPU sanction against china. A non full GB202 GPU (94 SM instead of 96), clocked at 3GHz is still within the limit of the restriction.
"Like I did in college"...yep 😟...totally just in college 😨...definitely didn't use a box fan as side panel in my 30s😰...nope!
P.S. Sorry if any Gen Zers felt attacked with all my ellipses usage 👍
I find LPCAMM interesting because it allows for previously non socketable memory to be socketable. LPDDR is different from SO-DIMM and is the kind of memory that previously was only possible to be soldered directly to the board.
CAMM and CAMM2 would be the one they're trying to replace SO-DIMMs with.
Intel launches new "K"-series CPUs - everyone skips this part...
Your news are put together with so much love and the pictures are top notch! Thanks to you both.
Because that in itself isn't news? What did you think they were going to do, launch new CPUs without K models? THAT would be a bigger announcement.
I am going to stick with my 7-10700k and 3080 until they either break, or they can no longer play the games I play. I initially estimated I would be running this setup until the 60 series, but I am now thinking this setup could last even longer. The more they cost, the longer I am going to go between purchases. My next computer is going to be a monster.
I don't think anyone cares about video cards anymore when they can't afford food and housing.
Speak for yourself
The 5080. 16gb of Vram. £1200+. And a power connector that melts. I'm so excited.
If they actually including the 12vhpwr connector again, this time I’m going to commit war crimes😡😡 (real)
Another glorious tech news video to accompany my Sunday morning visit to the thunderbox. It's becoming a Sunday tradition for me, followed by roast beef and Yorkshire puddings for dinner.
I really hope amd will make none "ai" cpu's so i can avoid this crap!
lol its just software, not a brain
its not like you're getting less CPU just cus they made it AI optimized. But I mean, if you want less, go buy last gen
Fun fact, all processors are AI processors, the AI part means noting, just shirt hand for "fast at specific calculations" like GPU, wasn't a thing for many years was just part of the CPU.
So you already have an AI CPU, and GPU and Phone and .... You get it. Just won't be as fast as possible for the workload.
i'd rather have more freq and/or cores in place of minus the ai part. is it just instructions built into cpu or is it another set of cores or such dedicated to just the ai function of the os/software?
@@Tea_1745 both currently. But as the software develops there will likely be more specific hardware for it.
There are parts being included now for the "AI CPU" stuff. And we don't use them, yet, but I could see in maybe 10 years games taking advantage of it for their "AI" like NPCs and stuff. But that's purely a guess right now.
i can't wait for the ROG Ally Zero, ROG Ally ZX, ROG Ally Legends, ROG Ally Battle Network, & ROG Ally StarForce
.... we don't talk about the ROG Ally X-Over
And the adult version with improved haptics... The ROG Ally XXX
Really does not matter because the new cards will cost silly amounts of money that many of us do not have. 5070ti for $1000? I do not think it will cost much less than that. I am looking at a 7900xt or 3080 for about 500. Currently have a vanilla 3060.
Will the 5090 come with a fire extinguisher accessory option?
The reasons for moving away from SMT are pretty easy to understand given the P core E core architecture dynamic and Intel needing to cut maximum power draw at any cost.
It would make sense if the 80 tier launched before the 90 tier. Ever since GTX was a thing the 80 tier has always been their first one to launch, even if a 90 tier was announced at the same time, the 80 tier would be available first. Good example is the 600 series, the GTX 690 was released after the 680 despite being announced in the same event.
Hope everyone that complained about display port 1.4 will get a new monitor. There’s only 2 to chose from that I’m aware of
wowowowo the naming scheme change will be a very good one... going from 5diggits to 3 may be a big loss tho ahahahha thanks Steve.
and Paul as always, thanks
Inb4 the rtx5080 is $1500 MSRP and the rtx5090 is $2000 MSRP
= Nvidia gets no money at least from me. That's a pretty large increase, so not likely to be the price.
$1200 - $1500 for 5080 based on GB203. Full blown GB202 (if two GB203's put together) could be $3000. Sounds absurd but with nothing from AMD or Intel to compete at those levels for a while, why not from Nvidia? Even bigger curve ball would be using GB203 for vanilla 5090 and use GB202 for a 5090 Ti (AI focused card). With AI being the focus of Nvidia and no competition at the high level, who knows that Nvidia is thinking for launching Blackwell. DLSS 4 will be put in the spotlight as mandatory to get performance out of 5080 and below.
I really like the intro 😉👍! You can make the episode worth watching even if the news are nothing to write home about.
Intel could have named their new Lineup AI 9 instead of Ultra 9 or whatever it is now. Missed Opportunity.
You old people can't keep up ? Never needed that RTX 4090 ?
Missed Opportunity !
I bet AMD rAIzen CPUs could squeeze out a lot more performance for Wine on Linux if given enough time.
“Aggressive design”…
What does that even mean?
It punches you in the face as you pull it out of the box lol
Good job Asus, for ruining your reputation 'right before' launching a new Ally with all the waranty _'scams'_ and faulty hardware in most, if not all, first gen Ally's!
OMG, they're going to announce an upcoming announcement!
Intel still at minus two P cores on their flagship CPU, meh...
7:55 The C in LPCAMM2 stands for compression, not consumption.
I wonder if Intel is going to rush through Battlemage to get to Celestial's architecture improvements. I would imagine Celetial will have improvements based on what programmmers and developers have observed on Alchemist. So instead of spending toomuch time on the intermediate Battlemage, effort may be focusing more on Celestial with mobile being the initial test rollout.
Architecture improvements don't really work like that. CUDA cores are an insanely mature all-purpose parallel compute technology that is extremely well optimised from a software and hardware perspective. Tensor Cores are much more specialised FP32 linear algebra machines. It's been finely balanced in the consumer market but H100s are literally just tensor cores.
Intel's XE cores are literally just half 256 bit ALUs and half 1024 bit ALUs. It will get better in the next gen but the real breakthroughs are 2 to 3 generations away.
Whilst not as chaotic but an appropriate or equivalent example is infinity fabric. At first it was more of a hand brake on performance but in the fourth gen 7000 series - it's what is really making the difference. XE cores will likely experience a similar trajectory.
MFG's will need to start replacing cases for high-end pc's with dorm refrigerators.
that intro alone warranted a sub... absolutely brilliant! :D
5080 and 5090 gonna be priced through the roof. Wouldn't expect a 5080 for any less than $2000 usd retail but there's gonna be little to no stock til mid 2025 so get ready for scalper pricing of $3k or more. 5090s are gonna go straight out the backdoor to AI farms.
Completely delusional
nah that 32gb (if they even give it that) isn't enough to make it interesting to AI "farms", maybe single-card hobbyists
still better to load up on 3090s cuz nvlink (nvlink was removed 4xxx+) or to just get 48gb cards like the rtx6000
the vram matters much more than the compute power in the single server world, that's why they're scared of giving the gamer cards more vram
Bro gets a like and a sub for the Current Affair throwback.
So, you're saying to wait on buying a GPU?
Sadly, the replacement for SODIMM seems to be embedded laptop RAM with no expansion slot.
Tech Marketing departments really are over-paid.
Oh, did that fan trick (sticking a desk fan in the case) back in the old days, only problem was if you changed the fan speed the computer might reboot due to the emp.
Yep, ROG Ally X will take up to a 2280, have more than 16 GB of RAM, & have "way more than just a 40% better battery." Whatever that means, exactly... 🤷♂ & They also moved the SD card reader away from the heatsink, hoping stopping the SD read issues.
Does anybody know what the advantage might be for removing hyperthreading from the processor?
So we get to choose from no hyperthreading and AI bullshit? Great...
Sooo I shouldn’t pick up a 40 series right now?
We still don't have 5.0 standard, why even bother with 6.0? A bunch of 1x 5.0 slots would be so much more useful than x4/x8/x16 slots.
I think the 50 series will be just a little faster with the correct number of cuda cores that were missing from the 40 series cards. Long live my EVGA 3090!!!
I would have been interested in the number of shaders of a RTX 5080. As I do have the 3080 as a water cooled eGPU with 8000 cores, it is great for AI models.
I still think we will eventually see CAMM sockets directly on CPU packages at some point. Especially now that glass CPU substrates are arriving.
The best thing you can do after installing a new cpu is undervolt.
Thou I get that not everyone wants to tinker themselves, but technically you can't kill CPU by lowering the power and if u go too low, system will start to hang, so u know immediately when u give not enough power.
It might take about 10 minutes to get some undervolting done and it's much easier than oc'ing, but giving you much more temp headroom.
i feel like beside fsr, ray tracing and a few other features these new cards are just getting larger and drawing more power to achieve more performance without a lot of optimizations.
I think the idea of them doing the dual ship design and waiting till next year is the most plausible because it allows them to sell a shit load of 5080s to the gamer community and then they actually don’t have to meet the high demand of trying to launch a 5080 and 5090 at the same time and everybody holding out for the 5090 and then they don’t sell as many cards in general. So we’ll see however I could be wrong on that part because most people who already have a 40 series cards are not gonna see more than a 20% gain and therefore selling the 5090 at the same time could very much lead to having more sales overall. As the people with the 40 series cards will certainly jump at a 5090.
and how many connectors going to be melting with those ?
The GTX 5080 will be the first card whose price will match its name.
And you would buy it anyway.
Thanks for all the leather! LOL!
GTX?
Well, I won’t be at all surprised if the MSRP is half the number. Funny thought though!
I disagree with the comments on AMD's new naming scheme. I like it better than Intel's "Ultra."
I'm not sure what all AI tasks are normally handled on a CPU. I know image generation is typically done on the CPU if you're using an AMD GPU, due to the absence of CUDA. But DLSS and all that are ruled out as I believe they're handled on the GPU. Not that it doesn't make sense to add a dedicated AI processor to the CPU, like video encoding has had dedicated architecture on GPUs for a while, but I don't think it makes sense to put them in every model since most people will probably have no use for it. Idk, maybe I'm missing something or maybe they know something we don't.
RTX 5080, costs $5080.
And the RTX-5090 GFX cards will cost how much????
I can't wait to spend €2k+ on the new GPU! It's time to upgrade my 2080 ti as I'm currently only playing Pokerogue which requires massive GPU power.
When he was talking about Panther Lake, I thought I was hearing Cancer Lake.
I would say in some regards it makes sense for Intel to do a new naming scheme given it is now tile based design for the new naming where it was monolithic before.
Thereby no need to drop the i and have Ultra for some products while not for others.
My best guess for AMD AI naming is they have 45+ TFLOPs (maybe some other measurement) NPUs which seems to be the requirement for Windows copilot to run on device, which is what the new Snapdragon has. But otherwise I would think it is still the same design by AMD and not new design like Intel’s tiles.
Then I do wonder if any of the current lineup with NPUs will even be useful when they dont seem to be powerful enough for copilot thereby a waste for people to have bought it specifically for such things (like Ryzen 8000 series basically just bit more powerful NPUs compared to 7000).
if only intel revives the extreme series cpu product line up....
new gpus drawing up to 600w....but we arent gonna address the power connector that keeps melting them down.........
Saving up for a new build this year, waiting to see how the CORE Ultra stack up against the 1400 series
Done that...Both side panels off with 2 Box Fans in Push-Pull on my FX8375 with 2 R9 290X Sapphire OC 8gb cards in Cross Fire...thing STILL shut down even when set outside in 15 F temps. 8( very sad.