[SPON: Get Creality Ender 3 S1 3D Pritner for $149: micro.center/613b ] [SPON: Receive $25 OFF when you Submit your Build: micro.center/ybu3 ] [SPON: Build, Upgrade, and Save during Build Your Own Month: micro.center/gec0 ] [SPON: Check Out MC’s NEW Miami Store: micro.center/an29 ]
So Microsoft going back to it's roots, with Blue and Green, very interesting, being one of those owners back in the day with the Crystal OG Xbox. I still think you have the right idea though that you said in one of your most recent videos, they should just go with an Xbox OS with Windows, and let it run as a service which essentially it is now with Game Pass as it's in everything from Tv's, Consoles, PC's and now Cloud via Nvidia Shield etc. As for Intel for the GPU, not yet ready as we have seen with Battlemage, and ARC, they have got better with drivers but still nowhere yet near AMD quality yet, but I wonder if this is all being driven by the guy who now works for Intel who worked for AMD. I would like to see Blue and Green come back, and would now like to see a console to run everything Native once again, none of this DLSS and FSR, upscaling tech which started in the TV business a few years back and is just pushing a narrative for AI in general. I will give you an example, just take Starfield now on the Series S, with the most recent update we had this past week, the use of FSR 1.0 from 900p to 1080p, they adjusted the lighting but in the way it now shows how terrible the upscaling is going on especially with your scanner now on, it was blinded by all the Gamma before, not it's all to see, I see more and more Triple AAA games really now struggle on Series S, it's the console that sells the most due to the price however it's really is struggling to run the most recent games. I'm not sure FSR 3 can work on RDNA 2 but they should get it on XSS and XSX ASAP but it's 2024 and we should have everything now run Native, not at 720p being upscaled to whatever looking at your FFXVI, that is 2005 HD Ready graphics or PS3 / 360. I've been gaming since 1988 and have seen it all in the last 35 years, and even Phil Spencer said due to the FTC case during the ABK that if they don't hit the Game Pass targets in 2027, they will pull out of the business altogether. Remember Nokia? They did it before, they will do it again and I see a lot of resemblances with the layoffs, it was what they did at Nokia as I was on the ground when it did occur and then they exited the business.
Seems like a shitty idea, more hardware devs have to optimize for. Atleast consoles lately have been pretty much cheap computers which I assume makes it easier
Intel: Everyone, come use our fabs. We have plenty of capacity. also Intel Inside: "psst, fellas, once they move their chips to our fabs, let's grab up all of TSMC primo capacity for our chips."
I doubt Microsoft would go with Intel because companies like Intel and Nvidia re very difficult to work with. AMD's been a charm to work with and they offer the best CPUs and competitive GPUs to get the job done.
@@nntt9764 I'm sure Microsoft can get along just fine with Intel and Nvidia, they work with them constantly on software and hardware, specifically the Surface laptops. Microsoft had some issues with Nvidia during the original Xbox, but that was back when Nvidia was a much younger company and nowhere near the size that it is today.
Over 2 decades Intel neglected consoles. The OG Xbox used Intel more by circumstance as the Xbox team basically built a PC to convince Microsoft to do a console. It's utterly pathetic that Intel has neglected the console space and let IBM take the money during the generation after and then AMD dominate for 2 console generations. An alternative time line, Intel would have started their graphics development earlier and have stronger APU's available for consoles.
Yes, MLID forgot that 1st Xbox was using Intel Pentium III CPU and Nvidia GeForce 3 GPU. 2nd Xbox 360 was 3-core IBM PowerPC CPU + ATi Xenos GPU. MS changed not only CPU vendor but also whole instruction set (ISA). So I would not rule out ARM for next Xbox (leaks shows MS thinks about ARM). Especially when MS has its brand new server CPU with 128-core MS Cobalt 100 (using ARM's Neoverse N2 cores). Win 10 and Win 11 already exists in ARM version so it would be for MS zero effort.
@@PurushNahiMahaPurush Ironically Intel used to owned the best mobile ARM chips on the world for phones and PDA... it's was called Intel XScale (former DEC StrongARM). But Intel sold Xscale to Marvel (where it died) and tried to replace ARM cores with x86 Atom cores (same strategy as Intel used to destroy DEC Alpha in servers). Oh boy Intel tried hard, billions went to destroy ARM, but this time Intel hit physical laws (all modern x86 CPU are RISC inside since 1995 while decoding CISC x86 instructions into RISC microOPs - this additional HW translation costs transistors and higher power consumption in compare to pure RISC core). That's why old CISC x86 has no chance in smartphones and passive cooled laptops like MacBook M1 Air. x86 can live only in markets where higher power consumption is not an issue. Second major problem is that x86 costs more transistors so it's bigger area and cost more money to make (and is more difficult to develop because you develops RISC core and CISC translation workarounds on top of that). It's just a matter of time when x86 will be completely replaced by pure RISC ISAs like ARM or RISC-5
I didn’t hear anything wrong with it when first mentioned but around 10 minute mark I started to hear a few hits, not crackling. I had to stop watching at 11 mins to get ready for work so maybe it got worse but I’ve watched your videos for years and don’t remember any tech issues
@@MooresLawIsDeadIt sounds like traditional artifacts from having too short of an audio buffer. Try bumping up your sample buffer size one step up. I very well could be wrong, but the way the software for video recording / encoding may handle the input audio stream with less stability because of the higher compression factor when you are live-streaming. Simple setting change, no real consequence unless you actively monitor yourself in a DAW or through software. Heck, you could max your buffer sample count without any real consequence. Also try matching the input sample rate base multiple with the recording software, so the resampler DSP doesn’t have to work as hard. So if you prefer to use your max input rate and the recording software is set to 44.1KHz SR, then set your interface to 44.1/88.2/176.4K, And for 48KHz SR, 48/96/192K… just in case the resampler is buggy for high compression encoding with the streaming software when you are compensating for sampling frequencies with different base rates (phase/time incoherence).
One thing to verify the crackles aren't on your side, depending on the interface you're using, make sure recorded rate matches you sample rate(ie.44.1Khz, 48Khz, 96Khz). Also, make sure the buffer size isn't too low as well. I wouldn't be surprised if the buffer is set barely too low for yours, but I could also be wrong. For my Focusrite 2i4 anything above 192 is fine, 160 has the occasional crackle like in the video, and the lower under 160 I set it the more often crackles and other artifacts will occur. Apologies for any annoyance of unrequested tips, but I wanted to suggest anyways to possibly help.
The crackles are definitely on Tom’s end. First video I’ve seen today where they’re noticeable, and it wasn’t just the livestream. It’s very apparent in the recorded version too.
Sounds like you have system latency issues. An average computer outta be able to handle 128 sample buffers. A bad driver likely. Sometimes it is from mobo integrated peripherals.
As someone who records music in a DAW, I had forgotten before that I left my buffer low to get low latency, and it caused crackling and even lcauaed the audio to be stuck in a strange digitized sound that required restarting to fix a couple times until I recalled the buffer rate needed to be increased a little again.
Tom, why doesn't loose ends come in podcast format in all the usual platforms? i can only usually listen to multi hour long content during work(driver) so having it in podcast form would be greatly appreciated(if possible) thanks for all the great content
It used to be like that in the not too distant past, but if xbox has another series s, them game dev will probably drop them all together as programming for 3 separate consoles, 1 with no market share makes no sense
Would non-affiliated multi platform devs having a hard time even be considered a bad thing as far as Microsoft is concerned? They’re pushing their own subscription service and their own in-house dev teams pretty hard. If that just made it harder for other devs to compete, that’s a win for Microsoft I would think as long as it doesn’t push too many consumers into Sony’s camp.
@@benjaminlynch9958it's already pushing way too many to Sony to the point that Microsoft was thinking of killing off the Xbox and go full pc and cloud
@@rluker5344 its not down to the brand - this soudnslike this when you have like a webcam and a capturing other rt device on a shared pci bus or usb interface. happens to amd or intel whatever. Going for Brand Bashing is silly unless its nGreedia with their current politics which is everything else but healthy for the consumers. If people stilld efend that their beyond hope lol
I predicted years ago that Intel would try and court Sony or Microsoft at some point, as long as they had a functioning discrete graphics architecture. Intel is known for working very closely with OEMs, so a console manufacturer makes sense. Now whether or not either of them actually go for it? It seems kinda unlikely, at least for 10th gen. 11th gen though? Who knows.
MS definitively ending Xbox's back-compat program is telling. Then there's the whole "adorably digital" Brooklyn refresh. So, ending bc and signaling an all digital future does make sense for a move to Intel and/or Nvidia. I still think AMD is the most likely future.
Thx for the answer on mine! I like to think that I'm well informed on how the general process of bringing chips to the market works by now, but I figured I'd ask any way to see if I had any blind spots.
@saricubra2867 that's not the point of my comment. What I'm saying is if Tom is wrong on his zen 5 IPC leaks then I'm unfollowing this channel because of how smug and arrogant he is being about it. In fact it's really getting more and more harder to ignore his ego here lately. It's really irritating because he's constantly saying shit like "...which THIS channel already leaked" and "...which this channel already put out" and he's just so matter of fact about anything he says. Then on top of that he throws shade on other people trying to leak things. RGT basically gave this channel a nod and a mention a couple weeks back and was saying that HE (Paul from red gaming tech) was hearing an IPC uplift of more than 30%. Tom immediately done a video not acknowledging Paul mentioning him and crediting Tom for a leaked slide and instead low key bashed him adamantly saying that it's not gonna be 30%. I respect all leakers and take everything they say with a grain of salt but Tom indirectly talking shit about other leakers is just gross and distasteful.
Looks like he didnt talk about the Top News of the Day, the Release of the RTX 3050 6GB with 70W TDP. Price is much too high atm, but there are Cards with no Power Connector (no burning Connector issue;) and even a blast from the Past, a fully passive PalmX Model. :D - Note: Original 3050 8GB which has 2 variants 115/130W is about 20% faster.
Intel and NVIDIA thought they were too damn good to provide Sony and Microsoft with their hardware. Now AMD rules gaming they want their share back in console gaming .... Not going to happen
Wait, you think Nvidia with their stock at the highest and racking money as never before this to AI cares about console gaming space when they are profiting so much on the switch ? Are you delusional ?
I have a 2070 Super and this gen has been so dismal I'm probably just gonna wait for RDNA 4 and Blackwell. If those aren't great maybe I'll look into used 4070 TIs by then
sitting on a 2060s currently, same here. I can still play anything i want reasonably (also only a 1080p 60Hz Monitor) so i'll just wait. Firt have to upgrade my Monitor anyway.
What exactly is Microsoft's upside in switching to ARC? The hardware consumed WAY more power than Zen, it's less powerful, the drivers are way behind, the devs won't like having to deal with a completely new architecture. How much of a discount can Intel give to make all that palatable?
The beauty of consoles is that you can directly program to the metal so software optimization is not an issue at all just like with Apple. If devs optimize for it, it'll work much better than Ark on desktop, which will indirectly make arc desktop much better
Manufacturing capacity is a big deal. It took, what, 2-3 years after launch before the Xbox and PlayStation weren’t being scalped on eBay and were widely available on store shelves? That doesn’t just hurt hardware sales, but also software sales, particularly Xbox Live subscriptions.
@@benjaminlynch9958 What's more likely, that MS will take a big loss in power efficiency (CPU side) and power (GPU side) and dev familiarity, or that they will negotiate a bit harder with TSMC to get more fab space? Or maybe they can take a 3rd path, and have AMD adapt their designs for Intel foundries. (I bet AMD will leap at the chance to optimize their stuff on Intel process nodes on someone else's expense).
Microsoft going arm with nvidia graphics for low power could be something interesting. I don't see anything beneficial for an interesting cpu or gpu from intel unless they take that loss.
Hopefully they indeed go with that route with the PS handheld. I've always had that idea with the next handheld, even for PS5: make it play last gen natively at normal res and possibly play select current gen games at lower res. It's the most logical way to make one. That, or have some sort of integration with their smartphone division. Maybe even both, as mobile has its own sustainable ecosystem.
The only way I could imagine Intel making it sufficiently enticing to Microsoft to switch CPU/GPU vendors would be to sell the parts to MS at a loss. MS would still face significant investment in overcoming the compatibility issues this would introduce. Not as much as going to a non-x86 architecture but still something to consider when negotiating such a deal.
I'm pretty sure AMD's problem with R7000M was the comparative inefficiency of 7000 Desktop. Battery life matters! Thermals matter! Thin and lights are the meta right now.
Yeah RDNA3 can overcome the inefficiency over Ada lovelace on desktop with price/performance but on laptop that's just not enough (and intel's dGPUs are even lower down the curve in that regard)
@@olnnnRDNA3 is a worse architecture than Ada Lovelace. Basically no one will buy an AMD Radeon laptop because of lackluster 3D productivity thanks to terrible ray/path tracing perfomance.
@@electrocola9765 You see, I would buy that but that still has the same problem as buying an Xbox. It is gonna have Xbox games only but not steam while the Steam Deck has both.
Any updates on Intel's "Frame Generation" feature? They were last planning to overcome the delay problem without using any other technology by using a different technique (sacrificing image quality).
@@SAKTHITech Thank you for informing me. I didn't know that. Since I read it on a news site, I don't have full knowledge of the subject. If it can be turned into a feature within 1 year, as you say, it would be perfect. I think it would be a very fast response for Intel to offer such a feature within 1 year. I especially hope there will be a driver-based version like AMD does.
Honestly, an Intel-based Xbox with built-in XeSS accelerating hardware is probably the most interesting thing Xbox could do right now. It would help differentiate the system from PlayStation, and give Intel both the foundry business it wants and Arc the kick in the ass it needs. That being said, it's abundantly clear the senior leadership at Microsoft has given up and checked out of the console game, so despite being a compelling idea, it likely won't happen.
Intel can want to make the new XBox but if I was Intel I would make sure it kept a disc drive because digital only as the choice won't sell as well as they think.
Blackwell memory fix up. I'm thinking about previous leaks about 3GB memory chips. If those are ready for Blackwell launch it would make sense for Nvidia to just keep same memory bus at every tier and use 3GB memory chips throughout the entire line up. So at every tier there's a 50% increase in memory capacity. The key here is that if they swap to 3GB chips it would make sense for them to try to get it to as high volume production as possible to drive down the costs and then keep only inventory of different binning of the same 3GB chips for that generation. Some people might also mistake that increase in memory capacity in non power of 2 manner as a fake and round it down to next power of two chips and think bus width increase.
Question: Would a mobile equivalent of 7600XT 16GB make sense for laptop/prebuilt mini PC like minisforum hx99 thing ? It slowly becoming clear that future games will require much more bandwidth than 8GB GDDR can give, and 8GB Vram will surely come to a limit as higher & higher res textures are used. So would 16Gig Vram buffer make sense even if 7600(XT) doesn't have the power or hardware bus to address the entire buffer at once ?
32:24 this is an aside from your correct/valid point but "high end" all depends on how you classify high end, does it mean "some of the largest die/bus you can manufacture" if so then Vega got a high end card, but only with the Vega64(495mm) being a higher manufacturing class as the GTX 1080TI(471mm where as pascal went up to the full 600mm). The Radon VII(331mm) on the other hand was in a smaller manufacturing class than the 3060TI/3070/3070TI(392mm), the 5700XT was even smaller at 251mm just barely out of the traditional 180-250mm XX50 class like the 4060TI(188mm).
I hope Xbox can use Intel. It will help the Arc division to survive. In fact it will be a perfect match. Intel Hybid CPU design needs optimisation, so do Arc. Actually, Arc XMX core and RT core performance are impressive.
The thing I'm wondering most - right now Raytracing is a novelty feature, and we know empirically that it takes a 4080 or 4090 class product to run a game in FULL RAYTRACING at 1080p acceptably. And even then, it's a crapshoot - some games are still sub 60fps on 4080 @ 1080p raytraced. But raytracing IS the next step in 3D rendering. Whether people like it or not, it's happening. Especially if the "PS6" and "Xbox Herpderp" have robust enough raytracing performance, it could push devs to start designing their games around raytracing pipelines instead of raster sooner than we expect. How long do you think it'll be before we're forced to stop thinking in terms of "this card is better in raster, but this other one is better in raytracing" and actually HAVE to pay attention to the raytracing performance? That'll be a very interesting jump. And with regard to the RDNA4 launch - while I expect the highest end RDNA4 to be slower than the highest end RDNA3 in raster, I wonder if it'll be faster in raytracing? And whether that will be enough of a jump to make the purchase more significant specifically for future raytracing-required titles? Or if that jump will still not matter until RDNA5 and GeForce 60...?
Whenever we can conceivably see pure path tracing as viable (stable _native_ 1080p60) at $200-$300. That's it. Until that happens, it's not worth obsessing over.
This is just the opinions of a random guy on the Internet. I think we already aree at the point of "this card is better in raster, but this other one is better in raytracing" being part of the dialogue. I remember when the 7900XTX and the 4080 were being reviewed. With the deciding factor between the two being the 4080's RT performance. You do have a point where eventually we might hit that point where raster stagnants or even decreases but RT goes through the roof. With what MLID has said in that RDNA4 being the time AMD focuses on RT performance. Hypothetically we might get a card that in raster matches a 7800XT but gets the RT performance between the 4080 and 4090. But you are right this is a good point.
I think you're right on the money that future GPUs will win more on the merits of their RT performance over raster. Personally, I think the next generation of consoles will be a hard sell unless they can do full RT, like something to the effect of Cyberpunk 2077's Path Tracing mode. I just don't think there's any more raster techniques left to discover that would provide a generational leap in graphics that could wow the average consumer.
What I mean is I really think that if the next consoles push RT performance and RT graphics we might start seeing games come to PC sooner than expected that either REQUIRE RT, or look significantly worse without it. It's coming, the question is how soon?
Intel sounds desperate and confused - they are still losing revenue bad, so their genius plan is to enter ... a razor thin margin market like consoles... Especially considering their cpu deficiencies, and the fact intel needs a 256 bit gpu to compete with a 128 bit radeon.
This offer leak is exactly the potential market spoiling kind of move that concerned me with Intel, Arc & the AXG saga. Stronger competition against Nvidia's excesses is required not just weaker splintered uneconomic efforts that rely on deficit finance.
Sony relies on AMD for backwards compatibility. Microsoft's backwards compatibility is software-based, making them less reliant on AMD. I can absolutely see them jumping ship.
Jensen, Su and Gelsinger probably discuss everything behind closed doors. They can all increase performance dramatically, but chooses not to do so. Because of long term income potential... the withholding or more ram is the clear indicator of this.
Intel getting in to console can bring balance to the gaming industry, if Microsoft go Intel, then the consoles will have all GPU teams on board: Nvidia on Nintendo consoles, AMD on Sony Playstation, and Intel on Microsoft Xbox, and game devs will be forced to optimize for the Intel's architecture, thus helping Intel's graphics card on desktop to be well optimized enough to properly compete with AMD and Nvidia, plus Intel XeSS is superior to FSR on Intel's own GPU, so they will also have better upscaling compared to AMD powered Playstation, plus they are also ahead of AMD in terms of RT/raster performance ratio, so that will be better RT performance as well for Xbox, which is much needed considering that the main reason why there is so little game properly implementing ray tracing is because of RDNA's shitty RT performance on the consoles; all of this can finally make the console market actually interesting again: Sony have inferior performance but great exclusives, while Xbox has Game pass and the best console performance.
@@tringuyen7519 MS kinda have their own system for backcompatibility thru some form of emulation. So I don't think switching to intel will pose a problem
I will never again buy handheld from Sony. I had original PSP, I had Vita and Sony clearly showed they can fuck up support for their handhelds no matter what.
5:00 - this would be great if Microsoft would create a console with Intel or Intel/Nvidia. I don't think that this is a good scenario where everything is made the same way. 25:00 - 8800xt as fast as the 7900xtx? I don't think so. IMHO it will be as fast as the 7900xt. Even AMD said something like this that the 7900xtx still will be on top. So... 8800xt will be as fast as the 7900xt so as fast as the 4070tiS so as fast as the ~$600 5070 or maybe even slower... so yeah... I would say, don't buy $1k XTX now. 43:00 - ZEN2->ZEN3 +19% IPC, ZEN3->ZEN4 +14% IPC so ZEN4->ZEN5 +15-20% IPC will be nothing spectacular especially if the core count will be the same. Time will tell...
1) Intel & Nvidia working together to sell an APU for an Xbox @ cost? Are you crazy? 2) No one knows what a 8800XT achieve yet. 3) Zen 5 vs Intel’s Raptor Lake Refresh+++? Or Zen 5 vs Arrow Lake? Zen 5 will win against both easily.
@@tringuyen7519 1. Sony will make another $500 meh console and Xbox will create a $999 one but 4K 60fps max settings with raytracing. People would buy it right away. 2. Sure, I just said what I heard. 3. Yes sure, Zen5 will be faster than Intel but it doesn't change the fact that this won't be some mindblowing CPU so many people are hyped about already.
"Flailing" chip marker that made 55B in revenue during a failure of a year. Competition is good for all of us. AMD having to compete lowers prices for these devices.
Wow, so many jumping to the defense of "poor underdog" Intel... Yeah competition is good, but AMD is the competition, not intel. We'd need another decade of AMD dominating, for those positions to change meaningfully. Intel IS flailing! Watch almost any video for the past 2 years on this channel...or better yet, take 5 minutes to look at their financial report for 2023! Intel spends the equivalent of AMD's revenue on R&D! Only to completely run over by AMD's entire product stack! Intel can and must take these hits and they are far from going under, once they start to liquidate assets you can start to worry. For the record, I don't hate intel and I'm not an AMD fanboy. However I've been seriously burned by Intel before and then there is the small matter of almost the entire 2010s, when they had the entire market cornered and priced accordingly and their bloated R&D budget delivered what could charitably be described as modest gains. IMHO Ryzen is the biggest and best market disruptor to the home PC market since discrete GPUs! And now similar things are happening in the server market. Sooooo yeah, competition is good and this is it! Now it's time for intel to get competitive! Right?
@@lucasfranke5161True. Tom has often brought up that Nvidia desperately wants Nintendo to use modern hardware in their consoles for development reasons, but Nintendo's too cheap to go for anything recent. 😂
With the launch of DG1, i said what they need to do is offer a compelling APU to use in a mini console, microPC/laptop, or preferably a haldheld console(or a major desktop console brand like xbox/ps but fat chance of that), sure at the time most of these handhelds things had like a 4700U and costed $1500 with GT 1030 to 1050 level of graphics which is why they were so slow for adoption, the CPU was way too powerful, and the GPU was way under powered. But if intel had landed someone big at the time like for the GPD winmax. With a more balanced APU with 2-4 cores but then a massive GPU portion say 128 Xe cores and offer it for a compelling price for all of these miniPC and handheld, along with laptops, i think they would have gotten far more market penetration and then far more devs would consider optimizing for them. I was actually surprised with the spec of the A770, didnt expect a 70 class die or membus, or i guess 80 class now that Nvidia re-branded 400mm 256 bit to the 4080, instead of the 970, 2070, 3070 etc What i wasnt surprised about was that a 70 class card as a first real public entry didnt do well. Part of what surprized me with a 70 class card is that If you're going to go big, you have to go BIG, i'm talking 80TI/90 class 550-600mm 384 bit bus. If you're not going to target the BigDikEnergy large card market you should spend that allocation to target the market segment that moves volume, the 100-300mm 96-192 bit market, if you're making the card that alot of people are buying, because it is cheap and reasonable performance, devs are going to target it. Instead the A380, while acceptable price, only has a 96 bit bus, the card that should have been called the A380, or A580, was instead only ever used in the Arc Pro A60 ~270mm 2048 shaders, 192 bit bus. This should have been not only the top card, but the only die of the generation, and this should have been priced around $169-$199(this card costs less to manufacture than the actual A580 at $179). Give the A580 the full 2048 shaders and 12GB of RAM, give the A380 1536 shaders and 6GB of RAM but maintain the 192 bit bus, give the A350 1024 shaders and 8GB of RAM on a cut down 128 bit bus, give the A330 768 shaders on a 192 bit bus. Oh and make sure the A350 and below dont need a 6/8 pin supplemental power.
Intel would be ideal for the potential thin client XBOX, I think Intel still have the best media engine, which is ideal when all you're doing is streaming video from some server somewhere.
I seriously doubt Xbox will use non AMD hardware for their next console. For starters, it'd essentially kill their whole Back Compat library and push. We'd be back to waiting for updates for batches of games. They'll most likely opt to go for less custom arch and more standard RDNA 5 or 6 + Zen 6
Honestly, why is AMD letting Microsoft toy with them this much? AMD is currently in a position where they can talk back to MS and make some counter demands. Why don't they?
If you really think about it all 8000 series CPU cores, are Zen4c cores, since they got 2MB L3 cache/core compared to 4MB L3 cache/core on the 7000 series. L3 seems to be crucial for gaming loads. So why don't they utilize their 3D cache technology to save horizontal die space for more CUs, on their APUs ? Why not use the 3D cache for the CUs as well since DRR5 is also a huge bottleneck (at least compared to GDDR6). I not it's not something like copying & pasting, but why isn't in development?
If i boot up my xbox right now, i don't see any AMD logo, so no, we wouldn't see Intel logos on the xbox. And i really doubt Microsoft is reluctant to launch another console. They are commited to gaming and they need a box for those that don't want to buy a PC or handheld or whatever.
Microsoft is just the first of many to jump on board with intel. Idk why people are shitting on intel like amd wasn’t being shit on for a while before it changed things up. Xbox’s next console will be a hand held streaming device for gamepass.
I think people need to go to Wikipedia and check the GeForce 30 series: All GPUs not requiring the top die usually have two variants with different dies just to get rid of the bad yields. 90% yield that TSMC and everyone else quotes is for it's nodes is for fabbing tiny 37mm² test chips and not for 400-600+ mm² monster dies that have a much much lower yield and only by down binning can a real customer achieve anything near 90% in reality. Examples: GA102 is used as low as the 3070 Ti with 40% of the cores disabled // GA104 is used in the 3060 with 40% of the cores disabled
7900XTX should have been $800 from the beginning. For RDNA 4 to be competitive, there has to be a card with 7900XTX level of performance for around $600 ~300 watt monolithic GPU with 20GB of 21Gbps 320 bit memory. Basically 2x performance per $ compared to RDNA 2 SKUs MSRP. But much faster Ray tracing, atleast beating 4070 ti super in ray tracing.
@Tom: Gerald Undone had the same crackling issue. Only solution was to restart the stream in OBS (Not in UA-cam itself). Seems many people have this exact issue right now.
Have you looked into the mod that enables DLSS3 for all RTX cards? It converts the pipeline to FSR3 so it's a little fuzzy but it's a great FU to Nvidia for their gatekeeping BS to try and force people onto 40 series. Imagine how pissed you'd be if you bought a 3090, or especially a 3080ti or 3090ti just to have Nvidia give you the middle finger just months later?
Intel is better at ray tracing with gpus than amd maybe even better than nvidia. Its good for competition. I wish PS6 uses Nvidia for dlss and path tracing. AMD is so behind on GPU TECH. Only the size of the trasistors save them
High TDP ruins laptop experience so much that you basically disable boosting, and undervolt. APU's should be non-boosting ~2.5ghz all efficiency cores... or one ~3.5ghz, the rest ~2.5ghz.
If they go for Intel with Xbox, it will probably be with Arc Druid. Not Celestial as you referred to. And Druid would be focused even more on. It also makes complete sense for Xbox to shift over to Intel seeing how they are unable to beat Playstation while still working with AMD. And finally given how the Intel handhelds are beating the AMD handhelds at the moment, they have shown that they can do it. Even with the wildly inferior Alchemist.
Microsoft is now the most valuable company ahead of apple. If they choose someone else other than AMD, they can certainly afford it. Even Hamilton will race for Ferrari in 2025. Nothing is impossible
My guess is MS went to Nvidia probably first for the Xbox portable because of Switch success and expertise. Plus DLSS is much better than FSR etc... That's probably why in the documents there was talks of Arm cpu's in there.
I don’t think it will be a intel and nvidia combo for the next Xbox but a intel cpu and gpu that has a gpu that comes really close to what amd will offer. Something that proves to the world that intel can do good graphics.
AMD Strix Halo won’t compete with M2 Ultra/M3 Ultra 🙏 Remember, Ultra is the huge dual die configuration (>130bn transistors, 32 x 16bit memory channels, 76 core/9728 ALU GPU, 16 P-cores with 4MB L2 per core, 8 E-cores, AMX SIMD block, 96MB of SLC, ~7 display controllers, 2 NPUs, multiple video encode/decode engines, SSD controller, several Thunderbolt controllers etc). It’s as exotic as silicon gets 😍 Strix Halo will be competitive with M3 Max in some cases, but it won’t be an outright win by any means. The gaming performance of Apple GPUs is hard to place given the smaller number of Mac native games, but the compute side of things is much easier to measure. When it comes to compute, Apple has a huge advantage over AMD and Intel because Metal as a whole is more mature and well supported. For example, Apple GPUs are already well supported for ML workloads, and even old base M1 Macs can run them well (outside of Nvidia, Apple GPUs are the only ones that can be considered “just works” for e.g. local LLMs). Another example is Blender - Apple’s new ray tracing hardware allows M3 Max (~50W) to match the rendering performance of a 7900 XT! (again, aside from Nvidia, Apple is the only good option for Blender). Honestly, people don’t like to hear this, but I think Apple is doing a better job in graphics than AMD or Intel. They’ve invested in a great GPU architecture and (most importantly) a great software stack. Does it matter to gamers? Maybe not today, but who knows what the future will bring. (Not hating on AMD here btw - just giving perspective 🙏)
I advise Microsoft to stick with AMD for future projects. A couple reasons being backwards compatibility and AMD tech is still in motion and getting better!!
What AMD needs to do is give more value in software so they can justify charging higher prices and support more software. It's a virtuous cycle, what Nvidia did. I told you repeatedly YEARS ago that Nvidia was giving more value in supporting machine learning on even consumer cards, whereas even today AMD refuses to officially support rocm drivers on consumers cards except for the higher end 7900s. And AMD fanboys are constantly giving excuses on how AMD can't afford to hire developers or need time when they have had more than a decade and could have easily done it if they had started years earlier. AMD management simply refused to make money and got bailed out by Intel's misapplying Jack Welch's GE management to stack rank their company into technical oblivion.
so the next gen XBOX going to be needing a 800w power brick and water cooling for the 400w CPU? or is it going to be a bunch of atom cores and a UHD650?
will Granite Ridge really be named Ryzen 9000? it makes no sense, they changed the laptop naming scheme, so it doesn't overlap with the desktop one. sure, ryzen 8000G is Zen 4, but before Zen 3, that's how AMD named APUs
In their new mobile naming scheme the first digit is now the year of release, not linked to the gen anymore. If they want to make that consistent on desktop too, then the 8000Gs are 8000 because they came out this year. Granite Ridge released this year would then also have to be 8000. Though they could still decide not to apply that same logic to the desktop line up and instead keep using odd numbers for Ryzen desktop, even numbers for APUs.
1:02:07 is it the memory bandwidth that they care about most in AI, or is is memory capacity, because with 256 bit DDR5/LPDDR5 in theory you could get 1TB of ECC uDIMM on strix halo. And while Apple's solution could support ~512GB according to the sparse specs, Apple only offers 192GB, presumedly to maintain that very high bandwidth, which is what i argue is the wrong thing to do with the Z1X. Cmon Asus and Lenovo, i dont care if you have DDR5-25kCL3, if the GPU ends up using the SSD as RAM because you only have 16GB to share between the CPU and GPU, and all of the background tasks require 6-8GB
Because AMD doesn't use Big-Little on x86. AMD has nothing against my 12700K and 13600K, way too powerful and Intel losing money selling those for cheap.
[SPON: Get Creality Ender 3 S1 3D Pritner for $149: micro.center/613b ]
[SPON: Receive $25 OFF when you Submit your Build: micro.center/ybu3 ]
[SPON: Build, Upgrade, and Save during Build Your Own Month: micro.center/gec0 ]
[SPON: Check Out MC’s NEW Miami Store: micro.center/an29 ]
So Microsoft going back to it's roots, with Blue and Green, very interesting, being one of those owners back in the day with the Crystal OG Xbox.
I still think you have the right idea though that you said in one of your most recent videos, they should just go with an Xbox OS with Windows, and let it run as a service which essentially it is now with Game Pass as it's in everything from Tv's, Consoles, PC's and now Cloud via Nvidia Shield etc.
As for Intel for the GPU, not yet ready as we have seen with Battlemage, and ARC, they have got better with drivers but still nowhere yet near AMD quality yet, but I wonder if this is all being driven by the guy who now works for Intel who worked for AMD.
I would like to see Blue and Green come back, and would now like to see a console to run everything Native once again, none of this DLSS and FSR, upscaling tech which started in the TV business a few years back and is just pushing a narrative for AI in general.
I will give you an example, just take Starfield now on the Series S, with the most recent update we had this past week, the use of FSR 1.0 from 900p to 1080p, they adjusted the lighting but in the way it now shows how terrible the upscaling is going on especially with your scanner now on, it was blinded by all the Gamma before, not it's all to see, I see more and more Triple AAA games really now struggle on Series S, it's the console that sells the most due to the price however it's really is struggling to run the most recent games. I'm not sure FSR 3 can work on RDNA 2 but they should get it on XSS and XSX ASAP but it's 2024 and we should have everything now run Native, not at 720p being upscaled to whatever looking at your FFXVI, that is 2005 HD Ready graphics or PS3 / 360.
I've been gaming since 1988 and have seen it all in the last 35 years, and even Phil Spencer said due to the FTC case during the ABK that if they don't hit the Game Pass targets in 2027, they will pull out of the business altogether.
Remember Nokia? They did it before, they will do it again and I see a lot of resemblances with the layoffs, it was what they did at Nokia as I was on the ground when it did occur and then they exited the business.
Xbox and PS being on different architectures would make game performance comparisons super interesting
And the xbox run three times hotter
@@Wahinies the next gen xbox will be cloud hybrid. kinda pointless to compare.
@@Wahinies I got the popcorn ready
PS would win in every single category, not very interesting at all.
Seems like a shitty idea, more hardware devs have to optimize for. Atleast consoles lately have been pretty much cheap computers which I assume makes it easier
Intel: Everyone, come use our fabs. We have plenty of capacity.
also Intel Inside: "psst, fellas, once they move their chips to our fabs, let's grab up all of TSMC primo capacity for our chips."
Yeah....I mean if you're an investor you can't ignore what they're low-key saying...
Intel also have a tradition, you really going to take your unique design ideas to a corp with massive conflicts of interest?
@@RobBCactiveQualcomm & Nvidia will wait until Intel actually uses its Fab for its own CPU. 3/4 of the tiles on MTL & ARL come from TSMC!
@@tringuyen7519 Both are firms Intel are competing against, so like Apple and Samsung, its more than just process
@@RobBCactive This. If you use Intel Fabs they will steal your IP and use it themselves.
next Xbox will consume 400watts if intel gets the job lmao.
Lol
Yeah... if it is CPU only and has no GPU ;)
I doubt Microsoft would go with Intel because companies like Intel and Nvidia re very difficult to work with. AMD's been a charm to work with and they offer the best CPUs and competitive GPUs to get the job done.
And successive 5% improvements over each generation
@@nntt9764 I'm sure Microsoft can get along just fine with Intel and Nvidia, they work with them constantly on software and hardware, specifically the Surface laptops.
Microsoft had some issues with Nvidia during the original Xbox, but that was back when Nvidia was a much younger company and nowhere near the size that it is today.
Over 2 decades Intel neglected consoles. The OG Xbox used Intel more by circumstance as the Xbox team basically built a PC to convince Microsoft to do a console. It's utterly pathetic that Intel has neglected the console space and let IBM take the money during the generation after and then AMD dominate for 2 console generations.
An alternative time line, Intel would have started their graphics development earlier and have stronger APU's available for consoles.
Yes, MLID forgot that 1st Xbox was using Intel Pentium III CPU and Nvidia GeForce 3 GPU.
2nd Xbox 360 was 3-core IBM PowerPC CPU + ATi Xenos GPU.
MS changed not only CPU vendor but also whole instruction set (ISA). So I would not rule out ARM for next Xbox (leaks shows MS thinks about ARM). Especially when MS has its brand new server CPU with 128-core MS Cobalt 100 (using ARM's Neoverse N2 cores). Win 10 and Win 11 already exists in ARM version so it would be for MS zero effort.
intel seemed ready to abandon home computing in general as well, prioritizing server and laptops, until amd revitalized desktop cpus with ryzen
Intel has also neglected the mobile space. The got in too late and their Intel Atom offering was half assed and sucked so bad
@@PurushNahiMahaPurush Ironically Intel used to owned the best mobile ARM chips on the world for phones and PDA... it's was called Intel XScale (former DEC StrongARM). But Intel sold Xscale to Marvel (where it died) and tried to replace ARM cores with x86 Atom cores (same strategy as Intel used to destroy DEC Alpha in servers).
Oh boy Intel tried hard, billions went to destroy ARM, but this time Intel hit physical laws (all modern x86 CPU are RISC inside since 1995 while decoding CISC x86 instructions into RISC microOPs - this additional HW translation costs transistors and higher power consumption in compare to pure RISC core). That's why old CISC x86 has no chance in smartphones and passive cooled laptops like MacBook M1 Air.
x86 can live only in markets where higher power consumption is not an issue. Second major problem is that x86 costs more transistors so it's bigger area and cost more money to make (and is more difficult to develop because you develops RISC core and CISC translation workarounds on top of that). It's just a matter of time when x86 will be completely replaced by pure RISC ISAs like ARM or RISC-5
@@richard.20000As long as Micorsoft keeps bungling Windows on ARM, X86 will stay for quite a while.
If I may, every Loose Ends episode has audio crackling while Broken Silicon is always flawless.
Loose Ends is a Livestream, and weird YT issues happen sometimes as a result.
On my phone I can't hear the crackling even at max volume
On earbuds you can but its not too bad
I didn’t hear anything wrong with it when first mentioned but around 10 minute mark I started to hear a few hits, not crackling. I had to stop watching at 11 mins to get ready for work so maybe it got worse but I’ve watched your videos for years and don’t remember any tech issues
@@MooresLawIsDeadIt sounds like traditional artifacts from having too short of an audio buffer.
Try bumping up your sample buffer size one step up.
I very well could be wrong, but the way the software for video recording / encoding may handle the input audio stream with less stability because of the higher compression factor when you are live-streaming.
Simple setting change, no real consequence unless you actively monitor yourself in a DAW or through software. Heck, you could max your buffer sample count without any real consequence.
Also try matching the input sample rate base multiple with the recording software, so the resampler DSP doesn’t have to work as hard.
So if you prefer to use your max input rate and the recording software is set to 44.1KHz SR, then set your interface to 44.1/88.2/176.4K,
And for 48KHz SR, 48/96/192K… just in case the resampler is buggy for high compression encoding with the streaming software when you are compensating for sampling frequencies with different base rates (phase/time incoherence).
One thing to verify the crackles aren't on your side, depending on the interface you're using, make sure recorded rate matches you sample rate(ie.44.1Khz, 48Khz, 96Khz). Also, make sure the buffer size isn't too low as well. I wouldn't be surprised if the buffer is set barely too low for yours, but I could also be wrong. For my Focusrite 2i4 anything above 192 is fine, 160 has the occasional crackle like in the video, and the lower under 160 I set it the more often crackles and other artifacts will occur. Apologies for any annoyance of unrequested tips, but I wanted to suggest anyways to possibly help.
The crackles are definitely on Tom’s end. First video I’ve seen today where they’re noticeable, and it wasn’t just the livestream. It’s very apparent in the recorded version too.
@@benjaminlynch9958 yup. can hear em on my big ass hifi and my fancy cans lol
Sounds like you have system latency issues. An average computer outta be able to handle 128 sample buffers. A bad driver likely. Sometimes it is from mobo integrated peripherals.
As someone who records music in a DAW, I had forgotten before that I left my buffer low to get low latency, and it caused crackling and even lcauaed the audio to be stuck in a strange digitized sound that required restarting to fix a couple times until I recalled the buffer rate needed to be increased a little again.
Sounds like MS is debating whether or not to even release another Xbox.
Tom, why doesn't loose ends come in podcast format in all the usual platforms? i can only usually listen to multi hour long content during work(driver) so having it in podcast form would be greatly appreciated(if possible) thanks for all the great content
you can download just the audio track with yt-dlp
Because its live streamed and most podcast platforms don't allow livestreams
@@wile123456he could just record it himself and upload it as a podcast
@@wile123456 oh, i didnt know that. thanks
Intel Nvidia vs AMD AMD consoles would be crazy.
Yes Intel Nvidia console will be 2x more expensive , you wanna $1200 for console????? AMD will do same performance for $600.00 .
Would be basically Gamecube (IBM/ATI) vs Xbox (Intel/NVIDIA) all over again.
@@skywalker1991hahahaha sure amd fanboy
Getting a contract for new Xboxes would save ARC, but that would make multiplat focused Devs have a harder time. I kinda doubt it would happen.
As long as it does not go with nvidia, I am fine. Nvidia is toxic AF.
It used to be like that in the not too distant past, but if xbox has another series s, them game dev will probably drop them all together as programming for 3 separate consoles, 1 with no market share makes no sense
Would non-affiliated multi platform devs having a hard time even be considered a bad thing as far as Microsoft is concerned? They’re pushing their own subscription service and their own in-house dev teams pretty hard. If that just made it harder for other devs to compete, that’s a win for Microsoft I would think as long as it doesn’t push too many consumers into Sony’s camp.
Honestly yes. One could argue AMD having console contracts have keep Radeon running.
@@benjaminlynch9958it's already pushing way too many to Sony to the point that Microsoft was thinking of killing off the Xbox and go full pc and cloud
AMD can never get XPS and Surface Laptops. Cuz Dell and Microsoft don't care about what is good for their customers.
8700G APUs are really impressive, that's no way they ditch AMD.
you def have crackling that soudnslike its coming from a shared pci/usb bus constraint.
I was going to say it must be AMD hardware, but yours is better. And real.
@@rluker5344 its not down to the brand - this soudnslike this when you have like a webcam and a capturing other rt device on a shared pci bus or usb interface. happens to amd or intel whatever. Going for Brand Bashing is silly unless its nGreedia with their current politics which is everything else but healthy for the consumers. If people stilld efend that their beyond hope lol
@@_Melos It is tempting, but the shared data lane explanation is too legitimate to not give it credit.
I could consider buying a Surface if it had AMD inside
MLID that is the best thumbnail yet xD
I predicted years ago that Intel would try and court Sony or Microsoft at some point, as long as they had a functioning discrete graphics architecture. Intel is known for working very closely with OEMs, so a console manufacturer makes sense.
Now whether or not either of them actually go for it? It seems kinda unlikely, at least for 10th gen. 11th gen though? Who knows.
MS definitively ending Xbox's back-compat program is telling. Then there's the whole "adorably digital" Brooklyn refresh.
So, ending bc and signaling an all digital future does make sense for a move to Intel and/or Nvidia. I still think AMD is the most likely future.
Thx for the answer on mine! I like to think that I'm well informed on how the general process of bringing chips to the market works by now, but I figured I'd ask any way to see if I had any blind spots.
That's a lot of assumptions based on only the fact that Microsoft hasnt been in contact with AMD until just recently. Fun to imagine though!
Zen 5 IPC will determine whether or not I continue to follow this channel.
Alder Lake has higher IPC than Zen 4.
@saricubra2867 that's not the point of my comment. What I'm saying is if Tom is wrong on his zen 5 IPC leaks then I'm unfollowing this channel because of how smug and arrogant he is being about it. In fact it's really getting more and more harder to ignore his ego here lately. It's really irritating because he's constantly saying shit like "...which THIS channel already leaked" and "...which this channel already put out" and he's just so matter of fact about anything he says. Then on top of that he throws shade on other people trying to leak things. RGT basically gave this channel a nod and a mention a couple weeks back and was saying that HE (Paul from red gaming tech) was hearing an IPC uplift of more than 30%. Tom immediately done a video not acknowledging Paul mentioning him and crediting Tom for a leaked slide and instead low key bashed him adamantly saying that it's not gonna be 30%. I respect all leakers and take everything they say with a grain of salt but Tom indirectly talking shit about other leakers is just gross and distasteful.
Looks like he didnt talk about the Top News of the Day, the Release of the RTX 3050 6GB with 70W TDP. Price is much too high atm, but there are Cards with no Power Connector (no burning Connector issue;) and even a blast from the Past, a fully passive PalmX Model. :D - Note: Original 3050 8GB which has 2 variants 115/130W is about 20% faster.
Intel and NVIDIA thought they were too damn good to provide Sony and Microsoft with their hardware. Now AMD rules gaming they want their share back in console gaming ....
Not going to happen
Wait, you think Nvidia with their stock at the highest and racking money as never before this to AI cares about console gaming space when they are profiting so much on the switch ? Are you delusional ?
@@nekogami87Nvidia cares for clout and marketing. But the greed is strong.
@@nekogami87when GTA6 comes out in 2025, every gamer will want a PS5 Pro! PCs won’t see GTA6 until 2027.
Thanks for the leaks!
Thank you for the support. :)
I have a 2070 Super and this gen has been so dismal I'm probably just gonna wait for RDNA 4 and Blackwell. If those aren't great maybe I'll look into used 4070 TIs by then
i got a used 2070s for 150 bucks on ebay. currently using it for a 4k 60hz display. with dlss i can play baldurs gate 3 with almost max settings
sitting on a 2060s currently, same here. I can still play anything i want reasonably (also only a 1080p 60Hz Monitor) so i'll just wait. Firt have to upgrade my Monitor anyway.
What exactly is Microsoft's upside in switching to ARC? The hardware consumed WAY more power than Zen, it's less powerful, the drivers are way behind, the devs won't like having to deal with a completely new architecture. How much of a discount can Intel give to make all that palatable?
The beauty of consoles is that you can directly program to the metal so software optimization is not an issue at all just like with Apple.
If devs optimize for it, it'll work much better than Ark on desktop, which will indirectly make arc desktop much better
Manufacturing capacity is a big deal. It took, what, 2-3 years after launch before the Xbox and PlayStation weren’t being scalped on eBay and were widely available on store shelves? That doesn’t just hurt hardware sales, but also software sales, particularly Xbox Live subscriptions.
@@benjaminlynch9958 What's more likely, that MS will take a big loss in power efficiency (CPU side) and power (GPU side) and dev familiarity, or that they will negotiate a bit harder with TSMC to get more fab space?
Or maybe they can take a 3rd path, and have AMD adapt their designs for Intel foundries. (I bet AMD will leap at the chance to optimize their stuff on Intel process nodes on someone else's expense).
For a company to buy Activision for WAAY too much fucking money and then drag its feet on the next gen console is the most pathetic bluff.
Microsoft going arm with nvidia graphics for low power could be something interesting. I don't see anything beneficial for an interesting cpu or gpu from intel unless they take that loss.
It will never happen but it would be awesome if MS made a deal with Intel and Nvidia and put a really good Intel CPU with a nice Nvidia discreet GPU.
Seems Nvidia put cheaper components to drop the price on the Super. Instead of dropping the 4080 price.
Hopefully they indeed go with that route with the PS handheld. I've always had that idea with the next handheld, even for PS5: make it play last gen natively at normal res and possibly play select current gen games at lower res. It's the most logical way to make one. That, or have some sort of integration with their smartphone division. Maybe even both, as mobile has its own sustainable ecosystem.
The only way I could imagine Intel making it sufficiently enticing to Microsoft to switch CPU/GPU vendors would be to sell the parts to MS at a loss. MS would still face significant investment in overcoming the compatibility issues this would introduce. Not as much as going to a non-x86 architecture but still something to consider when negotiating such a deal.
Keep us posted about Xbox's news. I am very interested in the new console. Thanks!
I'm pretty sure AMD's problem with R7000M was the comparative inefficiency of 7000 Desktop. Battery life matters! Thermals matter! Thin and lights are the meta right now.
Yeah RDNA3 can overcome the inefficiency over Ada lovelace on desktop with price/performance but on laptop that's just not enough (and intel's dGPUs are even lower down the curve in that regard)
@@olnnnRDNA3 is a worse architecture than Ada Lovelace. Basically no one will buy an AMD Radeon laptop because of lackluster 3D productivity thanks to terrible ray/path tracing perfomance.
Intel/Nvidia makes a lot of sense for MS next box as it's going to be a hybrid console.
I'm not convinced there's going to be another Xbox
@@electrocola9765 You see, I would buy that but that still has the same problem as buying an Xbox. It is gonna have Xbox games only but not steam while the Steam Deck has both.
It will probably be a cloud machine.
I also think xbox will make that jump first.
Any updates on Intel's "Frame Generation" feature? They were last planning to overcome the delay problem without using any other technology by using a different technique (sacrificing image quality).
Oh, it was just a scientific paper. It might become a feature in a year or so.
@@SAKTHITech Thank you for informing me. I didn't know that. Since I read it on a news site, I don't have full knowledge of the subject. If it can be turned into a feature within 1 year, as you say, it would be perfect. I think it would be a very fast response for Intel to offer such a feature within 1 year. I especially hope there will be a driver-based version like AMD does.
Honestly, an Intel-based Xbox with built-in XeSS accelerating hardware is probably the most interesting thing Xbox could do right now. It would help differentiate the system from PlayStation, and give Intel both the foundry business it wants and Arc the kick in the ass it needs. That being said, it's abundantly clear the senior leadership at Microsoft has given up and checked out of the console game, so despite being a compelling idea, it likely won't happen.
Intel can want to make the new XBox but if I was Intel I would make sure it kept a disc drive because digital only as the choice won't sell as well as they think.
If you have a PCVR headset and like Battlefield type games, get War Dust. It's a perfect Battlefield conversion to VR.
Blackwell memory fix up. I'm thinking about previous leaks about 3GB memory chips. If those are ready for Blackwell launch it would make sense for Nvidia to just keep same memory bus at every tier and use 3GB memory chips throughout the entire line up. So at every tier there's a 50% increase in memory capacity. The key here is that if they swap to 3GB chips it would make sense for them to try to get it to as high volume production as possible to drive down the costs and then keep only inventory of different binning of the same 3GB chips for that generation.
Some people might also mistake that increase in memory capacity in non power of 2 manner as a fake and round it down to next power of two chips and think bus width increase.
Question: Would a mobile equivalent of 7600XT 16GB make sense for laptop/prebuilt mini PC like minisforum hx99 thing ?
It slowly becoming clear that future games will require much more bandwidth than 8GB GDDR can give, and 8GB Vram will surely come to a limit as higher & higher res textures are used.
So would 16Gig Vram buffer make sense even if 7600(XT) doesn't have the power or hardware bus to address the entire buffer at once ?
That is a very good point.
It does not need to use all of the 16GB of VRAM, just more than 8GB.
32:24 this is an aside from your correct/valid point but "high end" all depends on how you classify high end, does it mean "some of the largest die/bus you can manufacture" if so then Vega got a high end card, but only with the Vega64(495mm) being a higher manufacturing class as the GTX 1080TI(471mm where as pascal went up to the full 600mm). The Radon VII(331mm) on the other hand was in a smaller manufacturing class than the 3060TI/3070/3070TI(392mm), the 5700XT was even smaller at 251mm just barely out of the traditional 180-250mm XX50 class like the 4060TI(188mm).
I hope Xbox can use Intel. It will help the Arc division to survive.
In fact it will be a perfect match. Intel Hybid CPU design needs optimisation, so do Arc.
Actually, Arc XMX core and RT core performance are impressive.
The thing I'm wondering most - right now Raytracing is a novelty feature, and we know empirically that it takes a 4080 or 4090 class product to run a game in FULL RAYTRACING at 1080p acceptably. And even then, it's a crapshoot - some games are still sub 60fps on 4080 @ 1080p raytraced. But raytracing IS the next step in 3D rendering. Whether people like it or not, it's happening.
Especially if the "PS6" and "Xbox Herpderp" have robust enough raytracing performance, it could push devs to start designing their games around raytracing pipelines instead of raster sooner than we expect.
How long do you think it'll be before we're forced to stop thinking in terms of "this card is better in raster, but this other one is better in raytracing" and actually HAVE to pay attention to the raytracing performance? That'll be a very interesting jump.
And with regard to the RDNA4 launch - while I expect the highest end RDNA4 to be slower than the highest end RDNA3 in raster, I wonder if it'll be faster in raytracing? And whether that will be enough of a jump to make the purchase more significant specifically for future raytracing-required titles? Or if that jump will still not matter until RDNA5 and GeForce 60...?
Whenever we can conceivably see pure path tracing as viable (stable _native_ 1080p60) at $200-$300. That's it. Until that happens, it's not worth obsessing over.
This is just the opinions of a random guy on the Internet.
I think we already aree at the point of "this card is better in raster, but this other one is better in raytracing" being part of the dialogue. I remember when the 7900XTX and the 4080 were being reviewed. With the deciding factor between the two being the 4080's RT performance.
You do have a point where eventually we might hit that point where raster stagnants or even decreases but RT goes through the roof. With what MLID has said in that RDNA4 being the time AMD focuses on RT performance. Hypothetically we might get a card that in raster matches a 7800XT but gets the RT performance between the 4080 and 4090.
But you are right this is a good point.
I think you're right on the money that future GPUs will win more on the merits of their RT performance over raster. Personally, I think the next generation of consoles will be a hard sell unless they can do full RT, like something to the effect of Cyberpunk 2077's Path Tracing mode. I just don't think there's any more raster techniques left to discover that would provide a generational leap in graphics that could wow the average consumer.
What I mean is I really think that if the next consoles push RT performance and RT graphics we might start seeing games come to PC sooner than expected that either REQUIRE RT, or look significantly worse without it.
It's coming, the question is how soon?
@@sonicboy678consoles is where the majority of aaa game titles make their money so once consoles can do it, it'll happen
Not only would it be awesome for gaming but it would be wild for investors in either chip maker. Grabbed my popcorn before starting this one🎉
Saw RDNA 4, Zen 5, Playstation 6, and thought this was a recent video. Goes to show how far ahead MLID is...
once they reach similar performance in ray tracing both intel and amd i think theyll take more market share especially if they keep prices lower.
Nope. Wont happen. Nvidia mind share is too strong.
Intel sounds desperate and confused - they are still losing revenue bad, so their genius plan is to enter ... a razor thin margin market like consoles... Especially considering their cpu deficiencies, and the fact intel needs a 256 bit gpu to compete with a 128 bit radeon.
Do you think strix Laptops will be out for back to school or christmas ?
PlayStation handheld launching near 2028 would be too late just to run ps4 games. I’m imagining this releases 2026 at 299
This offer leak is exactly the potential market spoiling kind of move that concerned me with Intel, Arc & the AXG saga.
Stronger competition against Nvidia's excesses is required not just weaker splintered uneconomic efforts that rely on deficit finance.
Sony relies on AMD for backwards compatibility. Microsoft's backwards compatibility is software-based, making them less reliant on AMD. I can absolutely see them jumping ship.
Jensen, Su and Gelsinger probably discuss everything behind closed doors. They can all increase performance dramatically, but chooses not to do so. Because of long term income potential... the withholding or more ram is the clear indicator of this.
Intel getting in to console can bring balance to the gaming industry, if Microsoft go Intel, then the consoles will have all GPU teams on board: Nvidia on Nintendo consoles, AMD on Sony Playstation, and Intel on Microsoft Xbox, and game devs will be forced to optimize for the Intel's architecture, thus helping Intel's graphics card on desktop to be well optimized enough to properly compete with AMD and Nvidia, plus Intel XeSS is superior to FSR on Intel's own GPU, so they will also have better upscaling compared to AMD powered Playstation, plus they are also ahead of AMD in terms of RT/raster performance ratio, so that will be better RT performance as well for Xbox, which is much needed considering that the main reason why there is so little game properly implementing ray tracing is because of RDNA's shitty RT performance on the consoles; all of this can finally make the console market actually interesting again: Sony have inferior performance but great exclusives, while Xbox has Game pass and the best console performance.
Ye intel has better RT and upscaling that will def be more beneficial for a new console
@@dante19890& new Xbox won’t support any legacy games. Yeah. Sure. & Nvidia release CUDA into open source accessible.
@@tringuyen7519 MS kinda have their own system for backcompatibility thru some form of emulation.
So I don't think switching to intel will pose a problem
I will never again buy handheld from Sony. I had original PSP, I had Vita and Sony clearly showed they can fuck up support for their handhelds no matter what.
Well at least this time it's likely literally a portable PS5 or PS6....so it would need to be supported to some degree as long as the main console.
5:00 - this would be great if Microsoft would create a console with Intel or Intel/Nvidia. I don't think that this is a good scenario where everything is made the same way.
25:00 - 8800xt as fast as the 7900xtx? I don't think so. IMHO it will be as fast as the 7900xt. Even AMD said something like this that the 7900xtx still will be on top.
So... 8800xt will be as fast as the 7900xt so as fast as the 4070tiS so as fast as the ~$600 5070 or maybe even slower... so yeah... I would say, don't buy $1k XTX now.
43:00 - ZEN2->ZEN3 +19% IPC, ZEN3->ZEN4 +14% IPC so ZEN4->ZEN5 +15-20% IPC will be nothing spectacular especially if the core count will be the same. Time will tell...
1) Intel & Nvidia working together to sell an APU for an Xbox @ cost? Are you crazy?
2) No one knows what a 8800XT achieve yet.
3) Zen 5 vs Intel’s Raptor Lake Refresh+++? Or Zen 5 vs Arrow Lake? Zen 5 will win against both easily.
@@tringuyen7519
1. Sony will make another $500 meh console and Xbox will create a $999 one but 4K 60fps max settings with raytracing. People would buy it right away.
2. Sure, I just said what I heard.
3. Yes sure, Zen5 will be faster than Intel but it doesn't change the fact that this won't be some mindblowing CPU so many people are hyped about already.
5:27 So, a flailing chip maker is trying to land a contract to make a failing console? Yup, that checks out 😂😂😂
AMDs entry into the console market was before Zen (in a time where the CPUs really were not that great)…
"Flailing" chip marker that made 55B in revenue during a failure of a year.
Competition is good for all of us.
AMD having to compete lowers prices for these devices.
Bro what 💀 , ok i like amd too but sit down
Wow, so many jumping to the defense of "poor underdog" Intel...
Yeah competition is good, but AMD is the competition, not intel. We'd need another decade of AMD dominating, for those positions to change meaningfully.
Intel IS flailing! Watch almost any video for the past 2 years on this channel...or better yet, take 5 minutes to look at their financial report for 2023! Intel spends the equivalent of AMD's revenue on R&D! Only to completely run over by AMD's entire product stack!
Intel can and must take these hits and they are far from going under, once they start to liquidate assets you can start to worry.
For the record, I don't hate intel and I'm not an AMD fanboy. However I've been seriously burned by Intel before and then there is the small matter of almost the entire 2010s, when they had the entire market cornered and priced accordingly and their bloated R&D budget delivered what could charitably be described as modest gains.
IMHO Ryzen is the biggest and best market disruptor to the home PC market since discrete GPUs! And now similar things are happening in the server market.
Sooooo yeah, competition is good and this is it! Now it's time for intel to get competitive! Right?
@@ValdeSanus For another Xbox? Yeah, no. If anything, Intel would make a new Xbox more expensive.
It would be interesting to see Nvidia get involved
Nintendo uses NVIDIA, but of course their consoles aren't really trying to compete head on in performance
@@lucasfranke5161True. Tom has often brought up that Nvidia desperately wants Nintendo to use modern hardware in their consoles for development reasons, but Nintendo's too cheap to go for anything recent. 😂
I seriously dont know how series s was ever allowed to have 10GB of RAM for parity reasons.
With the launch of DG1, i said what they need to do is offer a compelling APU to use in a mini console, microPC/laptop, or preferably a haldheld console(or a major desktop console brand like xbox/ps but fat chance of that), sure at the time most of these handhelds things had like a 4700U and costed $1500 with GT 1030 to 1050 level of graphics which is why they were so slow for adoption, the CPU was way too powerful, and the GPU was way under powered.
But if intel had landed someone big at the time like for the GPD winmax. With a more balanced APU with 2-4 cores but then a massive GPU portion say 128 Xe cores and offer it for a compelling price for all of these miniPC and handheld, along with laptops, i think they would have gotten far more market penetration and then far more devs would consider optimizing for them.
I was actually surprised with the spec of the A770, didnt expect a 70 class die or membus, or i guess 80 class now that Nvidia re-branded 400mm 256 bit to the 4080, instead of the 970, 2070, 3070 etc
What i wasnt surprised about was that a 70 class card as a first real public entry didnt do well.
Part of what surprized me with a 70 class card is that If you're going to go big, you have to go BIG, i'm talking 80TI/90 class 550-600mm 384 bit bus. If you're not going to target the BigDikEnergy large card market you should spend that allocation to target the market segment that moves volume, the 100-300mm 96-192 bit market, if you're making the card that alot of people are buying, because it is cheap and reasonable performance, devs are going to target it.
Instead the A380, while acceptable price, only has a 96 bit bus, the card that should have been called the A380, or A580, was instead only ever used in the Arc Pro A60 ~270mm 2048 shaders, 192 bit bus.
This should have been not only the top card, but the only die of the generation, and this should have been priced around $169-$199(this card costs less to manufacture than the actual A580 at $179). Give the A580 the full 2048 shaders and 12GB of RAM, give the A380 1536 shaders and 6GB of RAM but maintain the 192 bit bus, give the A350 1024 shaders and 8GB of RAM on a cut down 128 bit bus, give the A330 768 shaders on a 192 bit bus. Oh and make sure the A350 and below dont need a 6/8 pin supplemental power.
Intel has nothing to compete with AMD for consoles, never had and never will lol
If they do go intel, it will be interesting to see how Microsoft will efficiently use the E cores in the intel APU.
Intel would be ideal for the potential thin client XBOX, I think Intel still have the best media engine, which is ideal when all you're doing is streaming video from some server somewhere.
I seriously doubt Xbox will use non AMD hardware for their next console.
For starters, it'd essentially kill their whole Back Compat library and push.
We'd be back to waiting for updates for batches of games.
They'll most likely opt to go for less custom arch and more standard RDNA 5 or 6 + Zen 6
They ended the Back Compat program years ago, and haven't added any more titles.
Honestly, why is AMD letting Microsoft toy with them this much? AMD is currently in a position where they can talk back to MS and make some counter demands. Why don't they?
Actually would like to see an intel xbox just so we end with more competition in the GPU space in the long run.
We have those, they are called PCs. 😜
Regardless of what chip they use. If the next Xbox is all digital it will fail. Massively.
They should release a 16-core X3D AM4 processor for the AM4 10Y anniversary.
If you really think about it all 8000 series CPU cores, are Zen4c cores, since they got 2MB L3 cache/core compared to 4MB L3 cache/core on the 7000 series.
L3 seems to be crucial for gaming loads. So why don't they utilize their 3D cache technology to save horizontal die space for more CUs, on their APUs ?
Why not use the 3D cache for the CUs as well since DRR5 is also a huge bottleneck (at least compared to GDDR6).
I not it's not something like copying & pasting, but why isn't in development?
If i boot up my xbox right now, i don't see any AMD logo, so no, we wouldn't see Intel logos on the xbox. And i really doubt Microsoft is reluctant to launch another console. They are commited to gaming and they need a box for those that don't want to buy a PC or handheld or whatever.
Microsoft is just the first of many to jump on board with intel. Idk why people are shitting on intel like amd wasn’t being shit on for a while before it changed things up. Xbox’s next console will be a hand held streaming device for gamepass.
I think people need to go to Wikipedia and check the GeForce 30 series: All GPUs not requiring the top die usually have two variants with different dies just to get rid of the bad yields. 90% yield that TSMC and everyone else quotes is for it's nodes is for fabbing tiny 37mm² test chips and not for 400-600+ mm² monster dies that have a much much lower yield and only by down binning can a real customer achieve anything near 90% in reality. Examples: GA102 is used as low as the 3070 Ti with 40% of the cores disabled // GA104 is used in the 3060 with 40% of the cores disabled
7900XTX should have been $800 from the beginning. For RDNA 4 to be competitive, there has to be a card with 7900XTX level of performance for around $600 ~300 watt monolithic GPU with 20GB of 21Gbps 320 bit memory. Basically 2x performance per $ compared to RDNA 2 SKUs MSRP. But much faster Ray tracing, atleast beating 4070 ti super in ray tracing.
@Tom: Gerald Undone had the same crackling issue. Only solution was to restart the stream in OBS (Not in UA-cam itself). Seems many people have this exact issue right now.
Have you looked into the mod that enables DLSS3 for all RTX cards? It converts the pipeline to FSR3 so it's a little fuzzy but it's a great FU to Nvidia for their gatekeeping BS to try and force people onto 40 series. Imagine how pissed you'd be if you bought a 3090, or especially a 3080ti or 3090ti just to have Nvidia give you the middle finger just months later?
Intel is better at ray tracing with gpus than amd maybe even better than nvidia.
Its good for competition.
I wish PS6 uses Nvidia for dlss and path tracing. AMD is so behind on GPU TECH. Only the size of the trasistors save them
High TDP ruins laptop experience so much that you basically disable boosting, and undervolt. APU's should be non-boosting ~2.5ghz all efficiency cores... or one ~3.5ghz, the rest ~2.5ghz.
I would honestly be very happy for Intel if they got this win, then need it. I say that as an AMD CPU and GPU user.
Pretty sure you just straight up missed that tipped question from AnonimuzzStarz here. Ouch.
If they go for Intel with Xbox, it will probably be with Arc Druid. Not Celestial as you referred to. And Druid would be focused even more on. It also makes complete sense for Xbox to shift over to Intel seeing how they are unable to beat Playstation while still working with AMD. And finally given how the Intel handhelds are beating the AMD handhelds at the moment, they have shown that they can do it. Even with the wildly inferior Alchemist.
what have you heard about Bartlett Lake? RGT mentioned it and I was wondering what you know
I do see a probabilty of MS going with Intel for a new console, NVIDIA though that would be a very stupid move IMO.
Good leaks
Finally, no more dumbed down PCs disguised as "consoles" (with the exception of the Switch of course).
haha intel Integrated graphics would be a nightmare for game developers I think.
Remember - I said INTEL wants XBOX...
Microsoft is now the most valuable company ahead of apple. If they choose someone else other than AMD, they can certainly afford it. Even Hamilton will race for Ferrari in 2025. Nothing is impossible
My guess is MS went to Nvidia probably first for the Xbox portable because of Switch success and expertise. Plus DLSS is much better than FSR etc... That's probably why in the documents there was talks of Arm cpu's in there.
I don’t think it will be a intel and nvidia combo for the next Xbox but a intel cpu and gpu that has a gpu that comes really close to what amd will offer. Something that proves to the world that intel can do good graphics.
AMD Strix Halo won’t compete with M2 Ultra/M3 Ultra 🙏 Remember, Ultra is the huge dual die configuration (>130bn transistors, 32 x 16bit memory channels, 76 core/9728 ALU GPU, 16 P-cores with 4MB L2 per core, 8 E-cores, AMX SIMD block, 96MB of SLC, ~7 display controllers, 2 NPUs, multiple video encode/decode engines, SSD controller, several Thunderbolt controllers etc). It’s as exotic as silicon gets 😍
Strix Halo will be competitive with M3 Max in some cases, but it won’t be an outright win by any means.
The gaming performance of Apple GPUs is hard to place given the smaller number of Mac native games, but the compute side of things is much easier to measure.
When it comes to compute, Apple has a huge advantage over AMD and Intel because Metal as a whole is more mature and well supported. For example, Apple GPUs are already well supported for ML workloads, and even old base M1 Macs can run them well (outside of Nvidia, Apple GPUs are the only ones that can be considered “just works” for e.g. local LLMs).
Another example is Blender - Apple’s new ray tracing hardware allows M3 Max (~50W) to match the rendering performance of a 7900 XT! (again, aside from Nvidia, Apple is the only good option for Blender).
Honestly, people don’t like to hear this, but I think Apple is doing a better job in graphics than AMD or Intel. They’ve invested in a great GPU architecture and (most importantly) a great software stack. Does it matter to gamers? Maybe not today, but who knows what the future will bring.
(Not hating on AMD here btw - just giving perspective 🙏)
Strix Point Halo will give AIB & laptop customers upgradable memory. & How many gamers play cyberpunk on M3 Ultra?
will amd increase core counts with Zen 6? or atleast make hybrid CPUs?
8500G is a hybrid APU. Perhaps you meant something else?
Hybrid ccd combination was also a thing with 7950x3d.
The question is ZenC ccd 16c + Zen 8c or Zen3d 8c for 24 cores.
I advise Microsoft to stick with AMD for future projects. A couple reasons being backwards compatibility and AMD tech is still in motion and getting better!!
What AMD needs to do is give more value in software so they can justify charging higher prices and support more software. It's a virtuous cycle, what Nvidia did. I told you repeatedly YEARS ago that Nvidia was giving more value in supporting machine learning on even consumer cards, whereas even today AMD refuses to officially support rocm drivers on consumers cards except for the higher end 7900s. And AMD fanboys are constantly giving excuses on how AMD can't afford to hire developers or need time when they have had more than a decade and could have easily done it if they had started years earlier. AMD management simply refused to make money and got bailed out by Intel's misapplying Jack Welch's GE management to stack rank their company into technical oblivion.
And years ago, we said the same thing about software...so idk who you are haha, but we've been saying everything you've said for a long time.
so the next gen XBOX going to be needing a 800w power brick and water cooling for the 400w CPU? or is it going to be a bunch of atom cores and a UHD650?
will Granite Ridge really be named Ryzen 9000? it makes no sense, they changed the laptop naming scheme, so it doesn't overlap with the desktop one. sure, ryzen 8000G is Zen 4, but before Zen 3, that's how AMD named APUs
In their new mobile naming scheme the first digit is now the year of release, not linked to the gen anymore. If they want to make that consistent on desktop too, then the 8000Gs are 8000 because they came out this year. Granite Ridge released this year would then also have to be 8000. Though they could still decide not to apply that same logic to the desktop line up and instead keep using odd numbers for Ryzen desktop, even numbers for APUs.
Intel wouldn't need to optomise legacy drivers in the next xbox, all games would use the latest direct x so thats all they'll need to optimise
is it possible to use Hyperthreading WITH Rentable Units? what is keeping AMD from using SMT-4?
I think with SMT-4, the juice may not be worth the squeeze
Killing hyperthreading would be bad. Moore's law is dead, doing a 19%-25% IPC increase is pretty hard, doing 30% not possible
If they mandate that, wouldn't it limit a LOT the development on PS5 platform, like what happened with the Xbox Series S?
1:02:07 is it the memory bandwidth that they care about most in AI, or is is memory capacity, because with 256 bit DDR5/LPDDR5 in theory you could get 1TB of ECC uDIMM on strix halo.
And while Apple's solution could support ~512GB according to the sparse specs, Apple only offers 192GB, presumedly to maintain that very high bandwidth, which is what i argue is the wrong thing to do with the Z1X. Cmon Asus and Lenovo, i dont care if you have DDR5-25kCL3, if the GPU ends up using the SSD as RAM because you only have 16GB to share between the CPU and GPU, and all of the background tasks require 6-8GB
I’ve never understood why 6-8 core is what AMD offers as the 3d option. After that surely your buying your cpu primarily for productivity.
Because AMD doesn't use Big-Little on x86. AMD has nothing against my 12700K and 13600K, way too powerful and Intel losing money selling those for cheap.