Why Microsoft switched from Intel to Power PC for the Xbox 360 | MVG
Вставка
- Опубліковано 21 лис 2024
- Microsoft and Intel's partnership stems back to the early 80's with MSDOS and Windows. Microsoft would use Intel to power the Original Xbox in 2001. Yet in 2005 with the next generation Xbox 360, they famously split from the chipmaker in favor of IBM and their PowerPC architecture - made famous by Apple and with PowerMac line of computers. In this episode we take a look at why Microsoft dumped Intel for the Xbox 360 game system.
► Consider supporting me - / modernvintagegamer
Sources :
► www.neoseeker....
► venturebeat.co...
► www.extremetec...
► www.edn.com/at...
Music in this Episode :
► LLS Main Menu Theme Gensokyo - Lotus Land Story
► Axelay - Stage 01 Music - Taro Kudo
Social Media Links :
► Facebook : / modernvintagegamer
► Twitter : / modernvintageg
► IG: / modernvintagegamer
► BandCamp : modernvintageg...
► The Real MVP Podcast : player.fm/seri...
#Xbox360 #PowerPC #Xenon
The *real* winner of the seventh generation of consoles?
IBM.
And the next was AMD. You know what was the same? Lisa Su. Coincidence? Maybe, I don't know. Just a fun fact.
@@johanlundberg8449 and is it coincidence that Nvidia have a raytracing card out and AMD does not? I would think not, in the same way IBM could not or would not provide Apple with better CPUs until after Apple left (What I'm saying here is that both these shifts came at a cost in the long term ... IBM lost a huge long term business partner and by the looks of it AMDs next gen GPU is now delayed until next year when the Nvidia 3k series is about to be released (maybe they will both hold back now)
I full appreciate that working with Microsoft and Sony will work out long term in desktop cards, but that won't be for a few years yet.
@@Nossieuk I am not so sure about RDNA2 being delayed, AMD and Lisa Su herself have stated multiple times and right up till recently that both Zen3 and RDNA2 are still 100% confirmed for this year.
There have been many rumors going around about delays, but AMD and Lisa Su have shot them down repeatedly. AMD do keep stating that RDNA2 is releasing before the next gen consoles and so is Zen3. So I do think they will release this year still, unless AMD and Lisa Su make an official announcement saying otherwise, I wouldn't believe any other information.
There have just been too many rumors about delays that keep getting debunked by AMD and Lisa Su, so I am just not believing anymore new rumors about delays. I will believe official word on the matter from AMD though.
Yeah, but Apple.. Apple Computers used IBM PowerPC RISC CPUs in their MACs, then they wokeup to reality and switched to Intel X86 CISCs for ultimate capablities.. But behold, they've recently become stupid again and now they have dumped CISC to go back RISC CPUs using custom ARMs.. The good thing though is that Apple users really can't complain now when PC users refer to them as consolers.. Hah
At the time it was really amusing how Microsoft bought PowerMac G5s in bulk to send out as the original Xbox 360 pre-release devkit
It kinda made sense, so at the very least developers would get familiar with the architecture
@@andreiarg *made
alvallac21 💀💀💀
well... it just hackingotich cheapst production....
That has always been Microsoft strategy. They are one of the top vendors of Mac software (office).
Man. The only E3 I ever went to was 2005, one month after spending my part-time job in college savings on a Dual 2.0GHz PowerMac G5 (it's still running in my house as a media server and RAID...right now). The only next-gen system playable at that E3 was the 360, yet to my surprise, it was nothing more than a bunch of my exact towers. Just felt so cool to be like "I bought this new workstation for music production, but hey, I also apparently bought an Xbox 360 Dev Kit!"
Microsoft will always be corruption!
Evil company, big and evil Epstein Bill gates people!
Hell yeah I used a dual cpu g5 for a long time. I still have it and play Mac OS 9 games on it from time to time.
So, Microsoft took a RISC with PowerPC....
*badummtssss*
Har har har I see what you did there
That is a terrible joke that would go over a lot of people's heads, lol...
Get out
*HAHAHAHAHAHAHA* HAHoooo ha...
10:19 - you probably meant to say "AMD", not "ATI" :)
Also, according to an interview with Nicholas Baker, the lead architect behind Xbox 360, one of the biggest reasons Microsoft chose PowerPC over x86 by Intel or AMD was because his team figured out that CPU clock speed isn't going to get much higher (especially when you take into account that power consumption and heat dissipation play a very large role in the design of a game console) and that the solution for increased performance in the long run is parallel execution. In other words - a multicore CPU. Problem was that neither Intel or AMD had anything like that on their roadmaps (at least at the time) while IBM did, so Microsoft chose to partner up with IBM.
EDIT:
For those interested in more details, here's the full interview with Nick Baker:
ua-cam.com/video/JP9TDLxq_1U/v-deo.html
Oh, and thanks for all the thumbs up :)
Amazingly accurate considering that we've barely increased clock speed since the mid 2000s. Multicore's become the focus of new chips, even though Intel got to 5ghz on their processors recently, with the 9900K coming out over a year ago.
AMD's managed to take IBM's place in consoles, being involved in every system for most of the 2010s until the Switch came out.
Thanks for the info! Very interesting, 5GHz is a pretty much a hard wall for PC users to push past in most cases or at the very least requires a lot of money spent on either the CPU or it cooling it... hell both really, so expecting consoles to ever reach that level comfortably is still unrealistic, let alone back in the mid 2000's - it's actually pretty cool that they had the foresight to recognise this, I believe there's some 'law' quoted about this very subject though isn't there?
@@TheMarc1k1 Moore's law.
AMD did hit 5ghz with fx 9590 but really hot (220w TDP!). Plus with failure of Bulldozer included (basically Pentium 4 repeats itself)
@@YoureUsingWordsIncorrectly that's why some fx 9590 came bundled with water-cooling. TDP was so close to gtx 480
@@bitelaserkhalif When I saw that processor, I felt like "...they... just literally grabbed one of their processors and sold it pre-overclocked right?"
RE: The question at the end on how did I feel about the move to PPC? I remember Apple abandoning PPC because of heat, and then when the whole "red ring" thing started happening on the 360, I was like...yep. There's the magic.
Dener Silva And to add a bit of irony (or going full circle), Apple’s RISC-based ARM processors have more in common with PowerPC then with the CISC-based x86-64 processors from Intel and AMD.
Great info, thanks man. I feel like such posts are underappreciated.
the wii u had a ppc chip that wasnt suspectible to overheating, it was more powerful than 7th gen chjps as well, despite having a low clock speed.
Yoloer Boy No, believe it or not the WiiU paled in comparison to the much faster Xbox 360 chip and of course the CELL. Look it up, even Metro 2033 devs said the wiiu sucked balls due to its weak cpu, but the 360 and whatnot was plenty even still.
It only had more ram and a somewhat beefier GPU, but better results still came out of the previous gen machines due to better cpu, not worse.
Dener Silva not, its igpu issue
I remember when I was an Apple true believer arguing against Intel fans, advocating for the superior performance of PowerPC RISC chips even though they had lower clock speeds than Intel Pentiums. "Wintel" (Windows and Intel combined) were the enemy. Then not only did Apple leave PowerPC for archenemy Intel, Microsoft left Intel for PowerPC. Boy was I thrown for a loop.
Well, Microsoft only left Intel on the console side. They obviously still work with Intel. lol
But yeah, I also had a friend that said the same thing about Apple. But at the time, the statement was also true about AMD processors being fastest than Intel despite their CPUs being faster.
Intel was heading downstream fast, but it was their mobile division that developed the core duo that saved them. I think that was also around the time that Apple went with Intel. The Core Duo is what put them back in the map.
Apple for the longest time used RISC in their computers. From 2006 to 2020, they used x86 Intel chips before going back to RISC with the Apple M chips.
The dual cpu g5 was quite amazing for its day. I have one in non working condition, a single chip g5 I used to dual boot tiger and Mac os9, and a 1st gen Mac Pro that both still work and get used from time to time.
Started with AthlonXP 1800 on nforce mobo with geforce2, never heard from wintels, cause it was the best bang for the buck, officially on the early Tom's hardware.
Lesson learned, simping for companies is a fruitless endeavor
Thought the first song was familiar. Turns out it's from Touhou 4! Good choice
same lol
2hu gang
lotus land story....
PC98 Touhou is godly.
Best Touhou
nintendo: we want a cheap, cool running, fast chip.
chip manufactures: OH is that what people want. we thought they wanted expensive, slow space heaters.
Intel when they designed the Pentium 4.
Bulldozer says hi
At least bulldozer ended up being cheap
AMD FX: thermal generator
@@KaitouKaiju the funny thing is that one and ps4 uses the same architecture
It's funny that they basically ended up selling the same heat monster to all three competitors.
Nintendo's CPU's were based on the very efficient PPC 750 en.wikipedia.org/wiki/PowerPC_7xx
The GCN from the wall pulls only about 40 watts of power. And that is everything combined. Graphics, cpu, ram, drive, controllers, blinkenlights, all combined drew half the power of Xenon by itself... And that was even more so with the Wii
IBM did a Microsoft against Sony, by licensing the chip to Microsoft.
@Officer94 a 10w chip can get very hot without sufficient cooling and in a small box ;) fortunatelly my wiis still run hot but fine...
@@Locutus people have to know the history of DOS to get that, I think. IBM rushed to get PCs out, initially as a "loss-leader" to get businesses into "real" hardware. They never imagined Microsoft would license to companies like Compaq. I guess IBM was looking at DEC's terminals for a business model. Bad move. Apple deserves credit for popularizing the idea that individuals should own/be in charge of their own computers. Commodore64s were too underpowered, and Amiga too late. Being older, I know history of "home" computers (now nearly all called PCs). Funny how many parallels there were in 21st century consoles. I imagine the same is true for smartphones.
Apple in 2005: “We’re moving from PowerPC to Intel”
Microsoft in 2005: “We’re moving from Intel to PowerPC”
Apple in 2020: "We're moving from x86 to ARM"
@@Volodimar what's microsoft gonna say
Microsoft staying as far away from Apple as possible.
Apple in 1994 "We're moving from Motorola 68k to PowerPC"
@@matthewrease2376 "Now announcing Windows RT 2"
One oddity of the Xbox 360 going PowerPC - when Microsoft made the original SDK for the 360, they went to the only major provider of 970-based systems ... Apple, namely the PowerMac G5 (G5 being the Apple name for the 970, and the PowerMac being the forerunner to the Mac Pro)
@AnEn Apple's Quad 2.7GHz G5 was MUCH quicker than the Xbox 360, They can compete with a 2009 Core 2 Quad, you can't compare their single 1.6 model to an Xbox. Even the Dual 2.5GHz had pretty even performance
@@WalnutSpice I think you mean quad 2.5 and dual 2.7. I have a quad 2.5 in my closet :P
I remember the early development kits was literally a PowerMac G5 with a Radeon X850 XT (PE?).
AnEn one thing also well worth taking into account is that AMD seems to be very willing to work with their customers to design custom products, Xbox One, PS4, and even current Mac computers all have custom non standard AMD silicon in them (most recently the MacBook Pro 16” has a very custom GPU option, a 5600 equipped with HBM2)
As an ex-IBM employee, this puts a smile on my face.
@referral madness Hardly. I worked on SPSS Statistics.
One of the devs at Microsoft had confessed that for the OG xbox, they turned down AMD at the last minute for their cpu and went intel. I wanted to also add that the Gamecube, xbox 360, XSX and PS5 are well built and balanced in general. PS4 and Xbone were unbalanced by the Jaguar arch limiting most games. PS3 was hard to program but very powerful. N64 was an engineering disaster that worked out.
now they all need the TSMC chips, then intel only was able to give them Ghz clock speeds
AMD is gone, was never a big party
The N64 was a tradeoff among the best solutions available at the time.
@A World Under A Spell We'll have to agree to disagree on everything you just said. I'm not even sure how you are coming up with your games list.
But I did some PC gaming as well as console gaming. None of the consoles could match running a PC with Voodoo 2 in Glide. But the N64 came somewhat close. The textures were laid out right, the animations were good for its time, the lighting was appropriate. The edge antialiasing it used wasn't as good as the FSAA that came later (post 2000), but I doubt the difference between the two would even have been noticeable with the N64 because of its intrinsic blurriness. It was like a low textured version of a PC.
The PS looked absolutely horrendous in comparison. The textures strangely contorted because it couldn't correct for perspective. And pixels shimmered, and polygons popped in and out because it lacked a Z buffer. It rendered artifacts galore. And its lighting ability was lackluster. Even at the time I knew it couldn't really do 3D.
I'll agree with you that the audio quality on the N64 wasn't good. But the N64 sounded better than the PS looked. The PS had the better controller, I'll give it that.
All the consoles of that generation were a tradeoff to meet the transistor budget. The N64 had a small texture cache size because that's all they could fit after implementing the rest of the hardware. The PS lacked a Z-buffer and a proper texture wrapper because of the same reasons. And the Jaguar didn't even have a texture wrapper.
But they were all tradeoffs of the best engineering solutions available at the time.
Apparently AMD wasn't even aware that the Xbox wasn't running their CPUs. must have felt like a serious betrayal at the time.
@@lucasrem1870 AMD is gone? What? PlayStation and Xbox have been using AMD hardware exclusively for their CPUs/GPUs since 2013. TSMC is just a foundry.
If you played a 360 as it started to red ring, it had extreme graphical freak outs. To this day, I still have PTSD and assume my console is dying whenever I see a graphical bug in a game, especially on Switch with how hot that thing gets.
Edit: To those asking, yes my Switch is an original 2017 version.
Yea, my white 20gb Xbox 360 went RROD 5 times and I was able to fix it 4 times and just gave up on the 5th...picked up a brand new 250gb slim, still to this day, it boots up fine, I still play games that is not on compatibility, and there is still a ton of them...
I remember there were a few games that would actually mimic a RROD and some that just had weird bugs or quirks that looked like it. I actually had on of the first RRODs and had no idea what was going on. It was only PGR3 and I figured it was just some weird game crashes, but then I started to see stuff on the internet about RROD. So, I sent my console back for the first of 3 times. Eventually I just bought a slim.
must be a switch 1.0 issue. I have the switch 1.1 (same specs, but gets like 2 hours extra battery life from having a better manufactured APU). and it doesn't get hot at all or glitch out.
@@awilliams1701 my Switch 1.1 has glitched out on me while playing SSBU a couple of times, I decided to move all the save data from that game to the console an so far it has been behaving properly... Of course there is no way of truly knowing if it was a MicroSD card fault, or an update, but I've been happy with it ever since (also I managed to fix my joycon drift with electrical spray cleaner)
My one friend has an Xbox 360 that has been having constant graphical glitches for a long time but his console didn't red ring yet
Seems like a good choice when they were planning the consoles in the early-mid 2000's.
they had back them chip are very underpower.. if you want a power like ps3 and xbox360... you had to deal with chip that don't had much gpu and heating like hell... taken them 10 years before they come up with new gen while ps4 and xbox taken only 3 years to get hardware refreash.
One small mistake: The CPU wasn't a Celeron when it released, but a slightly stripped P3. It wasn't until Coppermine Celerons that they had the same cache design as the Xbox chip.
not just that they didn't leave ibm to ati they left for AMD which had bought ati before this time ati was no longer ati anymore that that time there are more then one mistake in there
@@raven4k998 Indeed, IBM was then at the top of their game CPU wise. It wasn't until Apple went Intel, that that started to wane. A powerPC based cpu was really a great processor to have for a console. In fact, the Xbox one CPU is not that much faster, but it has branch prediction and out of order execution. Which makes it more powerful. But theoretically the Xbox 360's PPC should be able to keep up. (Graphics is a whole different matter of course :P ).
You never mentioned that before a working system was developed. developers were sent a special powerpc based mac to build their games on. Until a working developer kit was ready.
It actually wasn't anything "special", it was just a Dual 2.0GHz PowerMac G5 since that was the chip they used stock at the time. I left a comment somewhere on this thread, but I had purchased that exact computer before going to E3 the year the 360 was playable (and the PS3/"Revolution" Unveiled). Every single demo station had an empty shell of the launch 360 right next to the exact computer I bought. I had to ask a few of the demo drivers before someone with the right knowledge was able to answer (I think it was someone from Lionhead who was showing off a Fable tech demo). They told me they had a few of these at the studio and other than specific specs on each unit (which were the stock "build to order" G5s from Apple's site), it was indeed just a mostly off-the-shelf PowerMac running the SDK. Pretty cool stuff. It was so fun getting back home, looking at my tower and thinking "huh...I bought this thing for music/video production, but actually have an Xbox 360 DevKit, too" :)
2005 was such a weird time in the tech world. Microsoft went from Intel to Power PC right when Apple was switching from Power PC to Intel.
Those IBM PowerPC processors plus ATI graphics were a match made in heaven for Nintendo for the Gamecube, Wii, and Wii U. There weren't the most powerful chips, but they ran stable, cool, and fast with almost unheard of hardware failures, certainly much lower than the others at the time.
I'm pretty sure the CPU in the Classic Xbox was a Pentium III "tualatin" and not a Celeron. It gets murky because at that time because traditionaly the difference between the Pentium and Celeron was chache size, with the Celeron being made from Pentium III chips that had quality issues. This version had the same cache size.
Also Intel was being embarrassed big time because this generation of overclockers discovered that Celerons could out perform PIIIs. Basically cache speed was more important than cache size. PIIIs had twice the cache at half the speed of the Celeron but preformed worse. I believe this is why the tualatin version of these chips were so similar.
Anyway, like many generations of Intel chips at the time the LAST generation of chips outperformed the first generation of the NEW chips. I believe this had something to do with the quality of chip yelds always staring off poorly. That tualatin and even the mobile version were able to reach P4 performance levels with the right overclock. It was some dark times for Intel. Unfortunately Microsoft ran it CPU at 733mhz and not at the chips 1.4ghz max speed. My guess is, once again, yealds. Much more of the chips would run stable and cool at 733 then at 1.4 and it was cheaper.
All this to say that by the time they were developing the 360, all Intel had was the P4. Everything else was basically the same PIII chip Microsoft had already used.
The P4 was such an engineering failure that the 'Core' lines were based on a modified PIII core and the P4 'nerburst' architecture was brushed under the rug.
This is mostly from memory so I may have missed a few things. I had an AMD Athalon XP at the time and really enjoyed my gaming experience.
Edit: Thanks to everyone in the comments for their feedback. It turns out the Tualatin CPU was the fasted you could swap into the original Xbox. Not the stock CPU. It's still a PIII and not a Celeron. It's been almost 20 years now and I was bound to be a little wrong. You all are great.
Wicked lost and well written! In closing, Intel sucked hard and PPC was the better choice. Didn’t know about the Celeron performance delta though, interesting stuff.
it was a frankestein cpu, a tualatin core with half the cache running at 200 MHz bus...
@@FeeLtheHertZ i had a celeron A that moped the floor with a friend 's pentium 2
@@omegarugal9283 Yes, Celeron A! That's the name of the first Celeron chips with cache memory. I remember people in school having dual Celeron As at 1.1ghz. While I had an AMD K6-350.
"Microsoft's Xbox game console uses a variant of the Pentium III/Mobile Celeron family in a Micro-PGA2 form factor. The sSpec designator of the chips is SL5Sx, which makes it more similar to the Mobile Celeron Coppermine-128 processor"
It's a mixture of all sorts.
I love how everyone is abandoning Intel now. These prices are ridiculous
That's only if you want the latest and greatest. Modern processors last years for gaming these days. (most likely consoles stagnate the gaming processor needs until a new gen)
@@aaron1182 "Latest and Greatest." Intel still can't downsize their process. Still stuck at 10nm. Sure, their CPU's are still beating out AMD in terms of Single Core performance, but almost ALL modern applications support some form of multi-threading. AMD's SMT beats Intel's HT every time, while having slightly lower TDP's and lower manufacturing costs, and by extension can be marketed cheaper. Also, I seem to recall some malicious code being inserted into Intel firmware and software that crippled some AMD products intentionally. Something AMD has never resorted to.
Let's not forget that by putting the pins on the CPU, you also lower manufacturing costs for the motherboard. There's also no such thing as "mounting pressure" in that case, as pins don't need to be pushed against something vertically. Instead, they're clamped and contact is made laterally. A cheaper solution.
@@aaron1182 Intel's prices are way too high and AMD didn't performed bad. Price/Performance for Intel been awful since days, and since last year, AMD is very competing with Intel, for at least half of the price.
Intel performance results are mostly advertising nowadays, with their awful Retpolines/Spectre issue they lost ~40% performance with 1st mitigation, and nowadays is ~20% with improvements. Intel CPU security issues been increasing every year, like CVE-2020-0543
waltercool I think 40% is a bit over the top (I heard it was only a few percent)... If that’s true I’m turning them off though
I went through three Xbox 360s due to the red ring.
One of the things I like about retro gaming: I don't ever have to worry my NES or GameBoy CPU is gonna get so hot it causes a hardware failure just by playing them.
I went through like 4 ps2s. Not fun.
@@OliverNorth9729 was why it was my last console generation
22 years later and Cell still haunts Playstation in terms of PS3 backcompat
That's unfortunate, wasn't it the most powerful console at the time?
Seems like they should have developed that architecture So we wouldn't have all the low end x86 trash we have today.
In that case wasn't it IBM that pushed the Cell architecture to Sony in order to push a "home supercomputer" platform?
@@stevenswall x86 is way more powerful than PowerPC ever would have been. why would you put supercomputer architecture in a console, they are the last thing I would think be running one math equation a billion times.
I'm so confused, ps3 dropped way after the year 2000
@@PMARC14 That's pretty much what GPUs do. Modern integrated GPUs are basically the same idea as Cell.
Apple had to water cool the fastest G5 Powermac, and never managed to release a G5 laptop. PPC970 cores did run very hot. It's why they jumped ship to Intel, and now the same thing is happening again, hence the jump to ARM. MS and Sony knew this, and took the risk of heat problems.
I think the ARM jump has a lot to deal with the iPhone and iPad running on ARM, so they can unify platform development.
@@comicsans1689 Fairly much so I'd say, and even though Apple announced here in 2020 that by 2022 everything will be ARM, I'm fairly sure that choice was made back around 2016-2017 as Intel was the main name in x86-64 (aka: x64) CPU's at the time and AMD had get to start laying down the smackdown with the Zen architecture (they were just coming off the relative flop that was AM3/+ with their high heat-output, poor IPC and pseudo-multicore design (sharing key FPU's and such between what was otherwise pairs of true cores), which powers XB1/PS4) when Apple saw the problems Intel was having and was going to keep having at the time, so gave themselves about 5 years headstart to start moving MacOS from x86 to ARM to have it ready for their 2022 launch.
If Apple waited for say 2018/2019 to make the move from Intel for MacOS, they might have chosen to go with AMD for Zen-family CPU's to power future Macs, or they might have still made the choice to go with ARM and move from x86 anyway. We don't really know.
Yeah, and funnily enough, Power arch hasn’t been good for general compute efficiency since G4. They’re still useful, but mainly for enterprise server workloads as I/O, reliability/live failover, and security are greater concerns there (and Intel has been dropping the ball on those, leaving an enterprise server niche). There’s reason to believe IBM’s s390x and Power arch’s have been drifting into each other with s390x maxing out reliability and Power being more balanced. Right now, the best are:
Dedicated HPC: Fujitsu weirdly enough, using their newfangled ARM A64FX chips which are absolute vector compute demons, basically the modern-day Cell except not a nightmare to program
General-purpose/commodity: AMD because Zen2 arch is hard to go wrong with
Enterprise workloads: IBM, Power for better cost/performance or s390x for absolute maximum reliability
That's a true statement, but when you compare to the time era it doesn't mean much. AMD wasn't able to push 3ghz clocks on multi-core processors without using even more heat, and Intel wouldn't have anything heat/power competitive until 2006. If things had waited just a year it's easy to see how different this generation might have been though. Apple likely had advanced knowledge of how much better the core2 was compared to outgoing P4's, and they would have delayed their move from G5 to intel to match with this vast improvement. OTOH, Sony pushed back their ps3 launch due to blu-ray AACS fuckery, and if they had known the ps3 was going to be so far delayed, they could have used that time to gain a huge compatibility and performance advantage by leveraging a core2 based processor on their platform.
Given how far below competitive clocks the wii, u, and switch have been I doubt very much that their choice of platform matters in terms of having a good thermal performance. They just need to ensure they meet the bare minimum cooling requirements spec'd by the chip and hope that their supplier isn't lying about thermal/power properties.
but the RROD issue was because the GPU was overheating. It was never because the CPU......how are people getting this wrong.
While the 360 RROD was very common, it was almost always related to the GPU. The CPU rarely had anything to do with those failures.
Yep, the X1900 overheated and broke the solder.
@John Hooper You should say typical nVidia too. They had serious soldering issues as well, they lost the apple partnership because of that and laptops were dying with nV chips in the thousands. So this is not typical ATI, it was a common problem for all players when they switched to more environment friendly solders.
urrrrgh, now you tell me
@@hatesac1 Nope. The temperatures are not hot enough to melt the solder. The GPU die would detatch from the chip substrate due to poor design. Louis Rossman has a video all about this
@@hatesac1 Not the solder. It was more likely inside GPU. Macbook Pro's had similar fault. GPU dies itself and not for heat just. It was manufacturing problem of the GPU itself.
Apple suffered too with the PPC 970 heat, the Quad Power Mac G5 used liquid cooling and is famous for leaking and the machine itself being unreliable.
You seem to have some mixups. The pentium 4 that used RDRAM was the Willamette core, socket 423 (and later 478), which maxed at 2.0GHz without HT. It wasn't until the Northwood core in 2003 that HT was introduced on consumer Intel CPUs with the 3.06GHz Pentium 4 w/HT. Later a Pentium 4 3.0GHz HT was released with the higher 800MHz FSB (whereas the 3.06GHz was based on a 533MHz FSB).
they hit limited of one core... and end of the more's law... after that endless.. more core... and forcing chip...in the end we need to change from chip to a tower... or you just going get endless way to used less power to reducing heat...
RDRam was very short lived
Microsoft- we lost money on each original Xbox. We need to cost cut a bit on the next console to make some money back.
Red ring of death.
According to SSFF, they also cut costs on test devices and test procedures. Hence, high failure rate and shipping faulty units. There was a guy who was sent 3 faulty 360s in a row. The PPC problem just adds a tile to the jigsaw puzzle.
Microsoft gladly burned billions on the Xbox division in order to maintain mindshare and consumer loyalty.
@@GiuseppeGaetanoSabatelli particularly to children who aren't discerning. The way Camel Joe sold cigarettes. That's why I find it hilarious to get linux running on these things, even though it's usually highly impractical.
and you wondeer why bill gate hated the idea... he only allow them so the japan don't corner them on game.. since if people only buy console.. they don't buy pc... since it just nothing more than workstation for office... and not a real consumer household...
@@waltercomunello121 the PS3 had extreme Heat Problems too dude
PowerPC CPU for X360 was great. Problems with "red ring of death" in X360 or "yellow light of death" in PS3 were because of Lead Free Solder, bad radiators, bad case design and bad cooling.
That problem in the PS3 were because of the nec/tokin capacitors and it has been proven. Two PS3 I had from 2006 with YLOD ended up working AGAIN after changing the capacitors in both. How I found this out? whenever I blew hot air around the CPU and GPU where the nec/tokin capacitors are - it turned on for a few seconds or minutes and once the console temperatures (while playing) dropped or just idle, it turned off. Later found out around some brazilian forum talking about the capacitors and changing them to fix the issue. I tried it and never again had more YLOD issue. Went out of my way to do some testing with YLOD motherboard online and now own seven of them, all working lol.
UPDATE:
Edited to add this link www.psx-place.com/threads/tutorial-nec-tokin-capacitors-replacement-ylod-fix.25260/
@@robertvuitton Good comment. This is the real reason of YLOD. For all these years people blamed RSX and Cell (PowerPC) for bad design and overheating, but the main problem was nec caps.. And also, that shitty thermal paste under IHSes did make everything even worse. PPC was already hot, but that shitty thermal paste did help to kill those chips and those caps...PS3 was very hot, but after you delid that 90nm Cell and replace these caps, you will never get ylod ever again if you take care of that console. Slims were redesigned and had no nec caps, and process was shrunk to 45nm thats why slims are so reliable and fatties are not, but atleast now we know the fix for those "seemed to be dead" CECHA-CECHE models.. They stopped putting that shitty thermal paste on 3xxx-4xxx units, and soldered the chips to ihs. and those systems never fail, that was a real problem, because of shitty paste, chips got hot, because chips got hot and those caps overheated and died, those caps became even worse with all of that heat.
Yep, went through 3 ps3's myself bc of ylod
R. Atuey - yup - I believe it may be the same kind of tantalum capacitor (possibly even the same rating?) that causes the majority of MacBook Pro 15” 2010 and 2011 “gpu” failures...?
No idea about the quality of manufacturing of other components, but everything I've heard from game developers is that the PPC chips in the PS3 and 360 were pretty bad to program for, with all kinds of performance pitfalls.
One could say that Nintendo was actually ahead of its time, with both the PowerPC based GameCube (used in the Wii and Wii U, as well as the 360 and PS3), and the ARM based Gameboy Advance (used in all their handhelds since, including the Switch).
Since the N64 Nintendo had been using RISC cpus. Sony also always used RIsc, they used Mips chips. Microsoft was the first console in a while to use CISC
I wouldn't say Nintendo was ahead of their time with the Gameboy Advance being based on ARM, it was probably more of a necessity because an x86 or PowerPC handheld wouldn't have made much sense (power efficiency and thermals)
They probably are ahead of their time with the Nintendo Switch being ARM based though
@@mirac_ The Switch being on ARM is not being ahead of times its with the times as mobile devices tended to use ARM before the switch came out.
@@Luke357 I was talking about home consoles specifically, my bad
I wouldn’t be surprised if the PS6/7 or future Xbox‘s would be based on ARM
As a huge Mac fan and love PPC. I still have a last gen ( late 2005 ) G5 tower with dual 2.3 ghz and 10GB of ram, that thing still flies and is actually used as a daily driver.
Power9 is still very relevant in the workstation space, just means the move over to Linux to stay in the PPC ecosystem.
16gb G5 Quad here 🙂
kjjustinXD
I am staying away from that one because I am a bit scared of it’s liquid cooling. I do have Quadro FX4500 in my G5
@@theshadowman1398 i got lucky, got it locally for only 50€, no leaks but i replaced everything that may fail and refilled it with non corrosive liquid that wont rip and tear everything apart when it fails.
@@kjjustinXD I'd rather jerry-rig a Noctua cooler instead of chancing an aging AiO, especially on a maxed out G5
AMD is really knocking the competition with supplying cpu and gpu for console this generation and next generation.
AMD technology enables backwards compatibility with current gen and next gen.
Really really waiting for the video about AMD supplying the console gen.
They'll be using both cpu and gpu from AMD? that sounds damn great.
AMD is not the main reason for backwards compatibility.
@@SuperAmazinglover you're right, common x64 architecture is. But since consoles could never go Intel due to poor price/performance right now the only other option would be ARM, effectively killing backward compatibility once again. So... Thanks, AMD.
Fadex price too performance on intel is specifically because they choose too. They have no reason to lower there price as they still have a fair share of the consumer market and most Laptops and prebuilt are intel as well.
They also wouldn’t go ARM because there not powerful enough for next gen. Where getting backwards compat because of x64 and xbox has worked really hard on their emulation. If it was all thanks to AMD Sony would be having a lot more games at launch ready.
We really should give MS emulation team a lot more credit.
Not like they have an alternative. Nvidia has had petty bitch fights with anyone but Nintendo (and I'll bet they will at some point) and intel can't provide an all in one custom solution with proven tech, good yields and good thermal and power efficiency like AMD, especially on the GPU side.
The part about the Pentium 4 is wrong. The Pentium 4 only supported RDRAM on launch in 2001 with the Willamette architecture. The Northwood architecture that was introduced in 2002 already switched to DDR SDRAM (or what you incorrectly label as regular DRAM since there are multiple variants). The P4 HT processors were introduced in 2002 with the Northwood architecture. (3.06Ghz in 2002 and 3.0Ghz in 2003 with a faster FBS) and did not support RDRAM.
Not only that, but Northwood was pretty tame temperature-wise through the 2.8 GHz models. Pushing past that frequency and switching to 90 nm with Prescott is where the heat issues really began. Netburst wasn't an ideal architecture for IPC but the Northwood chips were competitive with AMD's offerings at the time. Both consumed about the same power. I'm guessing power consumption and price per unit were a larger concern than thermals or performance in Microsoft's decision to move on from Intel. Microsoft must have sunk outrageous amounts of money into redeveloping their SDK for PPC. You would think they would have stuck with x86 if the price had been right, based on what x86 hardware was available at the time.
Pentium 4 Willamette chips were first released supporting RDRAM only (socket 423). Later on, before the Northwood release, Intel launched Willamette cores (Pentium 4 and the first NetBurst Celerons) supporting regular SDRAM (socket 478). With the Northwood core, DDR became the standard.
@@KonjonoAwesome Yeah, they really made significant changes with the Prescott release. The major problem was (once again) the pipeline size increase, from 20 to 31 stages. That's literally more than 3 times the Pentium III stage count! At least in the 478 platform, upgrading from Northwood to Prescott didn't offer much advantages...
@@yukinagato1573 I bought one of those. Huge mistake. I even knew that AMD was better at the time, but I had 3 years earlier gotten a pretty good prebuilt PC with a Pentium 4 in it for a low price, and all things considered I didn't want to complain. I wanted to upgrade it, but didn't understand how meaningless it is to upgrade to something of the same generation, so I went from a 2.4 GHz Northwood to a 3 GHz Prescott. It had HT, but that didn't work in a home desktop environment.
No is was not the Pentium 4 that did not support DDR SDRAM or SDRAM, the north bridge was separate entity from the CPU up until the Core 9xx series chips. Intel bet the farm on RDRAM and decided not to make a chipset for DDR SDRAM until 2002. They even designed a Pentium pro (P6) chipset with RDRAM because they thought that was the future but those chips never had DDR FSB so they could not use it, except in dual processor servers that is. In 2002 Intel gave up on RDRAM and chucked all the support and chipsets for it.
Love this video. A minor correction: those 3Ghz Pentium 4 chips with Hyperthreading used DDR RAM. Only the original P4s, the "Willamette" chips utilized the fast but expensive and inconvenient (had to be installed in symmetric pairs) RD-RAM. About 8 months after the initial launch of the first Pentium 4 processors, new budget motherboards with a new chipset were released that utilized the much cheaper (but slower) PC133 SDRAM, followed about 6 months after that by another chipset and set of motherboards that replaced those that used DDR RAM, which Intel stuck with moving forward, joining the rest of the industry.
Heat wasn't the only issue with IBM+Toshiba's Cell architecture.
The largest problem point for these consoles, was that they released very early during the era of moving to lead-free solder, and in those early days, the mix of metals in the solder weren't fully figured out yet, and the techniques and tempuratures that you have to use for lead-free solder were just different enough that things were bound to go wrong in the right conditions.
And in these consoles, right conditions they were. Early failures were mostly due to cracking of brittle joints, that couldn't deal with the flexes and thermal differences between the base PCB and the CPU substrate PCB. But later on, the consoles that had come out of that era without issue had issues with the tin slowly starting to fringe outwards, eventually, the little hair-like leads coming off of them bridging with other joints, shorting the connections and interrupting the data processing flow or voltages, depending on what got shorted. If the consoles had released in the era of leaded solder, where the techniques and manufacturing process had already been in use with little failure for decades, this would likely have never been an issue, as there would not have been any stringing solder, or brittle joints, and the heat load would have been much less of a problem.
Even now, leaded solder, while generally being seen as less environmentally friendly for some reason (likely more related to the eWaste industry than just having it in a computer), still proves easier to handle and generally better for getting a good, strong and reliable joint, although manufacturing and design has long since shifted to account for the potential issues that tin based solder brings to the table.
PCB fabs had years to prepare for the RoHS, but they shove the head in the sand and got caught pants down when it took effect.
KillerMemz PPC is also IBM...
@@bizzzzzzle The Cell architecture is co-developed between IBM, Toshiba, and Sony. In the full version of the Cell architecture, the version that the PlayStation 3 had/has, there was one PowerPC core with a few instructions tacked in to handle Cell specific workloads, that served as the master core, and several slave cores that ran on an offshoot instruction set that wasn't PowerPC compliant. The Master core served to handle all of the IO, the base processing power for the game, as well as the scheduling of the slave cores, while one of the slave cores got used for system UI, and the rest were reserved purely for games to use.
The XBox 360 did get the Cell architecture, but it got aodified version of it. Instead of one master and several slave cores, the XBox 360 recieved a version that had repurposed the core communications instructions of the main PowerPC core to communicate with other full PowerPC cores, and there were three of these full cores, in what is now a much more traditional multi-core configuration, that would have likely never happened, if not for the core communications instructions that were co-developed by IBM, Sony, and Toshiba.
Imagine a single core XBox 360. What a different world *that* would be.
@@kmemz What if Sony's current gen used the better and small CELL (used in the super slim model) architecture, but with the current gen's memory size? I would had loved to see how powerful that could've been.
@@robertvuitton While a new CELL style master-slave core processor using POWER10 and a newer manufacturing node would be great, I don't think it would have as much "Unlockable potential" as the original CELL did. The difference between then, mid PS4 cycle, and now, is:
Back then, single-socket multi-core processing was still a very, very new thing, and not many people knew how to code for it properly.
In the mid cycle of the PS4, most developers didn't properly utilize the eight jaguar cores of the consoles of that generation, because of AMD's whole thing about having combined important sections of their cores, leaving four floating point units and eight integer units, in practice working out as a strange hardware hyper-threading, and on the Intel side, we were still in the middle of the quad core monopoly era.
Now, The Quad core monopoly era is completely and entirely over, and most decent developers practically fully understand how to work with six to eight cores in practice. As long as they can understand both the master and slave architectures, have a relative understanding of how the master core schedules the slaves, and actually make sure to code their program to take advantage of the fact that the hardware is there, then I don't imagine it posing the same intimidating threat that it did back then.
On top of that, Cross communicating master cores have come a long, long way since then, and Intel's Ring Bus has proven particularly impressive for its theoretical limitations, with AMD's shared cache mesh bus, the "infinity Fabric", proving to be an absolutely incredible work, although its ties to system RAM clock hold it back a bit unfortunately. While cross communicating master cores still poses some theoretical limits, it seems to generally have more flexibility in terms of how well it scales depending on its implementation, and can be implemented in a larger variety of ways.
in short, I think that we've finally reached the era where what made the CELL hard to code for yet such a wonder of unlockable performance is long over, and a new version with updated manufacturing and architecture wouldn't look nearly as impressive as it did back then.
First thing MS and IBM did together since the massive drama that was OS/2!
@referral madness They also tried a different hardware standards of Personal System/2 (PS/2, but that's confusing in this context with Playstation!) to fight ISA (later EISA), which was a standard agreed to by "clone" makers. That failed spectacularly, worse than OS/2. The PC clones won out over IBM. There was a lot of drama at the time. IBM didn't even want "home computers" to exist, they wanted to use the PC to tempt growing businesses into buying "minicomputers" or Big Iron. It was a bad miscalculation.
There's a key part of this you missed- 3DO. The 3DO M2 was PowerPC based. Although it never shipped as a console (there's an huge story about what happened between Panasonic, Sega, and 3DO), the system was done and was used in arcade machines and kiosks. 3DO showed M2 to Nintendo, and Nintendo was impressed but not enough buy the MX successor. For Gamecube they ditched the MIPS processor (used in the N64 and the PS1) and went with the impressively scalable PowerPC. Apparently one thing that impressed NIntendo was how easily M2 was able to have two CPUs work together with almost no overhead and no tricky programming compared to the Sega Saturn, 3DO people ended up working both at WebTV and on the original Xbox, so there were people familiar with the PPC and in dealing with IBM since they had already worked with them to create a low-cost PPC that was also tuned to be especially good for the needs of 3D games.
3DO M2 used two 602 cpus but the console was inferior to the Dreamcast in every way. It was dead in the water if it had been released.
The lead designer for the Xbox 360 worked on the 3do. So their is a link
As Sony and Microsoft were transitioning their consoles to PowerPC, Apple were moving to x86. Now as Sony and Microsoft are adopting a decent x86/x64 architecture, Apple are migrating their Mac products to custom ARM CPUs. It'll be interesting to see if the following generation of consoles (PS6 + Xbox ?) follow suit again.
Microsoft have already started to migrate Windows to ARM since last year. It's not the best of solution at the moment but personal computers will most likely with time move to ARM. If there is another generation of consoles after PS5 and Xbox series x (which I hope) I wouldn't be surprised if there is an other switch in architecture.
AMD's got an ARM license, I bet Sony and Microsoft are already hashing out the details with AMD. They've both been restricted with design in order to maintain good heat dissipation, moving back to RISC would allow them to not compromise on aesthetics in order to get great performance.
@@amirpourghoureiyan1637 Yea didn't Jim Keller also design a ARM architecture while at AMD as well? K12 or something, but it was put on hold because of finacial constraints and making sure zen got released.
Once Sony and Microsoft transition to ARM (it'll probably happen, when the single-core performance gets there), hipster Apple will probably move again. Currently the best bet would be RISCV (no licensing cost to pay, meaning Apple keeps more of your money), but RISCV has issues at the moment. Maybe RISCV will sort out its issues, maybe something else will become available, maybe Apple will transition to OpenPOWER (IBMs latest PowerPC ISA iteration) instead. Who knows?
@88oscuro K12 got announced and then forgotten about. Maybe it will come when AMD has the cash flow to do so, maybe it'll stay dead, maybe AMD has found some reason why ARM won't be able to outperform x86 in the desktop market.
@@88oscuro K12 design is complete, AMD's original plan was use zen for high performance(bulldozer replacement) and K12 for low power(bobcat/jaguar replacement) but since zen turn out to scale so well they shelved K12.
According to MS, the overwhelming cause of 360 failures was separation in the solder between the silicon and the substrate on the CPU. It wasn't so much that it was too hot, it was because of repeated heating and cooling cycles. Their stress testing (albiet rushed) didn't factor in how consoles are turned on and off a lot more than something like a PC.
@@deansmith6924 Blame PPC 970, not ATI . Every old Mac users hate unreliable of G5 computers.
@@deansmith6924 I am suprising that ATI was making substanard GPU. Anyway, I use AMD CPU for WIntel Plafform and Nviada GPU is defacto monoply.
I am a Mac person. 970 causes Apple to switch Intel first place.
Holy shit! Touhou music
I love Patchouli!!
@@C00L9UY me2
@@C00L9UY Flandre Scarlet is my fav (Yeah I know she is overrated)
The Red Ring OF death and the YLOD are not CPU related.
That's the GPU, from 2 different manufacturers causing the problem.
Mainly related to the ROHS solder used around this period of time being over-stressed by the heat emanating from theses chips.
The CPUs on the other hand are hot, that's for sure ! But they are designed to throttle correctly when overheating.
Also, the Xbox 360 GPU, is suffocating under the DVD Drive.
Lead free solder has a higher melting point than leaded solder, so how can it be 'over-stressed' by the heat from the GPUs?
RoHS is causing of problem.
@@ricky302v8 it have a higher melting point, but it's not as malleable as lead solder.
But another problem is that theses GPUs could not handled the extra heat necessary to solder this rohs solder properly.
So it's coming out of the factory with balls of solder making physical contact with the motherboard, but not really soldered.
So as time goes on and oxydation occurs, they are losing electrical conductivity.
@@ElectronikHeartÇa fait plaisir de te voir ici ! :)
It was actually a power supply issue, the power supply for the original Xbox ran too hot and it would damage the gpu. It was related to the solder but the original GPU just wasn’t up to par to begin with heating wise.
Man I sure love the way you present this information about the consoles, and their architecture and exploits. Thanks for doing this, it is a both entertaining and informative
Props to MVG for using Unkai from the Axelay OST, one of the best level 1 tracks of all time!
Yes, I was like. Wait thats Axelay!!! Also props for Salamander Avatar there D4Disdain. Though I think we both know what shooter has the best 1st stage music. :D
Ay my man MVG with the Touhou 4 LLS Main Menu Theme Song at the beginning of the video, nice. I've never expected to hear Touhou music in one of your videos, what a pleasant surprise~
Sometimes I wonder what the Xbox-scene would have looked like if they were able to stay with Intel/nVidia. It could have been another explosion in homebrew, emulators, and backup capability at a much faster pace and maybe saved us from the RROD.
The original Xbox originally had an AMD CPU, which is why the CPU erroneously rolls over to addr 0 at one point (AMD CPUs throw an exception), they switched to intel relatively late in development
The day before announcement to be exact
@@DripDripDrip69 Really? I've never heard that before. What AMD CPU was considered?
Huge thank you for Touhou PC-98 music.
I don't know why, but every time I watch one of your videos, I just imagine the executives of whatever console company your talking about banging their heads against their desks in rage when they hear people are loading homebrew on their consoles lmao.
..."console company" -> Why would they care that the old systems can run games no longer for sale without purchase, when the game publishers/developers don't care anymore either? (other than Nintendo, since they can't make a dime without nostalgia these days)
@@DxBlack I imagine it's during the time when the console is fresh and new.
@@DxBlack you kinda lost me at "[...] they can't make a dime without nostalgia these days"... I'm pretty sure the Switch is selling pretty well and not just on retro stuff...
Now if you considering that old franchises or characters are making money, that's a bit too broad and we could say the same about Disney or Marvel even.
There were a number of incorrect statements in this video but I'm only going to mention one. The original Xbox did not have a celeron processor. It actually had a modified Pentium III Coppermine processor (modified down to reduce cost but preserving the key aspects of coppermine).
I've just missed the Xenon era, getting into console game development shortly after XB1 release. But I've seen enough #ifdef XENON over the years to have a pretty good idea of just how many workarounds had to be put into place to get the games to actually perform at an acceptable level on the 360. Sure, you could just take your Windows DX8 game and compile it for 360 with minimal changes, but you weren't shipping a competitive AAA title this way. It's good to know there were good, legitimate reasons to go PowerPC in that generation, and it wasn't just a case of mass insanity in the game development world.
HOLY SHIT PC98 TOUHOU MUSIC THAT WAS ABSOLUTELY UNEXPECTED
I love Patchouli!!
@@C00L9UY yup, TOUHOU! Fatchouli is my fatfu
Zun is best guy
Zun art is best art
Immediately noticed it :D
Mistakes were made : 2:11 xbox had a pentium III cpu not a celeron
The chips ran hot and was thought to be one of the issues for the ps3 Ylod. it’s crazy that the real reason the ps3 ylod was the nec capacitors messing up. Replacing them should fix the system for good.
Did Sony ever own up to this? I remember when my phat PS3 YLOD they didn't want to know, even though it was a common problem, MS eventually took responsibility for the RROD although that was after my launch 360 had developed the problem...
@jvalex18The 'phat' models are notorious for it. Not as bad as RROD on the 360 but still a well known issue.
i hope you read this... i absolutely love your content. You should make a dvd/bluray set with these documentaries as i would buy them in a heartbeat!! very interesting and would rewatch them. thank you for your hard work in making these type of videos.
This type of technical history videos are so cool and watching them are fun!
Nice job.
I find it interesting to see the shift that hardware chips have taken over the years from excessive diversity to a rigid monoculture. Back in the day it seemed like every console and arcade machine had its own customized hardware designed specifically for it. But as games became more technically advanced and costs of making them increased, it just doesn't seem to be financially viable anymore for console manufacturers to spend the exhorbant amount of money on custom designs and instead reaching for tweaked off-the-shelf solutions. Which is why the PS4 and Xbox one run x86-64 (like standard PCs) and the Switch runs ARM (mobile phones/tablets). I feel like for the 360, the PPC deal with IBM was simply the most financially viable deal for them at the time. Look how much money Sony lost on engineering the Cell. Enough to offset pretty much all profit that would have been made from the PS3 in its lifetime. Despite how interesting of a design the Cell was, I don't we'll ever see such a radical and risque design like that ever again because it just isn't financially viable anymore.
You have it a bit backwards, they made custom hardware because it was required for the games, commodity hardware was not capable. They never made custom hardware just for the fun of spending money.
Was working at IBM Bromont in Quebec when they were making those chips. Fun times.
Always a fan of PPC architecture. I wonder how the current gen would be if they stuck with PPC with modern architectures.
The Race for a New Game Machine book was impressive. It said that the 360's PPE was originally designed for the PS3. IBM showed Microsoft the PPE, and Microsoft was impressed. Microsoft did add their own special sauce to that chip and extended the vector register file. It might had 2 vector units per core (6 threads, 6 vector units), which knocked down the advantage of the Cell CPU by a little. Around that time, even PCs were going multi-core, so the risk for these multi-core CPUs was relatively low.
IBM also was a good choice because IBM developer documentation was actually pretty good at that time (it's been a while since I have seen IBM developer docs). Microsoft is a compiler maker, and the in-order nature of the CPUs might be able to be helped with compiler optimizations. That probably sounded really good to Microsoft. The other advantage was that to learn how to develop an Xbox 360 game, you just needed a multi-core PowerPC system with an ATI graphics card, and they had Apple Macs at the time.
I also feel that the likes of Unreal Engine 3 taking off helped since Unreal Engine is still being used, and is often praised at the time as "that system will run UE3 games very well!"
Considering their options, outside the RRoD, the CPU itself wasn't a bad choice. They probably should have tested the CPU more before releasing (they released an early version instead).
10:20 Come on, ATI was integrated into AMD 7 years before that...
I don't know much about hardware and coding but i love your videos so much. I watch them them all the way through and even come back to watch them over again. You can describe these things and articulate so well.
Ooooh Axelay theme! Haven't played that game in years!
My dude coming through consistent with the good content
8:00 I don't know why this is framed as "unfortunate for Sony" that they didn't have exclusive access to PowerPC architecture and Microsoft was also able to use it. Nintendo had been using it since the previous generation and went on to use PowerPC for 3 generations in a row (GameCube, Wii, Wii U). Also Apple had been using it in their computers at that time. It was never a Sony thing. Sony only had the arrangement of special-use cores as the "Cell" processor. Microsoft and Nintendo didn't really try to do that. I don't think it was any more "unfortunate" for Sony than it was for Nintendo or Apple. They probably didn't care one bit.
I still remember all the red rings my friends got. I had a ps3, but I remember a couple of my friends had Frankenstein machines, holes cut in the side, fans hot glued in to suck the heat out, one guy had it straight mounted to piece of wood instead of in a case at all
Strange times to be sure.
I played the crap out of my ps3, never had any yellow lights. I know they were out there, I'm not naive, but I think the failure rates were 10% of the 360s.
I kept the shell off of mine and would just take the top of the disc drive off, remove the magnet then insert the disc & put the drive lid back on . Basically hotswapping which would also allow for ISO mods on unmodded 360s :)
Engineer: It’s gets hot
SMN: Ship it!
One thing that Nintendo never did was move away from RISC.
Sony switched to the x86 architecture with the PlayStation 4.
Quick correction here, the red ring of death was not at all related to heat. It was a defective graphics chip from ATi. Replacing this chip with a newer fixed chip repaired the console.
Heat caused louder fans and such, but didn't cause RROD.
Also, by the time the Pentium 4 HT was released, Intel had long abandoned RDRAM in favor of traditional DDR.
Red ring isn't caused by high temperatures and, in almost all cases, isn't related to the IBM CPU at all. The underfill used by AMD/NEC for the GPU was a poor match in terms of thermal expansion coefficient and glass transition temperature. After many thermal cycles it causes fatigue in the solder bumps attaching the silicon dies to the interposer. Doesn't matter how hot they run, it'll fail no matter what. It's also why reball/reflow is only a temporary fix. They're just going to crack again.
This was resolved at some point during the Falcon revision of the 360 with a move to a different underfill material. Take a look at xenonlibrary and ripfelix for some good deep-dives.
I saw a MVG video about old consoles, I like.
'old consoles' RIP me
the thermal issues of the POWER5 architecture became apparent with the fact that Apple never released G5-based mobile devices and that the dual CPU G5 Mac used water cooling with copper blocks.
Can you make a video how a development of a simple hello world on a Xbox/PS3 looks like? Tooling/SDK/docs would be interesting how it differs from a "normal" development with non-proprietary libs/frameworks
I was working tech support for the original Xbox back in the day. The months leading up to the 360's release, the hype and hoopla around it was astounding. when it came out, the 3RL issue gave me enough overtime to buy my first car.
Only the VERY early Pentium 4 used RDRAM and that one never reached above 2.0ghz and never had HT.
pretty sure there were pentiums up to 3ghz that used rdram. but the OPTION to use ddr ram came after a year or so. it made the cpu much slower though. still worth it over the overpriced piece of shit that was rdram
found an article:
www.tomshardware.com/reviews/intel-ddr,403-4.html
Since the memory controller was in the northbridge not in the CPU like nowadays it was really a motherboard requirement.
@referral madness RAM tech developed by RAMBUS that uses serial rather than parallel communication that was faster than SDRAM at the time but because RAMBUS is such a patent troll nobody uses their stuff anymore.
@referral madness It probably means you can't be bothered to use Google.
Socket 478 motherboards with RDRAM are really rare. Almost all of them use DDR memory
I think the tragedy of PowerPC is the shear amount of terrible PC ports we got for many games and the lack of ports for many excellent games that are now stuck on 13+ year old consoles.
PowerPC just had a great price/performance ratio at the time, x86 could never match it at the time, remember Xbox had to hit 299 price tag and perform well in gaming, at the time for that price you could not get anything from Intel nor AMD which would perform like this for this price. IBM was best partner to do this thing, reliable, strong partner with a lot of experience in custom designs, decades of experience, they were doing it very well with Gamecube, and now they did it again with XBOX360, PS3 and Wii. There is a reason why manufacturers chose IBM.
I do not think that IBM is to blame for overheating, designers from Sony and Microsoft are. GameCube/Wii/Wii U did just fine.
Yes that PPC 970 chip was hot as hell but most FAT PS3 failed because of NEC TOKINS and not because of failing Cell. The real problem was that it had shitte thermal paste under the IHS which dried up after long use. A lot of people think that RSX is to blame for YLOD, but it is actually Cell which was overheating because of shitty thermal paste. Sony designed PS3 poorly, they jammed that hot 380w PSU into small case, it did sit near damn chips and caused even more heat on those heat monsters. And even though, PS3 FAT design is shit, if you took care of your system, delided it, replaced failing NEC TOKINS, even CECHA models work to this day. Sony only solved their problems with PS3 Slim, ofcourse 45nm redesign lowered the heat drastically, but that was the price, remember guys they designed that chip in 2002-2005, Intel had a heat monster of their creation as well... Called Pentium 4 and then later Pentium D.
Same shit with XBOX360, they jammed all of those components to even smaller case than Sony did and they expect it do not overheat? Its is MS to blame not IBM.
Both Sony and MS sold their consoles below manufacturing costs, at the time, PC prices were just too high, and even though PPC was cheaper it still did cost a lot.
Wii U was a failure to be honest, that chip was just fkin too old at that point, IBM did not have anything acceptable in 2010-2011, which is kind of strange having in mind that PowerPC chips were in like 200-300 million gaming devices, why the hell IBM did not invest some money and effort in RND to improve their PPC gaming architecture, they had the market share and the money, its like really they got money for every 200-300 million unit those companies sold and they did just sit do nothing for 12 years with their architecture. Thats why X86 won, because IBM fell to sleep, they had Apple and Consoles in their hands and they dropped the ball. Apple did good predictions with that and foreseen what will happen, C2D architecture was such a huge leap that even to this day those chips perform decently which is kinda mind blowing to think about it, for example my Macbook Pro and iMac both with C2D chips run so well with modern SSD.. A 14-15 year old chip working well even to this day.... is amazing..
To be honest, Cell was a technical marvel, it could not be matched for some time, it was created in 2005 and was so powerful its amazing.. And it is creation of Sony, IBM and Toshiba. The bottleneck of PS3 was nVidia GPU, if it had better GPU it could even play modern games I believe... look, they did go safe route with PS4, but CPU power of PS3 is stronger than that low end X86 CPU on PS4 which just shows that Power PC was amazing and really could have been an option if not for lack of interest from IBM. Both Sony and Microsoft decided to offload most of processing to GPU side, and did chose weak X86 CPU to do the other calculations which could not be done on GPU. I do not buy fact that developers were more common with X86 than with PPC, look, they sold 300 million consoles from various manufacturers, they had Apple users using that platform, PPC certainly have developers interest. They did chose X86, because it was easier than PPC, but most importantly it had better performance per price, IBM just dropped the ball. Manufacturers chose to go code optimization route, which Nintendo did for years starting with GameCube, they did not give a shit that console was weaker, but they made some the best games. look at late cycle PS3 games, they were amazing.. Even on Wii U, which was a failure, they managed to make some amazing games, its all about code optimization, hardware is nice, but you really care about price/performance + code optimization, and thats why we got X86 APUS on 8th gen. Thats why we got X86 APUS on 9th gen, because for its price those APUS cannot be matched, just like X86 could not match it at 6th gen.
To conclude, it is kinda amazing that IBM had such an impact in gaming and content creation history they had it all and they dropped it all which is kinda sad, because we need competition to x86, competition is good for us, consumers and it makes technology move forward. Now we see the same thing is happening with X86, Apple already announced they are dropping X86 for ARM which I think is a good call. X86 was a PPC killer and now ARM is a X86 killer, same old problems, too much heat and incremental improvements. We really have not got anything really major since 2011 then Sandy Bridge was introduced, we only got more cores, more clock speed, faster ram, faster storage on these new platforms. Atleast AMD did kickback and provided chips like R5 1600 AF/ R5 3600 and these new amazing R5 3100, they are amazing for their price, Intel is down on its knees now. I would not be surprised if at some point even consoles will drop X86 and move forward, but for now and probably for PS6/XBOX1_V3 its going to be X86, because price/performance is unmatched at high end, at lower end, lower wattage ARM is already better than anything X86 can offer, look at A13 chips from Apple, they are cheap yet they have so much power that they can match 15W Intel CPU at 7,5W, its just a matter of time then they will learn how to stack those ARM chips and outperform X86 even at higher market.
^this
That generation was the last true leap in video game technology. 😮
Again an interesting video - Thank you
At 10:30 i see a modified Xbox360 (with external power cable for DVD drive ?). Can you give more information on this ? Where can i buy this long cable ?
Nice nod with the touhou intro music, started chuckling at work when I realized what it was
I was too young to appreciate the reasoning for all of these designs at the time. But looking back, these decisions were the right moves. Console gaming has alwas had to find a way to be more competative and cheaper than PC's.
Console were like $300 back then when a PC CPU can easily be $100-300 alone.
@@maggiejetson7904 yeah but nothing beats the comfort of plopping a disc into your console and pressing A to start your game
10:00 oh wow a Burger Becky article about perf and Load-Hit-Store and coaxing the C++ compiler into not generating garbage. There’s quite a bit of difference between early and late X360 and PS3 games and counting cycles and pipeline stalls made the difference.
IIRC late generation games like Gears of War 3 was known to cause overheating issues because efficient use of hardware means everything is being used at the same time, those 40 cycles of pipeline stall in that article meant a core isn't being used, thus less heat being emitted.
@referral madness one of the co-founders of Interplay, industry vet going back to the 2600 and Apple IIGS
Probably want to read "Masters of Doom" for the burger story. It's a name that keeps popping up around PC and 16 bit era games.
The funny thing is that while the consoles were moving to PowerPC, Apple turned their backs on IBM PowerPC and used X86.
Now we havesimilar times again. Apple goes to ARM, Nintendo also uses. PS and MS use AMD APUs.
The question is not which CPU architecture is used but which graphics acceleration.
AMD has made a really great purchase with the ATi. The current success is based on this purchase.
Intel can't get GPUs right we already saw with i740, Larabee we still wait for it 😄
We now need to see how ARM will evolve and if Apple is really getting performance out of ARM. But more important is the GPU performance and if there will be more competition from maybe NEC.
Well, Apple moving to Intel was a huge loss for PowerPC, I'm sure Microsoft and Sony got a great deal on their PPC chips... becasue you need customers to keep your fabs working you know.
@referral madness NEC PC98 series were x86 IBM PCs clones with a different BIOS... not a big deal. PC88 were really interesting though.
@referral madness NEC or PowerVR is still active in the mobile sector. We have the successors of the Kyro graphics cards, for example the PowerVR GE8320 . In the Playstation Vita was also PowerVR built in
NEC has nothing to do with PowerVR, they left them in 1999. Imagination jumped with STMicro only for them to bail on them and have the Kyro 3 cancelled in 2002. It hurts even today when I think of it. I know Apples custom GPU uses Imagination patents but I don't know much more than that.
Although they shared the same architecture, harnessing the power of those processors was difficult and different between all 3 consoles. Making a cross-platform game at that time was a nightmare.
My 360 was cursed. Out of all the consoles I've ever owned, the 360 was the one I took apart the most which was always for repair. The 360 was a such a furnace that the cooler mounts would deform under the heat. When I finally perfected my cooler mods and had it running seamlessly, I accidentally flashed the wrong firmware onto the dvd drive and bricked it. When I found a cheap compatible replacement, I spilled an entire diet coke into the console as soon as I took the cover off. At that point I took it as a sign and shelved it.
Argh! Pentium 4, bring, pain!
good thing i only used mid level chip back then.... core2duo also start heating up after few years of used..
I'm digging the Cave shirt and Shmup music in this vid. I've been grinding out Dai-ou-jou for the past week or so, so these are timely references to pop up here.
2:47 the original plan for the Xbox was for the console to have 128MB of ram but Microsoft cheaped out last minute. It makes me mad even today. I always wondered why the original Xbox never got a CPU/GPU die shrink(180nm cpu and 150nm gpu) but I have never found any article from MS, developers or game media stating why this didn't happen. MVG do you know? Intel did shrink the Pentium 3 to 130nm and Nvidia had their GPUs made by TSMC so I am sure they could have moved the Xbox GPU to 130nm.
they relized it woul be too expensive and left no room for hardware refreash
Yes the sega chihiro is essentially an xbox with 128mb. You can upgrade it to 128mb or have someone do it for you for about $100. It basically allows you to play Sega Chihiro arcade games, no other real benefit considering Xbox games were designed and optimized to only you 64mb of memory. It does help with other emulation as well but ya, by now we have modded 360/ps3 or a PC that can emulate the higher end stuff.
I don't understand any of this technical stuff, but for some reason, I really enjoy listening to it. Maybe it'll all come together someday in the future!
And now Apple’s jumping ship from Intel! (For good reasons, too...) It’s funny: AMD’s chips are soaring, Apple’s SoCs are going to be super exciting... and Intel is having die shrink trouble & similar thermal issues like the PowerPC line had... What a wild moment in time!
Intel sat at the top of the hill and abused the shit out of us consumers and even more for enterprises and industrial.. They milked us for almost a decade.. Since they won against the athlons and phenoms. Happiest person to see them go..
Well if former WebTV staff were directly involved, that would explain that one picture with the guy in the WebTV jacket standing next to all the 360s connected to Lamprey boards.
You'll likely find that image if you search for "Xbox 360 Lamprey"
I miss so much when each console manufacturer tried to use a completely different design. Today everything is the same. SO BORING.
I don't understand that. When you play the game you can not even tell what cpu architecture it is running on. I am more interested in the ability to load Linux or custom upgrade the hardware.
What’s really bad is that all they case designers for the 360 and the PS3 had to do was improve air flow and use more active cooler with better thermal paste. I’ve modded both console cases by simply adding cut outs to improve air intake and on the PS3 it dropped cpu temps by 8 degrees C. The 360 relied on basically trying to passively draw air across the heat sinks from 2 incredibly small bands of holes. Stand up the console? Block the bottom intake. Add on a HDD? Block the top intake. The 3rd and 4th bands of holes? Blocked by the presence of the motherboard itself. All I did was cut a huge hole in the case and stand off a sheet of plexi to protect the components and temps dropped dramatically for that system. Add in the 2 small fans, one on the old style cpu cooler to push air through it and one on the extended arm of the gpu cooler to push air down over it and it runs without having a single stutter in fps
I get tired of morons dragging PS3s into the problem Microsoft did with their POS 360s . They never bothered fixing the problem in the first place by using better thermal paste and make some minor adjustments with their fans . SONY never really had a problem until after a lot of heavy use over a long time with the PS3 . I never had any problems with my PS3 and I used it just as much as my 360 and still have no problems . The 360 issue with me is very different . I didn't buy one til the big lie was going around that Microsoft finally fixed it with the HALO3 Edition 360 . Mine lasted 3 weeks . Microsoft has always been last because they still sell 💩 to their customers .
also, personally i dont think that sony's ps3 ran that hot tbh... Alot of the YLOD issues werent to do with the heat and solder balls, like the xbox 360 had..
I bought a job lot of YLOD ps3's the other week on ebay, None were solder ball issues... they were all repaired by myself, simply but replacing the NEC/TOKIN capacitors with Tantelum capacitors...
They ran extremely hot dude what the hell are you talking about!? They were toaster ovens with heat that would cause your hands to sweat if you held it by the vents. Maybe if you only worked on non launch models then yeah, your statements then accurate.
No, those things could produce heat signatures seen from space.
Could you help me out? Tantalum caps arent available where I am and the available ones are both expensive and has low capacity.
Would a regular electrolytic capacitor work? Tantalums basically should operate like electrolytics. If you could try it out so I can fix my PS3, that would be great :D
Man ran ridiculously warm.
@@ayuchanayuko they technically would yep, problem is, Standard capacitors are too big and bulky. they needs to be surface mounted to the motherboard.
I used to surplus computer hardware at an old job i had. I had to open up printers to verify there were no hard drives that needed to be destroyed. I can't remember the model anymore, but there was one printer series we had that used a PowerPC 750. Always got a kick out of that.
One machine it's playing state of the art games, another it is just running a webpage and listening on the network for print jobs.
Here at less than 10 views! Great work keep it up!
I love PPC and ARM and MIPS. Similar instruction sets and way more pleasurable to program in than x86
Exactly, PPC gets hugely misunderstood nowadays. It didn't disappear from the consumer market because it was bad, it was 90% due to mishandling its potential and 10% due to Intel getting their sh*t together with Core cpu lineup 🤷🏼♂️
It might be Sense that Xbox 360, PS3, GameCube and Wii did have PowerPC Processor simular to a Mac that is powered by IBM. How great it might be? Well, the difference here is history...
Unf that classic PC-98 Touhou music hits you like a truck, so good.
Complaining about PowerPC thermals is a bit strange, given that the problem of the Pentium 4 was also thermals, and the PowerPC thermals are laughably low in comparison to much of what followed. It just happened to be around the time that computer cooling started to become a big thing and therefore it was comparably new to customers and manufacturers. Nowadays nobody bats an eye an monster tower coolers, 3-4 slot GPU coolers or regular AIO & even custom liquid loops. In the Pentium 3 era, a shoddy aluminium heatsink and a mini fan was enough.
1:48 A PC that would play games? =o wow really impressive