The One Chip Future
Вставка
- Опубліковано 18 гру 2024
- Should more manufacturers include RAM as part of the SOC like the Apple’s M1 and M2?
Watch the full WAN Show: • I'm Furious But NVIDIA...
► GET MERCH: lttstore.com
► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloa...
► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
► OUR WAN PODCAST GEAR: lmg.gg/wanset
FOLLOW US ON SOCIAL
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
TikTok (LMG Clips): www.tiktok.com/@_lmgclips_
Twitch: / linustech
Soldered (reasonable priced) Memory makes sense but soldered storage is terrible for the consumer and the manufacturers know this.
As well as hard to replace batteries being a bad idea since those are most likely to go bad first.
The thing is there can be a performance impact since every nm of solder does impose a delay. However, it is likely that with current SSD technology, that doesn't impact as much as it would with memory. NVME/M.2 is good enough for now but who knows if we need more speed and can supply that in the future.
it's exactly what I was going to say. It's the worst tech trend ever. It should straight be up ILLEGAL. it's only good for the manufacturer because of the price markups. The SSD is kinda like a consumable and it's the most upgraded part on a laptop. ~maybe EU should look into this~ 👀 I want to upgrade my M1 MacBook SSD 😭
I think it should only be allowed if you have a sata or m.2 expansion, imagine a gen 5 ssd and ddr5 soldered to your die, with an extra 16gb on the dimms and 2tb in the m.2. Crazy fast (on paper) im no hardware engineer
@@lillee4207 if you have space for all of that, then you have space for 1 SSD mounted on a m.2 socket. The Steam Deck SSD is a m.2, the ROG Ally also. If this crazy compact PCs have space for a socket I'm sure you can find space on a 13" laptop.
Tighter integration has been a trend as long as there were computers. Upgradable Math Coprocessors moved into the CPU, same as upgradable L2 cache on the board. You had to put in an USB controller card, which moved to the motherboard and now is even in the CPU on AMD.
Math Coprocessor was upgradable?
@@uis246 they were optional and you socket one in to accelerate floating point operations
@@uis246 Yep, it was a big day in my school's computer room (literally, "the room with the computer in it") when Mr Enright plugged in the new floating point co-pro. =:o}
@@uis246 Yep, look into the 487
@@LordApophis100 exactly. They are pluggable, but not upgradable.
I am fine with a cpu with on package ram, but leave the dimm slots as well. Image a cpu with several gigs of hbm in the package and the ability to many more of ddr6 or whatever. Either use the hbm as a huge cahe, or part of the total ram available.
Which is kinda already available in the data center via Xeon Max. Just gotta see if it ever comes to desktop.
that would make the whole thing go over easier as you have the best of both
This would have been perfect for the M2 Mac Pro which actually has room for 16 DIMM slots (the amount of channels on the M2 Ultra).
there isn't going to be "RAM" as we know it in the future. all system memory will be replaced by large high bandwidth low latency on-die/chiplet memory caches which are scalable and stackable.
At that point you're essentially adding additional memory controllers. And the two memories are going to have different performances.
That "embedded DDR6, hbm etc" will just be another cache but the designers still have to allocate expensive area to make memory PHYs and valuable pins to create the slots.
If consumers show they are willing to pay more for that option, then they will make it.
0:58 I am 100% with Dan there.
"Should we run 10 year old CPUs". Should people have enough money to buy a complete new system every three years or so? They probably should, but many don't.
Not to mention "being green" and "environment" and "sustainability"… every company does, pretty much no company actually means it.
The reason I haven't upgraded my PCs (outside of graphics cards) in the last 10 years, is simply because standards change too fast. By the time I decide it's not enough speed, or I saved up enough, I realize that I can get more bang for buck by selling and buying new than buying the best processor&ram that will fit. Storage thankfully hasn't changed standards too far, and graphics cards are compatible enough (within a power budget), but I think this makes sense. It's just unfortunate that much of it will be wasted no matter what we do.
Storage standards haven't changed so much that I use a hard drive from 2008. Sure, it is SATA2 thus not very fast even by HDD standards but it is still in perfect condition and works just fine for movies or old games. I have M2 NVME for system tho.
it hurts to hear this as the techy guy of the family but yes everyone in my family says "can you fix this for me?" and I do but having everything on the system being one just bothers me because what if its just a bad dim of ram? is the entire system just scrap now? I don't have the pateince for soldering so chances are if a family member has a bad dim I will just say "yeah the laptops dead"
Yes, any single failure and now the entire system is ewaste from a single bad ram chip
the one chip future doesn’t sound too good to me. it’s a bit too spicy, maybe without the ghost pepper i’d be okay with it
The main problem with this trend of coupling compute and memory is that it lets chip manufacturers artificially segment markets based on memory requirements. I want to do AI at home on consumer gpus. But Nvidia insists 24GB vram is the max you can get before the $40,000 hop to an H100.
Everything built on the chip is great for performance, but it depends on the type of system being built. Phones, Laptops, Micro Desktops, sure put an all-in-one package in there for better battery/power because you don't expect to upgrade the RAM. But desktops with expansion cards having modularity are great. However, the manufacturers need to keep the cost of the increased size of things in line with market value; none of this $400 8GB of RAM stuff Apple and others do. But I will buy a laptop with replaceable RAM and SSD as long as possible because I change the parts myself. And after 3-5 years, people should plan to upgrade because of technology improvements.
Admiral Hopper would have been aghast to know her name was being used on a system that didn't have the shortest paths between components.
Reliability is a non-issue, but consolidation leads to bundling and aggressive product segmentation, neither of which is good for the consumer.
Then again, the price might make it a moot point.
Even regarding the question, I'm fully in favor of putting ram on the processor! However, it should also be possible to expand the ram in the traditional way. This has already been done with one of the newer Xeon chips right? Which has 64 GB of ram inside of the cpu
2 tier ram is an interesting idea
It's essentially adding another layer of cache
@@scaredscorpion No, since the on chip ram is actually registered as real memory in the OS. You can literally just run one of those systems without ram sticks.
If we get tiered ram, then sure it might be "just another layer of cache" but having an entire 64GB of cache is something entirely different from "just another layer"
I just started a local PC repair/Custom build side gig in my town. I charge 25 dollars to install individual parts for people (gpu upgrade, fan install etc). This honestly just covers my time. If they are doing multiple upgrades i compound it with discounts. Fans, cpu, cooler and gpu would be 75. Which is my average build fee for a custom build so i typically urge people to just get a fresh build. However some people already have the parts and just can't be arsed to learn how to do it.
As for the "Datacenter part swap" bit, at least where I work we do that with anything that has ever had any amount of customer data go through it. It's a bit ridiculous for RAM but the chance of a data leak no matter how small of a chance outweighs part cost.
Honestly, I feel as long as there's the option to SUPPLEMENT this SoC ram, then most won't be that upset about it imho
yeah agreed, idgaf about soldered ram or even storage, as long as it has the option to physically upgrade it too
But this memory would be slower I guess this will need new kernel instructions to improve memory management. You could have low priority, inactive tasks on the external RAM and active or high priority on the main RAM
@@vanhalenbr that's called NUMA and has existed in OSes for a while (Win7 and up, Linux 3.13 and up). It's used a lot on multi-CPU servers where each CPU wants to prioritize closest/fastest memory regions. Desktops are typically a single NUMA node, but some laptops with integrated GPUs also run multiple NUMA zones.
The CPUs that have RAM on them from Intel still have 16 slots for DDR5. So 16GB of HBM or 64GB depending on how you set it up then up to 5TB of DDR5. The consumer version probably will have a system RAM cap not that high and only use 1 of the HBM chip so only 16GB of HBM. More HBM RAM than most have in their computer of DDR5.
After Intel recoups the money they spent on getting it to work i can see this coming to the high end consumer market as Xeon and the core i series share a lot in common. To the point i am share some of the core i series were made to be Xeons but binned down to be a consumer chip. The problem will be cost. Making a consumer accept the +100USD for the chip as "normal" will be hard and many will not like that the minimum is 16GB of system RAM. Though this will mean when it comes to consumers we will get boards that have 0 DIMM slots on the lower end. For example the ITX boards might just take out the DIMM slots and replace them with a M.2 storage PCIe x4 slot as the RAM would be on package. Making it able to be smaller. While for full size ATX the only reason why we still have 4 DIMM slots is it sells better than having 2 DIMM slots even if it is rare for the consumer to fill all 4 slots. So ATX will keep them with 0 reason to remove them just be able to boot without any additional RAM installed. For most this will be good as most users don't notice and if the part that handles memory management is coded correctly it should be seamless for which one is in use while the DIY ppl just get more RAM and another way to test if it is a RAM stick IF the DIY person goes with a board that has DIMM slots on it.
Turning on chip ram into a sort of L4/L5 cache might not be horrible assuming there is more room for more ram
I'd love to see chips with ram as another option along with current models, like AMD with 3D cache - surely there are a lot of useful applications for it - cheaper laptops (the ones with Pentiums) and ultrabooks might be even cheaper and more effective - they alreay have ram soldered on the board, cooling systems for them will be more compact which means less weight. Handhelds could be smaller, lighter and more efficient, or use free real estate for battery or whatever else. On desktop, you'll have smaller NUC and x86-64 SBPCs, or enthusiasts' versions for extra power. Maybe they can even be scalable - we could have a return on multiple cpu motherboards.
Socketed desktop CPU with unified RAM sounds like a good idea, as it's already proven by latest gen Xeon server CPU
@@sihamhamda47 as long as they still have an option to expand ram on board I'm all for it
@@TacticalHawk34 Yes, that's what I think. High speed unified memory with the option for DIMM RAM slots expansion
Loved Laserdisc. My roommate had one back in the 90s. And back in the eighties my neighbors had the older CED video disc system, back when I still had Beta tape machines.
I recently had a stability problem with my desktop machine, and it turned out to be an easy fix - I swapped out the AIO water cooler with an air solution and the machine stopped crashing. However, it can easily turn into a diagnostic nightmare - is it the CPU, is it the RAM, is it the PSU, is it the MBO, is it a loose cable somewhere, or some crazy thing. How much time and mental space can you afford to waste on trying to diagnose and fix the problem, if you have other things to do and this is really something you want fixed yesterday? I completely understand the concept of just swapping out the machine, restoring it from backup and getting up to speed with your actual work, because the cost of the new machine might be trivial in the overall scope of things.
Not about repairability or future upgrade, midrange CPU with integrated GPU do the job for me, but in my use case as developer sometimes I use more than 20 gb ram, the issue start here if in this scenario I have to buy high-end PC while I unitized ram but CPU and GPU usage is idle
That is a good point. If its integrated, you could be forced to buy an I7 if you want more then 8 or 12 GB of ram as its seen as gamery. Or production and feel you can afford it. While an i5 is more then powerful for you. I could even see the entire bottom of the CPU stack becoming worthless due to only including bottom barrel quantities of ram. My laptop is maxed out at 8GB and it runs into the limit often.
"most people treat their computer like a microwave"
* me, fixing my microwave * : what?
Everyone out here discussing pros and cons like this isn't the ULTIMATE way for Apple to completely and utterly force all repairs, and replacements, and fixes to go through them for proprietary parts only available through them. This is why modular computer parts will always be the consumer friendly choice.
There are certainly good things about this. For example, consoles are integrated, and while I know they do break, I never had a gaming console malfunction. I've seen lots of dead parts in PCs or just GPUs and RAM acting up, here in Brazil we have a particular issue with power supplies because people sell crappy power supplies and then sell even crappier voltage stabilizers, even if there was no need for one at all... so you can imagine the amount of dead power supplies, some tech shops used to just live off of that (it's so crazy that if you walk into any home that has a desktop pc, nothing in the home has a voltage stabilizer, not even the laptop, phone, tv, nothing, but the desktop pc has lol). Similar to phones and so on. The more moving and swappable parts you have, the easier it is for stuff to break or not work. I use to be heavily against phones having internal batteries that weren't easily swappable, but a decade later and I only had to change 1 battery and I never had any issues with battery not connecting properly.
I can't like this trend as a person who enjoys to build computers. Maybe it could become the final push for me to just go the next step and learn to properly solder, but it's more likely that it would drive out of the hobby entirely. Even if DIY continues, it will either require way more knowledge or just become an even more expensive hobby. On the other hand, for the average consumer this... is probably a really positive change and as long as companies don't start making everything proprietary, we will get better PCs and more stable software.
In the DC, memory-on-chip is for cache. Full memory-on-chip is difficult unless you can have skus for all memory configs, which will be difficult for all server configs and even worse when you have massive amounts of ram which means the die is huge and therefore errors and failures are much higher. You'd need chiplets and then assemble them.
Laptops are less problematic. I think 8G laptops would become rare if you had ram on chip, because that's a big risk. I could see 16G, 32G and the occasional 64G. Same with desktop CPUs, though ARM may then clean up at the low-end, so that might be counterproductive for intel.
It will be interesting to see if iphone gets full TB3/4 connections and then a dock with screen and keyboard.
It also makes sense for this to replace it because if one thing fails then at that scale something else will probably have a higher chance to fail. Also you can upgrade to more efficient/ powerful unit
The issue with a one-chip design is that professionals (like myself) are completely fine with the same CPU or GPU for multiple years, whereas things like the amount of memory required to do our jobs can double or tripple, as you adopt more specialized tools. I recently started editing and color-grading photos from my camera, which are 12k by 9k dimmensions. This requires literally 20GB of memory per layer in programs like Gimp or Photoshop. I wouldn't replace effectively the entire computer just for that. That would be stupid.
@@shapelessed I hope that in the future, especially with the new competition in the CPU space due to open architectures like RISC V, we can see a good compromise for workstation - a fixed amount of memory directly on the die for high efficiency, and some DDR(X) slots for upgradeability - kind of the same approach as Optane based swap drives, but using actual DRAM as the swap drive.
Same with just exposing PCI-E lanes so that a new GPU can be added alongside the built-in one.
@@shapelessed Extreme use case scenario isn't a counterargument to regular-user-focused technological change, though.
@@kalex22029 yes having onchip ram doesnt mean external ram doesnt exist anymore for low performance need tasks theres no reason you couldnt still have external stuff right?
@@ImplyDoods Kind of, it wouldn't be perfectly integrated because handling different memory regions at different speeds is a stability nightmare, but it would work like a RAMDisk with a swap partition on it.
In the 90's you could buy RAM modules for video cards. Integrated RAM has become an easy pill to swallow for many because you just take the amount you need and double it and that's what you buy, and it will last for as long as a board normally would. ITX owners already do it. I don't think GPU boards are going anywhere anytime soon though because the upgrade cycle is shorter.
2:15 Worked in the DoD as a Sys Admin.
We did that even if it was running perfect. reason being was warranty expiring.
They wanted the warranty so if anything broke, they could get a replacement part next day air for free.
Once it expired, we bought a new one, migrated and sent the still working one with NO DRIVES or DRIVE SLEDS to a place where they got recycled or resold.
Talk about wasteful govt spending. It infuriated me so much how stupid it was
This is manufacturers pushing their desires on us, it's an efficiency problem and now when you take on a new hobby like machine learning or get into virtualization, you're going to have to buy a whole new system with a soldered on chip compatible with your workload instead of replacing just the parts you need. This is wasteful and inefficient
Would tiered RAM be feasible?
We already have different levels of cache & the ability to use both NVME and SATA, why not have faster on-chip RAM as well as slower, upgradable RAM slots?
We kind of had this with that one weird Nvidia card where 3.5GB had full bandwidth and 0.5GB was slow. Any game that used more than 3.5GB VRAM tanked in performance.
We kinda have this in the data center with Intel Optane. You can mount an Optane Persistent Memory module on a regular RAM DIMM and it'll act like a massive _slower_ RAM stick
Gone With the Wind was so long that there is a built-in intermission period in the movie both in the original theater release as well as all the copies I've ever watched.
things have changed so much. in short, we're running a virtual desktop server farm on HP hardware and the thing I picked up from my boss is basically: 'if you need more power, no prob, just add some more cores to your virtual machines, it will keep us in the cheaper tier.'
Mac’s bios in reliant on soldered nand, which is known to fail prematurely. Earlier you could keep using it with an external drive, but new ones will be wasted if their storage fails. Integration has it’s pros, but comes with some major cons too.
0:10 you mean SOC, so we no longer talk about processors but about SOC's
People might not like it for upgradability, but it has massive benefits for the user on performance and responsivity so its gonna be the norm eventually. separate components are going away.
Please don't take away our upgradeability...
intel CPUs have a binary translator from x86_64 to an internal RISC ISA, and then a purly RISC core executes the instructions.
so what if we could compile binaries to the native ISA, for example on gentoo.
This wouldn’t be a problem if the memory wasn’t laughably low on the base models these companies are trying to say that these products are for professionals and then give you 16gigs of ram It’s 2023 32 or 64gigs should be the smallest tier
Had storage die in my gaming laptop after less than 2 years. Yeah on package storage would be bad
'Risc architecture is going to change everything' - Acid Burn 1995
It makes sense for server usage, to have things soldered on, but that will impact resale value of the hardware after they replace it.
06:10 Not a direct apples to apples comparison, not even close to it, but I had an Automatic car once that I managed to crash, the front-end got messed up, but the engine somehow remained apparently in working condition. I happened to have another car - Manual - that was of the same make and model as the one I crashed, except it had a bad engine that didn't work, so I got the repair shop to build the Automatic engine into a Manual engine, stuff it into the other car, and thought everything was fine and dandy.
But no, I had to repeatedly take the car in for repairs after that, and before I knew it, I had probably spent 3 times the worth of the car when I bought it just on repairs alone, so I finally wised up and decided to take it to the junk yard and get my recycling money for it instead.
So, yeah, repairing things isn't always the best thing.
I think we should take nature as an example here. Organisms die and give resources to other organisms.
If we'd use that concept on computers or in general consumer goods, it would translate into buying new BUT releasing the resources for of recycling.
I think the most ideal way would be to integrate as much as possible (so you have all the advantages of integration like speed, efficient resource utilisation,...) but recycle every bit cost effective to give resources to newer generations of a product.
If you think about it, Apple is exactly doing it with the iPhone and soon other devices, except for the cost effective part. 😅
Having previously owned a cloud computing company, I am not as worried by this "in the datacentre". Our biggest cost, other than staff, was electricity, so I would have happily paid for something that could not be upgraded but used far less power. We never upgraded server components - we always bought them maxed out, anyway.
So for many years I have been a fan of DIY, and also help out friends with how to replace or upgrade pc stuff.
I have also done a lot of small upgrades myself for my own PC.
But in the past 5-10 years. I think it's getting harder. Bigger upgrades just make more sense.
as soon you think more DDR4 ram is good, boom DDR5 came out. (so why not just wait for next build instead of buy more DDR4 now)
Anyway. In my own case. I have done a few stuff to be front of it.
So I have a "be quiet 1200w" PSU. (I can enjoy it for many years, and I got it as a demo product so it was half the price. Great deal for me)
Now I'm on a I9-10850K I did get a RTX 3080 last christmas. And I just upgrade my M.2 OS drive to a PCI 4.0.
So I have bought a new motherboard, ram and a CPU. For the time now. It's a used I3-12100.
Did buy the I3 used and got half price. It would only be for a few weeks.
I will wait for more info on the 14th gen. And if the price is right in my opinion. I will go for a I9-14900K. (or 13900K time will tell)
Have also bought an adapter for my CPU fan. (still rocking my Noctua NH-U14S with an extra fan)
I'm also looking forward to see what the I3 can do, it's kinda like my old i7-4770k.
Kinda a big step that happen in 8-10 years.
But all my hard drives are just M.2's now days.
So not that much to do small upgrades any more.
(I don't even have a real pc case. just a homemade wood plate. Not sure if I can share a photo in here)
You will also see a rise in subscription based hardware in this. You can access 16 out of 32 GB for X dollars etc etc.
2:00 Linus is kind of implying here that this an industry norm to just toss the server. There are lots of layers to this. Its not uncommon to have warranty contracts so that smart-hands (IT staff) are dispatched to replace the part. For most common server repairs, you don't even need to shut down the server. You slide the server out of the rack, swap the part, no one will know you were there. In some cases, Microsoft for example, has entire preconfigured shipping containers full of servers, pre-wired, hvac already taken care of. If the something dies in there, oh well, humans dont go in there. if the whole thing goes up in flames, its in a box, who cares, ship in another container, throw out old one, no affect to rest of data center. But repairing things is still pretty common....
The name of the game for most data centers is how much compute they can get per square foot, they want to go has dense as possible. So a 1 single server (YES, a single server, not a rack, just 1 server) can easily cost $100,000. You're probably not throwing that out in most situations over a simple stick or memory or a drive that can be swapped seemlessly without powering things down. The scenario where throwing out an entire server in datacenter makes sense is going to be far and few between... I'm not saying its not possible, and i'm not saying Linus is lying, but I am saying even for a large data center, throwing away 100k worth of equipment... nah.
HAHAHA The lead paint joke had me in stitches, so aggressive!!
RAM on a SoC is totally fine. You're always going to need RAM, it's going to perform better than anything expandable, and the two can work in tandem. I am more bothered by things like integrated graphics where I'm paying for silicon that will literally go unused because my dedicated graphics card fully replaces it.
Windows latest update addresses that exact problem
I would support this if they also put a slot on the motherboard for upgradability. It would be like my old systems RAM config where it used the original 8GB in dual channel, then once that got filled up it would use the extra 4GB I put in. Of course that extra 4 wouldn't be as fast as the 8 when accessing data on it, but its a lot better than being stuck with only 8GB.
How about just buy the right amount at the beginning
@@rewardilicious A: not everybody has some money to do that, especially because it's more economical to upgrade it yourself later. B: it destroys the devices ability to be recycled later and reused because it can't be upgraded whatsoever. C: sometimes your workload and things you run change today. You might need 8 GB, but tomorrow you might get a new program that needs 32 that you never thought you'd use before. With a computer that's not upgradable you just have to buy a whole new device. D: sometimes you want to upgrade beyond what the manufacturer will give to you when you buy it.
For instance, when I bought my eighth generation laptop, I need a computer that had at least 50 gigabytes of RAM. The max the manufacturer would give me 32 GB. But because it's upgradable I could just swap out the ram and upgrade it myself to what I needed after I bought it
I contacted Lenovo support recently, to get quote to repair my wife's laptop. They asked $2500 to replace the mobo (with cpu/ram soldered) in a 5 years old machine. For that money I could by a new better laptop.
So I just bought a second-hand mobo from Aliexpress for $180, which was even a slight cpu upgrade. Somehow nobody around was expecting laptop to work again, but it perfectly does.
The "broken = trash" mentality :(
Oh, and I saved the old mobo, because it's just a south bridge or IC like that was overheating, leading to power off. All due to a short circuit in a type-c port.
Because I expect it to work perfectly as a little server, if I manage to desolder/disconnect that chip at least :)
Having worked with designers for similar technologies, the engineers themselves are excited to have the memory on the chip to improve performance. Regarding the motives of company executives and marketing managers, the inability to upgrade is a silver lining while also lining their pockets.
The asus 14 inch has a single dim soldiered on. The rub was a blank pad next to it.
10:40 LOTR Extended Edition on LD would have been quite a collection.
Now time to wait for LMG to explore CEDs - the actual vinil record (usuccessful) video disc format
The last 20 minutes of The Return of the King extended edition is actually just credits.
An economically built board as a unit replacement. Reduced contacts reduced failure points.
I think we're all in agreement that soldered RAM is an unfortunate necessity and soldered storage is stupid
Anyone who has had their phone repaired will know, your phone just isn't the same afterwards. If you send your phone off for repair and it comes back feeling as good as it was before you broke it, you've found the holy grail in repair shops. For most people you get it back and it's just worse and you want a new one anyway and that experience effects how they deal with other repairs
I’d like the options. I like stuff like apple silicon. If some one made ai centric chips, or videography chips w/o apple sized prices that would be great. But for real workhorse set ups I’d like the ability to tinker, fix, upgrade. I’m not sure why you wouldn’t just keep the slower legacy options for extendable RAM and storage unless it’s just greed and evil.
In 2012 I bought a machine with 16 GB of RAM and 512 GB of storage. In 2021 I bought a machine with 32 GB of RAM and 1 TB of storage. Performance is developing slower and slower that for anyone other than the top 1% of users, you'll never need to upgrade until the rest of the machine is shot anyway.
still using an i7-3770 i found in a trashed pc a few months ago, before that it was an i5-3470
It's the same way with most things. My washer just started acting up and I paid someone $85 to tell me it's $650 to repair a 10 y/o washer.
The answer is that they are. Intel is already testing HBM on chip in server settings and it also accepts regular ram in addition which is the CORRECT way to do it but consumer chips probably won’t offer that. What I’d really like is a Pi-card type setup for a CPU and you can upgrade the whole PC regardless of socket etc. then you could still upgrade just a portion.
As long as I can install additional memory to a motherboard I don’t mind on chip ram. If the on chip capacity is the limit then it may be an issue.
The other thing about an "at scale" or working data center is that you're paying for all of it. Rent and power and bandwidth. Every computer in the building will need to be replaced on some CapEx schedule, otherwise you're wasting money. So, how do you determine which machines get replaced? Well the oldest first, unless you have a dead (for whatever reason) machine. Then it's not just renewing the warranty with the manufacturer, and a speed bump (which means you can make more money). Yeah, a 2 year old box isn't worth a ram swap. If that guy has to go out with a cart anyway, may as well mark it as part of this year's deployment.
listen to me: tiered RAM, use the on-package as a low(-ish) latency memory and keep the slots for another tier, we used to have CPU cache on board
I can see on package ram being like a level4 ram cache, and a slower on board ram.
I mean, how often do you want to upgrade your GPU memory for example? It's always sold as one package. Storage is a different ball game.
I remember as part of the USN as a Nuke ET they were older systems but they were still rack systems and we will required to diagnose a problem down to the individual component that went wrong but the solution would still be the same submit that report back with a complete swapping of the rack
Yeah I feel like when it comes to storage and memory, I keep being like right Ill fork out for the larger amount of memory and storage because when it comes to upgrading there's only so many Pcie lanes and your RAM benefits from being in those two slots at X speed with the other two slower (maybe not anymore), but when it comes to the things that effect the overall performance I don't want to leave stuff on the table.
The less less choice we have, the more we allow companies to monopolize hardware. It's just bad for consumers. If Intel, amd, and other cpu manufacturers decide they should up charge for the ram on cpus, and they (or other companies make their products (or software) incompatible with traditional ram, then guess what? We, as consumers, have to shell out the money to have functioning computers. No more choice.
Before something like this happens, we need strict standards to regulate price gouging and the monopoly of computer parts, as well as standards that specify how new hardware works with old hardware.
There is another aspect to remember because of quantum tunneling. At a certain point the process becomes impossible to avoid the tunneling, so gains are made by creating tightly vertical SoC's.
this is a problem for the chiplets themselves i doubt the glueing process is subject to this.
I’m actually down for a large SoC for desktops assuming they’re still socketed. Would make building SFF PC’s easier and aid in new users building a computer.
I can get behind a replaceable soc processor+memory combo, but it would be nice to be able to access that system if you wanted to.
Till my 20y I was doing occasional gaming and I was interested in getting the best the prettiest PC i could get…and I had time to thinker with it.
Now in my 29y I just wanted lightest, smallest laptop with longest battery life which ended up being macbook air m2 for me. For others it might be some other bigger more powerful version but what Im curious is how many people percentage-vise do want customizable PC which they would later upgrade etc. vs just “give me thing that works” kind of people?
It's a bit of a tech nit but makes a world of difference... soldered-on isn't the killer for repairability, it's the use of BGA - problem is that's probably unavoidable for the parts that matter...
And even then, how many of you do incremental upgrades anymore? For me it's "Well I bought this computer 4-5 years ago and upgrading one part makes no sense so might as well upgrade it all"
LaserDisc has two different disc types. CLV (60 minutes per side) and CAV (30 minutes per side). Most special editions of movies come on CAV instead of CLV because of it has a frame by frame mode. If Fellowship of the Ring Extended Cut made it to LD with a CAV release it would be on 4 discs. If you didn't have a Laserdisc player that didn't have flipped over the LaserDisc, you have to get up at least 8 times. The extra room on the last disc would be used for bonus content.
Fun fact: the Godfather Trilogy has 7 discs which is 13 sides. Image getting up 12 times to flip the disc!
Luke casually dropping a hacker's movie reference
I'm all for right to repair and all that, but if I wasn't a techy person and walked into a repair shop to repair a device, and the guy there quotes me for half of what I paid for the device years ago - I'm sorry, but I'm buying a new one.
Is there any reason we can't have both? Like keep the RAM slots but just also give us like 16 GB of L3 cache (or L4, whatever you want to call it)? I guess the CPU will cost more, but it'll be faster in many cases so the price per performance will still be comparable (possibly). And then the regular RAM is just for higher capacity at a slower speed (but still faster than an SSD). 🤷♂️
industrial down time is almost always more expensive than replacement parts or even new equipment parts, not to mention that a repaired device may need to be re-certified for reliability. I'm not defending the practice from an ethical or even environmental perspective, but at scale it's what makes economic sense
Upgradabilty is cool but performance is simply the number 1 priority. I mean sure we could also have individually upgradable cpu cores but that would make them suck. The advantage we can gain from a extremly fast interconect are simply not repucable in an upgradable system.
I don't mind tightly coupled chips, but Apple could absolutely have had e.g. 16GB or 32GB "unified memory" but allowed you to use SODIMMs... but then no one would buy the 96GB M2 Max stuff. I get why they wouldn't do this with a MacBook Air, maybe even a MacBook Pro... but there's no excuse for the lack of expandability on the Mac Pro.
This is called Hybrid Bonding. BE semiconductors industries NV, or BESI, based in the town of Duiven in The Netherlands develops and builds these systems.
It only makes sense if the power is WAY more than people would need, so companies can run the computer at lower speeds for better efficiency and longevity of the product. But companies won’t do that. I mean Surface tablets have failing SSDs which is theoretically impossible, yet they ruin the Surface for the ones with soldered SSDs. Offer an extended warranty for parts that shouldn’t fail, or allow it to be replaced without soldering.
failing SSDs is theoretically impossible? who told you that? you seriously misinformed. parts that shouldn't fail? there is no such thing.
@@giornikitop5373 mean usage is about 5-10 years of run time. Divide that by an average 10 hours of actual usage per day, and that’s 15-30 years. All much longer than the average use case for any device with soldered SSDs
@@RealWorldReview lol, yeah 30 years. and "theoretically" the world is great. now, in reality, ssd's usually fail due to hardware issues (controllers, voltage regulators etc) so, like i said, there is no such thing as parts won't fail/shouldn't fail. so, theoretical doesn't mean much in computer parts.
It actually makes more sense in a datacenter, but yeah, no doubt that when something applies to mobiles, to datacenters and to laptops, it's probably coming to desktops eventually, even if it might take time, possibly a decade or two.
I'd think repairs are more viable now that a GPU alone can cost $1000+, or Macs costing $2000+, they don't drop to $300 value that fast. For most products, the moment something is used it can be half its value, but not computers or especially GPU's. I would think repair is more viable than ever that hardware stopped dropping in cost from scale, and is now just exploding.
Embedded RAM needs to be just like a extra cache kinda thing, with regular RAM to hold the excess... Would really suck to not be able to upgrade/buy any config you want...
I will always keep buying computers/laptops that are upgradable. As long as I have the opportunity..
I don't see too much of a problem IF and only IF the on package RAM banks can be selectively disabled for bad RAM and you can still add more MB RAM. It's just another cache layer
Build HBM into the CPU but also include DIMM's as an extra larger cache, and honestly I wouldn't complain.
in the old dats before everything was shipped with a password i use to find open home wifi routers and it was so temping to change the settings for a laugh.
People in enthusiast grousp I feel often over estimate their impact on the market. I'm a car guy, been a subaru enthusiast for ever, I'd do anything for modern WRX hatchback. And I know in the groups, I'd have dozens, maybe even hundreds of people with the same desire. But in the scale of the market, we are still a drop in the bucket and they can't pander to us. Same thing with wagons. The US just wants SUVs and trucks... And enthusiasts sit here and see all the cool things Europe gets, but we'll never see because the enthusiasts just don't have that much sway in the overall market.
I feel like decisions of this gravitas/level, should be made for higher-tiers of consumer technology, e.g. “Enterprise, Corporate, Sm/Md Business, etc.”.
Manufacturers should offer a discount for return chips that can be used for recycling. It's in their best interest because a chip that's not on the second hand market is a chip sold.
I think that a transition to gigabyte scale 3d cache chiplets/tiles would be the way to go for most processors, with external memory controllers for high end and enthusiast tier products. Heck, you could make a relatively small memory controller chiplet that would simply not be included in lower SKUs like how AMD just cuts the CCD count from the 7900/50 to the 7800. I doubt that would happen though, too much focus on profit these days I think.
Other than like an at school showing what a laser disc was and some of the features it had I've only seen one movie all the way through on LaserDisk and that was Return of the Jedi I was going to say obviously but there might be a chance the special editions are on LaserDisk but it was before the special editions even came out that I saw it on LaserDisc
Imagine trying to watch The Stand on laserdisc at over 6 hours lol
As far as laptops go, I'm mostly fine with the SOC trend as long as the base model is something sensible and not just BARELY keeping up with the current generation. Storage needs to be upgradable though. Fixed storage laptops just need to die. If a base model SOC laptop started at 16 GB of RAM with a decent hardware, with upgradable storage, support for an optional eGPU, and an easily replaceable battery, I would be happy with that.
I've actually ordered a non-default mba and it took 2 weeks to arrive. No one is buying the non-default machines
Even LPDDR memory doesn't hold a candle to the memory that Apple has on the M2, which on the Max / Ultra, runs at 800 Gbps bandwidth.
If you truly have onboard SoC memory, it's a game-changer.