TLDR: developers creating infinite abstractions over abstractions, using thousands of dependencies, using virtualization + containers, refusing to learn more efficient stacks for the work they're doing, never optimizing, never learning better algorithms, forcing features you'll never use; all to make their life easier and their work more productive at your cost, and if you don't like it they tell you "it's your fault for not having the newest hardware anyway". We like to pretend it doesn't matter, but it does. And yes, management requirements do have some fault in that too.
I'm old enough to remember when Visual Studio not only didn't take a full minute (or more) to load a medium size project, but when it did, it even worked properly.
@@alexeysamokhin9629 even this is a stretch in my experience. I just tried Visual Studio on my 64GB ram machine, all top hardware and it still took a solid 20 seconds just to launch the program, forget about opening a project. Also it spat out some weird errors and crashed when I tried to just open a json file. vscode and vim had no such issues (LOL)
Yes. And now think about all the features you would miss. If we would develop with an old VS Version from 2005 or older. No thanks. And i dont think anybody really remembers how big the difference is now.
One day aliens will visit our planet completely depleted of energy and resources, use advanced technology to find out what went wrong it turned out to be chatbots and electron apps
Joplin, a note taking app uses 1.2Gb of ram on my machine. 1.2gb for taking notes. That's insane. And if you mention that to the devs they are like "F you"
"1.2gb for taking notes" Lol. If just need to take notes then use notepad? That app way more complex than "just taking notes". I guess thats basically a webview ( a webbrowser) with incredible amount of features and formatting and everything..
@@jungervin8765 nor really a very good argument since there are other apps with similar features that require much less ram, like Trilium, which uses 200mb.
@@balsalmalberto8086 - Yes it was. No one had the processing powers, etc. in the mid 90's to do all the texturing needed to make realistic graphics. Part of the reason after the N64, Nintendo used 32-bit cpus. Even with reaching the 4 Gb RAM threshold where 64-bit would make a difference, 32-bit was hardly to the limits of its capabilities, and many 32-bit programs would still be adequate today if not for the push of planned obsolescence that demands everything be 64-bit, at least until they're ready to force everything to be 128-bit, possibly in 10 years, maybe even less, despite scratching the surface of 64-bit.
The thing is, programming is fucked up right now. Like Jonathan Blow says, everything is more complicated for no sensible gain. You can't even put a pixel on the screen without going through N layers of cruft... Hardware has evolved a lot, but software did in fact degraded and we're not doing enough to stop it.
@@rhone733 It's not the developers fault to be honest, its all the ecosystem... This means, mostly companies, which are the ones that end up pushing technologies. But this includes Hardware manufacturers, Operating Systems, Programming Languages and everything that gets build up on top of this.
I definitely get a feeling that a lot of programming has no engineering mindset. Code is simply a tool to reach a goal, not an ends to a mean. Really the last place where performance truly exists as a culture is embedded systems, and even that is being invaded by the per kloc programmer who writes their AbstractBuilderFactory classes.
@@monkev1199 Thankfully there's *some* kind of awareness about these issues, and that's why I'm interested in the subject. I started my programming journey formally around 2014, with some bumps on the road and the pandemic but I never stopped learning and doing my own stuff (mostly through a overscoped game development project so I don't have much to show for it hah) I had been aware of the issue for some time and recently I got interested in actually going lower level, but I'm not sure where should I put my efforts into. This year I toyed a bit with Rust but recently I've been considering C, Zig, and C++. On the other side I've been learning about Cyber Security too.
@@monkev1199 Damn, youtube disappeared my comment. I said something along the lines of: Thankfully there's *some* awareness of the issue. I started my formal coding journey around 2014 and some stuff happened. I first saw J.B. talking about this a few years ago and I recently decided to try and go lower level. I did played and contributed to OSS with some Rust this year for about a month. And I'm currently looking where to put my efforts in, I'm considering C, Zig, C++. And also learning about CyberSecurity.
Yes, that's crazy, but do you know what's crazy? You can still take these old machines and write your texts on them and I think very few people do that. You don't have to buy a new machine to do something like that. I have computers here that are 40 years old and older and still work and are still an option when it comes to just writing text.
Most people do more things than just writing text. Did you watch the video? I'd like to see a computer with 128 kb of ram run a youtube video at 1080p.
@@MegaLokopo Yes, I watched the whole video and only referred to the text section. But yes, you can do it with a trick via Raspi hardware, that you build a module for a C64 that even conjures up such a UA-cam video on your CRT. And we all know that the resolution is not everything, especially with an analog device like an old CRT that I have here just like my C64s. So technically you can do it with old hardware and a module. And the whole thing is omly 64kb for display this, postprocessing with the a little help of a cheap Raspi.
You are cherry-picking contexts. Of course you can write text with an old machine, and you need heavy power for high-res video. But the point you are deliberately dodging was the problem with in-between. You cannot get modern daily tasks done with mere old word processor, but you shouldn't need high-res-video hardware for any of them. For example, reading news and weather forecasts, booking train tickets, doing online shopping, and basically *all* other basic daily tasks, none of them is functionally any more demanding than they were in 90s, yet due to unprofessional implementation, they *all* are now requiring GHz processor and GB of RAM, which they shouldn't need. Actually, text processing is a prime example - nowadays all text files you get from "out of your house" - despite being pure text - are in docx format, which needs a modern word processor, which tends to be unnecessarily bloated. Sure there are "niche ways" to do some of these, but they are too difficult for those most fond keeping old systems in use, and fundamentally no today's bloat should be tolerated on modern hardware either - new hardware is bought explicitly to run heavy payloads, not the useless overhead code which represents majority of the resource usage with today's bloatware on these functionally simple basic tasks (and often even on more advanced, resource-intensive uses), and this should be explicitly a concern for those who buy modern high-performance hardware, as it causes a lion's portion of the bought resources to be bought in vain. Imagine having a passenger car with body weighing a thousand metric tonnes. You wouldn't tolerate it no matter how cheap, powerful, and efficient semi-tractor engine you could have on it. And especially not if you want a high-performance one. Instead, the more advancement we have with car engines, and especially the more performant one you want, the lighter weight have also the car bodies been designed and wanted. The same should be the case with computers and software.
I just feel this jump to 16gb is not a jump that gives us something extra we are gonna need, unlike in the past. I think it's mostly gonna be used for sloppy programming and (so far) not very useful "AI" features. As example: I recently noticed my Dropbox Windows client used 750 mb ram idle. And my Adobe Lightroom / Camera Raw has exploded in ram and gpu ram usage (and requirements) after they added AI features (which I never touch). I'm a .NET developer myself and I noticed that after the Nuget package manager was introduced ram usage has exploded since ever framework now loads it's own dependencies (with sub and sub sub depedencies) almost always in different versions, often results in hundred mb of use just to run simple consoles app.
Exactly the issue. It's not the availability that's the issue. I'm HAPPY that I can easily build a machine with 64GB of RAM. That's great. What I'm not happy about is, as you said, something like dropbox using almost a gig of ram to do... nothing. Software is getting heavier and heavier, but adding nothing of value in the way of features. It's insanity.
16mb for win3.1? Lunduke you gotta be young. I used to run 3.1 just fine with 1mb ram on my 80286 machine while 384kb of that 1mb was reserved as graphics memory. 16mb was over the top even for win95/98.
yeah win 95/98 ran at like 4mb smoothly iirc (been awhile since i touched a win98 pc). Hell i think the first pc i got after i moved to japan, a Dell running XP was only like 1gb of ram
I specifically remember purchasing a new 386 PC a few months prior to the release of Windows 3.0. 1 MB of memory was sufficient to run the upcoming Windows 3.0, but I splurged and upgraded to 2 MB.
Let's not forget that bloat has a cost, and it's not just the money you pay for a new computer. Big tech companies are condemning millions of perfectly working computers to the scrap, and e-waste is not exactly easy to process. And the computers currently working around the world probably waste TWh of electricity just because of software bloat. But don't worry Apple, Google, Microsoft etc. are really serious about protecting the environment!
Web technologies are the worst offenders, but don't fool yourself, everything is in a sorry state... That's why Electron ends up getting used in the first place. Cross platform development comes with such a heavy baggage that software ends up being developed web first and WE SHOULD THINK WHY. Everything is so complex when it doesn't need to be. Software needs to change, it's not acceptable that in 2024 you can't even put a pixel on a screen on multiple platforms without going through unneeded layers of complexity. Just because of Apple, if you want to support Mac you need to use MoltenVK instead of Vulkan even if you're able to work with VK. Up until Vulkan 1.2 (2020) you couldn't use the same shader language for desktop platforms. This doesn't happen only with graphics sadly...
@@xDJKeroxFlutter is brilliant for modern dev using sane amounts of system resources. I hope the other OS vendors contribute to optimizing the experience like Canonical has done.
We had a 3dsmax plugin with 4kb ON DISK size written in C++, it got an electron "app" that is one of its screens, because it was "faster this way to do". They wanted to add a completely different electron app for a different screen because.... compatibility.... imagine that: just a plugin for an already huge program had full fledged two browsers for two of its gui windows... Thank God I could unify at least the two so there is only one... Still with that the plugin became like 200 megs up from 4kb and with nearly same functionality.
@@hashtag9990 No, electron in and on itseld is already a "fából vaskarika" bad idea in itself - and why the web is like it is now is exactly a series of similar decisions to be honest.
Within the last few months, Chrome has changed its algorithm for loading the last session's tabs at startup. It's much more efficient now, so RAM usage will go back down to where it was when people used bookmarks, and then start to grow from there. I saw someone on Reddit asking how they can recover 7000 tabs that didn't get restored. Unfortunately for them, they found a way.
And it sells computers to normies who would ask too many questions if their new system has the same oversized amount of RAM as their system from 5 years ago. But yes.
The problem is, that same methodology applies to the operating system itself! When you ultimately think about it, what can, for example, a modern word processor do that a word processor from 25 years ago can not do? But the ram requirements have gotten nuts.
i had 32 gigs of ram in 2014 its only just become the new optimal. Things have really slowed down. My gtx 1080 and 5820k from the same time period can still run every new game playable at 60 fps as well. Remember the days that a cutting edge pc would only last 2 years before it needed an upgrade or likely a whole new pc to keep up.
For the case of Apple, I have a sinking feeling they are doing this just to force their users to buy all new equipment. Every single iPhone except for the top end of the latest was instantly obsoleted. The base model of almost every Mac sold up to now was 8GB at most, so Apple not only has obsoleted all of their past Intel Macs, they now have taken out the base of all of their own silicon Macs sold. I think for iPhones the Apple minimum will be 8GB, but how long will that hold before they obsolete all of the 8GB models once they have higher ends ones with more memory, so that Apple can go for phones that eventually will sell for thousands of dollars.
I've had 16GB in my main computer since 2009. I've had 16GB in all my laptops since 2018. This is not a revolutionary amount of RAM in any realm except for Apple's reality extortion field, where RAM costs 8X what it costs on any other computer.
No matter how cheap the steel and semi-tractor engines are, I definitely wouldn't want my passenger car body to weigh a thousand metric tonnes. Any apologism for such is nothing but foolish. It's not revolutionary, it's disrevolutionary.
I remember going from 512MB to 1GB around 2004. That was crazy. So smooth. Most significant upgrade I’ve ever made, more significant than purchasing an SSD, and I’m pretty sure even today I could slim down my setup to 1GB if I really wanted/had to.
RAM requirements increasing, but the features aren't. People aren't working with larger files or anything. Word documents, spreadsheets, power points, they're all the same size as they have been. Why do RAM requirements keep increasing? This is a rhetorical question--we know the answer.
Induced demand, demand always grows to fit the capacity of a given system. This is as true for computers as it is for public transport, highway infrastructure, etc;
Maybe it's just an age thing, but I have a completely opposite opinion here. It seems wild to me, we are still seriously talking about 16 or even 32 GB of ram on a personal working machine. 64 should be a bare minimum, as it's not about "how much RAM do I need?" but about "how much RAM can I use?". The answer is - all of it. If I have more resources, I would find more ways to utilize them to make myself more productive. I don't need to have 3 different Linux distros, and a Windows running in a distrobox container at the same time, but If I can they are pretty useful to separate home/work/gaming systems. I don't need to have multiple workspaces with dozens of Firefox instances opened on them, but if I can, it would be useful. I don't need to have several personal ML models running on my machine at the same time, but if I have an option, I would find a way to use it to my advantage.
Yeah I can't complain that RAM is cheap enough to have a hard drive from 10 years ago worth of it hanging off your CPU. I bought 64GB of DDR4-ECC for less than $80 the other day.
I installed 64GB of RAM in my friend's laptop for him and it blew his mind. He didn't expect it to do anything, but never needing to hit the pagefile and Windows just caching everything really keeps the system responsive. RAM is so cheap now that it's incredible, I remember paying around $100 for 8GB of DDR3 back in the day. These days you can get 32GB of DDR5 for $100, sometimes less, and that's even with the massive amount of inflation we've had since the 2000s.
@@luizmonad777 Why shouldn't you use a desktop like a mainframe? RAM and CPU cores are both incredibly cheap now compared to 10-15 years ago. Running multiple virtual machines and having 200 tabs open is nothing when you can get 64GB of RAM for under $200 and Intel is selling 14 core CPUs for under $300.
I can't even imagine how much less efficient apps could even get, I think we've already reached peak inefficiency with Electron - but I'm sure that as available RAM continues to increase, the developers of the future will find ways to fill it (without making things any faster)
Ah, well then you should learn about microservices. You take your application, and split it into multiple web servers that talk to each other. That way a simple function call will take way longer. Then you add security and resiliency to that. You'll find you need a bunch of different computers to run it on because it takes up too many resources running on one. After running on a cluster, you'll find you need a larger cluster because the overhead of the communication is taking up too many resources. You'll need lots of retry logic and extra complexity to ensure the system doesn't go down now that you've multiplied the ways things can break. Now, your software sorta runs ok but is so complicated no one know if it really is or not. But that's ok because you use continuous integration to push out code before it is ready.
@@username7763 That's a great point. We definitely need to harness this slowness and inefficiency on the desktop. I wonder how long it will take the node.js folks to reinvent COM, except 1000x slower and hungrier for resources.
Things went downhill when javascript/web got pushed everywhere... In 2001, I used to browse the web, do text editing, and multitask on Windows 2000 machine with 128 MB RAM.
Back then, Javascript interpreters were way slower and less efficient than they are now. So it is telling that we've gone way past the performance improvements we made to be so much worse.
Years ago programmers had no issues doing their work in a simple text editor. Modern code-monkeys won't write a "Hello, world!" without an IDE which links some "very necessary" multimegabate libraries by default. Ah, and the best way to make cross-platform software nowadays is to use Electron, i.e. to include an entire browser in every piece of s.
We should be pushing towards better cross-platform development. It is so sad that the ecosystem is so f'd up that someone ended up "solving the issue" in this way. Web technology is horrendous to me, I've been praying for WASM to take over the web but it honestly talks about the failure of the ecosystem when it ended up being like this.
@@xDJKerox Web tech is amazing, it is just a skill issue. I am using a 6 years old, slow (max 2ghz core clock), 8gb ram notebook, never had problem with complex web apps or electron apps.
Yeah using Electron to be cross platform is like shipping a Windows VM with your software claiming to be cross platform. Actually, I know some linux software that did that with a Linux VM. Your software is not cross platform if the platform is runs on is as big as an operating system.
That's kind of the point. You need system resources for the convenience of devs who don't care about efficiency and for the benefit of companies that want to abuse those resources, not your actual necessity.
is that really so? good heavens. i know windows 10 uses 2gb, which is already horrendous. though i'm sure it also depends on how much ram is in your system
@@shallex5744 If you run Linux with KDE and uninstall PIM nonsense, it needs like 1.6GB idle. 2GB are barely enough to run even openbox once you launch either firefox or chrome based browser.
AI was the reason for my most recent RAM upgrade. I was doing fine with 32 GB. More than enough actually, with plenty of wiggle room. I got in to running AI language models locally though, so I upped my RAM to 64 GB so I can fit larger models in there.
better get board with 256Gb support of DDR4 (it's twice cheaper than new DDR5) my 128Gb is barely enough now, new merged mixed 100+ billions models using 120Gb+ Ram
In the early 90s my Amiga 500 I expanded it to 2megs RAM so I have 2.5 megs ram and got a 52 Meg HD. I think the key reason more ram is needed because we are using 64bit cpus with special floating point registers XMM 128bit and AVX 512 bits (64 bytes). Another reason is because the lazy development using high level languages that require a lot of ram; bloatware. You are right Lunduke I noticed my youtube tab was 3.6 Gigs ram needed heh... So 5 years from now; I think 64 GB ram and 10 years probably 128 GB to 1TB. BTW I think a new Babylon 5 is coming out soon; use to love that show.
I have a system I've been working with for a while, it runs Linux kernel 6.0 with complete GNURadio support, Numpy, Python3, hardware accelerated vector math libraries, and some custom drivers for an FPGA. The whole OS fits in a 32MB flash, and runs on a computer with 512MB RAM, the whole filesystem is decompressed from a 26MB image to become about 200MB and it runs in RAM, this leaves about 300MB for everything else. My desktop (kde crapma on arch) has 32GB and uses about 1GB at idle, my web browser (foxfire) uses around 7GB (granted I have a little under 150 tabs open right now). I only have 32GB of RAM so I can allocate 24GB to modded minecraft and have a playable experience, and yes, the modpacks I play actually require 16GB of RAM after optimizing, but I play some stupid complex modpacks.
This UA-cam tab alone taking 545 MB of RAM by the end of the video... And sometimes I have half a dozen to-be-watched YT tabs open in the background. For a comparison: Jellyfin open in browser streaming high quality FHD video from my NAS occupies just ~170 MB.
look into zram swap, by which you store compressed swap pages in ram, compressed at approximately a 4:1 efficiency ratio. it's like downloading more ram, but real. assuming a 4:1 compression ratio, if you had, say 8 GB of ram, you could dedicate 4 GB of it to zram-based swap, which would give you 4 GB of regular memory, and effectively 16 GB of swap, giving you an effective 20 gb of ram with only 8 GB of actual ram. but unlike a partition, this swap space is only filled as it's needed, so until you start swapping, you'll have access to 8 GB of ram like normal, only using up swap ram as it gets filled
It's cute of you to assume swap under windows works like the old ways and doesn't already demand a coefficient based on the installed memory just to duplicate everything to disk. Try setting up some windows vms with small emulated disks, and try to daily drive them without issues, ill wait. youre lucky if those devenv vm images work
there is a talk on CPPcon about computing a median of two numbers - the guy started with one line of code that computed (a+b)/2 and then considered all corner cases and ended up with a library function that has several hundreds of lines of code. When programming something complex you need to use libraries, or your program can crash because you forgot some corner cases - libraries are huge; but they give you a guarantee that your code will be stable and do what is expected
It would be interesting to see the increase related to technology. Here is my expectation: Artificial "Intelligence" is probably related to the latest jump. Necessary to get user private information "legally". The memory requirements for graphics has risen. From 800x600x8 bits to 3840x2160x32 bits + z-buffer for 3D graphics. Language technologies demanded more and more memory. From Assembler to C++ with its templates, Java-JIT, Javascript-engines, etc. All these technologies did not change much of the basic functionality of the computer.
@@brianmiller1077 I'd rather have 1080px at 30Hz than 480px at 144Hz, but I'd rather have 720px at 60Hz than either. No game ever needs more than 60 fps.
@@kpcraftster6580 The faster the frequency the more pleasant and calm it is to look at, that is why 100+ is much much better than 60, and why it is soooo comfortable to use an e-book reader. In the e-book readers case, there is not a frequency, the image is completely still! So no game needs it, but your eyes and mind need it more than the games.
@@viciouswaffle I realize that I am likely the exception to the rule, but 144 and 240 give me migraines and nausea, while I can look at 60 or even 30 for hours on end without any discomfort.
@@kpcraftster6580 That is definitely odd. I can't bear to look at anything below about 100FPS without noticing stutter. Not so bad when watching TV, but still noticeable. It's a lot worse when gaming due to the interactive element.
Something I've noticed is that older people are absolutely obsessed with memory usage lol. It's pretty clear we're going to plateau somewhere between 16GB to 32GB, assuming there's not another dark horse like AI. As a gamer, 16GB has been the real minimum spec for probably a decade at this point. If you want high quality high resolution assets on everything, memory and storage usage is just going to go up. CPUs are so fast now that they can actively compress and decompress memory, storage is insanely fast now so going out to the page file doesn't hurt that bad anymore, etc.
I would agree with you but why does everything need to be so bloated. Games are different in a sense they need to load visual assests and use a lot of memory. We can't just let these companies run with everything. They already collect a lot of our data. Things can be done differently and should be. Privacy and careful development of AI should be a priority.
Well, I completed Witcher 3 just fine back in 2015 on i5-3330, r9 270 with only *4Gb of RAM* (Win 7) and that was more than enough back in the day for any gaming machine. Nothing was freezing or lagging, smooth 45-50fps on high-med at 1080p. There's always gonna be 'that one game/app' that for no reason raises the bar for system requirements so I rarely ever bothered upgrading above the required minimum since you can't really even 'catch up' for too long. Installed 24Gb on my main machine, 20GB on a laptop - won't upgrade in at least 4-5 years.
@@Ben333bacc Notice how I added the qualifier "as a gamer". Yeah grandma checking her email doesn't even need a computer at this point, she needs an iPad lol
@@justinJ689 No company is going to release software that cant run on the majority of computers, unless it's so advanced that they quite literally just cant get it running. If hardware slows down maybe then they'll take the time to really dig into optimization to squeeze out that last ~20% or whatever, but until then it's not worth the dev time.
There's a lot going here... We've been trading memory usage for IO and/or CPU for the duration of the computer industry. I remember CRC-16 lookup tables on the C64: they were coo- to-haves at 300-600 baud, but they were necessary at 2400: the 6502 just couldn't keep up with the serial line if it had to do the full regime of bit-shifts with every byte. But then Z-modem came along with the PC, and it was using CRC-32! The C64 didn't have enough ram for that lookup table. Where are we now? Harddrives are getting bigger, but they're not really getting faster. Industry moved from RAID5 to RAID6 because of it. Its not like you got away from that for free because you're on a desktop. Your computer is caching fonts, images, libraries, configs and databases. Computers take a few minutes to boot compared to the seconds that a home-computer used to. I don't remember the source, but I recall a study from the mid-90s that concluded that more the half of all computer upgrades were faster CPUs but that should have been more RAM. Most of our software these days is still single-threaded. Computers do as much as they do because they can pre-compute and cache.
As a counterpoint, 10 years ago I wasn't streaming UA-cam at 1080p or 4k resolutions on a regular basis. Also, while there are definite inefficiencies in algorithm design, remember the trade off between time and space efficiencies. Sure we use a lot more space, but often it is done to maximize time. One thing I did learn recently is that unused ram is wasted ram. If you don't need it at the time, an OS predicting what sections of code you might use and loading it into ram will help speed up your system. If itpredicts wrong or you need the ram? The cost was probably less than what you saved from when it's predictions were right and you weren't using the ram.
The Google search function installed on my android phone uses 644 MB disk space. It displays an input box. Meanwhile, I wrote a document & training registration system for manufacturing plants. It was around 10 MB.
My Commodore PET 2001 had about 16 kilobytes of RAM initially. I later upgraded to a Commodore 64 with 64 kilobytes, which was massive. Even on the Commodore PET, I was able to program simple text to speech output using phonemes around 1981.
I feel similar looking back how much RAM my computers had in the past. But when I finally upgraded to 16gig a year ago, I was actually surprised that it came so late. And of course, this is just a personal experience, but this was, that I had 8 gigs since at least 10 years in my computers and it was totally fine and only upgraded basically cause I needed RAM and the 16gig were so cheap. So my personal RAM size graph does not look like it is still increasing exponentially as it was in the 90s and 2000s. I am still hopeful, that RAM will cap at around 64gigs. But hell yeah, the amount of memory a browser needs...but also how people use browsers. It seems people are under the impression, that they constantly have to have at least 10 to 15 tabs open and then maybe more at times. I think my average is maybe 3 tabs open at the same time, because I close tabs, that I do not use atm.
heck Xcode 16 Beta apple's development tools state you need 16 gas or more, so if they say 8gbs is enough then I think they are wrong, so I reckon the next Macs with M4s will have a min of 16gbs
4:27 4 MB RAM in 1989 was sorta high-end. The Mac II and SE/30 both typically shipped with 1 MB or 2 MB. Not sure exactly when 4 MB became typical, but maybe not until 1992 or 1993? And a Mac SE/30 ran fine with 1 MB.
Fascinating, I remember 48k spectum etc. And to 1MB on the Amiga, then maybe 1MB on the PC. Now I usee collossal machine in the cloud. Mad stuff really.
This really started with games. You started to want 16GB, then some games suggested that as a requirement. If RAM is available then it will be used. Now you can be less efficient.
When I came to Uni back in 1992 the CS department had a server with 128 Gb of ram. That was an unfathomable amount of ram for me back then. Today my refrigerator has more memory than that.
Roughly 20 years ago we discussed the same. It was just Megabytes instead of Gigabytes back then. This is just how technology works - but it will surely plateau at some point.
Totally agree with this. It is completely out of control. Just browsing Firefox for a couple of hours, opening a few tabs, results in it consuming 6+ GB. What the heck is even going on?
But that's the thing, did it make you stop? Did you decide to run a different browser or visit different sites? Unless it causes people to change behavior, performance will be prioritized below everything else.
@@username7763 Well, yes. It did make me stop. As for Windows becoming a totalitarian pain in the xxx OS, I created a Debian spin called Kumander Linux. I am only using that for over 2 years now. And as for Firefox being such a memory hog, I am in the process of finding a better alternative. Might give Opera a go, heard good things about it.
For me idle is 20 tabs and two browsers open at any time and that uses 37% @ 64GB ram. I approach 50% when I have Krita open with ComfyUI/Stable Diffusion going on. I sent out a mini pc to a friend that wanted 96GB, I just explained 96GB is really for some server boards running virtual machines. Usually the OS won't allow software to break out of 8/16GB for technical reasons. It's nice to know you can buy a mini pc now and not need to upgrade for 10 years.
I'd be interested to see how this graph correlates with recommended hard drive space and average disk and Internet bandwidth. I'd hazard that at least some (but certainly not all) of these RAM increases come from the amount of data we're expecting our computers to process every day, probably mostly from video streaming.
I regularly ran into oom-kill issues when transcoding hdr files in an lxc that was given 16gb to play with. Memory efficiency just isn't something devs bother to consider nowadays. Just load everything into RAM, uncompressed, and don't bother to unload anything until your whole program closes. That seems to be standard operating procedure.
Bryan, I think eventually we're gonna tune the operating system to load software that you commonly do every day so everything's gonna be in memory so we're gonna need tons of memory
Local run LLM require a lot of memory, however maybe they can be put to use to make application more memory efficient . Programs used to be written in low level programming languages which is part of the reason why they used to be less resources intensive.
I run a lot of AI models locally for fun, and I have to say even with 64GBs of RAM at points with the much bigger models like Grok for example even that has started to feel limiting. Good for me since Grok is bloated for it's performance, but not being able to run models well like WizardLM 2 has kinda hurt a little. I want 128GBs when I upgrade my CPU, but I feel like soon even that might be too little if I want to run the absolute bleeding edge models. At least with AI doesn't feel like bloat is at fault, these AI models are just really hard to run and actually need all of that RAM unless you find new architecture, some weird bitnet shenanigans, or just find out how to train them and filter data better etc. However, I would be concern if normal non AI related software starts taking up that much. But I somewhat hopeful it won't, but again, I could be acting foolish. I think that more memory will become standard for sure, but not for running basic programs unless those programs have AI junk in them now which I think most of that is a fad. AI is extremely useful, but it doesn't need to be in everything. For instance a terminal iTerm 2 added AI integration, I don't mind it but why? I could just have a program on the terminal to write commands for me with AI, I do not need intergration into the terminal itself it seems pointless when programs on the terminal exist
Finally, AI will be the excuse needed for younger gamers to get their hands on high end hardware. These kids will be playing high end games on their family computer
there's nothing wrong with getting more RAM to do tasks that inherently, at their most primitive function, require, or can take advantage of that much ram to do their bidding. same with CPU speed. think of something like encoding a video, that's a task that more or less, has already been broken down to its most efficient primitive computations, and, more or less, the only thing we can do to make that task faster or more efficient, is by upgrading your CPU speed/RAM count. so for those kinds of tasks, it is appropriate to upgrade your hardware if those tasks are important to you. there are certain things , certain types of computations and tasks that simply cannot be done without certain levels of CPU speed or memory, at least not in a timely fashion. an inappropriate case of needing to upgrade hardware, however, would be when software becomes so inefficient or bloated that it takes vastly more CPU cycles or memory to achieve basically the same type of computation or end result as you used to be able to achieve with much less resources on more efficient software
I predict that memory itself will radically change and that in the not too distant future memory will then have very little in common with what it is today. This will be driven and motivated by the need to keep costs obtainable and to be able to quickly scale up as technology advances.
That Babylon 5 thing... There was likely some prototyping that can be done initial modelling, but in production I've read that they got pile of Amigas, each having 32Mb of memory for rendering. That is also considered cheap solution instead of expensive SGI/wavefront workstations and the trick how it worked is that there was limitations in assets: Each 3D-model (mesh and texture) need to fit in 2Mb of memory. They also got 65k vertex limit on model and 1k texture limit but memory limitation for model was controlling their workflow. They were able to render scene with some number of models and trick to add complexity was multiple layers. Same tricks works today. You can render movie with Raspberry Pi if you want to with lower asset fidelity. And actually it would be very fast because of 3D hardware. Just keep assets fit in memory and don't use fancy raytracing/pathtracing. Rendertime would be bottlenecked by 4267Mb/s memory bandwidth. Todays movies CGI scene can be size of 256Tb. They have been pushing boundaries in fidelity since Babylon 5 times.
My 2009 laptop currently runs Debian 12 with LXDE, in under 400 MB RAM. With the 4GB it has I can browse the web, watch FullHD movies, play games, etc. Even when it runs Linux Mint with Cinnamon, taking 1.5GB RAM, there's enough left for the same things. The laptop runs great with the SSD I put in, and I'm pretty sure it will be usable for a while longer. Of course it's no longer a daily driver, which is why I don't need it for everything, but it can do a lot anyway. On my main PC I already have 64GB RAM, so I can do a lot more.😆
In 2002 I built a Windows PC with 512 MB of RAM and some time later I added 256 MB, so the total amount of RAM was 768 MB. On that PC I ran Windows 95, Windows 98, Windows ME and Windows XP
In many workplaces you need to keep open and switch between: a dozen browser tabs, a word processor, a spreadsheet editor, some chat/meeting client and maybe some specific company software. Many companies haven't upgraded from 8GB and it been slow, unstable and painful for like 4 years now. People are struggling and I blame the software companies and their awful practices. No one cares about optimizing anything anymore.
A modern CPU has roughly the same amount of L1 cache as my first Apple II computer had RAM, about the same amount of L2 cache as my first 286 had RAM, and as much L3 cache as my first hard-drive. The idea regularly boggles my mind that my first Wolfenstein machine could store its hard-drive in L3 and play the game entirely from L2 without ever touching physical RAM let alone the actual HDD/SSD.
I was using a crappy £200 laptop that had windows on it for like 4-5 years, it had 4gb ram and 32gb drive 😆(this is whilst i made the videos on my youtube channel). I had to rely on making virtual memory using disk so i could use applications without them crashing (i would add like 20gb page file on disk, etc). I also could not update windows due to the small 32gb drive.... and i relied basicaly on my 64gb sd card as my "home" folder... and using vps a lot as well. Now i am on a decent laptop with 16gm ram, 16 inch screen, 16:10, etc.... LG Gram 16 😃
Developpers: We've got 16 gigs of RAM to work with. Let's add some fancy features and eye candy to our app in order to improve the user experience. Hardware manufacturers: nowadays most modern programs require 16 gigs of RAM Let's double that amount on the platform we're developing now in order to stay ahead of the curve...
Honestly the average office worker already needs 32gb to manage the stuff they have open at once, and that's just a few apps, bunch of tabs and a video meeting.
Recenty, I was able to install the newest Debian 12 on my "vintage" PC, which only has 256 megs of RAM and a Pentium II CPU. Granted, it is only in text mode, and no interesting services are loaded, but the system after boot takes only around 26 MB of RAM.
I am a grown man and power user, yet I never used more than about 1.4GB RAM. The exception was when about 40 tabs in Firefox ate 7.8GB. I often only use about 700MB RAM, developing financial models and doing financial engineering. People waste their lives and too much resources on non-essentials.
RAM requirements have lagged behind what computers typically need to not be bottle-necked for yearrrsss. 16GB is the minimum for even the 2014 shop PCs at the company I work at, 32 for the new ones, and 64gb for the power users.
I had 8GB of ram in a PC I built back in 2007. I put 32GB of ram in my current PC I built in 2012. While yes, software has gotten crazy bloated, computers have not kept up with the dropping prices and increasing density of ram. I don't think it is unreasonable to have 32gb or more be a minimum. 32GB costs $50 these days. I remember buying 1MB SIMMs for that price each. It just doesn't make sense to save the $5 or $10 you'll save by cutting that in half for a machine that will last you a decade or more.
We need more space for ads, telemetry, and surveillance.
"Unused RAM is wasted RAM!"
😂😢
Electron
@@Demopans5990 Yea that's what he said "ads, telemetry, and surveillance" also known as electron. lol
ever wonder where all that extra super speed and bandwidth went when 5G rolled around to no difference o.0
React + Electron in every "desktop" application yummmm 😋
🤮
That's why every idiot saying "Unused RAM is wasted RAM", and "What's the harm in another game launcher ransoming us" are just useful idiots.
There are some computers still using yum. 😏
@@JohnCrawford1979 still waiting for the libdnf rewrite huh?
@@bruwyvn - No one's gonna yank my yogurt. 😏
TLDR: developers creating infinite abstractions over abstractions, using thousands of dependencies, using virtualization + containers, refusing to learn more efficient stacks for the work they're doing, never optimizing, never learning better algorithms, forcing features you'll never use; all to make their life easier and their work more productive at your cost, and if you don't like it they tell you "it's your fault for not having the newest hardware anyway". We like to pretend it doesn't matter, but it does. And yes, management requirements do have some fault in that too.
For some recent games, even having the latest hardware is not enough to compensate for the developer's lack of optimisation
@@mathmagician8191 Kof Kof *elden ring* kof kof
so our new generation developers are incompetent and inefficient?
@@serversC13nc3 No, the problem is cost. It cost too much to change it. Cheaper to have the end user upgrade.
Well, yes Java developers are.
I'm old enough to remember when Visual Studio not only didn't take a full minute (or more) to load a medium size project, but when it did, it even worked properly.
Nvme + 32gb ram = VS IS FAST
@@alexeysamokhin9629 even this is a stretch in my experience. I just tried Visual Studio on my 64GB ram machine, all top hardware and it still took a solid 20 seconds just to launch the program, forget about opening a project. Also it spat out some weird errors and crashed when I tried to just open a json file. vscode and vim had no such issues (LOL)
Yes. And now think about all the features you would miss. If we would develop with an old VS Version from 2005 or older. No thanks. And i dont think anybody really remembers how big the difference is now.
@@moritzbecker131 such as...
@@moritzbecker131 Name 5 things you use in 2022 that weren't present in 2005.
128 GB Clippy, thanks for the nightmares
Five years later, 2TB Clippy, no difference just hungrier.
One day aliens will visit our planet completely depleted of energy and resources, use advanced technology to find out what went wrong it turned out to be chatbots and electron apps
@@s1nistr433 That's one way AI will use Humans against each other; transitioned from Humans using AI to use Humans against each other.
Joplin, a note taking app uses 1.2Gb of ram on my machine. 1.2gb for taking notes. That's insane. And if you mention that to the devs they are like "F you"
"1.2gb for taking notes" Lol. If just need to take notes then use notepad? That app way more complex than "just taking notes". I guess thats basically a webview ( a webbrowser) with incredible amount of features and formatting and everything..
@@jungervin8765 nor really a very good argument since there are other apps with similar features that require much less ram, like Trilium, which uses 200mb.
Thing is mining some coins probably
The gigabyte is the new megabyte
And getting closer to being the new kilobyte. 😏
8 bit to 64 but addressing. Expotential address space = exponentially more addressable ram
@@Pressbutan - and they're already talking about 128 bit when we haven't even come close to reaching the limits of 64 bit.
@@JohnCrawford1979 So you're saying N64 was so far ahead into the future we just couldn't see it?
@@balsalmalberto8086 - Yes it was. No one had the processing powers, etc. in the mid 90's to do all the texturing needed to make realistic graphics. Part of the reason after the N64, Nintendo used 32-bit cpus. Even with reaching the 4 Gb RAM threshold where 64-bit would make a difference, 32-bit was hardly to the limits of its capabilities, and many 32-bit programs would still be adequate today if not for the push of planned obsolescence that demands everything be 64-bit, at least until they're ready to force everything to be 128-bit, possibly in 10 years, maybe even less, despite scratching the surface of 64-bit.
This graph desperately needs a logarithmic Y scale
Yeah, the jump from 128K ram to 4 MB looks tiny, but it’s actually a 3100% increase I think? No different than going from 128mb to 4 gigs.
Yes this.
The increase doesnt ramp up. It slows down.
I blame javascript devs
fax
I miss flash
Yeah they really need to suck_less
very valid
I blame them every time I open youtube. It is the epitome of JS hell. I can wait 2 secs to fetch some data.
Your daily reminder that building Hello World in Electron produces gigabytes of binary artifacts.
The thing is, programming is fucked up right now. Like Jonathan Blow says, everything is more complicated for no sensible gain.
You can't even put a pixel on the screen without going through N layers of cruft...
Hardware has evolved a lot, but software did in fact degraded and we're not doing enough to stop it.
Yep. Crappy developers don't know how to write clean, efficient code.
@@rhone733 It's not the developers fault to be honest, its all the ecosystem... This means, mostly companies, which are the ones that end up pushing technologies.
But this includes Hardware manufacturers, Operating Systems, Programming Languages and everything that gets build up on top of this.
I definitely get a feeling that a lot of programming has no engineering mindset. Code is simply a tool to reach a goal, not an ends to a mean.
Really the last place where performance truly exists as a culture is embedded systems, and even that is being invaded by the per kloc programmer who writes their AbstractBuilderFactory classes.
@@monkev1199 Thankfully there's *some* kind of awareness about these issues, and that's why I'm interested in the subject.
I started my programming journey formally around 2014, with some bumps on the road and the pandemic but I never stopped learning and doing my own stuff (mostly through a overscoped game development project so I don't have much to show for it hah)
I had been aware of the issue for some time and recently I got interested in actually going lower level, but I'm not sure where should I put my efforts into.
This year I toyed a bit with Rust but recently I've been considering C, Zig, and C++. On the other side I've been learning about Cyber Security too.
@@monkev1199 Damn, youtube disappeared my comment.
I said something along the lines of: Thankfully there's *some* awareness of the issue. I started my formal coding journey around 2014 and some stuff happened.
I first saw J.B. talking about this a few years ago and I recently decided to try and go lower level. I did played and contributed to OSS with some Rust this year for about a month.
And I'm currently looking where to put my efforts in, I'm considering C, Zig, C++. And also learning about CyberSecurity.
Yes, that's crazy, but do you know what's crazy? You can still take these old machines and write your texts on them and I think very few people do that. You don't have to buy a new machine to do something like that. I have computers here that are 40 years old and older and still work and are still an option when it comes to just writing text.
Most people do more things than just writing text. Did you watch the video? I'd like to see a computer with 128 kb of ram run a youtube video at 1080p.
@@MegaLokopo Yes, I watched the whole video and only referred to the text section. But yes, you can do it with a trick via Raspi hardware, that you build a module for a C64 that even conjures up such a UA-cam video on your CRT. And we all know that the resolution is not everything, especially with an analog device like an old CRT that I have here just like my C64s. So technically you can do it with old hardware and a module. And the whole thing is omly 64kb for display this, postprocessing with the a little help of a cheap Raspi.
George R. R. Martin still uses WordStar 4.0 for MS-DOS.
You are cherry-picking contexts. Of course you can write text with an old machine, and you need heavy power for high-res video.
But the point you are deliberately dodging was the problem with in-between. You cannot get modern daily tasks done with mere old word processor, but you shouldn't need high-res-video hardware for any of them.
For example, reading news and weather forecasts, booking train tickets, doing online shopping, and basically *all* other basic daily tasks, none of them is functionally any more demanding than they were in 90s, yet due to unprofessional implementation, they *all* are now requiring GHz processor and GB of RAM, which they shouldn't need.
Actually, text processing is a prime example - nowadays all text files you get from "out of your house" - despite being pure text - are in docx format, which needs a modern word processor, which tends to be unnecessarily bloated.
Sure there are "niche ways" to do some of these, but they are too difficult for those most fond keeping old systems in use, and fundamentally no today's bloat should be tolerated on modern hardware either - new hardware is bought explicitly to run heavy payloads, not the useless overhead code which represents majority of the resource usage with today's bloatware on these functionally simple basic tasks (and often even on more advanced, resource-intensive uses), and this should be explicitly a concern for those who buy modern high-performance hardware, as it causes a lion's portion of the bought resources to be bought in vain.
Imagine having a passenger car with body weighing a thousand metric tonnes. You wouldn't tolerate it no matter how cheap, powerful, and efficient semi-tractor engine you could have on it. And especially not if you want a high-performance one.
Instead, the more advancement we have with car engines, and especially the more performant one you want, the lighter weight have also the car bodies been designed and wanted. The same should be the case with computers and software.
I just feel this jump to 16gb is not a jump that gives us something extra we are gonna need, unlike in the past. I think it's mostly gonna be used for sloppy programming and (so far) not very useful "AI" features. As example: I recently noticed my Dropbox Windows client used 750 mb ram idle. And my Adobe Lightroom / Camera Raw has exploded in ram and gpu ram usage (and requirements) after they added AI features (which I never touch). I'm a .NET developer myself and I noticed that after the Nuget package manager was introduced ram usage has exploded since ever framework now loads it's own dependencies (with sub and sub sub depedencies) almost always in different versions, often results in hundred mb of use just to run simple consoles app.
Exactly the issue. It's not the availability that's the issue. I'm HAPPY that I can easily build a machine with 64GB of RAM. That's great. What I'm not happy about is, as you said, something like dropbox using almost a gig of ram to do... nothing. Software is getting heavier and heavier, but adding nothing of value in the way of features. It's insanity.
16mb for win3.1? Lunduke you gotta be young. I used to run 3.1 just fine with 1mb ram on my 80286 machine while 384kb of that 1mb was reserved as graphics memory. 16mb was over the top even for win95/98.
yeah win 95/98 ran at like 4mb smoothly iirc (been awhile since i touched a win98 pc). Hell i think the first pc i got after i moved to japan, a Dell running XP was only like 1gb of ram
9x needed 8mb to really be usable. 4mb was the listed minimum but even MS internally admitted that was borderline.
Nah I ran Win 3.1 on a 386SX with 8MB of RAM and it made a huge difference going to 16MB. Similar to going from 8GB to 16GB today.
I specifically remember purchasing a new 386 PC a few months prior to the release of Windows 3.0. 1 MB of memory was sufficient to run the upcoming Windows 3.0, but I splurged and upgraded to 2 MB.
@@pilotamurorei no way... for 9x you needed 8mb
Let's not forget that bloat has a cost, and it's not just the money you pay for a new computer.
Big tech companies are condemning millions of perfectly working computers to the scrap, and e-waste is not exactly easy to process.
And the computers currently working around the world probably waste TWh of electricity just because of software bloat.
But don't worry Apple, Google, Microsoft etc. are really serious about protecting the environment!
I blame Electron
True. Witnessed Teams eat 9GB of ram once.
Web technologies are the worst offenders, but don't fool yourself, everything is in a sorry state... That's why Electron ends up getting used in the first place.
Cross platform development comes with such a heavy baggage that software ends up being developed web first and WE SHOULD THINK WHY.
Everything is so complex when it doesn't need to be.
Software needs to change, it's not acceptable that in 2024 you can't even put a pixel on a screen on multiple platforms without going through unneeded layers of complexity. Just because of Apple, if you want to support Mac you need to use MoltenVK instead of Vulkan even if you're able to work with VK.
Up until Vulkan 1.2 (2020) you couldn't use the same shader language for desktop platforms.
This doesn't happen only with graphics sadly...
@@xDJKeroxFlutter is brilliant for modern dev using sane amounts of system resources.
I hope the other OS vendors contribute to optimizing the experience like Canonical has done.
We had a 3dsmax plugin with 4kb ON DISK size written in C++, it got an electron "app" that is one of its screens, because it was "faster this way to do". They wanted to add a completely different electron app for a different screen because.... compatibility.... imagine that: just a plugin for an already huge program had full fledged two browsers for two of its gui windows...
Thank God I could unify at least the two so there is only one... Still with that the plugin became like 200 megs up from 4kb and with nearly same functionality.
@@hashtag9990 No, electron in and on itseld is already a "fából vaskarika" bad idea in itself - and why the web is like it is now is exactly a series of similar decisions to be honest.
Within the last few months, Chrome has changed its algorithm for loading the last session's tabs at startup. It's much more efficient now, so RAM usage will go back down to where it was when people used bookmarks, and then start to grow from there.
I saw someone on Reddit asking how they can recover 7000 tabs that didn't get restored. Unfortunately for them, they found a way.
Putting more ram into computers only enables developers to be less efficient
bingo
I bet you watched it in 144p after first downloading it to a hard drive without anything else running in the background
And it sells computers to normies who would ask too many questions if their new system has the same oversized amount of RAM as their system from 5 years ago. But yes.
The problem is, that same methodology applies to the operating system itself! When you ultimately think about it, what can, for example, a modern word processor do that a word processor from 25 years ago can not do? But the ram requirements have gotten nuts.
@@slaapliedje In the case of modern MS Word, quite a lot. But MS Word is overkill for 95% of people.
@10:31 LOL! I love that _"...minimum viable operating system where you can actually do word processing"_ line.
Good job, Byran Lunduke.
i had 32 gigs of ram in 2014 its only just become the new optimal. Things have really slowed down. My gtx 1080 and 5820k from the same time period can still run every new game playable at 60 fps as well. Remember the days that a cutting edge pc would only last 2 years before it needed an upgrade or likely a whole new pc to keep up.
For the case of Apple, I have a sinking feeling they are doing this just to force their users to buy all new equipment. Every single iPhone except for the top end of the latest was instantly obsoleted. The base model of almost every Mac sold up to now was 8GB at most, so Apple not only has obsoleted all of their past Intel Macs, they now have taken out the base of all of their own silicon Macs sold. I think for iPhones the Apple minimum will be 8GB, but how long will that hold before they obsolete all of the 8GB models once they have higher ends ones with more memory, so that Apple can go for phones that eventually will sell for thousands of dollars.
I've had 16GB in my main computer since 2009. I've had 16GB in all my laptops since 2018. This is not a revolutionary amount of RAM in any realm except for Apple's reality extortion field, where RAM costs 8X what it costs on any other computer.
No matter how cheap the steel and semi-tractor engines are, I definitely wouldn't want my passenger car body to weigh a thousand metric tonnes. Any apologism for such is nothing but foolish.
It's not revolutionary, it's disrevolutionary.
I remember going from 512MB to 1GB around 2004. That was crazy. So smooth. Most significant upgrade I’ve ever made, more significant than purchasing an SSD, and I’m pretty sure even today I could slim down my setup to 1GB if I really wanted/had to.
This data should be displayed logarithmically.
Log scales are for quitters who can't find enough paper to make their point PROPERLY.
RAM requirements increasing, but the features aren't. People aren't working with larger files or anything. Word documents, spreadsheets, power points, they're all the same size as they have been. Why do RAM requirements keep increasing? This is a rhetorical question--we know the answer.
Induced demand, demand always grows to fit the capacity of a given system. This is as true for computers as it is for public transport, highway infrastructure, etc;
Maybe it's just an age thing, but I have a completely opposite opinion here. It seems wild to me, we are still seriously talking about 16 or even 32 GB of ram on a personal working machine. 64 should be a bare minimum, as it's not about "how much RAM do I need?" but about "how much RAM can I use?". The answer is - all of it. If I have more resources, I would find more ways to utilize them to make myself more productive. I don't need to have 3 different Linux distros, and a Windows running in a distrobox container at the same time, but If I can they are pretty useful to separate home/work/gaming systems. I don't need to have multiple workspaces with dozens of Firefox instances opened on them, but if I can, it would be useful. I don't need to have several personal ML models running on my machine at the same time, but if I have an option, I would find a way to use it to my advantage.
Yes. This. All of it.
Yeah I can't complain that RAM is cheap enough to have a hard drive from 10 years ago worth of it hanging off your CPU. I bought 64GB of DDR4-ECC for less than $80 the other day.
I installed 64GB of RAM in my friend's laptop for him and it blew his mind. He didn't expect it to do anything, but never needing to hit the pagefile and Windows just caching everything really keeps the system responsive. RAM is so cheap now that it's incredible, I remember paying around $100 for 8GB of DDR3 back in the day. These days you can get 32GB of DDR5 for $100, sometimes less, and that's even with the massive amount of inflation we've had since the 2000s.
if you have to use a desktop like a mainframe that only says how bad are operating systems failing at their task
@@luizmonad777 Why shouldn't you use a desktop like a mainframe? RAM and CPU cores are both incredibly cheap now compared to 10-15 years ago.
Running multiple virtual machines and having 200 tabs open is nothing when you can get 64GB of RAM for under $200 and Intel is selling 14 core CPUs for under $300.
I can't even imagine how much less efficient apps could even get, I think we've already reached peak inefficiency with Electron - but I'm sure that as available RAM continues to increase, the developers of the future will find ways to fill it (without making things any faster)
Ah, well then you should learn about microservices. You take your application, and split it into multiple web servers that talk to each other. That way a simple function call will take way longer. Then you add security and resiliency to that. You'll find you need a bunch of different computers to run it on because it takes up too many resources running on one. After running on a cluster, you'll find you need a larger cluster because the overhead of the communication is taking up too many resources. You'll need lots of retry logic and extra complexity to ensure the system doesn't go down now that you've multiplied the ways things can break. Now, your software sorta runs ok but is so complicated no one know if it really is or not. But that's ok because you use continuous integration to push out code before it is ready.
@@username7763 That's a great point. We definitely need to harness this slowness and inefficiency on the desktop.
I wonder how long it will take the node.js folks to reinvent COM, except 1000x slower and hungrier for resources.
Things went downhill when javascript/web got pushed everywhere...
In 2001, I used to browse the web, do text editing, and multitask on Windows 2000 machine with 128 MB RAM.
Back then, Javascript interpreters were way slower and less efficient than they are now. So it is telling that we've gone way past the performance improvements we made to be so much worse.
Years ago programmers had no issues doing their work in a simple text editor. Modern code-monkeys won't write a "Hello, world!" without an IDE which links some "very necessary" multimegabate libraries by default. Ah, and the best way to make cross-platform software nowadays is to use Electron, i.e. to include an entire browser in every piece of s.
We should be pushing towards better cross-platform development. It is so sad that the ecosystem is so f'd up that someone ended up "solving the issue" in this way.
Web technology is horrendous to me, I've been praying for WASM to take over the web but it honestly talks about the failure of the ecosystem when it ended up being like this.
@@xDJKerox Web tech is amazing, it is just a skill issue. I am using a 6 years old, slow (max 2ghz core clock), 8gb ram notebook, never had problem with complex web apps or electron apps.
@@jungervin8765 Please don't shill HTML and JavaScript for the love of god...
Yeah using Electron to be cross platform is like shipping a Windows VM with your software claiming to be cross platform. Actually, I know some linux software that did that with a Linux VM. Your software is not cross platform if the platform is runs on is as big as an operating system.
@@username7763 Stop it. You all have no clue about tech. Please stop.
All of the ram. Just... all of it.
That's kind of the point. You need system resources for the convenience of devs who don't care about efficiency and for the benefit of companies that want to abuse those resources, not your actual necessity.
Win 11 uses almost 8gb of ram just sitting idle.
"""idle"""
is that really so? good heavens. i know windows 10 uses 2gb, which is already horrendous. though i'm sure it also depends on how much ram is in your system
@@shallex5744 If you run Linux with KDE and uninstall PIM nonsense, it needs like 1.6GB idle. 2GB are barely enough to run even openbox once you launch either firefox or chrome based browser.
@@pluffcrock3438 it’s busy sending telemetry data needs a lot of ram for that
@@shallex5744 2? my windows 10 sits comfortably at 4.5GB when it's doing nothing...
AI was the reason for my most recent RAM upgrade. I was doing fine with 32 GB. More than enough actually, with plenty of wiggle room. I got in to running AI language models locally though, so I upped my RAM to 64 GB so I can fit larger models in there.
better get board with 256Gb support of DDR4 (it's twice cheaper than new DDR5) my 128Gb is barely enough now, new merged mixed 100+ billions models using 120Gb+ Ram
Im totally here for the Babylon 5 references.
In the early 90s my Amiga 500 I expanded it to 2megs RAM so I have 2.5 megs ram and got a 52 Meg HD. I think the key reason more ram is needed because we are using 64bit cpus with special floating point registers XMM 128bit and AVX 512 bits (64 bytes). Another reason is because the lazy development using high level languages that require a lot of ram; bloatware. You are right Lunduke I noticed my youtube tab was 3.6 Gigs ram needed heh... So 5 years from now; I think 64 GB ram and 10 years probably 128 GB to 1TB. BTW I think a new Babylon 5 is coming out soon; use to love that show.
Windows is pretty bloated now to and their gonna bloat it some more with AI that spies on you.
I have a system I've been working with for a while, it runs Linux kernel 6.0 with complete GNURadio support, Numpy, Python3, hardware accelerated vector math libraries, and some custom drivers for an FPGA. The whole OS fits in a 32MB flash, and runs on a computer with 512MB RAM, the whole filesystem is decompressed from a 26MB image to become about 200MB and it runs in RAM, this leaves about 300MB for everything else.
My desktop (kde crapma on arch) has 32GB and uses about 1GB at idle, my web browser (foxfire) uses around 7GB (granted I have a little under 150 tabs open right now). I only have 32GB of RAM so I can allocate 24GB to modded minecraft and have a playable experience, and yes, the modpacks I play actually require 16GB of RAM after optimizing, but I play some stupid complex modpacks.
This UA-cam tab alone taking 545 MB of RAM by the end of the video... And sometimes I have half a dozen to-be-watched YT tabs open in the background. For a comparison: Jellyfin open in browser streaming high quality FHD video from my NAS occupies just ~170 MB.
And people will still be augmenting that with a 64GB swap partition 💪
because i have 64 gigs of ram i have accidentally allowed distros im installing to automatically make 64 gig swaps.
Noswap masterrace right here.
It would only make sense if you hibernate
look into zram swap, by which you store compressed swap pages in ram, compressed at approximately a 4:1 efficiency ratio. it's like downloading more ram, but real. assuming a 4:1 compression ratio, if you had, say 8 GB of ram, you could dedicate 4 GB of it to zram-based swap, which would give you 4 GB of regular memory, and effectively 16 GB of swap, giving you an effective 20 gb of ram with only 8 GB of actual ram. but unlike a partition, this swap space is only filled as it's needed, so until you start swapping, you'll have access to 8 GB of ram like normal, only using up swap ram as it gets filled
It's cute of you to assume swap under windows works like the old ways and doesn't already demand a coefficient based on the installed memory just to duplicate everything to disk. Try setting up some windows vms with small emulated disks, and try to daily drive them without issues, ill wait. youre lucky if those devenv vm images work
there is a talk on CPPcon about computing a median of two numbers - the guy started with one line of code that computed (a+b)/2 and then considered all corner cases and ended up with a library function that has several hundreds of lines of code.
When programming something complex you need to use libraries, or your program can crash because you forgot some corner cases - libraries are huge; but they give you a guarantee that your code will be stable and do what is expected
Just buy OEM shares and trust that developers' incompetence will connive with planned obsolescence and AI bloat to give you good returns.
Titanic was created on an Amiga 4000 too, and nobody’s saying their SFX are cheesy. Incidentally, I loved Babylon 5.
I think Amiga was used for camera control and quick preview of recorded shots. There is a youtube video about restored Amiga from production set.
It would be interesting to see the increase related to technology.
Here is my expectation:
Artificial "Intelligence" is probably related to the latest jump. Necessary to get user private information "legally".
The memory requirements for graphics has risen. From 800x600x8 bits to 3840x2160x32 bits + z-buffer for 3D graphics.
Language technologies demanded more and more memory. From Assembler to C++ with its templates, Java-JIT, Javascript-engines, etc.
All these technologies did not change much of the basic functionality of the computer.
we need all of it😁
12:45 This video had major "Old man yells at clouds" vibes... but now you have my attention sir.
Lunduke's Law : Minimum RAM requirement for desktop OSes doubles every 5 years
I wouldn't be surprised if in 20 years min requirements could be 64gbs or 128gbs
Gen z "gamers" these days talking about 64gb as a bare minimum, when every game worth playing runs smooth on a 10+ y/o win7 laptop with 6gb. Smh.
Well, they gotta have 240 fps to be competitive.
@@brianmiller1077 I'd rather have 1080px at 30Hz than 480px at 144Hz, but I'd rather have 720px at 60Hz than either. No game ever needs more than 60 fps.
@@kpcraftster6580 The faster the frequency the more pleasant and calm it is to look at, that is why 100+ is much much better than 60, and why it is soooo comfortable to use an e-book reader. In the e-book readers case, there is not a frequency, the image is completely still!
So no game needs it, but your eyes and mind need it more than the games.
@@viciouswaffle I realize that I am likely the exception to the rule, but 144 and 240 give me migraines and nausea, while I can look at 60 or even 30 for hours on end without any discomfort.
@@kpcraftster6580 That is definitely odd. I can't bear to look at anything below about 100FPS without noticing stutter. Not so bad when watching TV, but still noticeable. It's a lot worse when gaming due to the interactive element.
Something I've noticed is that older people are absolutely obsessed with memory usage lol. It's pretty clear we're going to plateau somewhere between 16GB to 32GB, assuming there's not another dark horse like AI. As a gamer, 16GB has been the real minimum spec for probably a decade at this point. If you want high quality high resolution assets on everything, memory and storage usage is just going to go up.
CPUs are so fast now that they can actively compress and decompress memory, storage is insanely fast now so going out to the page file doesn't hurt that bad anymore, etc.
I would agree with you but why does everything need to be so bloated. Games are different in a sense they need to load visual assests and use a lot of memory. We can't just let these companies run with everything. They already collect a lot of our data. Things can be done differently and should be. Privacy and careful development of AI should be a priority.
Well, I completed Witcher 3 just fine back in 2015 on i5-3330, r9 270 with only *4Gb of RAM* (Win 7) and that was more than enough back in the day for any gaming machine. Nothing was freezing or lagging, smooth 45-50fps on high-med at 1080p. There's always gonna be 'that one game/app' that for no reason raises the bar for system requirements so I rarely ever bothered upgrading above the required minimum since you can't really even 'catch up' for too long.
Installed 24Gb on my main machine, 20GB on a laptop - won't upgrade in at least 4-5 years.
16 gig has NOT been the minimum spec for ten years dude. That is nuts. 16 gb is still plenty if not overkill for most people today.
@@Ben333bacc Notice how I added the qualifier "as a gamer". Yeah grandma checking her email doesn't even need a computer at this point, she needs an iPad lol
@@justinJ689 No company is going to release software that cant run on the majority of computers, unless it's so advanced that they quite literally just cant get it running. If hardware slows down maybe then they'll take the time to really dig into optimization to squeeze out that last ~20% or whatever, but until then it's not worth the dev time.
There's a lot going here... We've been trading memory usage for IO and/or CPU for the duration of the computer industry. I remember CRC-16 lookup tables on the C64: they were coo- to-haves at 300-600 baud, but they were necessary at 2400: the 6502 just couldn't keep up with the serial line if it had to do the full regime of bit-shifts with every byte. But then Z-modem came along with the PC, and it was using CRC-32! The C64 didn't have enough ram for that lookup table.
Where are we now? Harddrives are getting bigger, but they're not really getting faster. Industry moved from RAID5 to RAID6 because of it. Its not like you got away from that for free because you're on a desktop. Your computer is caching fonts, images, libraries, configs and databases. Computers take a few minutes to boot compared to the seconds that a home-computer used to.
I don't remember the source, but I recall a study from the mid-90s that concluded that more the half of all computer upgrades were faster CPUs but that should have been more RAM.
Most of our software these days is still single-threaded. Computers do as much as they do because they can pre-compute and cache.
If you don't need 16 gigs, how can they charge more for their machines?
That, and lazy programmers
Thanks for finally some senseful commentary on the prevailing horrendous state of software industry!
We'll probably be up to 1Tb RAM minimum within 5 to 10 years.
That RAM sounds like the GOAT, yo.
Justine have that much on her Mac i think
1.5 Tb and it was 4 years ago
Nah, memory isn't really scaling on process nodes like it used too.
@@crazybeatrice4555 cool
As a counterpoint, 10 years ago I wasn't streaming UA-cam at 1080p or 4k resolutions on a regular basis.
Also, while there are definite inefficiencies in algorithm design, remember the trade off between time and space efficiencies. Sure we use a lot more space, but often it is done to maximize time.
One thing I did learn recently is that unused ram is wasted ram. If you don't need it at the time, an OS predicting what sections of code you might use and loading it into ram will help speed up your system. If itpredicts wrong or you need the ram? The cost was probably less than what you saved from when it's predictions were right and you weren't using the ram.
The Google search function installed on my android phone uses 644 MB disk space.
It displays an input box.
Meanwhile, I wrote a document & training registration system for manufacturing plants. It was around 10 MB.
There are Silicon Graphics machines that used 16MB of memory and could do anti aliased 3d graphics.
My Commodore PET 2001 had about 16 kilobytes of RAM initially. I later upgraded to a Commodore 64 with 64 kilobytes, which was massive. Even on the Commodore PET, I was able to program simple text to speech output using phonemes around 1981.
it's a combination. developers keep using more resources and consumers keep buying new computers. it's a system and we're all part of it
I feel similar looking back how much RAM my computers had in the past. But when I finally upgraded to 16gig a year ago, I was actually surprised that it came so late. And of course, this is just a personal experience, but this was, that I had 8 gigs since at least 10 years in my computers and it was totally fine and only upgraded basically cause I needed RAM and the 16gig were so cheap. So my personal RAM size graph does not look like it is still increasing exponentially as it was in the 90s and 2000s. I am still hopeful, that RAM will cap at around 64gigs.
But hell yeah, the amount of memory a browser needs...but also how people use browsers. It seems people are under the impression, that they constantly have to have at least 10 to 15 tabs open and then maybe more at times. I think my average is maybe 3 tabs open at the same time, because I close tabs, that I do not use atm.
Could have put logarithmic scale on the graph. it'd show multiplications easier.
Log scales are for quitters who can't find enough paper to make their point PROPERLY.
@@rodrigogirao8344 :)
heck Xcode 16 Beta apple's development tools state you need 16 gas or more, so if they say 8gbs is enough then I think they are wrong, so I reckon the next Macs with M4s will have a min of 16gbs
6:00 You should've used a log scale. Also that's not a bar graph, that's a line graph.
well if you wanna run llama 70b you need like 80gb of vram, which I def want. Hurry up nvidia
4:27 4 MB RAM in 1989 was sorta high-end. The Mac II and SE/30 both typically shipped with 1 MB or 2 MB. Not sure exactly when 4 MB became typical, but maybe not until 1992 or 1993? And a Mac SE/30 ran fine with 1 MB.
I swear I read the upload time as "7 years ago" after reading the thumbnail and I got way too confused
Fascinating, I remember 48k spectum etc. And to 1MB on the Amiga, then maybe 1MB on the PC. Now I usee collossal machine in the cloud. Mad stuff really.
This really started with games. You started to want 16GB, then some games suggested that as a requirement. If RAM is available then it will be used. Now you can be less efficient.
When I came to Uni back in 1992 the CS department had a server with 128 Gb of ram. That was an unfathomable amount of ram for me back then. Today my refrigerator has more memory than that.
You clearly meant MB, not GB.
Roughly 20 years ago we discussed the same. It was just Megabytes instead of Gigabytes back then. This is just how technology works - but it will surely plateau at some point.
Totally agree with this. It is completely out of control. Just browsing Firefox for a couple of hours, opening a few tabs, results in it consuming 6+ GB. What the heck is even going on?
But that's the thing, did it make you stop? Did you decide to run a different browser or visit different sites? Unless it causes people to change behavior, performance will be prioritized below everything else.
@@username7763 Well, yes. It did make me stop. As for Windows becoming a totalitarian pain in the xxx OS, I created a Debian spin called Kumander Linux. I am only using that for over 2 years now. And as for Firefox being such a memory hog, I am in the process of finding a better alternative. Might give Opera a go, heard good things about it.
For me idle is 20 tabs and two browsers open at any time and that uses 37% @ 64GB ram. I approach 50% when I have Krita open with ComfyUI/Stable Diffusion going on. I sent out a mini pc to a friend that wanted 96GB, I just explained 96GB is really for some server boards running virtual machines. Usually the OS won't allow software to break out of 8/16GB for technical reasons. It's nice to know you can buy a mini pc now and not need to upgrade for 10 years.
I'd be interested to see how this graph correlates with recommended hard drive space and average disk and Internet bandwidth. I'd hazard that at least some (but certainly not all) of these RAM increases come from the amount of data we're expecting our computers to process every day, probably mostly from video streaming.
I regularly ran into oom-kill issues when transcoding hdr files in an lxc that was given 16gb to play with.
Memory efficiency just isn't something devs bother to consider nowadays. Just load everything into RAM, uncompressed, and don't bother to unload anything until your whole program closes. That seems to be standard operating procedure.
The system I am using now, Lenovo P520, is configured with 128GB of RAM, soon to be 256GB.
Bryan, I think eventually we're gonna tune the operating system to load software that you commonly do every day so everything's gonna be in memory so we're gonna need tons of memory
Before refreshing this webpage, Vivaldi was telling me that it was using up 2 GB of RAM.
alpine linux with xfce needs 200~400mb idle
i saw Mental Outlaw on youtube make his gentoo system with dwm use about 50 mb if i remember correctly
I'm old enough to remember a time when operating systems were judged by how little Ram they could get away with.
I've had 32GB for the last 5ish years and been slowly feeling like I'm running out of it and need at least 64GB to be able to self host my own LLM.
Local run LLM require a lot of memory, however maybe they can be put to use to make application more memory efficient . Programs used to be written in low level programming languages which is part of the reason why they used to be less resources intensive.
I run a lot of AI models locally for fun, and I have to say even with 64GBs of RAM at points with the much bigger models like Grok for example even that has started to feel limiting. Good for me since Grok is bloated for it's performance, but not being able to run models well like WizardLM 2 has kinda hurt a little. I want 128GBs when I upgrade my CPU, but I feel like soon even that might be too little if I want to run the absolute bleeding edge models. At least with AI doesn't feel like bloat is at fault, these AI models are just really hard to run and actually need all of that RAM unless you find new architecture, some weird bitnet shenanigans, or just find out how to train them and filter data better etc.
However, I would be concern if normal non AI related software starts taking up that much. But I somewhat hopeful it won't, but again, I could be acting foolish. I think that more memory will become standard for sure, but not for running basic programs unless those programs have AI junk in them now which I think most of that is a fad. AI is extremely useful, but it doesn't need to be in everything. For instance a terminal iTerm 2 added AI integration, I don't mind it but why? I could just have a program on the terminal to write commands for me with AI, I do not need intergration into the terminal itself it seems pointless when programs on the terminal exist
Finally, AI will be the excuse needed for younger gamers to get their hands on high end hardware. These kids will be playing high end games on their family computer
there's nothing wrong with getting more RAM to do tasks that inherently, at their most primitive function, require, or can take advantage of that much ram to do their bidding. same with CPU speed. think of something like encoding a video, that's a task that more or less, has already been broken down to its most efficient primitive computations, and, more or less, the only thing we can do to make that task faster or more efficient, is by upgrading your CPU speed/RAM count. so for those kinds of tasks, it is appropriate to upgrade your hardware if those tasks are important to you. there are certain things , certain types of computations and tasks that simply cannot be done without certain levels of CPU speed or memory, at least not in a timely fashion. an inappropriate case of needing to upgrade hardware, however, would be when software becomes so inefficient or bloated that it takes vastly more CPU cycles or memory to achieve basically the same type of computation or end result as you used to be able to achieve with much less resources on more efficient software
i was using 12GB of ram 14 years ago, 16 does not seem that bad. With that said, I hate javascript
I predict that memory itself will radically change and that in the not too distant future memory will then have very little in common with what it is today. This will be driven and motivated by the need to keep costs obtainable and to be able to quickly scale up as technology advances.
That Babylon 5 thing... There was likely some prototyping that can be done initial modelling, but in production I've read that they got pile of Amigas, each having 32Mb of memory for rendering.
That is also considered cheap solution instead of expensive SGI/wavefront workstations and the trick how it worked is that there was limitations in assets: Each 3D-model (mesh and texture) need to fit in 2Mb of memory. They also got 65k vertex limit on model and 1k texture limit but memory limitation for model was controlling their workflow.
They were able to render scene with some number of models and trick to add complexity was multiple layers.
Same tricks works today. You can render movie with Raspberry Pi if you want to with lower asset fidelity. And actually it would be very fast because of 3D hardware. Just keep assets fit in memory and don't use fancy raytracing/pathtracing. Rendertime would be bottlenecked by 4267Mb/s memory bandwidth.
Todays movies CGI scene can be size of 256Tb. They have been pushing boundaries in fidelity since Babylon 5 times.
I have 16GB in VRAM and 16GB of regular RAM kind of crazy where things are headed.
My 2009 laptop currently runs Debian 12 with LXDE, in under 400 MB RAM. With the 4GB it has I can browse the web, watch FullHD movies, play games, etc. Even when it runs Linux Mint with Cinnamon, taking 1.5GB RAM, there's enough left for the same things. The laptop runs great with the SSD I put in, and I'm pretty sure it will be usable for a while longer. Of course it's no longer a daily driver, which is why I don't need it for everything, but it can do a lot anyway. On my main PC I already have 64GB RAM, so I can do a lot more.😆
because no one programs with memory management in mind anymore because we have so much to play with. EVERYTHING IS BLOATED.
In 2002 I built a Windows PC with 512 MB of RAM and some time later I added 256 MB, so the total amount of RAM was 768 MB. On that PC I ran Windows 95, Windows 98, Windows ME and Windows XP
In many workplaces you need to keep open and switch between: a dozen browser tabs, a word processor, a spreadsheet editor, some chat/meeting client and maybe some specific company software. Many companies haven't upgraded from 8GB and it been slow, unstable and painful for like 4 years now. People are struggling and I blame the software companies and their awful practices. No one cares about optimizing anything anymore.
think how much power that will use on a global scale.
A modern CPU has roughly the same amount of L1 cache as my first Apple II computer had RAM, about the same amount of L2 cache as my first 286 had RAM, and as much L3 cache as my first hard-drive. The idea regularly boggles my mind that my first Wolfenstein machine could store its hard-drive in L3 and play the game entirely from L2 without ever touching physical RAM let alone the actual HDD/SSD.
I was using a crappy £200 laptop that had windows on it for like 4-5 years, it had 4gb ram and 32gb drive 😆(this is whilst i made the videos on my youtube channel). I had to rely on making virtual memory using disk so i could use applications without them crashing (i would add like 20gb page file on disk, etc). I also could not update windows due to the small 32gb drive.... and i relied basicaly on my 64gb sd card as my "home" folder... and using vps a lot as well. Now i am on a decent laptop with 16gm ram, 16 inch screen, 16:10, etc.... LG Gram 16 😃
I’d like to see this but for mobile phones from 1995 to now.
Developpers: We've got 16 gigs of RAM to work with.
Let's add some fancy features and eye candy to our app in order to improve the user experience.
Hardware manufacturers: nowadays most modern programs require 16 gigs of RAM
Let's double that amount on the platform we're developing now in order to stay ahead of the curve...
...Improve the user experience?
Honestly the average office worker already needs 32gb to manage the stuff they have open at once, and that's just a few apps, bunch of tabs and a video meeting.
Recenty, I was able to install the newest Debian 12 on my "vintage" PC, which only has 256 megs of RAM and a Pentium II CPU. Granted, it is only in text mode, and no interesting services are loaded, but the system after boot takes only around 26 MB of RAM.
I am a grown man and power user, yet I never used more than about 1.4GB RAM. The exception was when about 40 tabs in Firefox ate 7.8GB. I often only use about 700MB RAM, developing financial models and doing financial engineering. People waste their lives and too much resources on non-essentials.
“Greetings, I am a time traveller from 2050! My computer has 2 TB of RAM, but Windows is still running slow!” 🤣
You provide remarkable little insight into technology.
RAM requirements have lagged behind what computers typically need to not be bottle-necked for yearrrsss.
16GB is the minimum for even the 2014 shop PCs at the company I work at, 32 for the new ones, and 64gb for the power users.
I had 8GB of ram in a PC I built back in 2007. I put 32GB of ram in my current PC I built in 2012. While yes, software has gotten crazy bloated, computers have not kept up with the dropping prices and increasing density of ram. I don't think it is unreasonable to have 32gb or more be a minimum. 32GB costs $50 these days. I remember buying 1MB SIMMs for that price each. It just doesn't make sense to save the $5 or $10 you'll save by cutting that in half for a machine that will last you a decade or more.