3 Outdated Tech Terms we all Keep Saying
Вставка
- Опубліковано 17 лис 2024
- Check out the K65 Plus Wireless at: lmg.gg/corsair...
Learn about common tech terms we still use that have their origin in very old devices.
Leave a reply with your requests for future episodes.
► GET MERCH: lttstore.com
► GET A VPN: www.piavpn.com...
► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloa...
► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
FOLLOW US ELSEWHERE
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
The fun part with the IBM PC is that since a bunch of machines had already been referred to as "Personal Computers" before the PC was made, it was basically impossible to defend a trademark on "Personal Computer." As a result IBMs followup platform was the "Personal System 2" or PS/2, which is where those old "PS/2" keyboard and mouse ports come from.
So yeah, the old joke "if PCs are so good, why isn't there a PC2?" Well there was, but they had to change a letter to make a more awkward name they could actually trademark.
Bro IBM is really one of the OG when it comes to R&D. Like I have seen even Honeywell following IBM in some ways
I remember my ps/2. I think my mom bought it when I was about 6. Played a lot of Castles, Duke Nukem, ZZT, and Wolfenstein on that beast. Also taught myself Basic on it.
Jesus is the way, the truth and the life. Turn to him and repent from your sins today!
What? IBM literally was paid 1 dollar by all manufacturers for each computer made for a period of time. In other words, they defended it through a legal case that was only being upheld for that time period. As part of the case, they weren’t allowed to continue with the registry of PC for them exclusively. Think car, automobile, etc…
@@svBlackMantaI remember my PS2. I think my dad bought it when I was around 10. Played a lot of Tony Hawk, Need for Speed, FMX and Bully on that beast. Also taught myself pirating on it.
Because i RAM the Stick in the Socket
These things go into a socket??
@@jarde1989 I’d call it a slot but I guess calling it a socket work too
@drew2626 Everything as a slot or an outlet can be considered as socket.
Giggidy
I didn't expect a dad joke... but I got a dad joke. lol
My favorite naming conventions:
* BIOS. It's not a BIOS anymore, it's a UEFI, but everybody still calls it the BIOS
* We still call storage drives "disks", despite SSDs being commonplace and many PCs having zero spinning parts
OKAY ASIDE FROM FANS
UEFI is a Basic Input/Output System, technically. It's just not an IBM compatible BIOS. So that's technically the term we should be using: "IBM Compatible BIOS" vs "UEFI", or "BIOS" for when the distinction doesn't matter. Nobody would do that though.
@@teknixstuffBIOS is a specific firmware specification that allowed the OS to interact with the hardware, it's not a generic term. UEFI is the specification that replaced it. If you want a term that refers to both, then I think firmware would be most accurate.
@@Geerice BIOS is an acronym for Basic Input/Output System, which certainly sounds like a generic term. And it's used like a generic term in many cases, for instance console firmware (often required for emulation) is typically referred to as the console's BIOS.
@@teknixstuff Sure, it sounds like a generic term, but it's not. It's a formerly proprietary technology by IBM that was later reverse-engineered into a de facto standard firmware architecture. That's what makes it fall into the category of "outdated terms"
@@Geerice I'm fairly certain the term BIOS was used before the IBM PC was even released.
CMOS (Complementary Metal Oxide Semiconductor). In the 1980s it used to describe low power chips that could work with batteries and the memory could keep its contents with minimal battery power so the term is still used to refer for the clock that keeps the time on a PC. CMOS used to be slower than the NMOS used in computers.
However, nowadays CMOS is the standard technology for chips so the term is somewhat strange.
I first learned the term 'CMOS' for what we now refer to as BIOS (or UEFI) settings.
Before the IBM PC, personal computers were often referred to as microcomputers. "Micro" compared with the older machines that filled up an entire room and "mini" computers the size of a washing machine.
They called micro not because of case size, but because of using Microprocessor.
I still like to call modern desktops micro computers occasionally, just for its vintage outdated charm lol
@@pensivepenguin3000 Well... that's the definition I recall from my first computer course about 30 years ago!!!
I don't think that's actually correct.
@@vbifusful nope. There were main frames, mini computers and micro computers which were the first computers you could use at home or a small office. So to say a Personal Computer or Home Computer. Just look up a box of the C64. It says Micro Computer.
I like how companies still talk about "SSL certificates", while the underlying tech is using the TLS protocols (having deprecated the SSL protocols).
But it is still a valid certificate for the SSL protocol
Technically speaking should be called SSL/TLS certificates
They are really X.509 certificates.
Also, TLS 1.0 is *really* old. For some reason it existed alongside SSL 3.0 for a long time, with nobody caring, until the array of vulnerabilities that killed SSL3 were discovered.
I like how we still often use a pictogram of a floppy disk for a "Save" button. We ditched floppy disks about 15 years ago, so there is a generation of computer users who never saw a floppy disk.
Honorable mention: "debugging" originally referred to the process of going in and removing actual bugs/insects from the computer (back when computers were the size of rooms)
The most infuriating to me is when storage space on phones are called ROM. As in 4GB RAM/128GB ROM. That is totally not how ROM works.
The 128 GB cannot be written to, it can only be read... hence the name Read Only Memory. If you CAN write to it, however, it's not ROM.
@@laurendoe168 Is your phone's entire 128GB storage read-only?
@@laurendoe168 um, What? That would make iPhone even more useless wouldn't it?
@@Cronic318 That's what I'm saying. I don't understand why they even mention how much ROM something has - it's basically useless.
@@puilp0502 I have no idea how much ROM my phone has. If it ever stated, I ignored it. Mine has 16 GB RAM I believe.
When writing, SSDs or flash memory in general is only _kinda_ random access. You have to re-write the entire block, which makes random operations slow. Contrast this with Intel's now-dead Optane technology, which did not suffer from this restriction.
I guess the point is that it's abstracted away by the drive's flash translation layer, from the system perspective you specify an exact address and you get the data back. Even RAM doesn't have uniform latency for accesses, it's much faster to read from a memory location from an adjacent column if the row is already open, which is why CPU caches fetch a whole bunch of bytes in one go.
@@ricequackers fair point about the RAM. But then I guess it's a matter of scale. If my tape driver also abstracts away all the rewinding and seeking on the tape, does it also count as random access?
You can make flash memory with single byte access but it's comparatively slow and is usually reserved to store binaries for microcontrollers
I don't care why it's called RAM I just want to know how I can download more of it?
You don't
Just search up download ram, you’ll find a good site, I recommend downloading 128 gigs and maybe 256 gigs if you need a lot
@@lilguygamingandmorewdym? I downloaded 128gb ram this morning.
@@lilguygamingandmore party crusher
Meanwhile I just want to know how much I have to have dedotated to my Minecraft server
I may randomly access this memory
I may sequentially access this memory
Triggered Access Memories ooof
The role of RAM is still memory, not storage as the SSD is. Memory and storage are still two separate systems. RAM is still a useful term.
True, but I wonder how much longer. The role of a SATA SSD has been usurped by M.2 cards. What is the next step in miniaturisation? DDR8 that's large enough to be the entire virtual storage paging file (i.e. literally real storage).
Part of what makes RAM useful separate from NAND storage even as m.2 capabilities increase is that the multiple channels are further divided across CPU cores. This parallelized paradigm allows for more efficient usage than shoving everything through one bottleneck on an m.2 interface. Not to mention NAND storage suffers more from high usage than RAM does, offloading the most frequent operations to separate physical memory prolongs the life of SSD storage. For these reasons I don't foresee a combination of these functions for a long time yet.
@@brianvogt8125 ...You mean the venerable RamDisk (for you kids, that was a section or memory partitioned off to act as a virtual storage drive, though it needed to be copied from actual storage at boot, and written back before shutdown)?
@@VulpisFoxfire No, I speculated about any device that performs the function of the current DDR5, but is so cheap that it's large enough to make virtual storage & paging redundant. All addresses literally real, just like in 1964. Nothing to do with the boot & shutdown processes, so can be volatile.
@@brianvogt8125 ...Problem being, RAM *is* volatile. Unless battery backed up (like was used on some old game cartridges), you're going to lose the contents on every power-down.
1:11 Old, mechanical spinning hard drives were also technically random access since the drive heads could move directly to the specified platter/track/sector. Yes, it's a lot slower than SSD... but it's not sequential like a tape drive.
My first home computer has a cassette player for SAM memory.
Good old Commodore 64!
@@MonkeyJedi99 Right on! My first computer was also the C64 back in '85. Then bought the C128 the following year. I still have a newer 64-C model, but sadly it's in the box these days - the Vice emulator is a pretty decent alternative that fits the bill when I have the urge to crunch eight bits. 😊
The heads in rotating disk units do not move "directly" as in "instantly." They take time to move sequentially across the tracks. Having arrived at the required track, they engage in rotational position sensing (a sequential process of reading sectors). It's all measured in milliseconds. I agree that nothing is as time consuming as a tape.
@@brianvogt8125 And even though hard drives are random access, they are significantly faster to access sequentially, because for each sequential stream, you only had an initial seek to the proper cylinder, an initial rotational delay, and a short track-to-track seek and rotational delay when you fill the cylinder.
@@jimrafert7372 That's true for the first few hundreds or thousands of files written on a disk device, and for short files. After that, fragmentation becomes a problem, and the smooth operation you described stumbles because the next fragment can be on any cylinder. In any case, for the past 30+ years, it's all fronted by high speed cache memory partly filled by a sequential read ahead process (hoping that the app is processing data sequentially).
I remember as a kid my dad shopping for a computer and making sure that it was "IBM Compatible" now I feel old.
Colossians 3.
1. If ye then be risen with Christ, seek those things which are above where Christ sitteth on the right hand of God.
2. Set your affection on the things which are above and not on things which are on the earth.
3. For ye are dead and your life is hid with Christ in God.
4. When Christ, which is our life shall appear, so shall ye also appear with him in Glory.
********
Jesus is calling you today. Come to him, repent from your sins, bear his cross and live the victorious life
********
HAIL SATAN
@@DryPaperHammerBro okay dude no need to go *that* far
@@terdik36 There may be no need, but it was decently funny. Personally I'd have gone with a flying spaghetti monster parody, or possibly discordianism, but "Hail Satan" does have a flair to it that is hard to match.
@@JesusPlsSaveMe not ideal way to spread religion. i prefer non religion
DYK, Natural gas is "natural" because before we figured out how to transport natural gas long distances, it used to be "manufactured" in a factory from coal.
huh, TIL!
Everything is natural, cuz the universe is deterministic, hence whatever we do is already written, including every single atom of oxygen being breathed, carbon emitted, etc.. 🗿
@@nocturn9x Pfp (Profile Picture) and / or Banner Sauce (Source [Artist])? 🗿
Quantum mechanics abhors a deterministic universe @@SimoneBellomonte
@@SimoneBellomonte there's no evidence that the universe is deterministic, and Quantum mechanics actually suggests (if not proves) the opposite
Fun fact: In Russian and English the meaning of "RAM" abbreviation is completely different. In russian it's "operational memoriing device" ([operativnoe zapominajushee ustrojstvo], [OZU]), it also have term "memory with arbitrary access", but it's barely used and doesn't even have acronym.
fast memorizing devuce even
@@ceiling_cat lol, interesting word usage you got there
In German it's also different, we say "Arbeitsspeicher" which means "working memory". IMHO a term that describes its function far better.
@@bywonline It's true, that "оперативная память" is way more common and describes the device perfectly. But I've seen ОЗУ in more technical fields and nerd circles. I think it's the original term from the ancient times.
@@bywonline I didn't say "оперативное запоминающее устройство" is said often. But "ОЗУ" is the most (only) used abbriveature in Russian and "оперативное запоминающее устройство" is a decryption of it. Also, I often see it's in programs (the most popular ones - in PC properties of Windows 7 and XP), becouse "ОЗУ" is much shorter than even "память" (the shortest way you can say it without using abbreviatures). And also, it's not rare when I hear "ОЗУ" (althrough "оперативка"/"память" is more often). You probably never spoke with 30+ people if never heard someone say "ОЗУ"... And speak rarely people of any age in general.. (i'm joking, you definitely heard it but don't notice it just because you understand this word and also because you don't walk with the note writing down every word you heard) But, again, my comment was about abbreviature, not about the most used word for describing memory stick.
I'll call my computer wtf I want, Apple can't stop me.
I built a waddodinkle on a budget of 1k
@@BlueTable-t6k Waddodinkle Pro Max baybeeee
"wtf I want, Apple can't stop me" is a really weird host name. Will Windows even allow it? I know Linux won't.
A Mac is a PC, unlike what Apple may have us believe.
@@asadfarrajNo, it's not. It's a home computer, not a personnel computer.
Also in Win 95/98 there was the "It's now safe to turn off your computer." screen displayed even though you had Shut Down the computer. Being a young kid, I was like "Are you sure cuz you are still displaying some text"
Pretty sure that was only on the older AT systems that had a hard on/off button rather than ATX that could do a soft turn off - hence not seeing that screen 🤔
Yep. Technically on those old boards it was just shutting down Windows itself, since it couldn’t actually control the power supply!
BIOS...everyone still calling it BIOS instead of UEFI.
Because UEFI is a type of BIOS. One where Trojan horses are a feature, not a bug.
@@davidwuhrer6704 It is not, UEFI is replacement of BIOS, since both are different types of firmware. It's common misconception :3
@@micropanda7916 Firmware is just another word for software. (Hardware manufacturers found that universal computers programmed for a task are a lot cheaper to build than application specific integrated circuits, so they called the software "firmware" so they could pretend that it is part of the hardware. Ironically, hardware is much cheaper and easier to replace than software.) Any software downloaded onto any type of microchip can be called "firmware".
UEFI was intended to replace BIOS. They are the same type of software: First stage loaders.
If they were different types of firmware, one could not replace the other. Do you think the firmware for a graphics card could replace the firmware for a network router? Or that a climate control firmware could replace the firmware for a loop station?
UEFI never fully replaced the older BIOS. Most UEFI implementations provide a backwards compatible BIOS implementation as a fallback or alternative. Reasons are that the previous BIOS spec is more secure (by not providing Trojans as a service) and less restrictive (by not requiring a cryptographically signed third stage (of which there are thousands for UEFI, many of which have security holes)). It's also smaller and thereby faster, but that's rarely an issue.
The only advantage UEFI has over the older BIOS is that it isn't limited to four primary partitions. That, too, is rarely an issue.
Most computers don't need a BIOS anyway. Microcontrollers have their own first stage loaders, as did almost all home computers. SBCs have their own first and second stage and can boot directly into an OS, or optionally a third stage loader like GRUB or LiLo or syslinux.
Only PCs (and Intel Macs, where EFI is from originally) needed a BIOS, but there are free implementations like uboot and LibreBoot, and ASUS have shown that a Linux kernel can be used as the first stage directly (if the board supports it).
It's one of the needless complications that people who grew up on PCs think are normal.
BIOS just rolls off the tongue better than UEFI
@@micropanda7916 id argue that BIOS is still an acceptable term because the UEFI still does the same job. Like going from an F-150 to a Silverado, its still a pickup truck. Even on motherboard websites they still call it BIOS or UEFI-BIOS.
The "I'm a freak in the sheets" line was hilarious 🤣
I think the technical term for a non-IBM "personal computer" in 1981 was "microcomputer," contrasting with "minicomputers" that were the size of refrigerators, and regular old computers a.k.a. mainframes, that were the size of entire rooms or office suites. If IBM hadn't enjoyed such a meteoric rise in business space, and later, the consumer space, we might still be calling them "microcomputers" to this day. Instead, we got "PC".
Although the development of computers in the Soviet Union followed the path of copying western technology, they transferred that technology to their own theoretical base. Therefore, in russian, most computer terms differ from english ones. So, RAM in russian is called 'operational memory', and that is what it is still called today.
In everyday life 'operational memory' often shortens to just 'operational', because essentially everyone knows what it means, but occasionally it creates little confusions when it comes to VRAM. So to avoid this, VRAM can be called 'video memory' or just 'memory' when in put into context, but never called 'operational'.
Interestingly, in russian PC is also called PC (but it sounds differently, like PK) and have very same meaning - personal computer.
I wish their Trinary computer had beat out the X86.
Here in Poland you can find also a ,,Pamięć Operacyjna" which is just operational memory translated said alternatery with RAM
Learned what RAM wad from Daft Punk album
Probably a typo, but I've spent like 10 minutes imagining what a "RAM wad" might be.........
For example, does one RAM the wad or does one Wad the RAM??
@@scottcole1881 You talk too much.. It’s a simple mistake bro
@@comicalacurate I know, but it was funny 🤣🤣
@@scottcole1881 cmon bro its not THAT funny😭
Could you make a video about how neither Intel nor AMD have made a true x86 CPU since 1995 and 1996 respectively? With the introduction of the Pentium Pro and AMD K5, they both switched to RISC cores, with x86 serving more as an intermediary for compatibility. The x86 instructions are broken into smaller operations to be executed by the RISC cores. I think it would be a very interesting topic, especially with the current popularity of ARM and RISC-V in new devices.
What? The fact that modern x86 CPUs now perform internal translations to a simple-ish RISC-like dialect before breaking it into micro-ops doesn't make those CPUs "not x86" lmao
@@nocturn9x Those micro-ops are the RISC instructions. As far as I am aware, neither manufacturer has ever disclosed what they are, only that they exist and are used. The point is that people look at x86, see that it's an old CISC instruction set, and think that AMD and Intel are completely screwed because "RISC is taking over computing". RISC has taken over desktop computing back in the 90s, people just don't know it because there wasn't much point in marketing it that way.
@@Creepus_Explodus they are X 86:-/
X86 is the baseline. Everything else is because they tacked on stuff after that- that’s what that micro code is= Additional functionality on top of the baseline.
even the original 8086 had microcodes, what is so surprising?
@@nocturn9xwhy did you respond?? Bros gonna send you so much
Competitors did not reverse engineer the original PC's BIOS. IBM released the IBM 5150 Technical Reference manual in August, 1981. It included the fully commented source code listing for the BIOS.
They still had to reverse engineer how anything actually worked. The reference only specifies how to use it.
For legal reasons they had to use "virgins" who had never worked at IBM so they couldn't be sued for stealing company secrets.
Those were the days when they gave you the full schematics and source code when you bought a piece of electronics. Now, they treat fixing your own stuff as an offense and good luck finding any schematics/source code.
They also called x86 computers "IBM Compatible".
IBM Compatible was more about the BIOS being able to boot MS DOS. There were plenty of X86 systems that couldn't run DOS.
I guess a PC is a PC, regardless of chip architecture, just like with the term laptop. Maybe I am wrong here but when I use the term PC I generally refer to the concept of a DIY machine for the home, not a specific architecture, brand or OS
Same here. “PC”, IMO, means exactly that. A “Personal Computer”. Brand, OS, form factor, or any of that other stuff is pretty irrelevant to what is your “personal computer”.
Modern phones could also be considered PC's as they are personal computing devices.
same here, at least in chile, it means computer, or any of the synonyms we use for it, like Tarro (bucket), compu, torre (desktops) or whatever we use to call them around here.
@@abbe9641 you've just reinvented PDA and PPC 😄
Even apple call their Mac's PC…
nerd pickup line "we should excel sometime, spread some sheets"? 😂
Where are you going after you die?
What happens next? Have you ever thought about that?
Repent today and give your life to Jesus Christ to obtain eternal salvation. Tomorrow may be too late my brethen😢.
Hebrews 9:27 says "And as it is appointed unto man once to die, but after that the judgement
@idehenebenezer
Did you want me to actually answer these questions for you? Or is this rhetorical rhetoric?
Propagandhi 3:8 - Jesus saves, Gretsky scores!!!
Oh no. That's good but you just reminded me, way back last century I got a nerd friend out on a blind date with one of the work's secretaries. The next day she came in and asked me what I was thinking.... apparently the conversation between the two has descended to "what's your favourite word processor, Word or WordPerfect".
@@JesusPlsSaveMeTo your mom
Self-explanatory. Yes
@@JesusPlsSaveMeAlso, for the last 1, no. 🗿
Yep. And _floppy discs_ stopped being *floppy* after the 5.25" format was succeeded by 3.5" discs in _rigid_ casings.
But that didn't stop us from calling them "floppies". Heck, we even called the disc *_drives_* "floppies".
To be fair, the magnetic medium inside a 3.5" hard plastic casing is still floppy.
Dlevi's right, 5.25" floppies and 3.5" floppies both store data on internal magnetic disks that really are floppy. 3.5" floppies just have a rigid case.
And Hard Disk Drives really do store data on rigid metal disks - I've opened them up too.
Calling RAM an antiquated term is like saying a modern car isn't a car because it isn't a Model T
Wikipedia says that the first car was the one Carl Benz built, ignoring the earlier Flocke Elektrowagen, probably because it was electrical, like 40% of the cars in the Americas when Ford built his Model T.
But those weren't _real_ cars. Electric autos weren't cars until Elon Musk bought Tesla, right?
Mr Cugnot seems to have been the first one to produce a "steam car" although you could call his contraption a tractor. It was supposed to tow several tons of artillery at roughly a walking pace. His was built in the 1700s so he beat the flocke electrowagen by well over a century... but nobody ever intended his "car-like object" to operate with the primary purpose ofoving humans about.
@@julianbrelsford That's not a carriage at all, is it.
I had no idea about most of this. While we're pointing out odd things a lot of us get wrong without realizing it: at around 2:37 "What we all need is more acronyms." and while I agree, we have too many, none of the "acronyms" swirling around were, in fact, acronyms outside of "R.A.M." or 'ram'. An acronym is an abbreviation that is pronounced as a word. PSU, GPU, CPU, HDD, SSD, are all abbreviations. R.A.M., N.A.S., R.A.I.D. are acronyms. Now we both learned something :3
The proper term for those other abbreviations is 'initialisms', but that distinction is mostly dead. For most purposes now, 'acronym' already covers both acronyms and initialisms.
@@patheddles4004 I haven't heard anyone use initialism in ages. Good to see it's lost but not forgotten! And even though acronym has effective taken the place of both, in my head I will always be correcting people lol.
I refer to all endpoint machines as PCs, compared to servers, NAS or other more specialized tech. Yes, that includes Mac and Linux, so long as it's designed for an individual to use it as a daily driver.
Is my Amiga a PC? What about my Raspberry?
@@davidwuhrer6704 is it designed for use as a daily driver by an individual? Is it designed to carry out a specific function on its own?
@@nonamesleft136 It is a universal machine, not limited to one specific function.
I use one of my Raspberries as a desktop computer. I use another as a web server.
And there are still things that only Amiga makes possible.
That is actually probably more accurate to the name "Personal Computer" than how its used now.
When I was a kid I "loaded" a "programme". I still say "load a programme". Apparently now I'm supposed to say "click on the app".
Eh, whatever.
I still object to calling a full desktop application an 'app' (in part because when I was growing up an "app" was something much smaller and also more limited functionally). And I much prefer the words 'run', 'launch', or 'execute' to describe the action.
There are after all other ways to do so that don't involve clicking or tapping, at least on anything besides a smartphone.
I was there 3000 years ago, when you had to manually assign IRQ and DMA values to your hardware
I remember the Conventional Memory, The Upper Memory Area, the High Memory Area the Expanded AND Extended memory.
...I was there when the FPU moved on to the CPU...
Good ol’ MS-DOS. I remember the frustration of trying to find a free IRQ for my Sound Blaster and other cards I installed.
@TurboLoveTrain I still have nightmares about exp/ext and trying to set it up and work.
@@danjitheman
What is really interesting is that memory addressing never really changed--the handlers were moved to the main chip dye and everyone kind of forgot about extended/expanded memory... but it's still the same as it was in the 80/90s.
Hell were still using the antiquated SDRAM and the associated antiquated buses and addressing methods.. that, again, came out in the 90s.
you kids are cute. I remember when you had to take the computer to a shop if you wanted to max your ram out to a whopping 128K.
I was fully expecting debug to be on this list too. Surprised it wasn't since that completely changed exact meaning but kept the same overall meaning since it became a term
Yeah it used to mean removing actual bugs from mechanical computer components
Today is CrowdStrike day!
We should declare it international holiday. Imagine wishing happy CrowdStrike day to your colleagues and staying home that day.
I actually hate how many people refer to windows as PC (Personal Computer), but mac as mac. For example, I have seen people say something like "here's how to switch from pc to mac". The annoyance I have, is that a mac _is_ a Personal Computer, not specifically windows. I prefer to refer to PC as just a desktop (or laptop) computer that a physical mouse and keyboard, and is meant to be _personal_ to the user, which does not restrict what operating system you use, whether it's windows, mac, or linux, it;s still a PC.
Oh, and cell phones and tablets are different, because they're mobile devices that are made to be used with just a touchscreen, and are usually not used for stuff that a PC is used for (by the way, a laptop that can fold into a tablet is still a PC).
I agree with you, but 30 years ago got used to the term "PC" to mean an IBM clone. I "fought" that battle in the 80s; my Atari was also a personal computer; I was offended by the dominance of the use of the term "PC" to mean a specific genre of personal computer.
Well now you know why that is the case. For the longest time apple themselves never called their machines PCs and neither did anyone else making non IBM computers. Like try convincing someone in the 80s that their commodore 64 is a PC
Not every home computer is a PC.
@@KaitouKaiju C64 is a home computer and therefore a type of PC.
Not an IBM compatible one though.
@@davidwuhrer6704 While not every home computer can run IBM software, every home computer IS a personal computer - even Macs.
We would love a video on the origin of CrowdStrike
A man of culture
They’re a…….weird company…. Like tech wise
In 1983, I had a colleague (we were MVS System Programmers on IBM mainframe computer) who still referred to main memory as "core" because in the 1960s, main memory consisted of magnetic rings with wires cris-crossing through them.
As an older UK PC builder, we used the term RAM (Random Access Memory) to differentiate from ROM (Read only Memory). ROM was where the bios and its precursors were stored and could not be accessed or adjusted by the end users in those earlier days. While there are still some uses of SAM today (magnetic tape being one) much faster memory types have since come into play.
Technically, ROM vs. RAM doesn't make so much sense. A memory can be both ROM and RAM, and it could be neither of those.
Consistency is not to be looked for. ROM is random-access, but it is not RAM.
Daft Punk Album RAM!?
Mechanical HDDs were always "random access", that's kind of the entire point of the moving head!
No, that's "direct access".
@@davidwuhrer6704 _"No, that's 'direct access'."_ What a moronic, stupidly obtuse, and basically incorrect, reply! The categorically understood difference between "random" and and "linear" is the _requirement_ to have unneeded data pass the read/write heads to position the media to the point of the correct data, ie: *TAPE* . HDD's are completely "random" in the ability to position the head at exactly where the requires data is, skipping the vast majority of other data that may exist on the platter.
There is essentially *no* practical difference between a hard drive, core and later static "random access" memory. An address (track/sector or binary, difference is meaningless) is selected and a bit (or byte, depending an address width) is read. You could argue that SRAM was *MORE* "sequential" than an HDD since related data was always assumed to be in the next sequential address (ie: a string, etc) here a HDD has no such to limitation, as it were, related data _could_ be one byte at a time, scattered across the platter (fragmentation, basically).
Core was unique since it used a destructive read and needed to be re-written. DRAM requires refresh cycles to stay persistent.
@@awebuser5914 No, hard disks are not random access. You position the read at the track you need, but you still have to read the track until you get to the block you want.
So a hard disk is not entirely linear like a tape, nor entirely random access like core memory. The technical term for it is direct access.
@@davidwuhrer6704 That is a modern-day, bullshit, "re-definition", that has absolutely *no* meaning in the context of when the term was _created_ . The term was created to distinguish from ubiquitous magnetic *tapes* , nothing more.
The _concept_ of RAM reaches back to 1948, being the first tube-based electronic storage of a program, versus hard-wiring or other mechanical means.
Soon after, in 1951, magnetic tape was introduced as a way to store programs and data in a high-density and inexpensive manner, albeit linear and fairly slow. Thus the term "random access memory" was coined to distinguish between the two. Hard disk drives didn't appear until five years later and IBM (not so) ironically called it the "IBM 305 RAMAC (Random Access Method of Accounting and Control) system". Hmmm, I seem to see the term "random" in there...
@@awebuser5914 The Zuse Z3 used telephone relays rather than vacuum tubes for storage. Tubes were first used in Bletchley Park, and they also used relays at first. Nobody called any part of a computer "memory" back then, although the Z3 could already store parts of programmes read from paper tape.
John von Neumann was the first to use the term "memory" for the data store.
Random access memory was used to distinguish from stack memory. Paper tape was not regarded as memory yet.
Calling magnetic tape a linear memory is a retroactive redefinition, but accurate. In pure theory, all memory is one-dimensional like the magnetic tape of the Turing machine. In practice, a number of different storage technologies were used, some of which could be addressed directly, like magnetic core memory, and others which could not, like bubble memory.
Hard discs are magnetic. Like with bubble memory, you can select a track, and then spool until you reach the address you want. Not random access. Not a stack. Not linear either. The term "direct access" was used.
I don't care what buzzwords IBM used in their marketing material.
I had a low tech friend years ago who asked me what he should get on a new computer. So I told him the specs he should ask for including how many Meg's of RAM he should need. He later forgot the number and asked me again how many Mega-Rams he needed. I couldn't stop laughing. Baaaa!
Why didn't they just call it Direct Access Memory? Random isn't really accurate as it isn't random information getting fetched. It's specific to the information requested by the cpu. Is it because RAM would be called DAM?
Direct Access Memory is already a thing. It's how Hard Drives operate since you can go directly to where the data is stored, but you still have to read it sequentially and wait for the hard drive platters to spin over where the data is.
Random Access Memory means the information can be stored anywhere on the device and can be retrieved in any order. It is, in effect, random in how the data will be retrieved.
What's "random" is the access pattern. You can't predict it so it's effectively random. Hence you need byte addressable memory, unlike other memory technologies like NAND flash that only support block level operations by their very nature, and while you can definitely do "random reads" from an SSD or HDD, those are still not quite the same as RAM accesses. Since RAM is optimized for being accessed at any point in any order, that's where the random is coming from
Because saying "DAM" would be too vulgar
A disk can be "direct" access too. Random means *both* any location and any order.
@@justicefool3942 thanks
RAM is random due to byte addressable capability while HDDs are sector addressable. In HDD, to read a byte first need to read sector and then in sequentially read byte which make it non random. Some SSD can do byte addressable but lot slower as they are not designed for that. In this context RAM is still relevant to randomly access memory location. 😊
because thats how you install it
underrated comment
We just continue to use these terms simply because people would know what we're referring to. Like how we continue to say "Q-tips" when the actual name is "cotton swabs". Just up and saying something different the next day will start to confuse people.
Though, we STILL have people today still calling storage space "memory". Conversation goes like...:
"Dang.. I'm low on memory..."
"Then delete some stuff!"
"I said MEMORY, as in RAM! Not storage space! I have plenty of free storage space!"
"Oh...then.. what's the difference..?"
Imagine how chaotic your computing experience would be if the processor did indeed just access memory randomly 😂
That reminds me of Bogosort. But that memory behavior, you don't even need to randomize the set, you just have to check it again.
"RAM" is still useful because if you say "memory" to a user that doesn't know much about computers, they'll think you're talking about hard drive storage. They may not know what RAM is exactly that they know it's not storage for your files etc.
TIme to blow someone's mind with CD-ROM standing for Compact Disc - Read Only Memory.
And DVD-RW being "Digital Video Disc - ReWritable"
Nice try bud 👍
Don't forget DVD+RW
Also don’t forget DVD-RAM. 🫢
Or Digital Versatile Disk, rather. (Movie-formatted DVDs were DVD-Video.)
@@doujinflip Yup, because Sony's gonna Sony 🙂
All great info, except for you mistakenly referring to terms are acronyms when they aren't. RAM and BIOS are indeed acronyms. Acronyms are strictly pronounceable words. CPU, PSU and AMD are not acronyms, they are initialisms. You have to say the letters individually. I mean, if we are a bunch of geeks getting pedantic about the names of things at least get it right! 🤪Thanks for attending my TED talk.
it’s named after a Daft Punk album
Modem is supposed to be a device that modulates/demodulates: converts analog signal to digital and vice versa. This made sense back when we got the internet over analog phone lines, then converted them to digital for our PCs. Modern high-speed gives and receives digital signal, not analog, and does not need to be modulated or demodulated.
Therefore unless you are using dial-up, you do not have a modem in your system, regardless of what it is named.
Excuse me! I have a modem and I don't use dialup! But then, you probably weren't taking ham radio operators into consideration for good reason :)
Modem is still apropos for cable broadband as it is indeed modulating the digital signal, just using a very high frequency and 4096-QAM rather than PSK/PCM of dialup modems. But for fibre, I see people tend to call it an ONT which is the far more accurate term.
I don't hear modem a lot anyway. Probably because the device (now it's converting between electricitiy and light on a fiber line, but performing the same role) is also in the same unit as the router, firewall, and Wireless Access point. And everyone just says Router
I haven't heard modem in a long time. Most people have a router even though the thing we call a router nowadays is actually a gateway
I remember when video games and software used to say "IBM PC or compatible" in the system requirements.
"I'm a freak in the sheets" -thing I'm going to say at work next time someone compliments my Excel skills.
What color do you want that database? I think mauve has the most RAM.
0:58. No. It's still storage, not memory. Car radio technology has changed in the last 20 years but it's still a radio, not a wheel.
1:45. Yes, the name was because it was physical chips but the name remains because what it refers to still serves the same function, even though the technology has changed.
Storage is memory, the only difference is persistence. The term RAM could apply to both. There is nothing to say that your computer's memory has to be RAM anyway, you could use sequential memory in place of RAM and it would still be possible to have a functional system, it just wouldn't perform as well. The actual memory or storage type and technology used doesn't actually matter, the only thing that changes is performance and how you access it. You could have a system with only SSDs and no traditional RAM and it could work absolutely fine.
As for chipsets the name remains but it isn't accurate anymore so why shouldn't it be changed?
In some countries, you can't put a trademark on words that are too basic or too close from natural language. It would have been difficult for IBM to keep the term « personal computer » for itself, while the thing was actually nothing else than... a personal computer. Would be the same if some random car company tried to get an IP on a car called « small car ».
fun fact, you can't get a copyright on numbers either,
that's why Intel went from 286, 386, 486 to Pentium... like 586 "Penta-" being a prefix for 5
IBM built the Personal Computer, as opposed to everyone else's home computers. ("A mini for your home!") They were a late entry to the market. The reason why they allowed others to clone them was because there was an anti-trust (monopoly abuse) case running against them at the time. Same reason they let Microsoft keep DOS. (The surprising thing was that they bought it from MS at all instead of using their own CP/M.)
It was built from off-the-shelf components, designed to be as cheap as possible, causing all kinds of headaches for users having to configure the interrupts themselves. It sold so well because, well, nobody had ever been fired for buying IBM. Even though the thing was trash. But being thrown together from existing components, it was also modular, so it offered tinkering opportunities and upgrade paths that didn't require throwing the whole thing out, like Apple's products did and still do.
The anti-trust suit against IBM was dropped by presidential decree a year later.
RAM still works because it's a generic description of functionality, not a specific technology.
Long ago, I predicted that we would use "core memory" long after the origins were lost, retronymically considering it to be the (random access) memory closest to the CPU, ie: at the core of the computer. Of course, the actual origin referred to very tiny ferromagnetic donut shaped "cores", which had been used for some while, but were being replaced with other technology (which today we would generically call RAM) at the time of my prediction.
My prediction turned out wrong, although the term "core dump" lasted for some while (as a dump of memory contents).
memory closest du the CPU would be the L1 SRAM cache. Although it's transparent :).
As a kid in the 80s I always thought “floppy disks” were those larger flexible discs and “hard disks” were the smaller discs with the hard plastic cases. I can’t be alone in having thought that lol
Many people thought that. I had to explain to them that a "hard disk" was inside the computer and those 1.44 mb 3.5 inch disks were also called "floppy disks" or "diskettes".
The actual disk inside a 3 1/2 inch floppy is just as floppy as the disk inside a 5 3/4 or 8 inch floppy. It just has more protection.
As a kid in the 80s, we didn't even have the 3.5" floppy disks. Everything we used was on the 5.25" disks instead. It wasn't until late 80s that I even saw a 3.5" floppy, and by that point I was already familiar with hard drives since we had some computers with them, so it was only natural for us to go on to call the 3.5" disks floppies as well.
@@alanlafond9705 And indeed they were floppy, on the inside!
@@allanrichardson1468 Oh, indeed! I've taken a few of them apart, back in the day! Pretty easy to separate the casing if you're careful. If not, a 3.5" disk doesn't run you too much to buy a replacement. 😁
Hobby? After 20+ years in corporate IT it's hardly a hobby. I see no issue with the old terms. Most everybody understands what they mean and the key to effective communication is to be understood.
Q: Why is it called RAM? A: To keep ewe happy?
I was fed up with acronyms, especially "PC's". as far back as a job interview in Aug of 1983. As an electrician/electronics tech, the owner/interviewer asked me if I was familiar with PC's. and I had to ask which type of PC's he meant, and he did not know what it stood for. I said, to me it could be, 'Printed Circuits", "Programmable controllers", or "personal computers". He still didn't know which, so I let him off the hook, and said ...I can do all 3. 🤭
3:53 lets just be happy that acronym didn't take off...
The thing about an SSD (Solid State Disk) is that you have 'random BLOCK access' whereas RAM gives you 'random BYTE/WORD access'.
So they work a little bit differently in practice.
Why do they call it RAM when you RAM in the cold data of out hot read the data?
Exactly. I just refer to my car as a "fast thing with wheels," but there are other fast things with wheels, like skateboards. Thus, my car is actually a skateboard.
"I'm a PC"
"And I'm a Mac"
"And I'm a Linux user"
"I use Arch by the way"
@@scialomy " I use Dex btw"
And technically they were all PCs (Unless you had a really old mac or a raspberry pi or a linux phone, in which case they weren't x86 which might disqualify them depending on your definition of PC)
Jesus is returning soon🔥 Repent and turn away from your sins to obtain salvation 🤗🤗
Linux guy would be sporting a neck beard and fedora
"Ram is outdated term."
"Ram is correct and accurate term still today. I just don't like the term."
Never seen someone prove themself wrong so fast in their own video.
0:08 Does "To the pit with him" mean the same as "drag and drop"?
RAM vs DASD (hard drive, not SSD). Random Access Memory basically means that the access time memory at any address is basically equal (ignoring cache). DASD is Direct Access Storage Device (like old Hard Disk). You can get any unit of data without reading other units of data. The access time depended on a lot of factors, but mainly where the data was on disk vs. where the read/write head was. Moving the head was relatively slow. SSDs are more like RAM because the access time for any unit of data is basically the same. Sequential Access is the need to read from where you are physically positioned to where the data you need is positioned.
Because of Daft Punk.
I remember looking at old game boxes and seeing that "IBM AT/XT Compatible"
Another outdated term may be BIOS - today, every new computer doesn't have a BIOS, but an UEFI, however branding inside UEFI settings' screens still often refers to itself as "BIOS", because the term is just so commonly understood (whereas UEFI isn't).
UEFI is a Basic Input/Output System, technically. It's just not an IBM compatible BIOS. So that's technically the term we should be using: "IBM Compatible BIOS" vs "UEFI", or "BIOS" for when the distinction doesn't matter. Nobody would do that though.
UEFI is still a BIOS. probably even more so than the IBM BIOS.
I think it's just because BIOS rolls off the tongue more easily. The correct generic term would be firmware.
And BIOS can be pronounced as a word, but UEFI can't. That's why it will always be called "bios".
@@ricequackersEven firmware isn't really a good term because there are a lot of firmwares that aren't at all a bios. Any piece of software stored permanently on a chip is a firmware, for example the OS of a 3DS is firmware
The Personal Computer naming is wrong.
The term Personal Computer predates the IBM PC's release in August of 1981 and it took a long time for the term to only mean what we would call the "IBM", or "IBM PC".
***
A quick check of old Compute! Magazines shows personal computer was common in 1979, although "Microcomputer" was still more popular, and it shows that "Personal Computer" was used for other systems for years later. I can quickly find references to "your Apple or Atari Personal Computer".
We wouldn't use the term "DOS" to describe the machines. "It's an IBM" (even if it had no IBM parts) vs " It's an Apple II" vs "It's a Macintosh" (the full name was common for a long time) vs "It's a Commodore 64" ("C64" was only used when there was no space).
What we now call PCs, box specs would call something like "IBM", "IBM PC and Compatibles" or just "IBM PC-Compatibles", with occasional references to PC models. In the late mid-late 1980's into the 1990's the "Tandy" (Tandy 1000) also appeared on the spec box.
For example, the original 1991 Civilization box specs say:
"IBM PC/XT/AT/Tandy and most Compatibles"
The goal would be to emphasize compatibility with the PC "IBM PC and 100% compatibles". Realistically, the best home computers and the most powerful PCs were not IBM, and all those models were dated. For that reason, when referring to the IBM it actually made sense to not use the term IBM PC, since the original PC, the IBM 5150, would not run PC games, and you needed to refer to CPUs
If you look at the specs on 1992's Ultima VII: The Black Gate, you see
IBM 100% Compatible: 386, 386SX, 486 PC System.
In theory, this should make clear that the "IBM PC" will not run this.
As a side note, since this is Ultima VII, there's a good chance your 386, 386SX, and 486 will not run it until you have performed a sacrifice to the memory manager to fit all of your drivers and the game under 1MB. You usually needed a boot disk for U7. Mine had an ultra tiny mouse driver.
It isn't until the old 8-bit machines go away that PC starts to mean just the IBM PC Compatibles.
***
No sane person used Wintel in the 1990's. It was a way to embarrass yourself. Just because it was done didn't mean it was a thing.
Please top trying to make Wintel happen.
PC is iust a generic term. It doesn't matter about the specifics very much. Apple makes P. C.'s. They just don't like to call them that.
he literally just explained that it wasn't and how it wasn't.
The point the video is making is that that was not always the case. The apple ii for example was not a PC even though it is a home microcomputer
@@KaitouKaiju Apple didn't start drawing a distinction between Macs and "PCs" until much later. At the time of the Apple ][, there were plenty of other PCs that were all incompatible with each other. There's a reason the phrase "IBM PC compatible" survived until Windows 9x came along(rendering it unnecessary).
@@angrymokyuu9475The reason for that phrase is that an IBM PC compatible machine is not an actual IBM, and while it will run MS-DOS, it is not hardware compatible, as opposed to an IBM PC clone (which is a PC for all intents and purposes, but not an actual IBM either).
An Apple, Atari, Sinclair, Commodore, Acorn, or TRS won't even run MS-DOS. They were home computers, but not PCs.
"Random" is correct, random just means that each value is unrelated to the value that came before, ie, not a pattern, not predictable... contrast this with a spinning disc, where you can move the head to a random track, but then the sectors on that track have to be read sequentially, as they spin past the head.
Of course, one of the oldest terms in computing is "computer", which used to be the profession title of a person whose job was to perform calculations. A room full of computers was a roof full of people (usually women) with pens 'n paper, doing maths for stuff
I disagree with your interpretation of RAM and your thesis that the term is outdated. It's random access memory from the processor's point of view. That is, the processor directly addresses and can manipulate RAM, even individual bytes, by direction of the program code it's executing. An SSD storage device might have grids for randomly accessing cells by its built-in controller, but the processor can't see the cells; it must request data in 4 KB blocks (or larger) through mass storage protocols, which is then transferred into RAM.
DRAM can only be read in a whole word line and then needs to be written back. SDRAM transmits a sequence of bytes from these. The CPU is indeed free to specify the starting address.
@@ArneChristianRosenfeldt Irrelevant distinction, but thanks for playing.
@@beakt lol he really thought he was on to something with that
RAM is volatile memory used for temporarily storing data that the CPU needs quick access to while performing tasks. When the power is turned off, all data in RAM is lost. RAM is much faster than SSDs and is designed to handle the high-speed demands of active processing and running applications.
An SSD is non-volatile storage, meaning it retains data even when the power is off. SSDs are faster than HDDs but are still much slower than RAM. They provide quick access to stored data but are not optimized for the rapid read/write operations required by the CPU during active tasks.
Despite both being forms of random access memory, RAM and SSDs serve different roles in a computer system. RAM is still the appropriate term for the type of memory that provides the CPU with fast, temporary data storage. The speed, volatility, and purpose of RAM distinguish it from SSDs. The term "RAM" accurately describes its role in computing, and conflating it with SSDs would blur these important distinctions.
@@herbie_the_hillbillie_goat you sound like a business accountant who briefly read into the topic. Can we make this topic more fascinating please? More about physics? What about MRAM? Core memory? DRAM and EEPROM both store charge on a capacitor. Some people claim that charge is distributed over a bulk material? I learned in school that charge site on the surface of a conductor. Later I learned that even an insulator has a conduction band, but doping does not work well. But if you shot electrons into the conduction band by sheer force (voltage >5Volt) , they are free to move.
A program can randomly select the address to access the main memory.
And it can also randomly decide if it wants to read or to write the main memory on a selected address.
So this was the meaning of "random-access" in RAM.
One of the few acronyms I remember from college is: PCMCIA (People Can't Memorize Computer Industry Acronyms)
PEBCAK
@@FracturedPixels I still use the acronym PICNIC - Problem In Chair Not In Computer
But that is not even an acronym though.... How exactly do you pronounce PCMCIA?
We always called them IBM-compatibles.
3:17 skip ad
Ads? What are those?
Use Sponso Block instead
How about thumb drives? Some of them are more like thumbnail drives these days. I know, they are more correctly referred to as flash drives.
If we keep using the terms, they are, by definition, not outdated.
The "random" in random access memory meant that you really could access any random spot in memory directly, not that the accesses would really be random. This is just being silly.
As far as "chipset," he says that modern computers have only one chip, the CPU. That is incorrect. I just looked at a motherboard on Amazon and counted over 20 integrated circuits or chips. It sure looks like a set of chips to me.
"PC" was short for personal computer before IBM built one. I had a PC that was by Timex-Sinclair, and several by RadioShack Color Computer. Then in 1989, I bought an "IBM PC compatible," meaning a computer that would run the same software as the computers IBM was making. Mine was a Tandy 3000NL from Radio Shack. I have never in my life owned a computer from IBM. I built my current computer from scratch, so it has no brand name at all.
Listening to this "child" talk about things that happened before he was born is amusing. I started my computer journey over 50 years ago by accessing the mainframe at the Research Triangle via acoustic modem from UNC at Chapel Hill as part of my astronomy homework.
The key is the significance of the identifier. RAM has always been RAM because the speed at which it access information is what’s most important, just as the most important identifier in SSD is that it’s Solid State rather than Hard Disk. Chip Set is a slightly different as it also became used to refer to a series of chips from a given brand such as intel’s i5, i7, and i9 or AMD’s Ryzen series. Since they are typically 3-4 processors released at the same time
And how does this help me with cloudstrike… lol
Gotta download more RAM bro
just boot into safe mode and update the system.
@@hatyyy the joke went over your head lol
You got cloudstruck
@@Hako_exe wheres the joke
I remember the days when any computer that could fit on top of a desk was designated a 'microcomputer' or 'micro', as they were smaller than 'minicomputers' which were small enough that they didn't fill an entire room.
we should indeed call apple devices PC.... pricey crap.
that sooooo trueeeee!
I don't think their hardware is crap at all, I think most tech people would love to have Apple HW, if it just came without Apple trying to control everything.
@@owlstead as a tech person myself (who works woth other tech people) i can asure you that your assumption is incorrect. we're efficient. you can get much more powerful hardware for the same money. with much better features. (and without fisherprice OS)
@@owlstead Actually no. Apple's locked down firmware etc and confusing OS isn't something I'd ever want to deal with.
@@teknixstuff Oh, that's the reason that I don't buy them either. But I won't call their hardware crap.
Ram is random access, meaning you can address a cell randomly. In contrast with serial memory which can only be read in sequence. (I2C, SWP, SPI)
L take video. RAM is still RAM, chipsets are still chipsets and personal computers are still personal. So what is " outdated" exactly faux hipster?
Personal Computer have been around since the 60. IBM just reused the term. There was even Personal computing Magazine as far back as 1977. Covering Apple, Atari, commodore, Tandy and others. The trade mark is "The IBM Personal Computer"
So none of these terms is really outdated. Any mathematician would tell you: a set containing one item is still a set, so chipsets are still relevant, computers are still personal and RAM is still accessed randomly (as opposed to sequentially).
4:12 Actually, the term WinTel was used to describe the idea that the two companies colluded together to obligate users of the latest software to require the latest hardware and vice versa. The hardware would be needed to for it to run fast enough and the software would be needed in order to have the latest drivers for said hardware.
BIOS is the same now, isn't it? I'm pretty sure motherboards switched to a new boot system with it's own acronym?
SSD's are not that random. Reads are still done a page at a time, writes are usually 3 pages for TLC, and erase is done a block at a time. This is why random benchmarks are slower than sequential.
IBM has done so much to transform the world into the digital age if you think about it. Kinda sad they did not continue their legacy and just decided to stick to the comfort of enterprise solutions. They had some incredible people in the R&D back in the day.
They got outcompeted in the retail space and ultimately it was better for them to license the tech to other manufacturers than try and pursue every company cloning their machines in court
Growing up my dad always had the latest tech. He helped me build my first few PCs which he always referred to as "IBM compatible"
"Aspirin", like "Kleenex" and "Thermos," is trademarked but legally became a generic term because Bayer did not enforce the trademark strongly enough in the USA.
It's funny you say "working memory" because that's literally what RAM is called in German. "Arbeitsspeicher" literally translates to "work memory"