I've been looking for this comment. This guy not only has a great sense of humour, but he does his research. Either that, or he's a lot older than he sounds.
In the context of Unix, it really should be "vi" not "vim". I remember taking the source for it and porting it other operating systems, notably OS/9 a RTOS that I don't people even remember.
Denis Ritchie is an underappreciated legend of IT history, nice to hear his name mentioned here. He and Jobs passed away on the same month and of course Jobs got all the mentions and credits while Ritchie was barely mentioned at all. Thanks for this.
Within the IT community however, Ritchie and Thompson were at least as famous as Steve Jobs, if not more. Whereas Jobs (along with Wozniak) was most of the time acknowledged for the Apple I and II computer and the introduction of GUIs with Lisa and the Macintosh, Ritchie and Thomson were famous for both Unix and C, and during my Linux phase beginning with its appearance in 1992 all through the nineties I never met someone in that community who hadn't at least one book of those guys, mostly "The C programming language.". Only a few other guys were that popular: Donald Knuth, Bjarne Stroustrop, Linus Torvalds and Niklaus Wirth (for Pascal and Modula). And maybe Weizenbaum for Eliza and making us rethink our stance on computer technology whenever the consequences of something new in the IT business became apparent. We invited him once to a conference on data security in Kiel and he gave us a great talk on the ethics of IT engineering.
People keep harping on this like they are the same. They could not be more different. Unix was 60's technology that had no relevance anymore after 2000 for anyone but IT nerds. Jobs successfully built and marketed products (not based on Unix) that STILL have relevance today. For literally billions of people. If you want to complain about people not getting recognition, maybe you should focus on Linus Torvalds that stole Unix's thunder by basically porting it to X86 and (initially) adding nothing to it himself? Google then used the Linux kernel and stole the Java API to build their iOS copy, Android. But that is another story...
I appreciate that you compared the cost of computers to the cost of a graduate student - that is an accurate way to depict what that relationship looks like
Now the computers cost nothing, the software costs nothing, IT professionals are expensive as hell and companies pay so much money in opportunity costs to force people to use Windows in some weird power trip.
@@svr5423 The Windows thing is about HR. In most companies regular users outnumber IT by 100:1 and since most applicants only have experiance with Windows it is easier to use Windows then to retrain every new hire. (I know is seems trivial but a lot of worker drones are... a bit slow to figure things out.) This was (and is) a big part of MS strategy when it came to dumping OEM windows into every home-computer on the market. I mean OEM windows licenses have always been dirt cheap, they weren't trying to make much on it directly but they know such market coverage locks in third party software and enterprise customers who then slowly get sucked into a walled garden of exchange servers whatnot that is needed to support all of the Windows nodes. MS also dumps cheap OS licenses on schools and colleges for the same reason.
@@mytech6779 My 85 Macintosh Lisa was ~$5500 brand new, bought by the Art Institute Of Chicago, and was replaced in 1987, where I bought it for a mere $200! Graphics editing was still just starting to get capable, and since they were a well funded institution I guess they wanted the best of the best, and grew out of it pretty darned quick. I guess Apple didn't take the old ones back to refurbish them, and the School could just write them off and give them them to the students, one who resold it at a profit to me!
@@liam3284 Even consumer grade software is not cheap, and all of that subscription stuff costs even more in the long run, by "costs nothing" I'm sure they meant FOSS software Like what I run: I ditched windows ~15 years ago, and haven't touched it again since. I'm on Arch Linux and use all FOSS software, and haven't paid for software in the same time either, I just donate to the developers of often used and highly appreciated software's I use here and there, and take none of the abuse I would by using anything from Microsoft, where you hand over near total control of your PC and everything on it to them and they spy on and track you, and are a prime target for hackers, scammers... and you pay them to do so instead of them paying you! It's like being a pimps bitch!
Asianometry is the very paradigm of what a You Tube channel should be. Really top quality content. Thanks again for another excellent essay on computing history.
As a fledging EE in 1974, I was fortunate to join Digital in Maynard where I worked with the PDP-11 team. I enjoyed the vintage photos of the hardware and of some of my colleagues that I enjoyed working with. I left Digital to become a software engineer in 1982, joining a Pittsburgh startup called "3 Rivers Computer Corporation". Our hardware was an innovative workstation called the PeRQ. Gordon Bell joined us briefly after leaving Digital. One of our major projects at 3RCC, done under contract in Edinburgh for ICL (a major UK supplier at the time), was a port of Unix using "C-Codes". Our hardware was a bytecode machine (very fast for its day) and we developed a bytecode interpreter for C. We used a portable assembler to generate C-Codes from standard C. That let us run Unix on our hardware. This piece reminded me how much fun we had in the early years of the workstation world.
@@lenkapenka6976 : Yes indeed, the machine was marketed by ICL. Thankfully, I had nothing to do with any of that. My job was to make the Unix port run. I loved the architecture of the machine. It was basically a clone of the Xerox Alto (one of the founders of TRCC worked with the Alto at PARC). The internal working name of the TRCC machine that became the PeRQ -- even when I got there -- was "Pascalto". It was an Alto with microcode optimized to run Pascal bytecodes. The PeRQ shipped with an OS written in Pascal. I had just been introduced to Smalltalk at the time and several of us also worked on a very nice Smalltalk port for the machine. The bytecode architecture of the machine an excellent fit with Smalltalk. The WCS meant that the entire "rasterop/bitblt" behavior from the Alto worked just fine on the PeRQ (with the able assistance of at least one ex-Parc contributor who was intimately familiar with that particular code). For those who don't know, the PeRQ was built around the AMD 2901 bitslice chips (very advanced for its day) and had a writable control store (WCS). That meant that that if the developers knew that a particular task warranted, then a specific bytecode interpreter optimized for that task could be pushed into the WCS while the task was active and popped upon completion. The original task could then continue with the original bytecode interpreter in the WCS. The result was machine with absolutely blinding performance (for its day).
Ehhhhh.... I didn't care for it. A lot of the technical detail was kinda confused. It felt like someone talking about cars that isn't really into cars if you know what I mean.
Good work so far. I have one disagreement. Take macOS, still a Unix (and OpenSource since its beginning) certified system, being the successor to NextStep. And all other Apple OSes (iOS; iPad, WatchOS, tvOS and soon VisionOS) also share the same fundamentals, kernel and framework. Also, though seen as different, Linux is, on purpose, a close open source variant. Let's not argue about the variation of kernels... What I'm saying is that all the concepts of Unix are still widely present in all those Operating Systems. Unix has not fallen at all, it is the dominating philosophy of all modern operating systems. Windows is extremely marginal in fact, as all the operating systems I mentioned are present in many more devices than PCs. Think mobiles phones, access points, switches, routers, IoT, cars, etc.... Unix variants are really dominating.
I was going to make this comment if no one else did. Far from being on their way out, operating systems based on Unix or Unix-like actually are used in most servers, most mobile devices, and in desktop in the case of MacOS (and a little desktop Linux). Most routers, set top boxes etc. are usually based on Linux. People have been predicting the demise of unices for decades.
Depends on how you define the main concepts of Unix supposedly making up the dominating philosophy of all modern operating systems. NextStep and NT were pioneers of the object oriented approach which you may call central to modern systems. Unix had none of that. Its "philosophy" relied on creating virtual file systems to loosely mimic objects and parsing of output instead of the direct access to objects. Piping made "everything is file/text" an interesting concept, but PowerShell's object piping is even more elegant. You don't have to bother with "everything is file/text" when dealing with macos. Even in the Linux world POSIX utils are being given up as scripting tools for the sake of Python libraries and thus an object oriented approach. Virtual file systems are also becoming somewhat of a dead soil upon which everything is being actualy handeled via D-Bus and udev. Text configs are more and more an auto-generated sham. Some parts of POSIX stay kind of universally relevant, but only until C gets fully replaced by Rust in system programming.
one way to feel old: you see a video of a historic review of things that happened during your professional lifetime. I fondly remember playing with a multics system in the ep500 scouts Wednesday nights after school. It had variable length segments of memory and we wrote code for it in PL/1.
Multics had a far more advanced and user-friendly shell than UNIX. In fact, some of its features are just beginning to appear on UNIX via non-official shells and shell extensions. Multics command-line supported autocompletion, inline help, and execution confirmation. It also supported random accessible character coordinates on CRT terminals.
Correction: 'ex' was not WYSIWYG, but a line editor always displaying only the line currently edited in a file, just like 'ed' on which it was based. Billy Joy then used 'ex' as a base for his first 'vi' editor with 'vi' derived from 'visual' standing for the at that time pretty revolutionary new 'visual' mode that by using terminal commands let the user navigate through a text file on screen as we know it today. 'vim' came much later and isn't even based on the original 'vi' source code (according to Wikipedia its forefather was a vi clone called 'stevie').
There was also ED (pronounced "ee-dee") in linux, but also a simple text editor for DOS namee Ed (pronounced "ed"). Many CS students got the two confused and even thought they might be the same app ported to different OSes. Come to think of it DOS Ed may have come from CP/M! Thank goodness we moved on to "vi and emacs", and the editor wars broke out!😂
@@squirlmy most likely, the first version of MS-DOS was a CP/M clone that Microsoft bought. Gates had initially sent IBM reps to the creators of Dr DOS but the guy refused to sign IBM's NDA's so Bill Gates seized the opportunity the other guy discarded.
@@adamboggs4745I know few of the letter commands, but having first used Unix via a DECwriter (an upper-case-only output one), I know most of the ed commands, and generally only use them in vim. I know what will happen when I type :g/foo/s//bar/ Plus, using regular expressions makes it more powerful
This video helps me understand some things that happened in the software world while I was otherwise occupied working in OS/360, VMS, MS-DOS, Windows 95, and Windows NT environments. I hope there will be a next video, or maybe it's two videos, about the proprietary Unix wars and the subsequent rise of open source and Linux. One of the most fascinating things to hear in this video was that the Bell Labs inclination toward open source was the result of an antitrust action taken by government.
I had always wondered why ATT practically gave away Unix, initially not even copyrighting it, to Universities. I hadn't been aware of the consent decree, but that explains it pretty well.
The funny thing about this "anti-trust" agreement is that it was practically the reverse: the government gave a monopoly to AT&T in exchange for giving up something no one really valued.
It's a shame that the Government didn't break Microsoft up for their antitrust violations. They should have separated Microsoft into three separate businesses. OS, Networking and Applications. MS ability to embed and interlock all three facets of computing gave them massive power to manipulate the market for personal and later business computing. We're still suffering the consequences of that failure of our government to understand technology and be swayed by lobbying technocrats.
When I worked at AT&T in the 80's, I went to the Murray Hill Bell Labs facility for a day-long conference. At lunch, my wife (who was also an employee), one of my co-workers, Martin, and I decided to wander around looking for "famous people". We wandered into the lab in area 11 where Unix was born, and Dennis Ritchie was sitting at terminal typing. We told him we were there to see "famous people". He laughed and asked if we wanted to see the first Unix bug. When we said yes, he held up a glass jar containing a dead cockroach.
@@hello-cn5nh I can't prove it happened, since the only thing I took from that day was the memory. I worked at AT&T Communications for a few months, then got promoted to Bell Labs, where I was part of the C compiler team that developed the System V Release 4 compiler. I cowrote the link editor (ld), the dynamic linker (that loads shared objects into the kernel) and helped with the development of the original ELF specification that is still used in Linux. While at Bell Labs, I met Ken Thompson and Dennis Ritchie when they came over to our building (4 miles from Murray Hill) for a noon talk. I also regularly saw Bjarne Stroupstup in the hallway since he came over frequently to consult with the C++ team, although I never had much interaction with him. Believe my story or not, but it happened as I described.
@@norbert.kiszka the UNIX war of today is GNU / Linux versus everyone else, but mostly the last two standing, FreeBSD and illumos. Yes yes, NetBSD and OpenBSD still exist, but they are niche within a niche. I am happy that GNU / Linux is, slowly but steadily, losing its grip on the IT industry in favor of FreeBSD and especially illumos.
When I was in the navy all of the weapons and sonar systems ran on some flavor of Unix or Linux. Windows was just for email. This is still true today because of the flexibility it offers. In fact I'd say that the US military is probably the largest user of unix/Linux. It's not going away any time soon
As a fellow engineer, I got emotional from the efforts of programmers and engineers to provide foundation for the computer industries nowadays. Kids have no ideas what we went thru with the limited hardware back then.
Thanks for the nod to BCPL, the first language that I used in professional capacity back in 1981. It's a beautifully simple language that can work nicely where memory space is very limited.
BCPL - Ugh! Everything in the Amiga operating system was written in 'C', EXCEPT for the Disk Operating System, which was written in BCPL. This was because Commodore couldn't finish their planned DOS in time, so they bolted on a port of Metacomco TRIPOS, which was written in BCPL. It was difficult to get the two brain-halves to communicate, as they could not share Structs.
3:47 file system 5:27 PDP-7 Unix 6:17 PDP-11 roff typesetting 10:01 Berkeley Unix 14:00 summer 1982 Bill Joy SUN Microsystems 15:32 SCO Unix. 15:56 Microsoft Xenix [ran on TRS-80 Model II/12/16/6000 family]. NeXT NeXTSTEP [predecessor to macOS]
My father worked at Murray Hill. Around 1976, I joined a Boy Scout Explorer Post sponsored by the great Walter Brown. On Monday nights we would meet in the Bell Labs lobby and be escorted to the Unix development area at the Labs. There we learned the basics of shell programming, C and text formatting with roff and troff. Some of us printed out out high school term papers on the incredibly huge phototypesetting machine that resided near the Cray-1. I was lucky enough to be introduced to unix at its birth and made a 40+ year career out of it. UNIX will NEVER die!
Where I work there's a bell labs facility down the road , it's still open they work on cell phone tower equipment there . A few of my customers were engineers there for years .
Ahh yes, Bill Joy. Beyond the nerds and the hobbyists, he's criminally underrated among techies. It's a damn shame more people in the Google era of internet don't know about Sun as much as they do early Microsoft or Apple, maybe even Commodore and Atari.
Amazing video! Thank you so much, it dusted off a lot of cobwebs of memory in my mind. I worked at AT&T in the early 80s and witnessed many of these developments in the UNIX community firsthand. As someone previously said, Dennis Richie, and his contemporaries get so little recognition for their contribution to the modern world. As we used to say, with grep, sed, and awk, we can rule the world!
Unix is at the core of every Mac, iPhone, iPad, Apple Watch and AppleTV made today, ever since Apple bought Next in the ‘90s and released MacOS X (which was originally NextStep).
This begs for a part 2!! Part 2 should go from 1983 to about 1993. Part 3 should go from 1993 to 2003. Part 4 should go from 2003 at least until Android, until 2013. Part 5 would also be interesting going from 2013 to 2023....
Except that -Linux became much more influential in the 90s, particularly on internet servers. And Android came from a specific Linux distro(Gentoo?), not directly from Unix or any other similar systems (like BSD). Yes this could smoothly transition into the story of linux, but the stories of copyright Unix, Sun OS and BSD, even Xenix and Minix, are distinct stories of their own, and a decision whether to follow those other lineages would need to be made. And also how much would be dedicated to "Free Software" and "Open Source", as opposed to the more technical Operating System stories.
Yes, although the discussion about Android could demonstrate that while the *ux kernel is totally robust and versatile, from a user point of view the underlying Linux structure is totally insignificant and even worthless. Heck, my Android is so closed I can't even get my primary and most powerful tool, a shell.
Correction: Unix was not the first OS written primarily in a high level language. The first such OS is generally acknowledged to be MCP written in ESPOL in 1961. From what I can deduce, UNIX was originally implemented in assembly language and wasn’t re-implemented in C until 1973.
Some of the best software engineering I learned was learning to use the unix command lines. The way the tools are organized (The UNIX Way) are so incredibly powerful and such a great example of composition.
While other engineers in my profession struggled to learn the massive language Perl, I used a handful of Unix utilities strung together with pipes and committed to shell scripts. I could test and debug any utility I needed in usually less than 20 minutes while they were still fishing for the 6-inch thick Perl documentation. I never needed to learn Perl and never have. I am learning Python, just to keep current.
@@GH-oi2jfHow? I've been writing a shell and have find myself constantly migrating back to UNIX-y ways of doing things. So what exactly would be better?
This was great to watch! I was a student at UC Berkeley during the decade of the 70s, at a time when "computing in the humanities" was a thing. I was introduced to Unix when it was running on the VAX and had just started being available for student use. Good fun...
Decade of the 70s huh? So how many boxes of cards did you go through? lol In just 3 semesters of Fortran and COBOL I went through I cannot remember how many boxes of cards. Did you ever write programs to play tricks on the guys running the computer?
I still use nroff and troff, and while I'm still learning it, I use them very heavily to write documentation for any software I write, even if it's just a shell script, and then I package everything into an OS package. (n)roff has been called the assembler of typesetting, but it's really powerful. If you've ever used LaTeX, it's like that, and before you jump in, it is possible to both generate and embed images into manual pages. The man command simply won't display graphics if it detects that the target device doesn't support them; it truly is a document rendering engine, not just simple ASCII text formatter.
Thank you for producing this; it's a truly wonderful video. Like numerous others here, I've also forged a lengthy and remarkable career based on UNIX and C starting in the early 80’s, and still plugging away at it.
I was there from the late 70's. C and Unix in college at UNCC, SVR and BSDlater on paid my bills. The PDP/11 was my toy. After many years in Unix and C land, I ran the Software Labs for Tandem Computers in Austin many varieties of Unix hardware, Sun, SGI, others plus our own. Additionally the baby OS called Linux. Later on I was an initial founder of GST which brought me to Vancouver , Washington Managed a start up, GST. A CLEC and Internet Company. Unix and Data Center Operations Manager. We went bankrupt unfortunately... Got a job working on Disaster Recovery and Business Continuity for ACS, prime contractor for Nike. Sun, HP, Linux and Windows. I was Senior Data Analyst. Long time ago but one hell of a fun ride...
I think you can confirm what I think is urban legend. The Tandem owner was showcasing the Tandem Nonstop and shot at the computer with a gun and it kept working. Myth or True?
SCO, that's a name I haven't heard in some time. The history around the SCO-IBM lawsuits really got a lot of technology people interested in the law. I remember spending countless hours on Groklaw reading filings. There's a really good story for the channel in all that, both the lawsuits themselves and the way it drove a ton of changes to the tech world and people operating in it.
same, I recall something about them going after IBM, then some government "National Laboratories" for using Linux on some experimental supercomputers (o_O), then silence. I recall looking around to see what happened with the IBM case few years later, and I guess it was "compounded" by System V vs BSD code-sharing mess that goes way back to the dawn of pain. As for the rest, well, SCO Group mobility and Unix software assets were sold off in 2011, guess no one wanted anything to do with their alleged patent-troll lawsuit shenanigans. SCO was Renamed to The TSG Group then Chapter 7 bankruptcy in 2012. reads like a book/movie on how to burn your name and reputation instantly.
These were a fun trip down memory lane, and I learned a few tidbits I hadn't been aware of. I started as a UNIX System Admin in 1986, which was right at the end of this video, and experienced the entire UNIX Wars you documented very well in the next part. Thanks for this!
Fantastic episode Jon, already laying in a stock of popcorn for part 2 aka "Time to lawyer up!" Hoping to see a part 3 where some dude from Finland eats their lunch.
Awesome run through history. Brings back lots of good memories. For the text editors, there's 'ex', then 'vi', and then much later 'vim', which we all love.
Vi is still one of my favorite text editors of all time. It’s beautiful not to have to rely on a mouse to navigate a document, integrated reg expressions all operating on the most basic terminal, which is not sensitive to any network delays. Thoroughly enjoyed the history and some memories of my childhood. Can’t wait to hear the sequel.
It took me awhile to understand why I kept coming back to vi: built-in regex support on the command and simple searching. Very efficient. What's funny to me is that everyone I worked with knows a slightly different subset of vi commands. (TAMTOWTDI, as in Perl.)
@@davidcarson4421emacs was to replace vi, for easier usage and more options, and it was extensible. It is still popular in some areas, but to be honest, notepad++ is a worthy replacement. Emacs was also kind of an os in itself, you could do your email in emacs, ftp, debug c programs, etc. I once ported micro-emacs to a vax vms system because it did not have a decent editor. Today, we remember emacs as the start of the big wave of public domain open source applications and the copyleft licence
I started UNIX life at university in the mid 80s on a PDP, and today I use a Raspberry Pi as my home machine. It's remarkable both how much and how little has changed in 40 years; it's been bleeding edge every step of the way, but we never really stop and think about it.
Once again, thank you AsianOmetry ~ your clips are brilliant. This is probably the single best telling of the birth of UNIX I have seen. I note toward the end, you mentioned the Santa Cruise Operation. By about '93 or '94, Linux had been built (at least the earliest versions of it) and the huge SCO lawsuit was ongoing, consuming money at a rate like a national debt, and it would drag on for about 5 years or so. As a 1st year college level student of computers, nobody seemed able to explain in a way a normal person could understand, *WHY* that one legal battle over the copyrights to UNIX / Linux were so Earth-shaking important. Nearly 30 years later, and today I've got a pretty fair idea why the SCO battle was so important, but the industry at that time did a completely terrible job of explaining it, not only to Mum & Dad outsiders, but even college level computing students. The question was ~ Linux is Unix, even if it doesn't share identical source code. Ok, so if the source code is slightly different, even if it then performs the same tasks in the same way, can the commercial copyright owners, (Lining up behind SCO ~ who are carrying the Flame of Justice on this question) claim Linux is a copyright violation and therefore theft? There was no point trying to float Red Hat or any other Linux based business, until this matter was resolved and a precedent was set. The implication of this, would be Does Linux belong to people like Richard Stallman and Linus Torvalds, and the thousands of volunteer developers, or are they 'hackers' who should be jailed for forgery? The courts basically said Linux is free and should stay that way, but the SCO and the Lawyers, kept dragging it back in with 'new evidence' and new points and new arguments, and seemingly bottomless pockets. There was a very large amount of money at stake, and they were not about to simply give up & walk away ...
The reason is simple. Linux contained "stolen" Unix source code. Not saying it was done intentionally by those with authority over Linux. When so many people from so many organizations, or just plain individuals, contribute to something that large it's not surprising that things can creep in. The legal stuff stopped when it was ruled that SCO did not own the rights to Unix. They screwed up when acquiring Unix assets from Novell. Novell still owned the rights and THEY refused to uphold them. Not that Unix source wasn't in Linux. en.wikipedia.org/wiki/SCO-Linux_disputes
Unix wasn't the first OS written in a high level language: Burroughs MCP was written in Algol for their large systems, and Multics was written in PL/1.
@@JonathanMaddoxit was written in C. I still have C language manual in my home somewhere. Also, I still have Unix V manual too. Yes, I’m dating myself.
00:00:30 Creation of Unix as a versatile, cost-effective software platform. 00:03:46 Development of Unix's file system for hardware abstraction. 00:06:48 Unix's spread due to affordability, portability, and open-source nature. 00:12:34 Unix's pivotal role in the development of the Internet. 00:13:58 Transition of Unix from a hobby to a commercial industry. 00:16:06 Unix's significance in shaping the software industry's trajectory.
Yes. And everything works pretty much the same, from your raspberry pi HTPC over your mac laptop to the server in the company. Linux is for lazy people :)
@@Teluric2 Maybe not, but for most productivity tasks Mac OS blows Windows out of the water. I’m a writer, and no writer I know who runs Mac has lost work. The Windows based writers do have issues this way, not all but enough that there’s no damned way I’d trust Windows for anything other than gaming.
Thank you for sharing. I wonder how the YT algo figured out how to surface this vid for me. I contracted at AT&T in 1985 as a tech writer. One project we worked on was the Unix PC manufactured by Olivetti. Wedge shaped and an early version of marketecture in hardware. One of the many footnotes in the Unix story. Best unix command was "write" where you could blow away the text on someone else screen and send them into a panic.
The funny thing is, if a video is mostly watched on mobile and TVs, then the video is streamed to more unix-like systems than NT. I do hope linux systems will become the norm for average user, slowly but surely. Writing this from a linux system.
Statistically speaking, Windows is on a slow death curve; as Android, iOS, MacOS, datacenter Linux, and academic Chromebooks slowly increase their domination. All office-suites now firmly in the cloud, video games are the last good reason to run Windows.
Not really in the way that most Linux users like, but Linux already has conquered the computer world. 80% of phone users use Android, an overwhelming amount of smart TVs run Android or another Linux-based OS, many schools are now using Chromebooks and an overwhelming amount of web servers run Linux. The only common device that doesn't is the PC, in which OS usage is largely superficial, basically a tool to use the web browser.
@@truejimAnd even gaming is becoming less attractive for Windows with the advent of Proton. Windows is on life support, Linux just needs hardware with it preinstalled to reach out to normies
@@mgord9518 And Visual Basic was once the worlds most dominant programming environment. Linux is very popular at the moment but I wouldn't say conquered. Much of that popularity is under the covers embedded within a device or serving web pages. Not recognized by users. Very easily replaced in the future by the next hot mess. BTW, Apple products are certainly common devices and they don't run Linux.
The part about Pascal being part of Bill Joys work and Unix is new to me. Pascal came from Niklaus Wirth. I used Pascal on CP/M and C on BSD 4.2 but I didn’t know there was Pascal on Unix other than the ones derived from Wirth’s work.
1:35 Virtual memory, in the context of modern operating system, means the mechanism that OS configures the mapping between virtual address and physical address where some parts of virtual address might not be mapped to physical address, so user programs can access memory like there's its own full e.g. 4GB address space. If the user programs access virtual addresses that's not mapped to physical address, OS would run "page fault handler" which tries to find free physical memories and fills the missing virtual-physical address mapping, then returns to the user program. In page fault handler, if there's no free physical memory, OS might delete an existing virtual-physical mapping, copy the content of the physical memory to disk, and create a new virtual-physical mapping. If the user program accesses a virtual address whose mapping to physical address was deleted previously, page fault handler starts again and copy the memory content back from the disk. This is called memory swapping, and sometimes called "virtual memory" (like the setting in Windows). This is the explanation of today's "virtual memory", but I don't know the concepts of Multics.
Due to the mentioned NextStep computer, every Mac since about 2000 is Unix at heart and since 2007 : UNIX 03 compliant. Programmers use a lot of that under the hood Unix to do a lot of tasks too. I'm not sure how much "Unix" is in iPhone and other non-Mac products but I'd assume: a lot - though not certified to UNIX-03. So not sure how much "falling" has occurred here.
macos is basically freebsd with an apple userland, altho nearly all of the basic freebsd commands/services are still present so it is mostly quite easy to port apps to it from other unix versions
@@brucebecker7097Pretty sure it's the opposite. macOS is freeBSD userland with an Apple kernel (and the aforementioned macOS GUI tools slapped on top)
Great video, brings back memories. I started working on Unix in the mid 1980s, I was still coding for it when I retired in 2020 and now I use it on my personal Raspberry Pi projects. I still use vi/vim 🙂
Thanks for sharing! My heros were those Silicon Graphics and Sun guys which rolled out fantastic workstations to Universities. I loved those Sparcstations and even got one for home use in the 90ties on which I developed Opensource stuff in C for X11. But one more thing: In web3 & blockchain nowadays, I start to feel the sames exciting distributed client/server computing spirit again. Very pioneering scientific opensource stuff around. Let's go for it and write fantastic web3 code on Linux workstations. I also see that RISC processors starting to have their comeback in a more competitive way. Very promising future.
Was an early member of Hydra Computer Systems, parallelized 4.2 and sysv unix on a multiprocessor (shipped 20 cpu system in 1985). Sweet speedup, could “gang” schedule cpus to applications, etc. First cpu family was Nat Semi 32k series, later moved to 88k. Also delivered the Annex terminal server, later sold (w engineers) to xylogics. Best 8 years of my working life (Hydra was a wholly owned sub of Encore Computer Systems).
Encore was started by Ken Fisher, Gordon Bell, and Henry Burkhardt, (prime, dec and data general respectively). They wanted to buy a number of startups and release a family of products simultaneously. Add “resolution”, a multi processor workstation, and a software company cant recall the name). Encore would cover all mfg and service, with the subs being independent.
Fascinating! It sounds like this work ought to be more widely known. The CPU choices were a bit unfortunate, given the way everything played out, though.
That AT&T story makes me wonder what sort of utopia we would live in if companies were still regulated today, rather than let the companies control government. It almost looks like the entire idea of open-source was almost inspired partly by that AT&T decree. Incredible what great things regulation can do for the world!
Recently I've gotten much more invested in Linux. This led me to going through quite a few videos covering the history of Unix and Linux. Great to see you cover this topic! Part of the reason I've gotten much more involved with Linux is I'm in the middle of going through a computer engineering program. Though I'm still a freshman you can't ignore Linux if you want to become a computer engineer.
It's so weird to think there was a time where there weren't "files" as we know them today. It's hard to even comprehend how many layers of innovation have led to the computers and software we now have.
After I spent some time working with z/OS and its so-called "data set" - which is essentially a file that contains one or more records - I finally understood why the invention of the concept of "files" was so revolutionary. Computing before UNIX is so... weird, is all that I can say.
"Creating files is so easy in UNIX systems that the only standard shared by all of them is the System Administrator's MOTD telling users to clean up their files."
This is seriously like the Revolutions podcast by Mike Duncan . Which is to say it was great. I like my history with some facts, a bunch of context, and just a bit of humor - all supporting a reasonable argument. Very well done.
Charles (Chuck) Haley was an incredible linebacker for the 49ers, and, more importantly, the Dallas Cowboys during their 90s Dynasty years. So nice to see a video featuring him. :)
12:34 I see where you’re coming from. The use of the term “lame categorization” to describe the VAX-11/780 as a “minicomputer” or “superminicomputer” does seem dismissive, particularly given its significance in computing history. DEC’s VAX-11/780 was a groundbreaking system that straddled the line between smaller, more affordable systems like PDP-11s and the traditionally much larger and costlier mainframes, carving out a unique market niche. However, as you suggest, IBM mainframes of the era had significant architectural advantages that the VAX could not match-particularly the use of dedicated I/O channels (which offloaded I/O operations from the CPU) and their ability to handle massively parallel workloads in a way that even DEC’s most advanced systems couldn’t. These features underscore why IBM mainframes occupied an entirely different tier of enterprise computing. Calling the categorization “lame” might be an oversimplification or misunderstanding of the VAX’s place in the computing hierarchy. It ignores how the VAX essentially defined the minicomputer market in its era while providing some features (like virtual memory) that were competitive with larger systems. At the same time, it doesn’t account for the stark differences in design philosophy and capabilities between the VAX and true mainframes like the IBM System/370 or 3081. The VAX deserves respect for democratizing computing power and pushing technological boundaries for smaller organizations, but it would be inappropriate to lump it in with mainframes, given the latter’s scale and design focus. It seems the “lame” critique might oversimplify a much more nuanced comparison.
It’s odd to title this as UNIX “falling” when it’s now at the core of a huge percentage of our devices, and UNIX-like OSes making up a significant share of the remainder. If anything, UNIX won. (I mean, literally 100% of our mobile phones and tablets use a UNIX or UNIX-like OS.)
@@don_n5skt Absolutely. And of the remainder, a significant portion is Apple products, which all run UNIXey OSes. (Heck, even some Apple _accessories,_ like the Lightning to HDMI adapter, actually have a stripped-down version of the XNU kernel used in iOS!) And in the case of macOS, it’s literally UNIX certified, so it’s not just UNIX-like, it _is_ officially a UNIX.
Well, NO phones or tablets use copyrighted Unix!!! While Linux distros (including Android with a Linux kernel) run all those things, the Open Source Linux kernel makes all of those distinct from Unix. In that sense, Unix is pretty much dead. If you want to include BSDs and other Uni alike OSes, you could call them un*x, but that essential feature of Open Source/Libre code makes that quite distinct!!!
@@squirlmy macOS, HP-UX, and AIX are all certified “real” UNIX. iOS doesn’t have the certification, but is extremely closely related under the hood. Regardless, I said “UNIX or UNIX-like”, the latter encompassing Linux and Android.
@@TheOwlGuy777 If you mean that Unix wouldn't be a footnote because of Apple - I say ehh. By the time OSX came out in 2001 the tech field already knew what Unix was and Linux was already well established by that point. The rest of the Unix world was going nowhere compared to what was going on in the Windows world (except MAYBE IBM but only because they had the install base). Sun was perpetually on life support, HP-UX was hot garbage, Cray and SGI were niche.
@@TheOwlGuy777 Despite NextStep's Unix heritage, most of OSX's SUS (Single Unix Specification) compliance, actually comes from FreeBSD code. If it weren't for FreeBSD, I highly doubt Apple would have put the necessary time, effort, or money into achieving Unix compliance without it. So, the OP's comment is still quite accurate even if you want to include Apple, which their products are technically Unix, but it's really more of a side note.
@@TheOwlGuy777Were it not for the massive success of Linux by that time, Apple likely would never have looked at a Unix anyway. They only chose BSD because they could keep it proprietary.
11:15 "vim, a text editor that some people like" it's still super usable for 99% of quickie-fu editing needs in 2023. No nonsense and quick to do the most common things.
Interesting times. I was at Berkeley during the construction of the new EECS building. They were in Evans Hall before the move. Sharing space with mathematicians. Back when housing was cheap and computers were expensive.
I had to use a Sun workstation at one time in my career. Their mantra was “the network is the computer.” What that meant was that persons on other workstations could do things to mess with your workstation as a practical joke, and that did happen. I did like two things about it. It used “Life” as a screen saver, which was a good choice. It had a chess program which I found possible to beat, occasionally, so just hard enough to be challenging, not too hard to be pointless playing it.
@@GH-oi2jf That mantra was stolen from DEC. DEC's minicomputers did WAN and LAN networking long before Sun even existed. And then there is VMS clustering, never equalled.
We had 4 Sun workstations shared between a dozen developers. When a colleague got a new job I told him that "it wasn't so much like losing a colleague as gaining a Sun".
The term Virtual Memory for Swap Space isnt quite right. You can blame Microsoft for that. Virtual Memory is just a virtual address space. On 32bit systems it was 4gb, on 64bit i think it goes up to several exabytes but I believe on x86 64 it's capped at some reasonable limit. Anyway the system VM subsystem manages how this is allocated and used, and it can be run on top of real physical memory, a swap space on disk allocated to supplement the systems total memory, or even completely on disk, though it would be so slow you never would want to run everything from a disk memory even if the OS let you. For a while there was some experimental support in Widows and I think Linux that would allow you to also use a swapfile on flash whenever a thumb drive was plugged in to dynamically supplement the total conventional memory the system could shove the prcoesses around in, but nand and even early ssd speeds made this not worth it and you get other problems anyway like cell wear if you have a swapfile or swap partition on ssd. I don't run one even though most modern Linux distros still try to insist you need one.
@@tracyrreedAh ok so it's a VMS carry over that Cutler and his team did. That makes sense since they all came from DEC and would use similar nomenclature.
Exacly that. The CPUs sees only the virtual address space. Swap is just a region of a block device that is mapped into the virtual address space. Whenever data is accessed, a page fault occurs, prompting the memory management to copy the data from block storage into the actual physical memory. Swap space is generally not needed*, it's basically and very slow cheap memory - sometimes people don't want to pay for RAM, sometimes you cannot install more. I also never understood why some OS or applications insist on having swap. It is usually never explained. Other useful mechanism to safe memory are memory overcommit (software developers tend to allocate huge amounts of memory they never use), memory compression and memory deduplication (available on select hardware plattforms, does great with plenty of similar workloads and small memory pages). * there are a few platform dependent exceptions. E.g. part of the kernel dump mechanism, or linux uses the swap mechanism to compress memory into a ramdisk. Edit: Fun fact: 32bit x86 could address more than 4GB total with the PAE feature. If you use Microsoft, you had to pay an extra license (usually a more expensive version of the OS) to use it. Software developers, especially those who clinged to 32bit binaries in the 64bit era, were often totally inept in using more than 4GB of RAM by spawning multiple processes and using IPC mechanism to shift data around. They often spectacularly went OOM instead. Edit: Fun fact 2: If you have Windows and an nVidia GPU, the OS maps the VRAM into the virtual address space WITHOUT increasing it's size. So the total VM available is less than the physical RAM + VRAM, leading to OOM situations when the GPU's VRAM is full, even when physical RAM is available. This is one of the cases where you might add Swap as a hack to increase the VM size in modern windows.
I started programming VAX computers using FORTRAN to support CAD applications back in 1984. I switched to a Unix based system in the early 90s. That was a very dynamic time in the world of computing.
I hate to be THAT guy, but as a former Teaching Assistant who taught Pascal, it’s pronounced “pass-CAL” (like CALifornia), not “PASS-kull.” Other than that tiny quibble, great video!
I've never heard important this tale told with such clarity and focus. This would make a good chapter for a book about computers on top of the Bletchly Park story.
In 1980 I became part of an engineering project to allow MCI Communications Corp to rapidly ramp competiveness in the long distance switching telephone biz vs AT&T. Our s/w was developed on a PDP-1170 running UNIX, with Bell Labs just a few miles away. Small ironic world.
I love the stories you write!! I wonder when you sleep. Or if you even do?? I can't kick more than a lyric out in a day and you drop this fascinating shit all the time.
Long time ago did entry level COBOL (followed by C) course. 14 students on green screen terminals, all hooked up to a single 386SX with 2GB RAM. The OS was Xenix. The Editor was Vi. The company was The Kalamzoo Computer Company who leased an office in State House down Dale St in Liverpool. If anyone else went to that course or worked there would love to hear from you, especially Julia, Andy or Stuey!
I'm looking forward to the next video, where it really kicks off. I had a job porting software to all the different UNIXs and keeping track of the differences was quite a challenge. I wonder how many names I can recall? BSD 4.1, 4.2 SVR 5 System 7 Sinix A/UX DG/UX AIX Ultrix Xenix (shudder) All with their own little idiosyncrasies. Ever had a compiler throw a "Too many shapes" error at you? Fun times.
This is history I like. I’m an electrician, never really into I.T., but I like history. I was my understanding that many OS’s were of Unix based, but never really dug into who started it all. Good vid man
I would argue that it didn't need _commercialisation_. It needed a formal persistent organisation to coordinate work, make and distribute updates, and provide support. Commercialisation was detrimental because it resulted in proprietary incompatible versions, closed shops, and high costs.
"Vim, a text editor, that some people like" this is the kind of diplomatic phrase that starts world wars. Wonderfully stated.
I've been looking for this comment. This guy not only has a great sense of humour, but he does his research. Either that, or he's a lot older than he sounds.
😆
In the context of Unix, it really should be "vi" not "vim". I remember taking the source for it and porting it other operating systems, notably OS/9 a RTOS that I don't people even remember.
vi, emacs, nano, yup, all starts by changing the default editor.
@@michaelhoffmann2891 Remember it, I've never heard of it. I stopped at OS/2. 😉
Denis Ritchie is an underappreciated legend of IT history, nice to hear his name mentioned here. He and Jobs passed away on the same month and of course Jobs got all the mentions and credits while Ritchie was barely mentioned at all. Thanks for this.
Within the IT community however, Ritchie and Thompson were at least as famous as Steve Jobs, if not more. Whereas Jobs (along with Wozniak) was most of the time acknowledged for the Apple I and II computer and the introduction of GUIs with Lisa and the Macintosh, Ritchie and Thomson were famous for both Unix and C, and during my Linux phase beginning with its appearance in 1992 all through the nineties I never met someone in that community who hadn't at least one book of those guys, mostly "The C programming language.". Only a few other guys were that popular: Donald Knuth, Bjarne Stroustrop, Linus Torvalds and Niklaus Wirth (for Pascal and Modula). And maybe Weizenbaum for Eliza and making us rethink our stance on computer technology whenever the consequences of something new in the IT business became apparent. We invited him once to a conference on data security in Kiel and he gave us a great talk on the ethics of IT engineering.
@@olafschluter706 I like this comment so much I read it twice lol... thanks for the further insight
I've been using C my entire career and I owe it all to him.
I felt the same, Jobs was overated
People keep harping on this like they are the same. They could not be more different. Unix was 60's technology that had no relevance anymore after 2000 for anyone but IT nerds. Jobs successfully built and marketed products (not based on Unix) that STILL have relevance today. For literally billions of people. If you want to complain about people not getting recognition, maybe you should focus on Linus Torvalds that stole Unix's thunder by basically porting it to X86 and (initially) adding nothing to it himself?
Google then used the Linux kernel and stole the Java API to build their iOS copy, Android. But that is another story...
I appreciate that you compared the cost of computers to the cost of a graduate student - that is an accurate way to depict what that relationship looks like
Cars would be another option, you could get a new commuter car for $2000, a Cadillac was about $6000.
Now the computers cost nothing, the software costs nothing, IT professionals are expensive as hell and companies pay so much money in opportunity costs to force people to use Windows in some weird power trip.
@@svr5423
The Windows thing is about HR. In most companies regular users outnumber IT by 100:1 and since most applicants only have experiance with Windows it is easier to use Windows then to retrain every new hire. (I know is seems trivial but a lot of worker drones are... a bit slow to figure things out.)
This was (and is) a big part of MS strategy when it came to dumping OEM windows into every home-computer on the market. I mean OEM windows licenses have always been dirt cheap, they weren't trying to make much on it directly but they know such market coverage locks in third party software and enterprise customers who then slowly get sucked into a walled garden of exchange servers whatnot that is needed to support all of the Windows nodes. MS also dumps cheap OS licenses on schools and colleges for the same reason.
@@mytech6779 My 85 Macintosh Lisa was ~$5500 brand new, bought by the Art Institute Of Chicago, and was replaced in 1987, where I bought it for a mere $200! Graphics editing was still just starting to get capable, and since they were a well funded institution I guess they wanted the best of the best, and grew out of it pretty darned quick. I guess Apple didn't take the old ones back to refurbish them, and the School could just write them off and give them them to the students, one who resold it at a profit to me!
@@liam3284 Even consumer grade software is not cheap, and all of that subscription stuff costs even more in the long run, by "costs nothing" I'm sure they meant FOSS software Like what I run: I ditched windows ~15 years ago, and haven't touched it again since. I'm on Arch Linux and use all FOSS software, and haven't paid for software in the same time either, I just donate to the developers of often used and highly appreciated software's I use here and there, and take none of the abuse I would by using anything from Microsoft, where you hand over near total control of your PC and everything on it to them and they spy on and track you, and are a prime target for hackers, scammers... and you pay them to do so instead of them paying you! It's like being a pimps bitch!
Asianometry is the very paradigm of what a You Tube channel should be. Really top quality content. Thanks again for another excellent essay on computing history.
As a fledging EE in 1974, I was fortunate to join Digital in Maynard where I worked with the PDP-11 team. I enjoyed the vintage photos of the hardware and of some of my colleagues that I enjoyed working with. I left Digital to become a software engineer in 1982, joining a Pittsburgh startup called "3 Rivers Computer Corporation". Our hardware was an innovative workstation called the PeRQ. Gordon Bell joined us briefly after leaving Digital.
One of our major projects at 3RCC, done under contract in Edinburgh for ICL (a major UK supplier at the time), was a port of Unix using "C-Codes". Our hardware was a bytecode machine (very fast for its day) and we developed a bytecode interpreter for C. We used a portable assembler to generate C-Codes from standard C. That let us run Unix on our hardware.
This piece reminded me how much fun we had in the early years of the workstation world.
OMG The PERQ! Didn't ICL market it? That machine itself was absolutely legendary and deserves its own episode!
@@lenkapenka6976 : Yes indeed, the machine was marketed by ICL. Thankfully, I had nothing to do with any of that. My job was to make the Unix port run. I loved the architecture of the machine. It was basically a clone of the Xerox Alto (one of the founders of TRCC worked with the Alto at PARC). The internal working name of the TRCC machine that became the PeRQ -- even when I got there -- was "Pascalto". It was an Alto with microcode optimized to run Pascal bytecodes. The PeRQ shipped with an OS written in Pascal.
I had just been introduced to Smalltalk at the time and several of us also worked on a very nice Smalltalk port for the machine. The bytecode architecture of the machine an excellent fit with Smalltalk. The WCS meant that the entire "rasterop/bitblt" behavior from the Alto worked just fine on the PeRQ (with the able assistance of at least one ex-Parc contributor who was intimately familiar with that particular code).
For those who don't know, the PeRQ was built around the AMD 2901 bitslice chips (very advanced for its day) and had a writable control store (WCS). That meant that that if the developers knew that a particular task warranted, then a specific bytecode interpreter optimized for that task could be pushed into the WCS while the task was active and popped upon completion. The original task could then continue with the original bytecode interpreter in the WCS.
The result was machine with absolutely blinding performance (for its day).
lost it at "vim - a text editor some people like" 🤣
same lol
μEmacs is The Best! Just ask my old buddy, Linus.
I came here to say this.
That was peak dry humor.
So people like moving the cursor with "hjkl" or "arrows," LOL.
First video of the year and already hitting it out of the park. Great video.
Ehhhhh.... I didn't care for it. A lot of the technical detail was kinda confused. It felt like someone talking about cars that isn't really into cars if you know what I mean.
Good work so far. I have one disagreement. Take macOS, still a Unix (and OpenSource since its beginning) certified system, being the successor to NextStep. And all other Apple OSes (iOS; iPad, WatchOS, tvOS and soon VisionOS) also share the same fundamentals, kernel and framework. Also, though seen as different, Linux is, on purpose, a close open source variant. Let's not argue about the variation of kernels... What I'm saying is that all the concepts of Unix are still widely present in all those Operating Systems. Unix has not fallen at all, it is the dominating philosophy of all modern operating systems. Windows is extremely marginal in fact, as all the operating systems I mentioned are present in many more devices than PCs. Think mobiles phones, access points, switches, routers, IoT, cars, etc.... Unix variants are really dominating.
Exactly. Even Microsoft Windows draws from Unix both in terms of philosophy and code. Windows networking code originally came from BSD.
@@roberttbrockway Microsoft only forgot what strong process control meant in the first place :)
I was going to make this comment if no one else did. Far from being on their way out, operating systems based on Unix or Unix-like actually are used in most servers, most mobile devices, and in desktop in the case of MacOS (and a little desktop Linux). Most routers, set top boxes etc. are usually based on Linux. People have been predicting the demise of unices for decades.
Depends on how you define the main concepts of Unix supposedly making up the dominating philosophy of all modern operating systems.
NextStep and NT were pioneers of the object oriented approach which you may call central to modern systems.
Unix had none of that. Its "philosophy" relied on creating virtual file systems to loosely mimic objects and parsing of output instead of the direct access to objects.
Piping made "everything is file/text" an interesting concept, but PowerShell's object piping is even more elegant.
You don't have to bother with "everything is file/text" when dealing with macos.
Even in the Linux world POSIX utils are being given up as scripting tools for the sake of Python libraries and thus an object oriented approach. Virtual file systems are also becoming somewhat of a dead soil upon which everything is being actualy handeled via D-Bus and udev. Text configs are more and more an auto-generated sham.
Some parts of POSIX stay kind of universally relevant, but only until C gets fully replaced by Rust in system programming.
@@eugrusOf course, PowerShell was written by the author of bash.
one way to feel old: you see a video of a historic review of things that happened during your professional lifetime. I fondly remember playing with a multics system in the ep500 scouts Wednesday nights after school. It had variable length segments of memory and we wrote code for it in PL/1.
Ey, as a young kid in this day, I've no doubt I'll feel the same looking back at what histories are written of today. ;p
What were you doing/what do you do now, was it just academia stuffs?
Wow, I always thought PL/1 was an IBM invention for their OS 360 systems. Filling in some gaps in my historical knowledge, which started in 1980.
Multics had a far more advanced and user-friendly shell than UNIX. In fact, some of its features are just beginning to appear on UNIX via non-official shells and shell extensions. Multics command-line supported autocompletion, inline help, and execution confirmation. It also supported random accessible character coordinates on CRT terminals.
As a young guy I find it amazing that OGs like you are still around. It's like you were there when the universe came into existence ...
Correction: 'ex' was not WYSIWYG, but a line editor always displaying only the line currently edited in a file, just like 'ed' on which it was based.
Billy Joy then used 'ex' as a base for his first 'vi' editor with 'vi' derived from 'visual' standing for the at that time pretty revolutionary new 'visual' mode that by using terminal commands let the user navigate through a text file on screen as we know it today.
'vim' came much later and isn't even based on the original 'vi' source code (according to Wikipedia its forefather was a vi clone called 'stevie').
Most people don't realize the colon commands in vim have roots in the early line editors.
There was also ED (pronounced "ee-dee") in linux, but also a simple text editor for DOS namee Ed (pronounced "ed"). Many CS students got the two confused and even thought they might be the same app ported to different OSes. Come to think of it DOS Ed may have come from CP/M! Thank goodness we moved on to "vi and emacs", and the editor wars broke out!😂
@@squirlmy most likely, the first version of MS-DOS was a CP/M clone that Microsoft bought. Gates had initially sent IBM reps to the creators of Dr DOS but the guy refused to sign IBM's NDA's so Bill Gates seized the opportunity the other guy discarded.
@@adamboggs4745I know few of the letter commands, but having first used Unix via a DECwriter (an upper-case-only output one), I know most of the ed commands, and generally only use them in vim. I know what will happen when I type :g/foo/s//bar/
Plus, using regular expressions makes it more powerful
@@adamboggs4745 I would like to state this even stronger, some vi commands ARE ed commands. Like :1,7 s/foo/bar/
This video helps me understand some things that happened in the software world while I was otherwise occupied working in OS/360, VMS, MS-DOS, Windows 95, and Windows NT environments. I hope there will be a next video, or maybe it's two videos, about the proprietary Unix wars and the subsequent rise of open source and Linux. One of the most fascinating things to hear in this video was that the Bell Labs inclination toward open source was the result of an antitrust action taken by government.
I had always wondered why ATT practically gave away Unix, initially not even copyrighting it, to Universities. I hadn't been aware of the consent decree, but that explains it pretty well.
Back when antitrust was still taken seriously, the public reaped the benefit. There's a lesson here for today.
The funny thing about this "anti-trust" agreement is that it was practically the reverse: the government gave a monopoly to AT&T in exchange for giving up something no one really valued.
@@vulpo Bell Labs also basically gave away the transistor.
It's a shame that the Government didn't break Microsoft up for their antitrust violations. They should have separated Microsoft into three separate businesses. OS, Networking and Applications. MS ability to embed and interlock all three facets of computing gave them massive power to manipulate the market for personal and later business computing. We're still suffering the consequences of that failure of our government to understand technology and be swayed by lobbying technocrats.
When I worked at AT&T in the 80's, I went to the Murray Hill Bell Labs facility for a day-long conference. At lunch, my wife (who was also an employee), one of my co-workers, Martin, and I decided to wander around looking for "famous people". We wandered into the lab in area 11 where Unix was born, and Dennis Ritchie was sitting at terminal typing. We told him we were there to see "famous people". He laughed and asked if we wanted to see the first Unix bug. When we said yes, he held up a glass jar containing a dead cockroach.
lol great story!
Yes, some of those Bell Labs people had a warped sense of humor! 😀
Cool story bruh. Evidence or it never happened.
@@BrettMonet sure. If you're not a skeptic. Where's the evidence any of this happened?
A skeptic would never believe a story like this.
@@hello-cn5nh I can't prove it happened, since the only thing I took from that day was the memory. I worked at AT&T Communications for a few months, then got promoted to Bell Labs, where I was part of the C compiler team that developed the System V Release 4 compiler. I cowrote the link editor (ld), the dynamic linker (that loads shared objects into the kernel) and helped with the development of the original ELF specification that is still used in Linux. While at Bell Labs, I met Ken Thompson and Dennis Ritchie when they came over to our building (4 miles from Murray Hill) for a noon talk. I also regularly saw Bjarne Stroupstup in the hallway since he came over frequently to consult with the C++ team, although I never had much interaction with him. Believe my story or not, but it happened as I described.
To think, Bell Labs nearly had a monopoly on one of the most important computer systems in the known world.
However, it wouldn't have gone on to become big if it had been jealously guarded the way all the other systems had been.
Thank God for anti-monopoly laws and governments that actively break up monopolies. We should have more of those.
@TheEvertw The same government: gives legal monopoly over telephone system to AT&T.
if Unix were not open, i doubt that it would become so popular.
The same thing is happening right now wlith car electronics systems. Proprietary head to toe.
@@TheGreatAtariozealot=/=jealous
Fascinating story, well presented. Thank you.
Congratulations on posting the first comment, dated two months before the video was published on UA-cam!
Saw that too.. strange@@cdl0
@@cdl0 how is this even possible
@@Finito54ify sponsoring the channel? Patreon or smting.
Patreon members get early access
haha "and pluto" nice clarification there.
I hope we get to listen to the Unix Wars soon. Thanks for the great production!
There are *nix wars today. FreeBSD, OpenBSD, Linux, Android, Chrome OS. Did I missed something?
The OG wars of BSD vs SysV and the later SCO shenanigans against Linux.
@@norbert.kiszka the UNIX war of today is GNU / Linux versus everyone else, but mostly the last two standing, FreeBSD and illumos. Yes yes, NetBSD and OpenBSD still exist, but they are niche within a niche. I am happy that GNU / Linux is, slowly but steadily, losing its grip on the IT industry in favor of FreeBSD and especially illumos.
Looking forward to a part 2, which is implied by the conclusion.
When I was in the navy all of the weapons and sonar systems ran on some flavor of Unix or Linux. Windows was just for email. This is still true today because of the flexibility it offers. In fact I'd say that the US military is probably the largest user of unix/Linux. It's not going away any time soon
All those playstations as well.
@@renaissancechambaraevery thing that's not windows is probably linux
@@metamorphis7reeeeeeeeeeeee os2 and BeOS and qnx exist
@@renaissancechambaratoo bad more and more of the internet is being centralized into just, facebook
@@cc-dtvno one care about facebook
As a fellow engineer, I got emotional from the efforts of programmers and engineers to provide foundation for the computer industries nowadays. Kids have no ideas what we went thru with the limited hardware back then.
Yup. You had to have a bit of brainpower to be able to use a computer.
Thanks for the nod to BCPL, the first language that I used in professional capacity back in 1981. It's a beautifully simple language that can work nicely where memory space is very limited.
BCPL - Ugh! Everything in the Amiga operating system was written in 'C', EXCEPT for the Disk Operating System, which was written in BCPL. This was because Commodore couldn't finish their planned DOS in time, so they bolted on a port of Metacomco TRIPOS, which was written in BCPL.
It was difficult to get the two brain-halves to communicate, as they could not share Structs.
3:47 file system 5:27 PDP-7 Unix 6:17 PDP-11 roff typesetting 10:01 Berkeley Unix 14:00 summer 1982 Bill Joy SUN Microsystems 15:32 SCO Unix. 15:56 Microsoft Xenix [ran on TRS-80 Model II/12/16/6000 family]. NeXT NeXTSTEP [predecessor to macOS]
My father worked at Murray Hill. Around 1976, I joined a Boy Scout Explorer Post sponsored by the great Walter Brown. On Monday nights we would meet in the Bell Labs lobby and be escorted to the Unix development area at the Labs. There we learned the basics of shell programming, C and text formatting with roff and troff. Some of us printed out out high school term papers on the incredibly huge phototypesetting machine that resided near the Cray-1.
I was lucky enough to be introduced to unix at its birth and made a 40+ year career out of it. UNIX will NEVER die!
Very jealous!
Where I work there's a bell labs facility down the road , it's still open they work on cell phone tower equipment there . A few of my customers were engineers there for years .
I'm sure that Cray 1 would've been a great toy to tinker with!
It already has died. No one has used it for decades.
Ahh yes, Bill Joy. Beyond the nerds and the hobbyists, he's criminally underrated among techies. It's a damn shame more people in the Google era of internet don't know about Sun as much as they do early Microsoft or Apple, maybe even Commodore and Atari.
As a former Java developer I am painfully aware of Sun’s existence
Thats because Oracle destroyed a good thing like all other tech billionaires.
@@jshowao I was a Sun for 20 years and sadly, in some ways, Sun destroyed itself first. Oracle just twisted the knife.
I worked at Sun for 20 years. When we were on the rise, it was an amazing place to work. You'll never see anything like that again.
Bill Joy is a certifiable genius.
Amazing video! Thank you so much, it dusted off a lot of cobwebs of memory in my mind. I worked at AT&T in the early 80s and witnessed many of these developments in the UNIX community firsthand. As someone previously said, Dennis Richie, and his contemporaries get so little recognition for their contribution to the modern world. As we used to say, with grep, sed, and awk, we can rule the world!
Unix is at the core of every Mac, iPhone, iPad, Apple Watch and AppleTV made today, ever since Apple bought Next in the ‘90s and released MacOS X (which was originally NextStep).
Yes, and Leia is Luke's sister.
Just before was BeOS, ran on 603e processors. I have an unopened copy..
Technically yes but the kernel name is XNU (X is not Unix) 😂
Posix…not unix.
Android is also Linux kernel based.
This begs for a part 2!!
Part 2 should go from 1983 to about 1993. Part 3 should go from 1993 to 2003. Part 4 should go from 2003 at least until Android, until 2013. Part 5 would also be interesting going from 2013 to 2023....
dude is obsessed with body parts
Don't overlook Minix (Mini-unix) from late 1980's and Linux from the early 1990's as microchips started to reach the masses.
Except that -Linux became much more influential in the 90s, particularly on internet servers. And Android came from a specific Linux distro(Gentoo?), not directly from Unix or any other similar systems (like BSD). Yes this could smoothly transition into the story of linux, but the stories of copyright Unix, Sun OS and BSD, even Xenix and Minix, are distinct stories of their own, and a decision whether to follow those other lineages would need to be made. And also how much would be dedicated to "Free Software" and "Open Source", as opposed to the more technical Operating System stories.
Yes, although the discussion about Android could demonstrate that while the *ux kernel is totally robust and versatile, from a user point of view the underlying Linux structure is totally insignificant and even worthless. Heck, my Android is so closed I can't even get my primary and most powerful tool, a shell.
@@AerialWaviator
Agreed!!
Correction: Unix was not the first OS written primarily in a high level language. The first such OS is generally acknowledged to be MCP written in ESPOL in 1961.
From what I can deduce, UNIX was originally implemented in assembly language and wasn’t re-implemented in C until 1973.
Or at least the kernel and the base utilities were originally written in ASM. The rewrite in C was a stroke of genius.
@@n7ekg MSDOS 2.1 was written in ASM. MSDOS 3.0 was written in C. It did not help.
@@AllenMorris3 3.11 was... "ok"
Some of the best software engineering I learned was learning to use the unix command lines. The way the tools are organized (The UNIX Way) are so incredibly powerful and such a great example of composition.
Odd. My opinion is that the command line interface to Unix is very poorly designed from a human factors standpoint.
its designed for programmers not humans@@GH-oi2jf
While other engineers in my profession struggled to learn the massive language Perl, I used a handful of Unix utilities strung together with pipes and committed to shell scripts. I could test and debug any utility I needed in usually less than 20 minutes while they were still fishing for the 6-inch thick Perl documentation. I never needed to learn Perl and never have. I am learning Python, just to keep current.
@@GH-oi2jfHow? I've been writing a shell and have find myself constantly migrating back to UNIX-y ways of doing things.
So what exactly would be better?
I love this channel so much. The breadth of topics and the quality leaves me astounded.
This was great to watch! I was a student at UC Berkeley during the decade of the 70s, at a time when "computing in the humanities" was a thing. I was introduced to Unix when it was running on the VAX and had just started being available for student use. Good fun...
Decade of the 70s huh? So how many boxes of cards did you go through? lol In just 3 semesters of Fortran and COBOL I went through I cannot remember how many boxes of cards. Did you ever write programs to play tricks on the guys running the computer?
Excellent summation of Unix history, highly appreciated; thank you!
On the same note,
Happy New Year 2024.
Greetings,
Anthony
Thanks!
Your best opening yet! Well done!
I still use nroff and troff, and while I'm still learning it, I use them very heavily to write documentation for any software I write, even if it's just a shell script, and then I package everything into an OS package. (n)roff has been called the assembler of typesetting, but it's really powerful. If you've ever used LaTeX, it's like that, and before you jump in, it is possible to both generate and embed images into manual pages. The man command simply won't display graphics if it detects that the target device doesn't support them; it truly is a document rendering engine, not just simple ASCII text formatter.
Dude your way of research and expertise mixed with topics like linux is the best blend for peeps like myself!
Thank you for producing this; it's a truly wonderful video. Like numerous others here, I've also forged a lengthy and remarkable career based on UNIX and C starting in the early 80’s, and still plugging away at it.
I was there from the late 70's. C and Unix in college at UNCC, SVR and BSDlater on paid my bills. The PDP/11 was my toy.
After many years in Unix and C land, I ran the Software Labs for Tandem Computers in Austin many varieties of Unix hardware, Sun, SGI, others plus our own. Additionally the baby OS called Linux.
Later on I was an initial founder of GST which brought me to Vancouver , Washington Managed a start up, GST. A CLEC and Internet Company. Unix and Data Center Operations Manager.
We went bankrupt unfortunately...
Got a job working on Disaster Recovery and Business Continuity for ACS, prime contractor for Nike. Sun, HP, Linux and Windows. I was Senior Data Analyst.
Long time ago but one hell of a fun ride...
Thanks for sharing!
I think you can confirm what I think is urban legend. The Tandem owner was showcasing the Tandem Nonstop and shot at the computer with a gun and it kept working. Myth or True?
I was there, Galdalf, 3000 years ago....
SCO, that's a name I haven't heard in some time. The history around the SCO-IBM lawsuits really got a lot of technology people interested in the law. I remember spending countless hours on Groklaw reading filings. There's a really good story for the channel in all that, both the lawsuits themselves and the way it drove a ton of changes to the tech world and people operating in it.
I once wrote an Ethernet driver for SCO Unix running on a 80386. It worked flawlessly at an exhibition the following week.
SCO was trashed by Gates, the demon who started life wrecking computing before moving onto people.
same, I recall something about them going after IBM, then some government "National Laboratories" for using Linux on some experimental supercomputers (o_O), then silence. I recall looking around to see what happened with the IBM case few years later, and I guess it was "compounded" by System V vs BSD code-sharing mess that goes way back to the dawn of pain. As for the rest, well, SCO Group mobility and Unix software assets were sold off in 2011, guess no one wanted anything to do with their alleged patent-troll lawsuit shenanigans. SCO was Renamed to The TSG Group then Chapter 7 bankruptcy in 2012. reads like a book/movie on how to burn your name and reputation instantly.
Kernighan and Ritchie! I haven't seen that book in 25 years! Thanks!
These were a fun trip down memory lane, and I learned a few tidbits I hadn't been aware of. I started as a UNIX System Admin in 1986, which was right at the end of this video, and experienced the entire UNIX Wars you documented very well in the next part. Thanks for this!
Fantastic episode Jon, already laying in a stock of popcorn for part 2 aka "Time to lawyer up!" Hoping to see a part 3 where some dude from Finland eats their lunch.
Awesome run through history. Brings back lots of good memories. For the text editors, there's 'ex', then 'vi', and then much later 'vim', which we all love.
Vi is still one of my favorite text editors of all time. It’s beautiful not to have to rely on a mouse to navigate a document, integrated reg expressions all operating on the most basic terminal, which is not sensitive to any network delays.
Thoroughly enjoyed the history and some memories of my childhood. Can’t wait to hear the sequel.
Your first sentence might require editing, as I think it does not reflect what you want to state. :wq
@@jlinkels thanks! Will be ready for the next update 😂
:w!
It took me awhile to understand why I kept coming back to vi: built-in regex support on the command and simple searching. Very efficient. What's funny to me is that everyone I worked with knows a slightly different subset of vi commands. (TAMTOWTDI, as in Perl.)
I also remember EMACS. How did that fit in?
@@davidcarson4421emacs was to replace vi, for easier usage and more options, and it was extensible. It is still popular in some areas, but to be honest, notepad++ is a worthy replacement. Emacs was also kind of an os in itself, you could do your email in emacs, ftp, debug c programs, etc. I once ported micro-emacs to a vax vms system because it did not have a decent editor. Today, we remember emacs as the start of the big wave of public domain open source applications and the copyleft licence
I can't wait for the next exciting episode! Happy New Year!
I started UNIX life at university in the mid 80s on a PDP, and today I use a Raspberry Pi as my home machine. It's remarkable both how much and how little has changed in 40 years; it's been bleeding edge every step of the way, but we never really stop and think about it.
Once again, thank you AsianOmetry ~ your clips are brilliant. This is probably the single best telling of the birth of UNIX I have seen.
I note toward the end, you mentioned the Santa Cruise Operation. By about '93 or '94, Linux had been built (at least the earliest versions of it) and the huge SCO lawsuit was ongoing, consuming money at a rate like a national debt, and it would drag on for about 5 years or so. As a 1st year college level student of computers, nobody seemed able to explain in a way a normal person could understand, *WHY* that one legal battle over the copyrights to UNIX / Linux were so Earth-shaking important. Nearly 30 years later, and today I've got a pretty fair idea why the SCO battle was so important, but the industry at that time did a completely terrible job of explaining it, not only to Mum & Dad outsiders, but even college level computing students.
The question was ~ Linux is Unix, even if it doesn't share identical source code. Ok, so if the source code is slightly different, even if it then performs the same tasks in the same way, can the commercial copyright owners, (Lining up behind SCO ~ who are carrying the Flame of Justice on this question) claim Linux is a copyright violation and therefore theft?
There was no point trying to float Red Hat or any other Linux based business, until this matter was resolved and a precedent was set. The implication of this, would be Does Linux belong to people like Richard Stallman and Linus Torvalds, and the thousands of volunteer developers, or are they 'hackers' who should be jailed for forgery? The courts basically said Linux is free and should stay that way, but the SCO and the Lawyers, kept dragging it back in with 'new evidence' and new points and new arguments, and seemingly bottomless pockets. There was a very large amount of money at stake, and they were not about to simply give up & walk away ...
The reason is simple. Linux contained "stolen" Unix source code. Not saying it was done intentionally by those with authority over Linux. When so many people from so many organizations, or just plain individuals, contribute to something that large it's not surprising that things can creep in. The legal stuff stopped when it was ruled that SCO did not own the rights to Unix. They screwed up when acquiring Unix assets from Novell. Novell still owned the rights and THEY refused to uphold them. Not that Unix source wasn't in Linux. en.wikipedia.org/wiki/SCO-Linux_disputes
Unix wasn't the first OS written in a high level language: Burroughs MCP was written in Algol for their large systems, and Multics was written in PL/1.
Unix was the first *portable* operating system, and it was portable because the implementation language lent itself to portability.
@@JonathanMaddoxit was written in C. I still have C language manual in my home somewhere. Also, I still have Unix V manual too. Yes, I’m dating myself.
@@frankchan4272 What is so odd about that? I code in C nearly every day.
@@frankchan4272 Have you proposed to yourself yet?
00:00:30 Creation of Unix as a versatile, cost-effective software platform.
00:03:46 Development of Unix's file system for hardware abstraction.
00:06:48 Unix's spread due to affordability, portability, and open-source nature.
00:12:34 Unix's pivotal role in the development of the Internet.
00:13:58 Transition of Unix from a hobby to a commercial industry.
00:16:06 Unix's significance in shaping the software industry's trajectory.
As soon as I switched to Mac in 2006 I became a Unix fan-mac , linux, bsd etc just because it seemed more intuitive than other OSes 🤷♂️
Reliable too. My work machines are Debian and OSX, I game on Windows.
Yes. And everything works pretty much the same, from your raspberry pi HTPC over your mac laptop to the server in the company.
Linux is for lazy people :)
@@svr5423
Yep. We like getting stuff done rather than reinstalling Windows. Again.
Mac os will never be able to do mission critical like a real UNIX unless you want another chernobyl.
@@Teluric2
Maybe not, but for most productivity tasks Mac OS blows Windows out of the water. I’m a writer, and no writer I know who runs Mac has lost work. The Windows based writers do have issues this way, not all but enough that there’s no damned way I’d trust Windows for anything other than gaming.
Thank you for sharing. I wonder how the YT algo figured out how to surface this vid for me. I contracted at AT&T in 1985 as a tech writer. One project we worked on was the Unix PC manufactured by Olivetti. Wedge shaped and an early version of marketecture in hardware. One of the many footnotes in the Unix story. Best unix command was "write" where you could blow away the text on someone else screen and send them into a panic.
The funny thing is, if a video is mostly watched on mobile and TVs, then the video is streamed to more unix-like systems than NT.
I do hope linux systems will become the norm for average user, slowly but surely. Writing this from a linux system.
It already is, Android is built on Linux.
Statistically speaking, Windows is on a slow death curve; as Android, iOS, MacOS, datacenter Linux, and academic Chromebooks slowly increase their domination. All office-suites now firmly in the cloud, video games are the last good reason to run Windows.
Not really in the way that most Linux users like, but Linux already has conquered the computer world. 80% of phone users use Android, an overwhelming amount of smart TVs run Android or another Linux-based OS, many schools are now using Chromebooks and an overwhelming amount of web servers run Linux.
The only common device that doesn't is the PC, in which OS usage is largely superficial, basically a tool to use the web browser.
@@truejimAnd even gaming is becoming less attractive for Windows with the advent of Proton.
Windows is on life support, Linux just needs hardware with it preinstalled to reach out to normies
@@mgord9518 And Visual Basic was once the worlds most dominant programming environment. Linux is very popular at the moment but I wouldn't say conquered. Much of that popularity is under the covers embedded within a device or serving web pages. Not recognized by users. Very easily replaced in the future by the next hot mess. BTW, Apple products are certainly common devices and they don't run Linux.
Absolutely fascinating Asianometry. Well done!!! Once you mentioned BSD the lightbulb went off in my head. I didn't know that was it's origin story.
The part about Pascal being part of Bill Joys work and Unix is new to me. Pascal came from Niklaus Wirth. I used Pascal on CP/M and C on BSD 4.2 but I didn’t know there was Pascal on Unix other than the ones derived from Wirth’s work.
1:35 Virtual memory, in the context of modern operating system, means the mechanism that OS configures the mapping between virtual address and physical address where some parts of virtual address might not be mapped to physical address, so user programs can access memory like there's its own full e.g. 4GB address space. If the user programs access virtual addresses that's not mapped to physical address, OS would run "page fault handler" which tries to find free physical memories and fills the missing virtual-physical address mapping, then returns to the user program.
In page fault handler, if there's no free physical memory, OS might delete an existing virtual-physical mapping, copy the content of the physical memory to disk, and create a new virtual-physical mapping. If the user program accesses a virtual address whose mapping to physical address was deleted previously, page fault handler starts again and copy the memory content back from the disk. This is called memory swapping, and sometimes called "virtual memory" (like the setting in Windows).
This is the explanation of today's "virtual memory", but I don't know the concepts of Multics.
Due to the mentioned NextStep computer, every Mac since about 2000 is Unix at heart and since 2007 : UNIX 03 compliant.
Programmers use a lot of that under the hood Unix to do a lot of tasks too.
I'm not sure how much "Unix" is in iPhone and other non-Mac products but I'd assume: a lot - though not certified to UNIX-03.
So not sure how much "falling" has occurred here.
macos is basically freebsd with an apple userland, altho nearly all of the basic freebsd commands/services are still present so it is mostly quite easy to port apps to it from other unix versions
@@brucebecker7097Pretty sure it's the opposite. macOS is freeBSD userland with an Apple kernel (and the aforementioned macOS GUI tools slapped on top)
I just have to say, your videos are top notch both in presentation and background research. Keep it up!
"...and Pluto." Nice😂
I love technology history and was always curious about the history of Unix. Thank you for making this!
Great video, brings back memories. I started working on Unix in the mid 1980s, I was still coding for it when I retired in 2020 and now I use it on my personal Raspberry Pi projects. I still use vi/vim 🙂
Man, this brings back memories...thanks!
Unix, or its inheritor, Linux, is still everywhere.
Exactly, MacOS is 100% POSIX compliant (in essence, UNIX) and it’s the #1 preferred OS for development out there today.
@@AdamPippert it's kind of a weird unix, though. super nice for developing web stuff, or remoting to linux boxen
..including in Android phones.
@@AdamPippertlol no
It's at most a nice unified development environment that can be emulated on Windows and Mac
Thanks for sharing! My heros were those Silicon Graphics and Sun guys which rolled out fantastic workstations to Universities. I loved those Sparcstations and even got one for home use in the 90ties on which I developed Opensource stuff in C for X11. But one more thing: In web3 & blockchain nowadays, I start to feel the sames exciting distributed client/server computing spirit again. Very pioneering scientific opensource stuff around. Let's go for it and write fantastic web3 code on Linux workstations. I also see that RISC processors starting to have their comeback in a more competitive way. Very promising future.
Was an early member of Hydra Computer Systems, parallelized 4.2 and sysv unix on a multiprocessor (shipped 20 cpu system in 1985). Sweet speedup, could “gang” schedule cpus to applications, etc. First cpu family was Nat Semi 32k series, later moved to 88k. Also delivered the Annex terminal server, later sold (w engineers) to xylogics. Best 8 years of my working life (Hydra was a wholly owned sub of Encore Computer Systems).
Encore was started by Ken Fisher, Gordon Bell, and Henry Burkhardt, (prime, dec and data general respectively). They wanted to buy a number of startups and release a family of products simultaneously. Add “resolution”, a multi processor workstation, and a software company cant recall the name). Encore would cover all mfg and service, with the subs being independent.
Encore went public before shipping any systems, raised 27mill IIRC.
Fascinating! It sounds like this work ought to be more widely known. The CPU choices were a bit unfortunate, given the way everything played out, though.
Your channel has such great historical content. I'm so glad to see how huge it has grown.
That AT&T story makes me wonder what sort of utopia we would live in if companies were still regulated today, rather than let the companies control government.
It almost looks like the entire idea of open-source was almost inspired partly by that AT&T decree. Incredible what great things regulation can do for the world!
So you support fascism...
I thank you so much for your videos. Your channel is my favorite on the many UA-cam channels that I subscribe to.
Recently I've gotten much more invested in Linux. This led me to going through quite a few videos covering the history of Unix and Linux. Great to see you cover this topic!
Part of the reason I've gotten much more involved with Linux is I'm in the middle of going through a computer engineering program. Though I'm still a freshman you can't ignore Linux if you want to become a computer engineer.
Great to hear. *nix skills will do you well in IT. FWIW, I've been using Linux for nearly 30 years and Unix for a bit longer than that.
Happy to know that my compatriot Ozalp Babaoglu made some contribution as well, thank you very much for your video!
It's so weird to think there was a time where there weren't "files" as we know them today. It's hard to even comprehend how many layers of innovation have led to the computers and software we now have.
After I spent some time working with z/OS and its so-called "data set" - which is essentially a file that contains one or more records - I finally understood why the invention of the concept of "files" was so revolutionary. Computing before UNIX is so... weird, is all that I can say.
"Creating files is so easy in UNIX systems that the only standard shared by all of them is the System Administrator's MOTD telling users to clean up their files."
@@RogerioPereiradaSilva77 Dataset is IBM speak. It predates Unix.
Programming in assembly is tedious, and is rarely rewarding.
@@lfrankow Quite right, except that the "in assembly" part is redundant.
This is seriously like the Revolutions podcast by Mike Duncan . Which is to say it was great. I like my history with some facts, a bunch of context, and just a bit of humor - all supporting a reasonable argument. Very well done.
Charles (Chuck) Haley was an incredible linebacker for the 49ers, and, more importantly, the Dallas Cowboys during their 90s Dynasty years. So nice to see a video featuring him. :)
12:34 I see where you’re coming from. The use of the term “lame categorization” to describe the VAX-11/780 as a “minicomputer” or “superminicomputer” does seem dismissive, particularly given its significance in computing history. DEC’s VAX-11/780 was a groundbreaking system that straddled the line between smaller, more affordable systems like PDP-11s and the traditionally much larger and costlier mainframes, carving out a unique market niche.
However, as you suggest, IBM mainframes of the era had significant architectural advantages that the VAX could not match-particularly the use of dedicated I/O channels (which offloaded I/O operations from the CPU) and their ability to handle massively parallel workloads in a way that even DEC’s most advanced systems couldn’t. These features underscore why IBM mainframes occupied an entirely different tier of enterprise computing.
Calling the categorization “lame” might be an oversimplification or misunderstanding of the VAX’s place in the computing hierarchy. It ignores how the VAX essentially defined the minicomputer market in its era while providing some features (like virtual memory) that were competitive with larger systems. At the same time, it doesn’t account for the stark differences in design philosophy and capabilities between the VAX and true mainframes like the IBM System/370 or 3081.
The VAX deserves respect for democratizing computing power and pushing technological boundaries for smaller organizations, but it would be inappropriate to lump it in with mainframes, given the latter’s scale and design focus. It seems the “lame” critique might oversimplify a much more nuanced comparison.
It’s odd to title this as UNIX “falling” when it’s now at the core of a huge percentage of our devices, and UNIX-like OSes making up a significant share of the remainder. If anything, UNIX won. (I mean, literally 100% of our mobile phones and tablets use a UNIX or UNIX-like OS.)
Yes, this was my issue with it as well. Falling how. Linux is EVERYWHERE.
@@don_n5skt Absolutely. And of the remainder, a significant portion is Apple products, which all run UNIXey OSes. (Heck, even some Apple _accessories,_ like the Lightning to HDMI adapter, actually have a stripped-down version of the XNU kernel used in iOS!) And in the case of macOS, it’s literally UNIX certified, so it’s not just UNIX-like, it _is_ officially a UNIX.
That was my first reaction - "But ... the descendants of UNIX lierally run the world!"
Well, NO phones or tablets use copyrighted Unix!!! While Linux distros (including Android with a Linux kernel) run all those things, the Open Source Linux kernel makes all of those distinct from Unix. In that sense, Unix is pretty much dead. If you want to include BSDs and other Uni alike OSes, you could call them un*x, but that essential feature of Open Source/Libre code makes that quite distinct!!!
@@squirlmy macOS, HP-UX, and AIX are all certified “real” UNIX. iOS doesn’t have the certification, but is extremely closely related under the hood. Regardless, I said “UNIX or UNIX-like”, the latter encompassing Linux and Android.
I can't wait for the next part! You have me hooked
Excellent video. More videos on Unix and Linux and the OS wars.
The original version of Unix was not written in C but in PDP-7 and then PDP-11 assembler. Unix wasn't rewritten in C until Version 4.
Unix would really be a footnote today if it weren't for Linux and FreeBSD. It's everywhere but people outside IT don't completely realize it .
Um, Apple.......
@@TheOwlGuy777 If you mean that Unix wouldn't be a footnote because of Apple - I say ehh. By the time OSX came out in 2001 the tech field already knew what Unix was and Linux was already well established by that point. The rest of the Unix world was going nowhere compared to what was going on in the Windows world (except MAYBE IBM but only because they had the install base). Sun was perpetually on life support, HP-UX was hot garbage, Cray and SGI were niche.
@@TheOwlGuy777 Despite NextStep's Unix heritage, most of OSX's SUS (Single Unix Specification) compliance, actually comes from FreeBSD code. If it weren't for FreeBSD, I highly doubt Apple would have put the necessary time, effort, or money into achieving Unix compliance without it. So, the OP's comment is still quite accurate even if you want to include Apple, which their products are technically Unix, but it's really more of a side note.
@@TheOwlGuy777Were it not for the massive success of Linux by that time, Apple likely would never have looked at a Unix anyway. They only chose BSD because they could keep it proprietary.
And Linux STILL isn't ready for prime time, yet Apple is mainstream.
11:15 "vim, a text editor that some people like" it's still super usable for 99% of quickie-fu editing needs in 2023. No nonsense and quick to do the most common things.
"A cheaper computer at just $65,000"
...in 1970s money...
That's like what - half a million now? Crazy. For less computing power than an Alexa puck. To share among a whole company.
Interesting times. I was at Berkeley during the construction of the new EECS building. They were in Evans Hall before the move. Sharing space with mathematicians. Back when housing was cheap and computers were expensive.
There was a time when in my life when having my own Sun workstation was the definition of "having arrived". I loved that technology.
I had to use a Sun workstation at one time in my career. Their mantra was “the network is the computer.” What that meant was that persons on other workstations could do things to mess with your workstation as a practical joke, and that did happen.
I did like two things about it. It used “Life” as a screen saver, which was a good choice. It had a chess program which I found possible to beat, occasionally, so just hard enough to be challenging, not too hard to be pointless playing it.
@@GH-oi2jf That mantra was stolen from DEC. DEC's minicomputers did WAN and LAN networking long before Sun even existed. And then there is VMS clustering, never equalled.
We had 4 Sun workstations shared between a dozen developers. When a colleague got a new job I told him that "it wasn't so much like losing a colleague as gaining a Sun".
Yes, I know you don't read YT comments; I don't blame you! :)
Thanks again for all your work, quality stuff as usual.
The term Virtual Memory for Swap Space isnt quite right. You can blame Microsoft for that.
Virtual Memory is just a virtual address space. On 32bit systems it was 4gb, on 64bit i think it goes up to several exabytes but I believe on x86 64 it's capped at some reasonable limit.
Anyway the system VM subsystem manages how this is allocated and used, and it can be run on top of real physical memory, a swap space on disk allocated to supplement the systems total memory, or even completely on disk, though it would be so slow you never would want to run everything from a disk memory even if the OS let you.
For a while there was some experimental support in Widows and I think Linux that would allow you to also use a swapfile on flash whenever a thumb drive was plugged in to dynamically supplement the total conventional memory the system could shove the prcoesses around in, but nand and even early ssd speeds made this not worth it and you get other problems anyway like cell wear if you have a swapfile or swap partition on ssd.
I don't run one even though most modern Linux distros still try to insist you need one.
Microsoft? I think you mean DEC and VMS.
@@tracyrreedAh ok so it's a VMS carry over that Cutler and his team did. That makes sense since they all came from DEC and would use similar nomenclature.
Exacly that.
The CPUs sees only the virtual address space.
Swap is just a region of a block device that is mapped into the virtual address space. Whenever data is accessed, a page fault occurs, prompting the memory management to copy the data from block storage into the actual physical memory.
Swap space is generally not needed*, it's basically and very slow cheap memory - sometimes people don't want to pay for RAM, sometimes you cannot install more. I also never understood why some OS or applications insist on having swap. It is usually never explained.
Other useful mechanism to safe memory are memory overcommit (software developers tend to allocate huge amounts of memory they never use), memory compression and memory deduplication (available on select hardware plattforms, does great with plenty of similar workloads and small memory pages).
* there are a few platform dependent exceptions. E.g. part of the kernel dump mechanism, or linux uses the swap mechanism to compress memory into a ramdisk.
Edit: Fun fact: 32bit x86 could address more than 4GB total with the PAE feature. If you use Microsoft, you had to pay an extra license (usually a more expensive version of the OS) to use it.
Software developers, especially those who clinged to 32bit binaries in the 64bit era, were often totally inept in using more than 4GB of RAM by spawning multiple processes and using IPC mechanism to shift data around. They often spectacularly went OOM instead.
Edit: Fun fact 2: If you have Windows and an nVidia GPU, the OS maps the VRAM into the virtual address space WITHOUT increasing it's size. So the total VM available is less than the physical RAM + VRAM, leading to OOM situations when the GPU's VRAM is full, even when physical RAM is available. This is one of the cases where you might add Swap as a hack to increase the VM size in modern windows.
I started programming VAX computers using FORTRAN to support CAD applications back in 1984. I switched to a Unix based system in the early 90s. That was a very dynamic time in the world of computing.
I hate to be THAT guy, but as a former Teaching Assistant who taught Pascal, it’s pronounced “pass-CAL” (like CALifornia), not “PASS-kull.” Other than that tiny quibble, great video!
I've never heard important this tale told with such clarity and focus. This would make a good chapter for a book about computers on top of the Bletchly Park story.
It didn't fall. It evolved and is now by far the most ubiquitous operating system, at least at the kernel level.
what fall?
In 1980 I became part of an engineering project to allow MCI Communications Corp to rapidly ramp competiveness in the long distance switching telephone biz vs AT&T. Our s/w was developed on a PDP-1170 running UNIX, with Bell Labs just a few miles away. Small ironic world.
and pluto….. how dare you..,
Thanks for adding actual captions for the Deaf - good history and explaination
I can't belive that Pluto it's not a planet...
Fake News...
I love the stories you write!! I wonder when you sleep. Or if you even do?? I can't kick more than a lyric out in a day and you drop this fascinating shit all the time.
I've run Apple X, SGI IRiX, Scitex BrisQ, NeXT, AiX,/AuX and Sun unix workstations. My God, it's so superior that trash produced by microsoft.
Long time ago did entry level COBOL (followed by C) course. 14 students on green screen terminals, all hooked up to a single 386SX with 2GB RAM. The OS was Xenix. The Editor was Vi. The company was The Kalamzoo Computer Company who leased an office in State House down Dale St in Liverpool. If anyone else went to that course or worked there would love to hear from you, especially Julia, Andy or Stuey!
Kernighan's name is pronounced with a hard "g".
Thank you. Great video. Enjoyed it. Takes me back. Cheers
I'm looking forward to the next video, where it really kicks off. I had a job porting software to all the different UNIXs and keeping track of the differences was quite a challenge. I wonder how many names I can recall?
BSD 4.1, 4.2
SVR 5
System 7
Sinix
A/UX
DG/UX
AIX
Ultrix
Xenix (shudder)
All with their own little idiosyncrasies. Ever had a compiler throw a "Too many shapes" error at you? Fun times.
Maybe episode 7 of this series will feature ifdef hell.
This is history I like. I’m an electrician, never really into I.T., but I like history. I was my understanding that many OS’s were of Unix based, but never really dug into who started it all. Good vid man
I would argue that it didn't need _commercialisation_. It needed a formal persistent organisation to coordinate work, make and distribute updates, and provide support. Commercialisation was detrimental because it resulted in proprietary incompatible versions, closed shops, and high costs.