@YolandaPlayne Because the incentives are different, here it's mostly folks doing what they want and what they think an interview should be. When they look at the mindrot box in their living room it's the diametric opposite as the joke men don't want to inform but dictate opinion.
Let's not underestimate the significance of these videos. He's not just reminiscing about the past, he's documenting history about some of the most important developments to ever occur in technology. This is like getting to listen to Ford talk about designing car parts with colleagues. The development of early operating systems has changed the world as we know it, much like the modern automobile.
I just want to know what the hell happened with Windows 10 and 11. >_> Talk about some atrocities of development. Even 8.1 at very least was salvageable.
I bet windows has a lot of "borrowed" Linux code, but the user agreement license is protecting ms sins pretty darn well. Although there was some code leak in 2020 of all xp related branches, afaik. I'd love to see some tech in-depth analysis some day
@@lawrencedoliveiro9104 Windows doesn't *need* WSL. It's just legacy cruft on top of a semi-modern kernel. Unix was the bare minimum needed to run software in the 70s. Linux is a bloated monolithic bare minimum needed to pretend Unix was a good idea in the first place.
As someone who trained and certified on Windows 2000 and has followed through supporting present day systems, this was amazing. Thank you. Explains so many things. Like why I'm bald...
i think we live in an exceptional time because there exists an ocean of fully functional free/open source software in almost every field that gives huge freedom and power to every human being on the planet for free
Seriously though, memory leaks or not, the 4GB limit (which in most OSes was even lower for actual application code) was becoming a serious issue by the early 2000s and it was the last real market for RISC workstations (including Itanium), x86-64 practically ended the RISC workstation market.
I was just thinking the same thing, and when you hear him describe what happened, that whole awkward 32-bit to 64-bit transition period suddenly makes a ton sense. If you’ve ever researched those releases on Wikipedia, it’s not necessarily obvious what happened or why they did what they did, so this totally clears some of that up and I’m sure those wiki pages are now going to make a lot more sense once someone adds this bit of info to it, now that there’s cite-able source. Awesome interview, because this is the insider stuff you never hear about (anywhere else) and as someone who makes a living supporting these systems, it’s enlightening to see what the actual lineage was. I also had a curiosity about it recently because I watched someone boot up a semi-functional version of XP on a much more modern PC that was only a few years old. The guy actually managed to get the x64 version to load, since it was based on that Server 2003 version he mentioned them switching to in this video, but it was still so new that it required him to use either the kernel or some other integral part piece of a much later release of Windows Server 2003, because it shared the same codebase. Now I’m kinda wondering if he could fix that last couple of things that he couldn’t get to run for various reasons if he tried using chunks from something even newer, now that we just hear Mr. Cutler confirm that all later versions of Windows were built from that same unified codebase. It also explains why I spent the last four years supporting so many legacy systems that were still running on some version of XP. Many of these systems were custom embedded systems, often running on Pentium 4s from 20 years ago-but those could’ve been swapped out for something newer, but it was ultimately down to the combination of legacy I/O that needed PCI or ISA cards to interface with the PLCs (I work in manufacturing IT) and proprietary software, and usually a vendor that ceased to exist or would only provide limited support at a hefty price, which usually turned into a sales pitch for newer equipment that would have cost millions very late in the lifecycle of a product that’s being only going to be manufactured until 2025. After that, they’re planning to re-work that engine from the current ICE iteration to a PHEV hybrid based off of it, which is gonna be pretty cool, and will likely be when they’re able to make those kinds of necessary upgrades. Assuming the folks at corporate even decide to build it at that plant, because there’s always internal competition between various plants within the company, for the products that they build and plants that come in last place in key areas or can’t compete with Mexico or whatever arbitrary metric they’ve decided on using, is what determines where they build the next one. My curiosity isn’t always a common use-case, but if I had found a way to Frankenstein together a version of XP that could run on modern PC hardware, that could’ve resulted in very cheap, solution to spending a fortune on finding replacement hardware to run the software that controls the Millions of dollars of manufacturing equipment. It just really sucks how long it took manufacturers of the systems that run those PLCs hung onto XP for so damn long, because we had manufacturing lines installed new in 2010 that were still running on XP, because the big scary “XP is actually EOL, please stop using this software” message only popped up in 2014. There were some newer machines brought in later or a handful that were bold enough to be using Windows 7 or Windows Server 2008, and we’ve been able to upgrade the software and/or hardware on those systems to Windows 10 years ago, so as you might imagine there’s been many years of frustration for the folks like me who are still dealing with these legacy systems that were designed during that era. The parts inside of PLCs don’t change very much if they ever do, because that shit is all modular and internal communications between the embedded PC and PLC are just serial connections carried via USB, RS-232, Ethernet, or some combination of those via adapters. The worst are manufacturers who use some sort of proprietary or obscure GPIO card/cable that you can get from anywhere else, and are vendor locked to their system of hardware. So, yeah-it’s just a super niche reason for my piqued interest in this particular topic of discussion. The easiest solution is to just throw a ton of money at the problem, but whenever a discussion ends with suggesting that they just pay the vendor for an updated version of their embedded system or some box that can convert or adapt newer hardware/software to their interface, it tends to stop the conversation, especially when they see a quote from the vendor. So, our department is constantly being asked to move mountains with a zero dollar budget. As new people cycle in and out of roles in various departments, we have the same exact discussions like it’s Groundhog’s Day, and folks like me are always keeping an eye out for a novel solution. The best case scenario is whenever I enthusiastically present some untested, janky as fuck, zero clue if it’s even functional, let alone reliable, one-off-solution that involves some creative nonsense it often results in upper management deciding that the they should just pay the fucking vendor what they’re asking, rather than rely on yet another undocumented solution where only one person actually knows how it works or how to fix it. Turns out there’s the worst case scenario, where they’ve allowed mad lads like me or others before me to do crazy ass things that were idiotic and ill advised from the beginning, and that just complicates life for our future selves or our eventual replacements after we leave. Seriously, if you’ve never worked in manufacturing IT, it’s a hoot if you like tinkering with ancient stuff and have a particularly dark sense of gallows humor, because that’s the sort of personality it takes to start with an idiotic problem with an obvious solution that we know works, and then “engineer” something else from scratch with no clue if it’s possible, all while thinking and verbally saying “this is a bad/stupid idea that it’s only going to end in tragedy or frustration-but I’ll get right on it!” I’ve resurrected Windows 98 machines that were still running as late as 2021 or 2022, and were integral to the process. I’ve also found a way to P2V Windows 7 systems that wouldn’t cooperate with the process the way it’s supposed to work, and because I wasn’t able to reinstall the software from scratch because the vendor literally never provided that to us, and likely wouldn’t support what I did-I cleverly got it working. The solution I managed to get working involved taking a non-booting virtual drive with all the software installed on it, and creating a brand new, bootable VM on another virtual drive with a working install of the old corporate load of Windows 7. Then I grafted the two together with some wild ass software I found (can’t recall the name of it) that essentially allowed a system admin to create a hard coded link instead of an existing folder. Kinda like the way symlinks work in Linux, but basically what Microsoft did when they included hidden copies of links in user directory that’s named “My Documents” that actually points to the new location of those files. So, basically I threw a couple of those bad boys into the “Program Files” directory on the C: drive of the working version of Windows 7, so then it would always just point at the old, non-booting but installed folder located on the D: drive. After doing that with the user account’s directory and anything else that was needed, I just copied and pasted the shortcuts and it all just worked like a champ. It was a massive waste of space having duplicate copies of Windows, but it worked and that janky ass mess won our department an award.
As an old Tech head myself, I was so happy to wake up this morning and see a new interview with Mr.Cutler! I've read so much about his efforts with VMS and NT, it's refreshing to finally hear from him directly. Thank you Dave! Please continue to document such important minds to the industry!
OMG!!! You have Dave Cutler in your shop!!!! Just the other day I was telling a bunch of students about him and what he did for the Windows ecosystem. I really really hope to be able to meet him somewhere….. although I live in Europe so the chances of that are slim….. looking forward to the entire interview.
I remember that time very well. XP 64 bit edition and Windows Server 2003 were a revelation and finally made it feasible to run huge workloads on Windows machines. This man is a living legend, can’t wait for the full interview to drop. Subbed instantly!
I am madly excited for however much more you and Dave Cutler can deliver to us of conversations and storytelling just like this. This is very good! Thank you each!
Dave, the content you produce and the pure heart you put into what you are doing is just superb. It adds a whole new layer of context to what some people (often including me) have been trying to deduce or rationalize on their own. Please keep up that fantastic work that you have been doing. Greetings from Warsaw, Poland!
I really like the VMS core of Windows. Its much more sane than the monolith insanity of Linux , specially the Graphical System. The WDM is a beautiful system and X/server needs to die, if I were a decision maker on Linux, I would take the Surface Flinger system for Android and plop that as a the actual Linux Graphical System, forget wayland also, and that crap of DRI/DRM, really framebuffers ? (what's this, the 1970s ? is anyone still using CGA cards ?). (lets not talk about audio on linux, it doesn't exist, basically you use ASIO and then it works, because nothing pass thought the kernel and is all done via hardware)
He was a refugee from a nest of Unix-haters at DEC. He succumbed to the 1990s fashion of inextricably tying the GUI into the OS kernel, when the *nix systems maintained the GUI as a separate modular, replaceable layer. Think about why Windows still uses single-letter drive names even today, and why Microsoft felt the need to add WSL to try to turn it into Linux.
@@lawrencedoliveiro9104 "Think about why Windows still uses single-letter drive names even today" It doesn't, that's only the GUI. NT file system doesn't have a concept of letter, that's a filter. Its just a symlink, come on.
@@lawrencedoliveiro9104 "He succumbed to the 1990s fashion of inextricably tying the GUI into the OS kernel" The irony is that Linux is a freaking monolith kernel. It was done for performance. Computers on 1990s weren't that much faster. That division was so stupid that even XServer had to retract. No one ever ran a distributed GUI system like a client/server. Except Citrix metaframe, aka, Windows RDP. Ironically.
Thank you Dave for putting the excerpts of the interview. Amazing to hear the open and honest discussion about past internal issues and dynamics of coding. Looking forward to the full interview. Cheers
What a wonderful and enthralling interview. His memory is as sharp as a razor! I wish my memory was so good. Talk about a life well lived. Thanks for recording this oral history. Great Questions too
As an old VAX/VMS system admin, that was one of the best interviews I've ever seen! I remember when I first heard, in the late 80s, that Microsoft was bringing Dave Cutler onboard. I said "Microsoft just got REALLY serious about Operating Systems!" And I excitedly waited. I adopted NT at version 3.1. But I gotta say...I miss VMS. Best Operating System humanity has ever produced.
Well, after having passed through several owners, OpenVMS still available for commercial sale! 😊 It's not long been ported to x86_64 and virtual environments.
I got introduced to RSX-11M at my first job after college, and that O/S came with a code kit. You went through some gyrations if you were an OEM (as we were) because if you had a proprietary device, you had to roll your own driver (which we did). Looking through the code, I ran across a comment that told me that Dave was obviously working at another level and that his colleagues looked on him as a software god. Can't remember the instruction or the section of code it was in, but one COMMENT on that (PDP-11 Macro) instruction was "; Dave Cutler SWEARS this will work." The company graduated to VAXen within a few years after I joined so I went to all the VAX software and O/S schools. Loved that machine and that O/S. But... OpenVMS became the O/S that wouldn't die. I went to work for the U.S. Navy and we had an OpenVMS on a personnel system. When COMPAQ bought out DEC, we switched to an Alpha/OpenVMS. And when H/P bought out COMPAQ, we switched to the Itanium/OpenVMS. Our customers loved it because every upgraded system ran 10-30 times faster than its predecessor. The Itaniums have finally been switched off when the app graduated to a web server and ORACLE/Unix.
What a nice surprise, in the Dave Q&A video's some people asked about Dave Cutler and now he sits in your Garage. I would pass out within a second, but you did well Dave. Love to see more of these video's. Your channel is awesome.
The "Doesn't matter Horn" made my day :D Thank you Dave, it's always so interesting to hear things about how Microsoft made their OS back in the day when the internet either didn't exist yet or was very slow, so no chance of patching bugs too fast.
Not really, I think which games you are referring to, but those problems are in no way new to that company and due tobthe outsourcing. If anything, the outsourcing seems to have resulted in more stable product than what they usually provide, not worse. Outsourcing actually forces you to have a minimum level of good design and modularity for it to work, so for some rotten companies, outsourcing might even be an improvement Sadly, those problems run much deeper than just outsourcing recklessly
The menu system problem happens when you have artists/graphic designers setting up the menus (or worse, programmers). What you need is a trained, skilled, and accomplished UI/UX specialist. Not many game companies hire one of these. When I first got to work with one, probably 10 years after starting to write modern console games, that guy was a firehose of great information about how people _actually_ use interfaces, how to make sure they _can_ use interfaces (without help), and how completely idiotic most programmers AND most artists are when they try to design such things without having studied the specific crafts of UI/UX. It's really a science of its own.
@@Felice_Enellen very true. I’m a backend engineer and a while ago I had to make a mobile interface even though I have NO experience in that, and no interest in it either. People “running” the show hire great people but haven’t got a clue how to use them properly.
Amazing, thanks a lot!!!! P.S. now finally someone tells me ... - I used to be the Server BG lead for Switzerland at that time (2003-5), but never was told anything like this. Can't wait for the full interview!
This is amazing!! People like Ken Thompson, Dennis Ritchie, Dave Cutler, Gary Kildall, Torvalds, Terry Davis... All those crazy people who love to do low level stuff, that deep core stuff are my inspirations!. and to the above names concat (Curious Marc, Dave's Garage) Thanks Dave for this interview!
Great interview... I'm intensely listening then it was over. Bring back more of this type of content and make it a little longer. Always interested in behind the scenes tech stories.
Your interview technique is very good and it's the first time I've seen it. You let Dave talk without cutting him off or stepping on him. You clearly have faith that the guy is switched on, interesting and that your viewers will find his story as interesting as you do. Great work Daves! It looks like there's a lot of mutual respect between you two.
So interesting. Working as a software developer for a for the financial side of the business, I can relate to this (of course, on the user application level) but many of the same issues arrived from purchased code that I had to fix or blackbox to make it work with our systems. Please the more of these types of interviews and story tells are so interesting to the outside. I'm 75 as was an adopter of MS Windows at the market onset. Thanks you and thanks to Mr. Dave Cutler for taking time to talk.
I worked at Muzak 10 years or so ago. We outsourced some stuff to a French team, who followed no process at all and pushed out an update to all worldwide video players (the devices that play videos in department stores and Times Square, etc), bricking them all for a day or so. That killed the entire product line everywhere and ended up causing us to close our HQ building, and contributed a lot to the bankruptcy in 2020.
Oh, I vaguely remmeber. XP 64-bit had that thing where it was forked off a different codebase. So a lot of XP software had issues running on it. That time when Vista 64 was a great leap. - As for seeing bad code: "The only valid measurement of code quality. WTFs/minute." - Thom Holwerda, 2008. - Love that logic form the server people: "We prefer x64, because it takes longer for our memory leaks to cause a OOM." When you are so far behind in fixing memory leaks, "just more RAM" is a valid solution.
This was also about the time Microsoft acquired Hotmail, which had been running its operation some BSD variant, and then made a big noise about switching everything over to Windows NT. Only it’s not clear that they got very far.
by service pack 3 and 4 XP x64 was the same as the 32bit version. It ended up being much better and more stable than the x32 version. I used it for quite a while until windows 7 was mature.
During this era the hardware that could take advantage of 64bit was also pretty rare, so you found a lot of outfits running it did so on decked out Intel Mac Pro's. The likes of HP and Dell were still dragging their feet on releasing 64bit Xeons with 64bit EFI's/BIOS's so many went with Apple then wiped OS X and stuck Windows on them. I used to work with a guy who was in the Mac Warehouse section of PC World in the UK in their business unit who did this quite a lot for corporate clients.
@@skilletpan5674 I think you are mixing things up. XP only had 3 SPs total. The last of which never was released for x64. 4 SPs sounds like you are talking about NT. It is the only one that got that many.
It's not exactly more RAM, just a bigger addressing range. Like, there are more virtual memory addresses available to the process. Physical memory that leaks will be swapped out to disk since it's not used any more, and won't use up any RAM.
This is a fantastic channel. Great host and Dave Cutler is a genius. What we are listening to is a history documentary technology, something that we will keep coming back in decades to come.
This is so interesting, I can not imagine a more interesting guest on the show, love hearing from his development point of view. I feel like despite being such an integral figure not much information or stories that he loves to tell are out there, so thanks for bringing this to light :)
As someone who grew up on PDP/RSX-11 (M and S) and VAX/VMS, and still has a copy of the VMS Internals and Data Structures manual, at least I think that is the title going from memory, this is a fun interview. I’ve always thought Cutler was one of the better OS designers of the minicomputer era. I must say though that I still prefer his work in the DEC world over that later in the PC world. I still miss file version numbers that we had in the DEC world. 😁
Yes! Cyber 73/74 -> DEC-10 -> UNIVAC-1100 -> VAX-VMS ... and so on ... DEC had the best CLI OSes bar none! UNIX was mostly a more arcane and a bit more flexible ... and developed on a PDP ... miss the DEC-10 more than the Vaxen .. ;-)
@@HaroldSchranz I agree. UNIX is an abomination compared to VMS. RSX was more cryptic like UNIX, but most of that was corrected in VMS. I never used a DEC-10 so no experience there. The first major system I installed was a VAX-11/750 running VMS and 27 PDP-11/23s running RSX-11S connected via DECdataway. If memory serves, our DEC sales rep said that we had the highest node count DECdataway installation in the world at that time (1984ish).
He was a refugee from a nest of Unix-haters at DEC. He succumbed to the 1990s fashion of inextricably tying the GUI into the OS kernel, when the *nix systems maintained the GUI as a separate modular, replaceable layer. Think about why Windows still uses single-letter drive names even today, and why Microsoft felt the need to add WSL to try to turn it into Linux.
@@lawrencedoliveiro9104 It's ridiculous that today I still have drive letters embedded in text command files on Windows, while 30 years ago on VMS we abstracted away device names from the actual hardware using logical assignments. Even on the Apple II, ProDOS abstracted away the hardware names to logical names. Use the logical name and the OS would find the right device on its own.
00:00 🤖 Dave Cutler introduces the topic of Longhorn, hinting at its relationship with Vista. 00:35 🛠️ Windows 2000 served as the foundation for Longhorn's development, with both workstation and server sharing the same codebase. 01:05 ⏳ Diverging timelines: Server development estimated at 3 years, while Consumer version aimed for 1.5 years due to different user expectations. 01:31 🐞 Consumer software faced significant build and run issues compared to the server branch, which focused on addressing security bugs. 02:25 🚫 Security concerns with XP led to a halt in development, particularly due to the emergence of buffer overflow attacks. 02:52 🧩 X64 project emerges as an independent venture to introduce 64-bit extensions, initially not a sanctioned company-wide initiative. 04:00 💡 X64 project adopts the server codebase, successfully creating 64-bit versions for both workstations and servers. 05:41 🌟 Successful launch of 64-bit system with Microsoft website transitioning for its notable reliability. 06:58 🔁 Unified efforts: The decision is made to switch Longhorn's code base to the x64 code base for improved performance and security. 08:34 🐜 Extensive efforts to fix security issues in XP, with the discovery of challenging-to-mitigate overflow bugs.
Only note I'd add. As he mentioned, the "Reliability" was really just x64 hiding the problem with their code. x64 allows a process with memory leaks to run longer before resource exhaustion.
"due to different user expectations" - unfortunately management always goes by this logic, i understand that its a reasonable thing to do to maximize profits, the big downside is that users become beta testers and in general even operating systems like most windows versions have a large amount of performance and UX issues and security holes.
In hindsight that almost seems like why Microsoft had such a big problem developing a successor to XP. If they'd spent more time getting XP right in the first place rather than having to stop and go back to do Service Pack 2, I wonder if successors would have flowed more naturally from that. I gather XP was so wildly successful mostly because it was the much more reliable NT architecture rather than the 9x architecture people were used to, with just enough improvements over 2000 to make it seriously viable for home and gaming use (though I recall there still being a LOT of games that didn't work out of the box on XP).
2000 was a better desktop OS, what features did XP have for "home and gaming" exactly? it had a shinier GUI but that's about it. The only games that were compatible with XP and not 2000 were the ones that did arbitrary version checks and refused to run even though they theoretically could.
@@doltBmB I remember throwing Blizzard games (back when they were good) at Win2000 and they ran without a hitch. In fact IIRC they were the only games I had that would run under 2k out of the box. Other times like with GTA3, you had to jump through hoops to get it going.
@@doltBmBXP's 9x and even DOS compatibility was a vast improvement over anything in 2000. For instance, the virtual DOS machine in Windows 2000 does not emulate sound, whereas XP's does. A whole bunch of compatibility tweaks specifically for making 9x games work on NT systems were included in XP but not nearly so many in 2000. I believe XP also had better support for legacy drivers though I could be wrong on that point; certainly in practice hardware support was better. Away from compatibility, XP also had fast user switching which was a bit of a game changer in terms of introducing multiple user accounts to the home, and since SP2 it also had a bunch more security features like a firewall (still fairly important in the days before most people had home routers) which were never available in 2000 in a consumer-friendly format.
@@snap_oversteer Windows 2000 did not have "wifi" support, it had network support, it was up to the wifi driver to handle the wifi part, just like it is up to any network card driver to handle a physical connection. What XP had was a standard driver for the most common wifi cards and a special GUI for wifi connections. In a way this is overstepping the bounds of what an OS should do, which is part of the reason why XP is shit, it tries to do everything, something that has only gotten worse and worse, meanwhile 2000 recognizes that the only thing it should do is allow you to run your software.
This actually makes a whole lot of sense. I knew long horn was buggy, but I didn't know it was so buggy that it had bugs that ended up being unfixable.
I was in IT at a Fortune 50 company when XP was introduced, and we summoned our Microsoft rep and his management to come and address the IT organization on how Vista and XP would be made less buggy. I remember sitting in that meeting; the Microsoft folks were squirming in their seats. All they could come up with was "Dave Cutler is working on it - the next OS should be great." This did *not* make us happy.
Project run by accountants - what else could it possibly produce? And Dave was never a frontend guy trying to satisfy whims of accountants. And that approach pays off - when they ruined their branch, he was called to save the world. 🙂
Fascinating how it all comes together. I love it. This was wonderful @Dave's Garage. I never had the insight in the details, so these videos are real gems.
8:18 Having completed my software degree's international studies and internship in Japan in the 2000s, this is all tol easy to imagine 😅 most of the code they were writing, the lack of standards and good programming habits/practices that I saw back then made me feel sick to my stomach to watch The worst I saw was a huge long running international ML (way before it was mainstream) and robotics project that combined several advanced sensors and robotics to make a physically self-learning robot by self-reflection... and ALL of that code was in a single C file that couldn't even be opened on some of their older laptops since it was too massive to open and crashed! Not only that, all the comments and variable names were written in different non-English languages with no uniform naming or indentation practices 🤢 Worst of all, one of my to be friends, a French ML genious (compared to me at least) was there to do his doctorate on it, only to realize it would be impossible, and he eventually had to rewrite/reformat/restructure/translate the whole thing, before he could even get started with his own work on it. He eventually had to extend his 6 month stay into a full year just to be able to do it. It took him nearly 6 months just to refactoe that horror-show project and its countless bugs. He got into a lot of trouble due to it, since he had to make so many changes in his plans in France, that ended up in him losing a job he had already waiting for him back home. In the end he managed to fix it, made it all work as intended and not only did he get his doctorate done successfully, he met his future wife during that stay and later started his own company. I still see that project sometimes shown/mentioned on the internet (even in memes), but not sure what results they got from it, and to what other projects it led to, since I was (suddenly put) in charge of another project
@@s0ckpupp3t Fun fact, "Made in Japan" used to have the same connotation as "Made in China" does today. I can also confidently state that a significant amount of tech is the code equivalent of held together with duck tape and superglue. Just look at Boeing software killing hundreds of people! The plane itself had to go through multiple physical tests to ensure the hardware functioned as expected. Yet, no 3rd party actually reviewed the code. Just that "standards" were followed for how it was written.
Interesting question, I definitely remember the bar being green on my PC back in the day. But haven't seen XP Home in a looong time especially not pre-SP2.
Interesting to hear what some of us have experienced and suspected all along, that the code base of Windows XP and prior were a buggy mess. The last Windows I used was Windows 2000. However, in 2001 I switched to Linux exclusively and I haven't looked back since. To me, there is nothing appealing about Windows today, especially for developers Windows is much inferior to Linux.
Your last sentence is nonsense. Visual Studio is literally superior to any development environment on Linux. Do you know how many apps I use on Windows that do not exist on Linux because nobody wants to waste time porting it to an operating system with such a small market share compared to Windows?
Dave Cutler cuts a wide path through life. My Dave Cutler story happened many years ago when I was contracting at Tandem. For various reasons, this huge body of code that I had been working on got code reviewed by Dave&co. The result (either through the brilliance of my code or, more probably, politicking by Compaq) was that Microsoft bought into the code and I was out of a job 2 days later. (No tears were shed, to be clear!) I think at MS it was called Wolfpack, but I'm not too sure about it. I don't think it seriously went anywhere. The other amusing thing on that day was a demo - where one of my colleagues was demonstrating a process failing over from one machine to another - it took about 30 seconds to exhaust the retries and do the failover, as I recall. After about 20 seconds Dave or somebody said "Hm. Sure is taking a long time..." to which the guy giving the demo replied "YOU think it's taking a long time? Imagine how long it feels like to ME!"
I attended some initial design reviews of Wolfpack with Rob Short and John Vert. To be honest, they had a hard time explaining what they were trying to achieve and it took a while for it to get off the ground. I never did get to use it, because there were alternative ways to build reliable systems other than clustering. But it was always fun visiting Microsoft and participating in these things, I often came away feeling a lot smarter after interacting with these guys, and also dumber when I realized all the stuff they were doing that was way over my head!
Dave, you outdid yourself with this interview. I'm amazed that you got an interview with the great Dave Cutler. I first became acquainted with RSX11-M while doing some work at a cancer agency while in university. To my delight, RSX11-M was shipped with source code of the operating system, and I spend endless hours poring over PDP-11 code reading and understanding the code written by Dave Cutler. Every now and then he would add some comments reflecting his wry wit. Looking forward to the entire interview! You have a fantastic channel ...... a fellow Canadian
"We can't wait that long" and "the consumer doesn't expect reliability"... It was at that point that he f-ed up. Turns out, people really do want their system to be stable. The operating system should be the last thing people think about on their computer. It should run stable and get out of their way so they can run the programs they need to run. If the system is crashing, getting in people's way, running poorly, etc. then the operating system is failing at it's most important job.
Спасибо большое за интервью! мне крайне сложно в России плыть против течения, когда я утверждаю, что Vista была революционной и дала человечеству большой толчок в развитии. Унификация, 64bit, службы, задачи, процессы. Всё было очень мудро построено. Великая работа!
I don't know that it's the worst I've ever seen, but I was amused when I discovered that the Z-80 editor assembler sold by Radio Shack (but written by Microsoft) had a bunch of code to check relative jumps, and it was wrong (would let you have a forward jump of 128 bytes, which the CPU saw as a backward jump).
I was getting MSDN and TechNet regularly back when this was going on. We pushed a lot of people to NT 4 and later XP just because the network stack was just better.
I was working on this big multi-phased ERP project years ago. I was seeing tons outsourced data integration programs delivered (and supposedly tested) that was buggy and not near spec. It had taken offshore an ungodly amount of time to produce this one specific program - it was total fubar crap. It wouldn't even run. My star SE and I just shook our heads as we performed our standard code review of it. It was so bad he proposed to just write this data integration program himself. I asked how long would it take. About a day, he said. Do it, I said. This was all under the table because implementation partner was getting paid millions every month to produce this crap. Ultimately, I chose the decision in order to maintain critical path. About 50 outsourced programmers were contracted at peak for a moderate amount of time during this 5 year project. Their code was fugly crap. 7 of us employees could have done this area of the project producing near bullet-proof code to spec easily within timeline. But, noooo... it was all outsourced to cheap green coders. 35 years as software engineer, what do I know. Thankfully those days are gone. Clean green pastures for me.
Interestingly, the Japanese IME in Windows Mobile was excellent from the user's perspective. The kanji writing recognition was superb as far back as WM2003 (that is the earliest I have tried). I still use this and it is awesome. For reference, iOS and Android were terrible until about 2014. Partly because Apple & Google were too lazy to port their Chinese work to Japanese.
@@wrenchposting9097I used Japanese IME on Win7 and Win10 and thought it worked fine. I don't recall using older versions so can't comment. However, I would add the Kanji handwriting recognition on those old win mobile devices was great (wm2003 forward).
It's funny; probably not a lot of your audience will be familiar with Windows IME, but since I use Japanese regularly, I am well familiar with it. For me bringing up that piece of software is like bringing up Task Manager, a highly familiar face for us all, so it's interesting to know the history of it. In Japan, Windows IME is of course available on all Windows, but I found that Atok, another IME software, was often more popular. I wonder if the history alluded to here, of Microsoft having a hard time getting that IME code ready, was a formative factor in this.
This is wild to hear from the people on the inside of a gigantic company explain why I felt the way I did about all of those products 😅 I used 98, ME was my first OS install, loved loved 2000, skipped vista, hesitantly went to 7 and.stayed there a loooong time. 10 and 11 have been decent but increasingly intrusive and harder to de-nanny.
Search for how to disable spying and telemetry or increase privacy. Decrapifier is one of the handy tools. The list of things may or may not bother you but the stuff is there.
Linux doesn't nanny. I've got fed up with Windows 10 bullshit and set up a Linux on a new laptop a year ago. It has everything in it and is much faster, like it always was. I think I'll also upgrade some of my PCs this way. With all due respect to Dave, shithead accountants are killing Windows. I'd install VMS, but unfortunately it's not for home PCs, and I don't have a nuclear plant to manage.
A company I used to work for outsourced code to India. One of the engineers traveled to India to see what was being done. There was a large room with 2 meter long tables spaced as if it was a classroom. People were sitting at the tables and writing code...by hand...into notebooks...using pencils. There was a _single computer_ on a table at the head of the room. When one of the coders was done writing, they would walk to the computer and enter their code into electronic format. Then vacate the seat for the next person in line and go back to their "coding" table and "write" more "code." Then some unlucky schmuck at some other location had to put all of that code together, compile it, and "run" it. The company had no idea that was happening. Do not outsource _anything._
Perfect time back then to start over- should've downloaded Linux/BSD code and see how OS should be done (not just code, but architecture as well). That could've saved your server products, your mobile division, and have better desktop (not like MW, Wista, W8 or W11), beter file systems (whoich are fast, don't lock files and don't get fragmented), better security and reliability. 30years later MS still has no clue how to build OS (they missed internet/web search to find Linux) - c'mon you can't put configs, executables and data in the same folder and call the day and don't start me with the abomination called Registry. Now mobile is dead, servers almost dead (Linux saved the day).
Very well done interview. You are actually letting the person you interview speak and not be interrupted. It’s rare these days. Thank you!
just realised that, i thought why it feels soo good
thats what 1.5 x speed is for! :)
Why is this comment on every single interview on youtube?
I praise that!
@YolandaPlayne Because the incentives are different, here it's mostly folks doing what they want and what they think an interview should be. When they look at the mindrot box in their living room it's the diametric opposite as the joke men don't want to inform but dictate opinion.
Let's not underestimate the significance of these videos. He's not just reminiscing about the past, he's documenting history about some of the most important developments to ever occur in technology. This is like getting to listen to Ford talk about designing car parts with colleagues. The development of early operating systems has changed the world as we know it, much like the modern automobile.
Wisely said 🙏
I just want to know what the hell happened with Windows 10 and 11. >_> Talk about some atrocities of development. Even 8.1 at very least was salvageable.
I mean, 10 was still good compared to 11. Way to mess everything up everything that worked on 10.
@@michalsvihla1403 Definitely disagree there. Windows 10 had and has a ton of problems. Windows 11 was just the shitty continuation of it all.
@traida111 NDAs are NDAs..
the open source world is a different animal by design.
So glad to see you have Dave Cutler on. He's a living legend in the realm of coding.
Legendary Unix-hater, refugee from a nest of them at DEC.
He is the reason why Windows needs WSL today.
I bet windows has a lot of "borrowed" Linux code, but the user agreement license is protecting ms sins pretty darn well. Although there was some code leak in 2020 of all xp related branches, afaik. I'd love to see some tech in-depth analysis some day
Legend? No, he isn't.
@@lawrencedoliveiro9104 Windows doesn't *need* WSL. It's just legacy cruft on top of a semi-modern kernel. Unix was the bare minimum needed to run software in the 70s. Linux is a bloated monolithic bare minimum needed to pretend Unix was a good idea in the first place.
@atlantic_love Oh yes he is - this guy makes Linus look like a script kiddie
"Does it Matterhorn" had me crack up. That's a brilliant name for the circumstance!
(4:18)
As someone who trained and certified on Windows 2000 and has followed through supporting present day systems, this was amazing. Thank you. Explains so many things. Like why I'm bald...
Two legends talking about Windows development. What a time to live in. Thank you, mr. Dave!
i think we live in an exceptional time because there exists an ocean of fully functional free/open source software in almost every field that gives huge freedom and power to every human being on the planet for free
**slaps roof of Windows XP Professional x64 Edition**
This bad boy can fit so much memory leaks in it
ROFLMAO
Solid reference exception 😂
Seriously though, memory leaks or not, the 4GB limit (which in most OSes was even lower for actual application code) was becoming a serious issue by the early 2000s and it was the last real market for RISC workstations (including Itanium), x86-64 practically ended the RISC workstation market.
This adds a whole lot of context to that era! Thanks for all these videos, can’t wait for the full interview
I was just thinking the same thing, and when you hear him describe what happened, that whole awkward 32-bit to 64-bit transition period suddenly makes a ton sense. If you’ve ever researched those releases on Wikipedia, it’s not necessarily obvious what happened or why they did what they did, so this totally clears some of that up and I’m sure those wiki pages are now going to make a lot more sense once someone adds this bit of info to it, now that there’s cite-able source.
Awesome interview, because this is the insider stuff you never hear about (anywhere else) and as someone who makes a living supporting these systems, it’s enlightening to see what the actual lineage was. I also had a curiosity about it recently because I watched someone boot up a semi-functional version of XP on a much more modern PC that was only a few years old. The guy actually managed to get the x64 version to load, since it was based on that Server 2003 version he mentioned them switching to in this video, but it was still so new that it required him to use either the kernel or some other integral part piece of a much later release of Windows Server 2003, because it shared the same codebase.
Now I’m kinda wondering if he could fix that last couple of things that he couldn’t get to run for various reasons if he tried using chunks from something even newer, now that we just hear Mr. Cutler confirm that all later versions of Windows were built from that same unified codebase. It also explains why I spent the last four years supporting so many legacy systems that were still running on some version of XP. Many of these systems were custom embedded systems, often running on Pentium 4s from 20 years ago-but those could’ve been swapped out for something newer, but it was ultimately down to the combination of legacy I/O that needed PCI or ISA cards to interface with the PLCs (I work in manufacturing IT) and proprietary software, and usually a vendor that ceased to exist or would only provide limited support at a hefty price, which usually turned into a sales pitch for newer equipment that would have cost millions very late in the lifecycle of a product that’s being only going to be manufactured until 2025. After that, they’re planning to re-work that engine from the current ICE iteration to a PHEV hybrid based off of it, which is gonna be pretty cool, and will likely be when they’re able to make those kinds of necessary upgrades. Assuming the folks at corporate even decide to build it at that plant, because there’s always internal competition between various plants within the company, for the products that they build and plants that come in last place in key areas or can’t compete with Mexico or whatever arbitrary metric they’ve decided on using, is what determines where they build the next one.
My curiosity isn’t always a common use-case, but if I had found a way to Frankenstein together a version of XP that could run on modern PC hardware, that could’ve resulted in very cheap, solution to spending a fortune on finding replacement hardware to run the software that controls the Millions of dollars of manufacturing equipment.
It just really sucks how long it took manufacturers of the systems that run those PLCs hung onto XP for so damn long, because we had manufacturing lines installed new in 2010 that were still running on XP, because the big scary “XP is actually EOL, please stop using this software” message only popped up in 2014. There were some newer machines brought in later or a handful that were bold enough to be using Windows 7 or Windows Server 2008, and we’ve been able to upgrade the software and/or hardware on those systems to Windows 10 years ago, so as you might imagine there’s been many years of frustration for the folks like me who are still dealing with these legacy systems that were designed during that era. The parts inside of PLCs don’t change very much if they ever do, because that shit is all modular and internal communications between the embedded PC and PLC are just serial connections carried via USB, RS-232, Ethernet, or some combination of those via adapters. The worst are manufacturers who use some sort of proprietary or obscure GPIO card/cable that you can get from anywhere else, and are vendor locked to their system of hardware.
So, yeah-it’s just a super niche reason for my piqued interest in this particular topic of discussion. The easiest solution is to just throw a ton of money at the problem, but whenever a discussion ends with suggesting that they just pay the vendor for an updated version of their embedded system or some box that can convert or adapt newer hardware/software to their interface, it tends to stop the conversation, especially when they see a quote from the vendor. So, our department is constantly being asked to move mountains with a zero dollar budget. As new people cycle in and out of roles in various departments, we have the same exact discussions like it’s Groundhog’s Day, and folks like me are always keeping an eye out for a novel solution. The best case scenario is whenever I enthusiastically present some untested, janky as fuck, zero clue if it’s even functional, let alone reliable, one-off-solution that involves some creative nonsense it often results in upper management deciding that the they should just pay the fucking vendor what they’re asking, rather than rely on yet another undocumented solution where only one person actually knows how it works or how to fix it. Turns out there’s the worst case scenario, where they’ve allowed mad lads like me or others before me to do crazy ass things that were idiotic and ill advised from the beginning, and that just complicates life for our future selves or our eventual replacements after we leave.
Seriously, if you’ve never worked in manufacturing IT, it’s a hoot if you like tinkering with ancient stuff and have a particularly dark sense of gallows humor, because that’s the sort of personality it takes to start with an idiotic problem with an obvious solution that we know works, and then “engineer” something else from scratch with no clue if it’s possible, all while thinking and verbally saying “this is a bad/stupid idea that it’s only going to end in tragedy or frustration-but I’ll get right on it!”
I’ve resurrected Windows 98 machines that were still running as late as 2021 or 2022, and were integral to the process. I’ve also found a way to P2V Windows 7 systems that wouldn’t cooperate with the process the way it’s supposed to work, and because I wasn’t able to reinstall the software from scratch because the vendor literally never provided that to us, and likely wouldn’t support what I did-I cleverly got it working. The solution I managed to get working involved taking a non-booting virtual drive with all the software installed on it, and creating a brand new, bootable VM on another virtual drive with a working install of the old corporate load of Windows 7. Then I grafted the two together with some wild ass software I found (can’t recall the name of it) that essentially allowed a system admin to create a hard coded link instead of an existing folder. Kinda like the way symlinks work in Linux, but basically what Microsoft did when they included hidden copies of links in user directory that’s named “My Documents” that actually points to the new location of those files. So, basically I threw a couple of those bad boys into the “Program Files” directory on the C: drive of the working version of Windows 7, so then it would always just point at the old, non-booting but installed folder located on the D: drive. After doing that with the user account’s directory and anything else that was needed, I just copied and pasted the shortcuts and it all just worked like a champ. It was a massive waste of space having duplicate copies of Windows, but it worked and that janky ass mess won our department an award.
As an old Tech head myself, I was so happy to wake up this morning and see a new interview with Mr.Cutler! I've read so much about his efforts with VMS and NT, it's refreshing to finally hear from him directly. Thank you Dave! Please continue to document such important minds to the industry!
It is incredibly rare to see a video about Dave Cutler. Thank you for that.
OMG!!! You have Dave Cutler in your shop!!!! Just the other day I was telling a bunch of students about him and what he did for the Windows ecosystem. I really really hope to be able to meet him somewhere….. although I live in Europe so the chances of that are slim….. looking forward to the entire interview.
I remember that time very well. XP 64 bit edition and Windows Server 2003 were a revelation and finally made it feasible to run huge workloads on Windows machines. This man is a living legend, can’t wait for the full interview to drop. Subbed instantly!
The shop were I was working in the late 90's was running the Windows Servver for the DEC Alpha. Ran circles around the Intel systems AND was reliable.
I am madly excited for however much more you and Dave Cutler can deliver to us of conversations and storytelling just like this.
This is very good! Thank you each!
Dave, the content you produce and the pure heart you put into what you are doing is just superb. It adds a whole new layer of context to what some people (often including me) have been trying to deduce or rationalize on their own. Please keep up that fantastic work that you have been doing. Greetings from Warsaw, Poland!
Again, thank you so much for talking to Dave C. Really looking forward to the full interview.
"we didnt need to change anything - your task manager just ran...." -> Dave knowingly nods thinking "of course!"
So cool, a legend in the O/S space. Thank you Dave's Garage
I started out as VMS system manager - Dave Cutler has been with me through my career. Great to hear him talk. Thanks both Daves :)
RT-11 -> Bell System Unix (at Bell Labs in the late '70s) -> SCO Unix -> VRTX-68K -> VMS -> Linux -> Windows for me. Interesting ride.
I really like the VMS core of Windows. Its much more sane than the monolith insanity of Linux , specially the Graphical System. The WDM is a beautiful system and X/server needs to die, if I were a decision maker on Linux, I would take the Surface Flinger system for Android and plop that as a the actual Linux Graphical System, forget wayland also, and that crap of DRI/DRM, really framebuffers ? (what's this, the 1970s ? is anyone still using CGA cards ?). (lets not talk about audio on linux, it doesn't exist, basically you use ASIO and then it works, because nothing pass thought the kernel and is all done via hardware)
He was a refugee from a nest of Unix-haters at DEC. He succumbed to the 1990s fashion of inextricably tying the GUI into the OS kernel, when the *nix systems maintained the GUI as a separate modular, replaceable layer.
Think about why Windows still uses single-letter drive names even today, and why Microsoft felt the need to add WSL to try to turn it into Linux.
@@lawrencedoliveiro9104 "Think about why Windows still uses single-letter drive names even today"
It doesn't, that's only the GUI.
NT file system doesn't have a concept of letter, that's a filter.
Its just a symlink, come on.
@@lawrencedoliveiro9104 "He succumbed to the 1990s fashion of inextricably tying the GUI into the OS kernel"
The irony is that Linux is a freaking monolith kernel.
It was done for performance. Computers on 1990s weren't that much faster.
That division was so stupid that even XServer had to retract.
No one ever ran a distributed GUI system like a client/server.
Except Citrix metaframe, aka, Windows RDP. Ironically.
This is fantastic content, great to see the great Dave Cutler!
Thank you Dave for putting the excerpts of the interview. Amazing to hear the open and honest discussion about past internal issues and dynamics of coding. Looking forward to the full interview. Cheers
Best interview I’ve seen in a while Dave! Love to see you sitting back and letting your guest tell the story. So few do this.
Please do some more of these. Hugely interesting!!
Dave awesome interview. Cant wait for the full version.
What a wonderful and enthralling interview. His memory is as sharp as a razor! I wish my memory was so good. Talk about a life well lived. Thanks for recording this oral history. Great Questions too
As an old VAX/VMS system admin, that was one of the best interviews I've ever seen! I remember when I first heard, in the late 80s, that Microsoft was bringing Dave Cutler onboard. I said "Microsoft just got REALLY serious about Operating Systems!" And I excitedly waited. I adopted NT at version 3.1. But I gotta say...I miss VMS. Best Operating System humanity has ever produced.
Well, after having passed through several owners, OpenVMS still available for commercial sale! 😊 It's not long been ported to x86_64 and virtual environments.
I got introduced to RSX-11M at my first job after college, and that O/S came with a code kit. You went through some gyrations if you were an OEM (as we were) because if you had a proprietary device, you had to roll your own driver (which we did). Looking through the code, I ran across a comment that told me that Dave was obviously working at another level and that his colleagues looked on him as a software god. Can't remember the instruction or the section of code it was in, but one COMMENT on that (PDP-11 Macro) instruction was "; Dave Cutler SWEARS this will work." The company graduated to VAXen within a few years after I joined so I went to all the VAX software and O/S schools. Loved that machine and that O/S. But... OpenVMS became the O/S that wouldn't die. I went to work for the U.S. Navy and we had an OpenVMS on a personnel system. When COMPAQ bought out DEC, we switched to an Alpha/OpenVMS. And when H/P bought out COMPAQ, we switched to the Itanium/OpenVMS. Our customers loved it because every upgraded system ran 10-30 times faster than its predecessor. The Itaniums have finally been switched off when the app graduated to a web server and ORACLE/Unix.
Yielding was a bit of a pain, though.
As someone a little older, I still think that Burroughs MCP was pretty good.
What a nice surprise, in the Dave Q&A video's some people asked about Dave Cutler and now he sits in your Garage. I would pass out within a second, but you did well Dave. Love to see more of these video's. Your channel is awesome.
This is absolutely fascinating! I'm so excited to see this in full.
Fantastic to hear Dave Cutler talking about this. Two legends here!
This explains so much. I distinctly remember XP being very buggy at launch but dramatically improving after one of the big service packs.
Service Pack 2 (2004) was the biggest one, and the one that fixed most of the bugs.
I remember XP becoming much more stable after service pack 1
The "Doesn't matter Horn" made my day :D
Thank you Dave, it's always so interesting to hear things about how Microsoft made their OS back in the day when the internet either didn't exist yet or was very slow, so no chance of patching bugs too fast.
I think he's referring to Matterhorn - the mountain in the Alps.
@@bigredracingteam9642
I know ;) I don't live too far away from there, Souther Germany :)
Dave this is the most awesome content I have seen in youtube. Good times. I want more stories
The curse of outsourcing code is very evident in AAA games, bugs, TERRIBLE menu systems, clunky game design etc...
MW2 instantly comes to mind…
Not really, I think which games you are referring to, but those problems are in no way new to that company and due tobthe outsourcing. If anything, the outsourcing seems to have resulted in more stable product than what they usually provide, not worse.
Outsourcing actually forces you to have a minimum level of good design and modularity for it to work, so for some rotten companies, outsourcing might even be an improvement
Sadly, those problems run much deeper than just outsourcing recklessly
The menu system problem happens when you have artists/graphic designers setting up the menus (or worse, programmers). What you need is a trained, skilled, and accomplished UI/UX specialist. Not many game companies hire one of these. When I first got to work with one, probably 10 years after starting to write modern console games, that guy was a firehose of great information about how people _actually_ use interfaces, how to make sure they _can_ use interfaces (without help), and how completely idiotic most programmers AND most artists are when they try to design such things without having studied the specific crafts of UI/UX. It's really a science of its own.
@@Felice_Enellen 100% this, amen
@@Felice_Enellen very true. I’m a backend engineer and a while ago I had to make a mobile interface even though I have NO experience in that, and no interest in it either. People “running” the show hire great people but haven’t got a clue how to use them properly.
Amazing, thanks a lot!!!!
P.S. now finally someone tells me ... - I used to be the Server BG lead for Switzerland at that time (2003-5), but never was told anything like this. Can't wait for the full interview!
This is amazing!! People like Ken Thompson, Dennis Ritchie, Dave Cutler, Gary Kildall, Torvalds, Terry Davis... All those crazy people who love to do low level stuff, that deep core stuff are my inspirations!.
and to the above names concat (Curious Marc, Dave's Garage)
Thanks Dave for this interview!
Don’t forget RMS and Tevinian!
I'm so looking forward to the entire interview!
Great interview... I'm intensely listening then it was over. Bring back more of this type of content and make it a little longer. Always interested in behind the scenes tech stories.
That was quite a captivating interview segment. I loved it.
Your interview technique is very good and it's the first time I've seen it. You let Dave talk without cutting him off or stepping on him. You clearly have faith that the guy is switched on, interesting and that your viewers will find his story as interesting as you do. Great work Daves! It looks like there's a lot of mutual respect between you two.
It is a non-confrontational technique. It works really well when the subject wants to share, but not when they want to obfuscate.
Need more of this conversation!!
How cool is it that you can get Dave Cutler to come over to your house and tell war stories. Well done. Very well done. More. Please
This interview was par excellence. Loved it. I remember workstation days well.
So interesting. Working as a software developer for a for the financial side of the business, I can relate to this (of course, on the user application level) but many of the same issues arrived from purchased code that I had to fix or blackbox to make it work with our systems.
Please the more of these types of interviews and story tells are so interesting to the outside. I'm 75 as was an adopter of MS Windows at the market onset.
Thanks you and thanks to Mr. Dave Cutler for taking time to talk.
I worked at Muzak 10 years or so ago. We outsourced some stuff to a French team, who followed no process at all and pushed out an update to all worldwide video players (the devices that play videos in department stores and Times Square, etc), bricking them all for a day or so. That killed the entire product line everywhere and ended up causing us to close our HQ building, and contributed a lot to the bankruptcy in 2020.
This is incredible, thank you for getting these stories documented!
Cool interview. Enjoying these!
Oh, I vaguely remmeber. XP 64-bit had that thing where it was forked off a different codebase. So a lot of XP software had issues running on it. That time when Vista 64 was a great leap.
- As for seeing bad code: "The only valid measurement of code quality. WTFs/minute." - Thom Holwerda, 2008.
- Love that logic form the server people: "We prefer x64, because it takes longer for our memory leaks to cause a OOM." When you are so far behind in fixing memory leaks, "just more RAM" is a valid solution.
This was also about the time Microsoft acquired Hotmail, which had been running its operation some BSD variant, and then made a big noise about switching everything over to Windows NT.
Only it’s not clear that they got very far.
by service pack 3 and 4 XP x64 was the same as the 32bit version. It ended up being much better and more stable than the x32 version. I used it for quite a while until windows 7 was mature.
During this era the hardware that could take advantage of 64bit was also pretty rare, so you found a lot of outfits running it did so on decked out Intel Mac Pro's. The likes of HP and Dell were still dragging their feet on releasing 64bit Xeons with 64bit EFI's/BIOS's so many went with Apple then wiped OS X and stuck Windows on them. I used to work with a guy who was in the Mac Warehouse section of PC World in the UK in their business unit who did this quite a lot for corporate clients.
@@skilletpan5674 I think you are mixing things up.
XP only had 3 SPs total. The last of which never was released for x64.
4 SPs sounds like you are talking about NT. It is the only one that got that many.
It's not exactly more RAM, just a bigger addressing range. Like, there are more virtual memory addresses available to the process. Physical memory that leaks will be swapped out to disk since it's not used any more, and won't use up any RAM.
I'm really enjoying the snippets of the interview. Well-done Dave.
Thank you Dave for this video
This is a fantastic channel. Great host and Dave Cutler is a genius. What we are listening to is a history documentary technology, something that we will keep coming back in decades to come.
This is so interesting, I can not imagine a more interesting guest on the show, love hearing from his development point of view. I feel like despite being such an integral figure not much information or stories that he loves to tell are out there, so thanks for bringing this to light :)
This was eye opening for me. Thank you Dave!
As someone who grew up on PDP/RSX-11 (M and S) and VAX/VMS, and still has a copy of the VMS Internals and Data Structures manual, at least I think that is the title going from memory, this is a fun interview. I’ve always thought Cutler was one of the better OS designers of the minicomputer era. I must say though that I still prefer his work in the DEC world over that later in the PC world. I still miss file version numbers that we had in the DEC world. 😁
Yes! Cyber 73/74 -> DEC-10 -> UNIVAC-1100 -> VAX-VMS ... and so on ... DEC had the best CLI OSes bar none! UNIX was mostly a more arcane and a bit more flexible ... and developed on a PDP ... miss the DEC-10 more than the Vaxen .. ;-)
@@HaroldSchranz I agree. UNIX is an abomination compared to VMS. RSX was more cryptic like UNIX, but most of that was corrected in VMS. I never used a DEC-10 so no experience there. The first major system I installed was a VAX-11/750 running VMS and 27 PDP-11/23s running RSX-11S connected via DECdataway. If memory serves, our DEC sales rep said that we had the highest node count DECdataway installation in the world at that time (1984ish).
I still have my copy, as well. Great memories of working at DEC.
He was a refugee from a nest of Unix-haters at DEC. He succumbed to the 1990s fashion of inextricably tying the GUI into the OS kernel, when the *nix systems maintained the GUI as a separate modular, replaceable layer.
Think about why Windows still uses single-letter drive names even today, and why Microsoft felt the need to add WSL to try to turn it into Linux.
@@lawrencedoliveiro9104 It's ridiculous that today I still have drive letters embedded in text command files on Windows, while 30 years ago on VMS we abstracted away device names from the actual hardware using logical assignments. Even on the Apple II, ProDOS abstracted away the hardware names to logical names. Use the logical name and the OS would find the right device on its own.
I love these stories.
Your documenting history behind the foundations of our IT industry Dave. Thank you for all you do.
00:00 🤖 Dave Cutler introduces the topic of Longhorn, hinting at its relationship with Vista.
00:35 🛠️ Windows 2000 served as the foundation for Longhorn's development, with both workstation and server sharing the same codebase.
01:05 ⏳ Diverging timelines: Server development estimated at 3 years, while Consumer version aimed for 1.5 years due to different user expectations.
01:31 🐞 Consumer software faced significant build and run issues compared to the server branch, which focused on addressing security bugs.
02:25 🚫 Security concerns with XP led to a halt in development, particularly due to the emergence of buffer overflow attacks.
02:52 🧩 X64 project emerges as an independent venture to introduce 64-bit extensions, initially not a sanctioned company-wide initiative.
04:00 💡 X64 project adopts the server codebase, successfully creating 64-bit versions for both workstations and servers.
05:41 🌟 Successful launch of 64-bit system with Microsoft website transitioning for its notable reliability.
06:58 🔁 Unified efforts: The decision is made to switch Longhorn's code base to the x64 code base for improved performance and security.
08:34 🐜 Extensive efforts to fix security issues in XP, with the discovery of challenging-to-mitigate overflow bugs.
Only note I'd add. As he mentioned, the "Reliability" was really just x64 hiding the problem with their code. x64 allows a process with memory leaks to run longer before resource exhaustion.
Thank you very much
6:17 🦵 He’s got cramps in his legs.
"due to different user expectations" - unfortunately management always goes by this logic, i understand that its a reasonable thing to do to maximize profits, the big downside is that users become beta testers and in general even operating systems like most windows versions have a large amount of performance and UX issues and security holes.
They could have made unified kernel, and separate consumer ans server parts, but it is monolitic OS....
@@atomtamadas
Great interview! I'd love to see more interviews on the channel.
In hindsight that almost seems like why Microsoft had such a big problem developing a successor to XP. If they'd spent more time getting XP right in the first place rather than having to stop and go back to do Service Pack 2, I wonder if successors would have flowed more naturally from that. I gather XP was so wildly successful mostly because it was the much more reliable NT architecture rather than the 9x architecture people were used to, with just enough improvements over 2000 to make it seriously viable for home and gaming use (though I recall there still being a LOT of games that didn't work out of the box on XP).
2000 was a better desktop OS, what features did XP have for "home and gaming" exactly? it had a shinier GUI but that's about it. The only games that were compatible with XP and not 2000 were the ones that did arbitrary version checks and refused to run even though they theoretically could.
@@doltBmB WiFi support/configuration in stock 2000 was bad compared to XP which unified it at least.
@@doltBmB I remember throwing Blizzard games (back when they were good) at Win2000 and they ran without a hitch. In fact IIRC they were the only games I had that would run under 2k out of the box. Other times like with GTA3, you had to jump through hoops to get it going.
@@doltBmBXP's 9x and even DOS compatibility was a vast improvement over anything in 2000. For instance, the virtual DOS machine in Windows 2000 does not emulate sound, whereas XP's does. A whole bunch of compatibility tweaks specifically for making 9x games work on NT systems were included in XP but not nearly so many in 2000. I believe XP also had better support for legacy drivers though I could be wrong on that point; certainly in practice hardware support was better. Away from compatibility, XP also had fast user switching which was a bit of a game changer in terms of introducing multiple user accounts to the home, and since SP2 it also had a bunch more security features like a firewall (still fairly important in the days before most people had home routers) which were never available in 2000 in a consumer-friendly format.
@@snap_oversteer Windows 2000 did not have "wifi" support, it had network support, it was up to the wifi driver to handle the wifi part, just like it is up to any network card driver to handle a physical connection. What XP had was a standard driver for the most common wifi cards and a special GUI for wifi connections. In a way this is overstepping the bounds of what an OS should do, which is part of the reason why XP is shit, it tries to do everything, something that has only gotten worse and worse, meanwhile 2000 recognizes that the only thing it should do is allow you to run your software.
love this video, thank you so much for interviewing such legends.
I could easily sit for another hour listening to this conversaton
This is fascinating! Thank you for the video, looking forward to watching the full interview!
This actually makes a whole lot of sense. I knew long horn was buggy, but I didn't know it was so buggy that it had bugs that ended up being unfixable.
I was in IT at a Fortune 50 company when XP was introduced, and we summoned our Microsoft rep and his management to come and address the IT organization on how Vista and XP would be made less buggy. I remember sitting in that meeting; the Microsoft folks were squirming in their seats. All they could come up with was "Dave Cutler is working on it - the next OS should be great." This did *not* make us happy.
Project run by accountants - what else could it possibly produce? And Dave was never a frontend guy trying to satisfy whims of accountants. And that approach pays off - when they ruined their branch, he was called to save the world. 🙂
Fascinating how it all comes together. I love it. This was wonderful @Dave's Garage. I never had the insight in the details, so these videos are real gems.
These videos with Dave Cutler are the greatest videos ever.
8:18 Having completed my software degree's international studies and internship in Japan in the 2000s, this is all tol easy to imagine 😅 most of the code they were writing, the lack of standards and good programming habits/practices that I saw back then made me feel sick to my stomach to watch
The worst I saw was a huge long running international ML (way before it was mainstream) and robotics project that combined several advanced sensors and robotics to make a physically self-learning robot by self-reflection... and ALL of that code was in a single C file that couldn't even be opened on some of their older laptops since it was too massive to open and crashed!
Not only that, all the comments and variable names were written in different non-English languages with no uniform naming or indentation practices 🤢
Worst of all, one of my to be friends, a French ML genious (compared to me at least) was there to do his doctorate on it, only to realize it would be impossible, and he eventually had to rewrite/reformat/restructure/translate the whole thing, before he could even get started with his own work on it. He eventually had to extend his 6 month stay into a full year just to be able to do it.
It took him nearly 6 months just to refactoe that horror-show project and its countless bugs. He got into a lot of trouble due to it, since he had to make so many changes in his plans in France, that ended up in him losing a job he had already waiting for him back home.
In the end he managed to fix it, made it all work as intended and not only did he get his doctorate done successfully, he met his future wife during that stay and later started his own company.
I still see that project sometimes shown/mentioned on the internet (even in memes), but not sure what results they got from it, and to what other projects it led to, since I was (suddenly put) in charge of another project
WWW consortium…used to be the Bible in the Web cowboy days.
what was the project?
@@lxy1312 I would rather not give more details for obvious reasons
@@SongfugelI always assumed Japan wrote good code because their technology is so good
@@s0ckpupp3t Fun fact, "Made in Japan" used to have the same connotation as "Made in China" does today. I can also confidently state that a significant amount of tech is the code equivalent of held together with duck tape and superglue.
Just look at Boeing software killing hundreds of people! The plane itself had to go through multiple physical tests to ensure the hardware functioned as expected. Yet, no 3rd party actually reviewed the code. Just that "standards" were followed for how it was written.
This was extremely cool! Thanks Dave!
Hey Dave, why did the Windows XP home edition loading bar turn from Green (the HE colour) to Blue (like in Pro) after SP2?
Interesting question, I definitely remember the bar being green on my PC back in the day. But haven't seen XP Home in a looong time especially not pre-SP2.
Great job Dave! Can't wait for the full release!
Dave Cutler, the man, the legend
These interviews are fascinating Dave, thank you so much!
Interesting to hear what some of us have experienced and suspected all along, that the code base of Windows XP and prior were a buggy mess. The last Windows I used was Windows 2000. However, in 2001 I switched to Linux exclusively and I haven't looked back since. To me, there is nothing appealing about Windows today, especially for developers Windows is much inferior to Linux.
Your last sentence is nonsense. Visual Studio is literally superior to any development environment on Linux. Do you know how many apps I use on Windows that do not exist on Linux because nobody wants to waste time porting it to an operating system with such a small market share compared to Windows?
@@dennisanderson8663 Visual Studio? You must be kidding us, what a joke of a development software haha.
Windows has been fine since Windows 7. Linux is just a massive hassle for hardware/software compatibility.
Can't wait for the full interview!
Dave Cutler cuts a wide path through life. My Dave Cutler story happened many years ago when I was contracting at Tandem. For various reasons, this huge body of code that I had been working on got code reviewed by Dave&co. The result (either through the brilliance of my code or, more probably, politicking by Compaq) was that Microsoft bought into the code and I was out of a job 2 days later. (No tears were shed, to be clear!) I think at MS it was called Wolfpack, but I'm not too sure about it. I don't think it seriously went anywhere. The other amusing thing on that day was a demo - where one of my colleagues was demonstrating a process failing over from one machine to another - it took about 30 seconds to exhaust the retries and do the failover, as I recall. After about 20 seconds Dave or somebody said "Hm. Sure is taking a long time..." to which the guy giving the demo replied "YOU think it's taking a long time? Imagine how long it feels like to ME!"
I attended some initial design reviews of Wolfpack with Rob Short and John Vert. To be honest, they had a hard time explaining what they were trying to achieve and it took a while for it to get off the ground. I never did get to use it, because there were alternative ways to build reliable systems other than clustering. But it was always fun visiting Microsoft and participating in these things, I often came away feeling a lot smarter after interacting with these guys, and also dumber when I realized all the stuff they were doing that was way over my head!
Dave, you outdid yourself with this interview. I'm amazed that you got an interview with the great Dave Cutler.
I first became acquainted with RSX11-M while doing some work at a cancer agency while in university. To my delight, RSX11-M was shipped with source code of the operating system, and I spend endless hours poring over PDP-11 code reading and understanding the code written by Dave Cutler. Every now and then he would add some comments reflecting his wry wit.
Looking forward to the entire interview! You have a fantastic channel ...... a fellow Canadian
"We can't wait that long" and "the consumer doesn't expect reliability"...
It was at that point that he f-ed up. Turns out, people really do want their system to be stable. The operating system should be the last thing people think about on their computer. It should run stable and get out of their way so they can run the programs they need to run. If the system is crashing, getting in people's way, running poorly, etc. then the operating system is failing at it's most important job.
Great interview, things I suspected back in the day clarified and/or confirmed.
Is there a full interview for this coming up? Love inside storys like this it's super interesting. Makes my inner nerd very happy.
the end card of the video explains it
@@xXx_Regulus_xXx Saw that now, ty.
This is amazing!! Thank you Dave, and Dave!
I'm saddened that decent men like this will be going extinct soon...
I read one of Cutler's older books about NT so it's great to hear from him again.
People really don't give AMD enough credit for innovating like crazy
Interesting upload, thankyou both
Love this, but can you please remove the flashy rgb stuff from the background, Im trying to unfry my dopamine receptors
I am amazed at the access we have to this man, a gem full of info, thank you.
Спасибо большое за интервью! мне крайне сложно в России плыть против течения, когда я утверждаю, что Vista была революционной и дала человечеству большой толчок в развитии. Унификация, 64bit, службы, задачи, процессы. Всё было очень мудро построено. Великая работа!
Amazing interview! Love these stories!!!
I don't know that it's the worst I've ever seen, but I was amused when I discovered that the Z-80 editor assembler sold by Radio Shack (but written by Microsoft) had a bunch of code to check relative jumps, and it was wrong (would let you have a forward jump of 128 bytes, which the CPU saw as a backward jump).
Excellent interview. Keep it coming, Dave!!!
I was getting MSDN and TechNet regularly back when this was going on. We pushed a lot of people to NT 4 and later XP just because the network stack was just better.
your videos are awesome.. i love how he pointed out "your task manager", makes me feel like a fly on the wall listening to greats..
I was working on this big multi-phased ERP project years ago. I was seeing tons outsourced data integration programs delivered (and supposedly tested) that was buggy and not near spec. It had taken offshore an ungodly amount of time to produce this one specific program - it was total fubar crap. It wouldn't even run. My star SE and I just shook our heads as we performed our standard code review of it. It was so bad he proposed to just write this data integration program himself. I asked how long would it take. About a day, he said. Do it, I said. This was all under the table because implementation partner was getting paid millions every month to produce this crap. Ultimately, I chose the decision in order to maintain critical path. About 50 outsourced programmers were contracted at peak for a moderate amount of time during this 5 year project. Their code was fugly crap. 7 of us employees could have done this area of the project producing near bullet-proof code to spec easily within timeline. But, noooo... it was all outsourced to cheap green coders. 35 years as software engineer, what do I know. Thankfully those days are gone. Clean green pastures for me.
I would love if you did more interviews or even a podcast.
Interestingly, the Japanese IME in Windows Mobile was excellent from the user's perspective. The kanji writing recognition was superb as far back as WM2003 (that is the earliest I have tried). I still use this and it is awesome.
For reference, iOS and Android were terrible until about 2014. Partly because Apple & Google were too lazy to port their Chinese work to Japanese.
I used Microsoft's Japanese IME on 32-bit Win2000 and XP, it was hot garbage and I'm not surprised he found all those bugs. JWPce was a necessity.
@@wrenchposting9097I used Japanese IME on Win7 and Win10 and thought it worked fine. I don't recall using older versions so can't comment.
However, I would add the Kanji handwriting recognition on those old win mobile devices was great (wm2003 forward).
Man I love this channel! Bringing back memories. I remember reading the forword by Dave Cutler in the first edition of the book Inside Windows NT.
I wasted thousands of hours of my youth in the middle of the OS wars on usenet.
It's funny; probably not a lot of your audience will be familiar with Windows IME, but since I use Japanese regularly, I am well familiar with it. For me bringing up that piece of software is like bringing up Task Manager, a highly familiar face for us all, so it's interesting to know the history of it. In Japan, Windows IME is of course available on all Windows, but I found that Atok, another IME software, was often more popular. I wonder if the history alluded to here, of Microsoft having a hard time getting that IME code ready, was a formative factor in this.
I think a lot of us have waited for this insight for many years. Ambitions did seem to come, but years later and from different teams.
This is wild to hear from the people on the inside of a gigantic company explain why I felt the way I did about all of those products 😅
I used 98, ME was my first OS install, loved loved 2000, skipped vista, hesitantly went to 7 and.stayed there a loooong time. 10 and 11 have been decent but increasingly intrusive and harder to de-nanny.
how does windows 10 nanny you ? I notice nothing. Only been computing for 50 years, so Maybe im just naive.
Search for how to disable spying and telemetry or increase privacy. Decrapifier is one of the handy tools. The list of things may or may not bother you but the stuff is there.
@@loupasternak Nanny isn't the right term it's more so intrusive
Linux doesn't nanny. I've got fed up with Windows 10 bullshit and set up a Linux on a new laptop a year ago. It has everything in it and is much faster, like it always was. I think I'll also upgrade some of my PCs this way. With all due respect to Dave, shithead accountants are killing Windows. I'd install VMS, but unfortunately it's not for home PCs, and I don't have a nuclear plant to manage.
It's very cool you are documenting so much of windows history.
A company I used to work for outsourced code to India. One of the engineers traveled to India to see what was being done. There was a large room with 2 meter long tables spaced as if it was a classroom. People were sitting at the tables and writing code...by hand...into notebooks...using pencils. There was a _single computer_ on a table at the head of the room. When one of the coders was done writing, they would walk to the computer and enter their code into electronic format. Then vacate the seat for the next person in line and go back to their "coding" table and "write" more "code." Then some unlucky schmuck at some other location had to put all of that code together, compile it, and "run" it. The company had no idea that was happening. Do not outsource _anything._
fake
Great segment! Thank you!
Perfect time back then to start over- should've downloaded Linux/BSD code and see how OS should be done (not just code, but architecture as well). That could've saved your server products, your mobile division, and have better desktop (not like MW, Wista, W8 or W11), beter file systems (whoich are fast, don't lock files and don't get fragmented), better security and reliability. 30years later MS still has no clue how to build OS (they missed internet/web search to find Linux) - c'mon you can't put configs, executables and data in the same folder and call the day and don't start me with the abomination called Registry. Now mobile is dead, servers almost dead (Linux saved the day).
great interview. I hope Mr. Cutler comes back for another session..