My mom worked at CRAY on the YMP, and I worked at SGI (Silicon Graphics Inc.). When the last space shuttle disaster happened, we had to rush and build a O3000 for Ames for disaster reconstruction. I was told that they sent out people with hand held GPS to find shuttle debris and log the location of the part to help determine the cause of the catastrophe using the SGI O3000.
The previous generation, the X-MP, was used to generate the visual effects of "The Last Starfighter." In Japan that same year, a Cray-1 was used on the anime film "Lensman."
My friend "worked" at SGI (I believe it got "assimilated" into the Google campus now), I'm pretty sure that the only reason he got the job there was because his mom was a pretty influential higher-up. He didn't last very long, but the free espresso and snacks when I visited him at work was pretty cool. 😂
@@nelsonbrum8496 SGI sold there HQ to Google but they were eventually bought out by Rackable Systems in 2009 who then went by SGI and then were acquired by HPE in 2016
I started engineering in the late 80s. We'd set up an FEA (Finite Element Analyses) model on our old 386 machine and run it overnight as it would take well over 8 hours to complete. Usually it would fail for one of a dozen reasons. Fix the model, run it again overnight. Some models would take a week just to get the first result. Today, the same analyses take seconds. It's amazing how far we've come in such a short period. Tech note, your audio track is uncharacteristically off. Your voice level is quite low. The closing music came in at the proper level and blew my head phones off.
My first job was at a research institute. More than one of the people I worked for, had done a lot of work with computers earlier in their careers, creating and loading piles of punch-cards. Many are the stories of dropped boxes of cards, cards in the wrong order, the wrong orientation... and then there are the mistakes with the code itself. Get in the queue to use a computer overnight, only to have the job fail after a couple of hours... if you were lucky there'd be a proactive computer operator working at night who could have a go at fixing it. we have come so far since then, it's amazing, though booking times on super-computers is still necessary.
I started engineering in the early 90's. FEA is at least half voodoo. It's probably significantly better now with modern computers, but back then it was rough.
I am in University right now and we can run FEA on laptops, it takes about as long as your 386 did but I think we can have much finer meshes. We just got some funding and spent a good chunk on a workstation that should compute even our hardest tasks in seconds which is really exciting.
Great episode, but the volume seemed low. Anyway, it was great to hear the VAX 11/750 mentioned. My first years as a sys admin were spent on 11/785s. The whole concept of the VAXCluster was phenomenal at the time.
Great presentation! I had the pleasure of working on most of the platforms mentioned from CDC, Cray and IBM as a system programmer at the GM Research Laboratories. We often collaborated with NASA Ames and NASA Lewis on software issues.
I was a computer professional for 50 years,so lived through much of this story. I helped maintain the IBM 360/91 at Princeton University in the late 1960’s and early 70’s. I remember be called upon the help out at NASA in Harlem NYC. They had an IBM 360/95. Quite an honor for a young guy. Also, an adventure for a country boy. I was on the subway, feeling really out of place, and I was asked for directions. I also maintained the IBM 370/195 at NOAA on the Princeton campus. Exciting times. In later years, I was involved with clustered UNIX servers, all connected with a high speed switch. Each node would compute part of the problem.
I agree- the “Aeronautics” part of the NASA doesn’t get the publicity of the “Space” part, but it’s impact is, perhaps, more significant. Ames- formerly Moffett Field, has an incredible history.
NASA does a ton. NACA did a lot of really cool research too. Almost every aircraft in WWII used a wing profile developed by NACA. Not all of them did, like for example the P-47 Thunderbolt didn't use a NACA wing, but pick an aircraft from any nation and the chances are extensive high its wing is a NACA wing.
NASA is responsible for a great part of our modern world. Google "NASA spinoffs". There's a Wikipedia article which breaks them down by industry. It's shocking just how much NASA has done.
@@nomar5spaulding NACA's influence is not limited to aeronautics/aerospace. The NACA duct/scoop is prevalent on just about every motorsports vehicle competing. I think the first one I ever remember seeing was on the side of the Lamborghini Countach.
The operating system and the programming languages were just as instrumental. Programmers with clean tight code in machine instructions still remain to be the unsung heroes. After all, you can't have a schlep driving a rocket ship! 👍👍👍
@@josephgaviota Yeah, path signal is important, but the big thing MOSFET transistors are modeled as R-C circuits. Charging follows I(1-e^-t/RC)=v. The big thing is reducing the capacitance of the gate of the transistor. This is accomplished by making the gate smaller When v= ~1/ev_max, it's considered 'ON', or a logic 1, otherwise off. At the speed of the cray it probably was a factor, just like the size of motherboards became a factor when speed went to GHz range. Wires are also capacitors and must charge up before current flows. The smaller the wire, the less charging needs to be done
As the daughter of an aerospace engineer, I found your video quite fascinating, talking about that side of computing. There were so many terms that popped out in my mind that my dad used to talk about ☺️ As an engineer’s kid in the 1970s my exposure to computers began with the optical read paper tape driven MCC drilling machines that my dad serviced as his side business. This progressed to assisting him with his field services; swapping out the serval (sp?) motors he had taught me to rebuild. During this time he also taught me how to do the circuit board paste-ups and assembly for the computer that he invented to translate data from the optical paper tape to reel-to-reel magnetic tape (circa 1977/78). I held the job title of electronic service technician from the age of 11 years old when I first started working with my dad until around the age of 15 years old when my teenage angst took over and we weren’t getting along that well 🙄 My next exposure to computers was in 1985 when I took a “computer literacy” course at one of those “career colleges” which taught programming in Basic on a Commodore 64 (more like an iOS predecessor), but the computer that they gave me at the end of the course was a Sanyo MBC-550, which was more of an IBM-type operating system. This was back in the infancy of personal computers and so these machines did not play well with one another and things like emulators did not exist. So it was kinda like being taught to communicate in Spanish but then having to work in an environment that only understands German 😆💯🤦🏻♀️ Anyway, thanks for the awesome video, I really enjoyed it; at least until the end where it appears that whoever did the video editing accidentally turned the volume up on the music channel instead of lowering it into a fade…..🤦🏻♀️💯🤷🏻♀️
I think the volume problem is actually that the main part of the video is much lower than it was supposed to be. The outro volume was about right. Something got messed up with the gain on the main part of the video, it seems.
@@vbscript2 Ah, yeah maybe. I’ve only done audio/video editing once, so my experience is limited to that. It’s always helpful to hear other prospectives on this so thank you for that ☺️ I still have a lot of learning ahead of me as I plan on creating music videos for my music in the near future. I can’t even remember the name of the editing app I used last time, but would love any suggestions on what kinds would be good for a beginner like me who doesn’t have any kind of budget getting started.
@@rochelleesser7961 I've also only done a small amount of video editing, but I'm just comparing the volume here to the volume in other video content. Normally, though, the volume in the video should be using a relatively large percentage of the available dynamic range, but should just be quiet enough to not cause any clipping. The user can then adjust the volume on their device to whatever level is needed. For signal processing in general, it's pretty much always best to preserve as much signal as possible (without clipping) for as long as possible in order to get the best quality.
I worked at Cray starting as a Reliability Engineer on the Cray 2, from there I went to work in the Engineering Building then wound up in the Development division as a Circuit Design Engineer. I worked on CMOS ASICS for the massive parallel architecture systems that used 1000's of SiGe DEC Alpha chips which ran at 400mhZ+ That was the beginning of the peta flop era. Chippewa Falls was a wonderful place to live and watch my kids growing up. I have many fond memories of that time. We saw the end coming when the Fed announced cessation of funds for RnD.
I worked at a oil company that had a Cray XMP. I was a new graduate and was givne a tour of the data center where it was housed and they had it "open" so we could see the inside. We even got to sit on the "seats" and you could almost "feel" the power of the thing as it hummed away. It was really cool. Got to see the NEC SX-2 at the Houston Area Research Center that at one time was one of the fastest super computers too. Got a printout of the 29th Mersenne prime number that it had found. Really neat stuff
I remember the seats, but I didn't think they were for actual sitting. I thought it was so neat and futuristic that they were liquid cooled. That was the ultimate in tech.
The X/MP was a derivative of the original Cray-1 architecture. Seymour Cray didn’t care that much for backward compatibility, so his Cray-2 would have required users to recompile all their code to take advantage of it. So Cray Research spun him off into a new subsidiary, Cray Computer, to work on the Cray-2 (and the later Cray-3 and Cray-4), while the parent company concentrated on producing new machines, like the X/MP and Y/MP, that kept backward compatibility, under the leadership of Steve Chen.
@@lawrencedoliveiro9104 And somewhere in the National Reconnaissance Office in Chantilly, Virginia, there are very tired and lonely engineers trying to keep those things alive because they don't want to re-write the mission-critical software they run on them every day...
@@nufosmatic I seriously doubt that. Their “mission-critical software” was doing things like traffic analysis and code-cracking, for which they want all the computing power they can muster. So running on obsolete hardware isn’t going to cut it.
I remember in the late 80s/early 90s when Tom Clancy was writing some of his best books. If Jack Ryan needed a super computer, there were 2 or 3 times where the Cray 2 was specifically mentioned. They used a Cray 2 in Hunt for Red October to do some CFD type stuff to see how the silent drive system in the submarine might work and how it might sound. There is also some stuff about the NSA using the Cray 2 to generate virtually unbreakable one time use code ciphers based on randomly recorded atmospheric fluctuations.
My father was a long time fluid dynamics researcher at the NASA Lewis, now Glenn, research Center in Cleveland, Ohio. I used to stare at some of the pictures in his office and then at home after his retirement it looked an awful lot like the modeling presented in this episode. I continue to appreciate the wide spectrum of topics you bring to us. Thank you.
This video fails to mention it, but Lewis/Glenn had some CRAY-YMPs of their own, and obviously the have the 10x10 and 8x6/9x15 facilities to go with it. The tunnels are still there, but most of the supercomputer capability was consolidated at ARC. Whether that was political or just out of budgetary necessity is debatable, at GRC has long been the redheaded stepchild of NASA. ua-cam.com/video/vVA5hdLRSDc/v-deo.html
Like a few others on here, my college days were spent crunching numbers on a slide rule. Handheld calculators made by Texas Instruments were just coming out, but were too expensive for most college students. Slide rules and computers that we would sneer at today put men on the moon. We've come a long way in computational power in the last 50 years.
Guts is what put men on the Moon. The LEM's skin was only as thick as an aluminum TV dinner tray. You could poke through it with your finger. You couldn't get me into one of those at gunpoint. I'd insist you shot me on the spot. I don't do sheer terror too good really.
Hear, Hear! in 1984 or 85, I got 2 OLD pinball machines, at an auction, when I lived in Lawton Oklahoma. Both came with GOOD schematic drawings, and with little issue, I got both running. Cut to 1991, when I was working at the world's largest hamburger store, McDonalds, Douglas. (Really McDonnell- Douglas.) I was called to troubleshoot the aft cargo bay loading system. It was all antique relay logic. Found the schematics, almost as good as Bally provided for the pinball machines, and was able to find the bad wiring and relays. Oh, the MD-11. Glad I was able to figure it out. No, I'm not an engineer, just an electrical installer. steve
Fascinating episode, thank you. I spent my career in various roles at Harwell, so for me it was particularly nice to see mention of the CADET machine. Later on Harwell’s computer division used a Cray 2 to model the 1987 Kings Cross underground fire that claimed 31 lives and injured many more. The modelling led to the discovery of the ‘trench effect’ for fire burning in steep inclines. Wikipedia has a good article explaining this for anyone who wants to know more about it.
Great episode. I realized that the CDC 6600 at 3 MFLOPS is dwarfed by a Raspberry Pi 4 Model B at 9.92 GFLOPS )(and for about $100 to boot). Geezy Peezy, what an improvement.
My first job out of grad school was to port the TCP/IP protocol suite to the Cray Timesharing System, the bespoke operating system that ran on the XMP-48 at NCSA at the University of Illinois, one of the first four academic supercomputing centers. TCP/IP was required to interface to the NSFnet, a nascent predecessor to today's Internet. They were heady times for sure.
I did Aerospace ay U. Illinois (graduated at the end of 87). I never did anything on the CRAYs, but we were being shown what some of the postgrads were doing with CFD modelling. I was into FEA with my professor and we just used one the departments Cybers for that. I was on the swim team and there was a post grad who'd been on the team a few years earlier who the coach still let use our locker room. I forget his name, but was working on the parallel processing compilers for the CRAYs. I remember being quite surprised to find out they were still using Fortran and he explained they did that because the math algorithms had already been proven. What he was working on was how to send different parts of the code to different processors and get the best performance. I graduated at the end of 87 and they were just bringing on the second CRAY which I think might have been a 2 or an XMP-48. I'm actually Australian and its incredibly hard to explain to people here what it was like to be at Illinois in that era. I had done a year at RMIT in Melbourne before going to Illinois. RMIT had all of 1 Cyber mainframe for the entire university. Just the Aerospace department at Illinois had 2 Cybers as well as access to the other engineering VAX main frames. One of my professors had 2 Micro Vax units under a desk in his office. A few years after getting back to Oz our main Science and Research Organisation the CSIRO got a YMP single core and thought it was the greatest thing ever. It was around the time Illinois was commissioning its second XMP-64.
@@tonywilson4713 I grew up in Urbana-Champaign where the U of I is. I remember going to the supercomputing center and seeing the two Crays, the newer one had windows you could look into and see the liquid cooling flowing past all the components that were submerged in it. That was middle school or high school. My older brother was a student at the U of I (all three of my older brothers went to U of I, two became programmers) and while a student he worked at NCSA as a programmer, during the time Mosaic internet browser was being developed, that later became Netscape. He worked for them for a time after he graduated, went to California for a few years, came back and worked for them again, and ultimately ended back up in California bay area working for the usual suspects, like Apple and Yahoo etc. I loved playing on the Plato system they had, running on a CDC 6000 series mainframe. We had them (Plato IV terminals) in middle and high school in our computer labs. I now realize all these years later, most kids didn't grow up with that kind of access to networked systems and mainframes running educational software and of course our favorite, multi-user games! When the internet came along, I was already on it years before it became available to the mass public, when it was mostly connected via institutions like the U of I. Indeed heady days, 80's through the early 90's.
@@marcusdamberger Dude what an awesome story. I do remember Plato, but I hated it. Maybe in 85-86 it was still at that point where it was so cumbersome that it gave us all nightmares. Looking back I am certain that we were just lab rats for that kind of system. What Plato sort of achieved that's really important is what it inspired. The engineering school had been given a bunch of systems. Dozens of IBM compatibles, AT&T Unix PCs (i had one and it was ahhh-ful!) and 2 rooms full of first gen Apple Macs that had a mouse AND THEY were so awesome for term papers. It was a big deal back then having access to fonts and basic spell checking. I like to think that Plato's inspiration was to get the guys at the NSCA make it as easy to use as an Apple Mac *AND VOI-ah-LA* they gave us the web browser. Think about it why would a bunch of guys playing with CRAYs turn their attention to Plato and make it work like an Apple Mac? Yet they did and arguably made one of the greatest technical achievements of the computer age. It only had 1 downside - it made computers so much easier to use that millions of people suddenly thought they were tech savvy. 🤷♂🤷♂
The Cybers were from CDC, which was Seymour Cray’s former employer. That company was just set up to sell business-type machines to compete with IBM, but they made the “mistake” of hiring Cray, who designed a machine for them that was about 50 times faster than anything comparable from IBM.
You, sir, are a god. I was tasked with porting the drivers for the Ultranet gigabit network interface to my real-time UNIX machine for an opportunity at Fermilab in Batavia, Illinois. The drivers were a million lines of C code which was rigged to run on Sun UNIX. It actually would not run on Solaris, thus the opportunity. I managed to get it to run, but the rig resulted in really poor performance. Ultranet is gone and good riddence...
That's a lifetime career. You don't see that kind of long term employment anymore. Its sad. Back then people had a job for life. It was understood that if you did your job well you were rewarded with job security and the employer was rewarded with a loyal worker that gets more proficient over time. No loss of output from worker turnover and retraining.
@@lillyanneserrelio2187 companies no longer give any incentives for long term employment. They won't raise your wages if they raise their starting salary, so you'll be making the same as a new person.
IBM boss Thomas Watson Jr was notoriously livid when CDC brought out their legendary 6600 machine, which at one leap became the world’s most powerful supercomputer. A memo he wrote said “I understand that in the laboratory developing this system there are only 34 people, including the janitor”. And yet all the massed might of IBM’s thousands of engineers could not match that achievement.
@8:40 you probably meant "VAX 11/780" not VAX 11/750. The picture you show is of an 11/780 and the 780 came first. The 750s were a washing machine height cabinet and were significantly slower and cheaper.
Super proud dad moment here, my son did his university computer software engineering internship at NASA Ames Research Center, his main project was doing research on using graphics processors as computing devices in future missions. They gave him a little bit of time on the supercomputer to test his research. What an awesome opportunity!!
This post brought back many memories. I was a grad student on the 11th floor of the physics building and watched a Cray being lowered into the main compute facility on the second floor. About 1985. Came from NASA Ames.I also noticed a picture of Digital 11/370 in the post. Another fine computer but no Cray. I was given free cycles on the Cray during early phase startup. The good old days I guess. SGI is another matter for another day.
My roommate and friend's father was one of SGI's top salesmen. One day they had a show and tell expo at my univ and I got a bunch of swag. I was probably the only kid in the world with a giant 3D poster of HIV PR1 over my bed. I'm still blown away by one of their demos of water dropping into a bucket and you could change parameters in real time and spin the whole thing in 3 space. This was at a time before you could drag windows on a pc and rendering a 3d image could take hours
Idk what’s more impressive, being able to imagine, design and build these machines or making use of such relatively small amounts of computing power compared to what we are used too today
My father ran Grumman Aircraft's Flight Test Department. In 1939, he was the first person to do airborne stress testing of aircraft structures using Radio Telemetry and the first Electronic Strain Gauge he invented. Back when flight test was done the hard way.
When I went to college, we had serial number 2 cdc6600, I created some programs for waveform shaping in my ee studies. The next machine I interacted with was the dec pdp8, I still have one that works. Eventually, I was involved in the cyber 205 circuit development where we using a cyber176 for circuit modeling at cdc. At the same time, I was working on another project with a Vax 11785 for ciruit modeling the 785 was a much better platform, I went on to eventual develope software in Lan environments. I will never forget the day when I ported software that had taken over 24 hours to simulate a model, and the Lan, with 2 IBM at pc's performed the same model in less the 20 minutes. The handwriting was on the wall for the old architecture of the Vax machines..
I grew up fascinated by super computers. I got a tour once of my university's YMP-4. It was one of the coolest displays I've ever seen, a flat black room with the flaming pink cray alone in the center with flood lighting. In the corner was an SGI crimson terminal with 128MB(!) of ram.
@@sdrc92126 I worked in my colleges tech department. The mainframe that ran all things related to class scheduling, grades, etc. It was crazy seeing a computer that took up a whole room including raised floors and signs for Halon fire suppression. There was a team of three guys whose jobs were nothing but babysitting it. A server had more power than it, but it must of been easier keeping the old thing plugged in than porting the software or adopting new software. I think most of the programs the mainframe ran was custom made just for the college. Next door was the server room where no one worked and all the auxiliary stuff like university website ran on them.
Another great episode. By the way, if you haven't already done so, you should consider doing an episode on the NORAD SAGE Air Defense system and the IBM Q-7 computer (block house) that was the heart of the system. It's was a milestone in computer development, as outlined in the book, "From Whirlwind to MITRE, The R&D Story of the SAGE Air Defense Computer." (SAGE = Semi-Automatic Ground Environment). It was the largest computer ever built (physical size, not computing power). It was comprised of hundreds of thousands of vacuum tubes and used magnetic core memory. After they started shutting down the system, Hollywood bought some of the more impressive looking components and used them as props (you can see part of the system in Dr Evil's hideout in the Austin Powers series, for example). I was a maintenance man on this system for about 7 years.
I was lucky to have seen the SAGE installation in North Bay, Ontario. I worked for Digital Equipment as a field service tech. They used our computer for the weather system. One day I was down "in the mountain" doing maintenance and official from the base asked me if I wanted to see their IBM system. I watched the tech start up a diagnostic application, which ran a repetitive pattern which in turn created a noise through a speaker in the huge front panel. A standard telephone handset was placed near the speaker which was connected to the P.A. system. The tech would then take a cart loaded up with vacuum tube, slowly walk down these long aisles housing the computer circuitry and tap each tube individually. If he heard the noise stop or change sound, after tapping a tube, he'd replace it. Then restart the program and carry on with the same process. It was extremely time-consuming. Tubes for computers like these were very unreliable and had a high failure rate compared to solid state equipment. It was one helluva beast to behold. High power consumption and it took up so much space!
Love it! (I work at NAS). What I love to tell people is the importance of Pleiades in the Kepler mission (and now the TESS mission). I tell them, "Kepler and TESS weren't designed to find planets on their own. They can only gather the data. Pleiades loads the data after it's beamed back to Earth and the real search begins."
Yrs ago when I worked at a high tech company in Cupertino, one of my coworkers my friends and I ended up carpooling with was a gal married to a guy stationed at What was then NAS Moffett Field. She and her family lived in a block.of apartments that was designated for families of married enlisted. This apt complex was directly across the road from Ames, and we got to be very familiar with the facility. Ots nickname, from naval personnel stationed there, because of the very loud noise of the wind tunnels, was "Rumble Bunny." BTW the NAS in the naval base stood for Naval Air Station. Another nickname for the trivia minded, those naval personnel
THG- My mother worked for Dr Mark. I got to see the wind tunnels and the models they used. When they were done they just thru them in trash- I got to take some home and had them for years. It was very cool to see the inner workings back then.
That's awesome. The wind tunnel facility where I used to work didn't let us take home any models. Apparently just because it's Top Secret you can't take it home (well, not unless you're a politician, apparently.)
Thanks for this. I did my Masters in CFD in the 90s. Used lots of Unix workstations and Plot3d etc. Also read a lot of those old NACA typeset papers. Good times
I worked at NAS for a number of years, worked on the NASA Net using advanced fiber optics to build and maintain the distributed net to all the various sites, LARK, LERC, etc. My favorite super was the massively parallel CM1 "bubble machine". so called because you could see through the transparent panels it's processors that were cooled in a circulating, bubbling bath of fluorinert. Iy is also true that NASA and the NAS worked on many diverse areas of research, not just aero subjects, it was a privilege to contribute to the program.
I was a little kid then, but was always fascinated by computers. I begged my parents to take me to computer conventions where they would just drop me off in front of convention centers. I remember looking at a Lisa in awe and seeing a color display for the first time showing a map of a city. About 30 years later, I bought a house and the realtor told me he worked as a salesman selling the lisa
In 1964 I learned to program on an IBM 1620 -- also made with individual transistors. It was as big as my desk, used a built-in IBM Executive typewriter for I/O (it also could be attached to a card read-punch unit) and had about 10K of memory. It was also called the CADET -- Can't Add, Doesn't Even Try. Later, in 1988, I worked on a CRAY-2, with VAX front-ends. I think my phone probably has more computing power than either of those.
In the early 80s I lived in Moutain View, and my grandmother until her passing in the early 2000s lived in Sunnyvale. Both just miles from Moffet Field (now Onizuka Air Base) and Ames Research. You could always tell the wind tunnels were being used because there would be this very deep droning sound that could be heard for miles around. My grandmother also worked at Ames as well as in "the big blue cube" at Ames/Moffett. As well as the parent of my best friend in grade school through high school worked at Ames. This is such a fun one for me.
I worked right next to Ames for several years and went on a tour of it once. The various wind tunnels there are really a sight to behold. Not sure if they're still doing tours but well worth it if they are!
Man, the early computers that took up an entire room were incredible feats of mechanical and electrical engineering. It's crazy to think that this 12" x 9" thing I have sitting on my bed is more powerful than any single one of those computers. The development of the world is truly staggering sometimes.
Correction: You mentioned the Cray from 1982 was 100x slower than a modern smartphone. I looked up the iPhone 14 at 2 Tflops and that's 10,000x the 210 Mflop Cray you quoted. An amazing statistic. I looked up the Cray X-MP and it seems you have its speed right. Big numbers. Fun stuff.
I calculate the iPhone 14 as being about 7,500x faster than a single-CPU Cray X-MP but hey, close enough 😂. And talk about big numbers… a 4-CPU Cray X-MP cost $15,000,000 back then, the equivalent of over $50 million in today’s money 😮
@@MaxPower-11 The other thing that's really amazing is the 345 KILOWATTS they used, vs the iPhone using more like 345 milliWatts:-) I didn't look up the real number, but.... Also, you probably couldn't make a phone call from the Cray.
Yeah, definitely more than 100x. Modern GPUs designed for consumer PC gaming already hit 40 TFLOPS. For a few hundred bucks, it's 200,000x the computational power of the Cray. As for power, that iPhone is almost certainly pulling Watts when it's anywhere near its theoretical peak CPU and GPU performance. Still a far cry from hundreds of kilowatts, though... or the 21 Megawatts currently used by the Frontier supercomputer at ORNL (but that 21 MW is for 1.1 exaFLOPS... i.e. 1,100,000 TFLOPS or 1,100,000,000,000 MFLOPS. It's a bit fast.)
I had the opportunity to tour some of the CDC and Cray facilities in Chippewa Falls, WI, Seymour Cray's home town. It was interesting to see how they were built. A large mass of wires on the back planes interconnecting the modules. Each one placed by hand by diligent workers, mostly if not all women. Years later some Cray employees took my Intro to Unix course at the nearby Chippewa Valley Technical College. Cray developed their own Unix variant based on System V and called it UNICOS.
What a crazy story. I was an undergraduate at MIT in the early 70s when people were talking about building a "super" computer. We were hoping to get to a Megabyte of memory. I don't think we were talking about mega flops yet. But it is amazing to see the change and progression of the computer in everyday life.
Dude.... Thank you for all the knowledge this channel has imparted upon me. With such easy access to not only information, but also misinformation, this channel is a shining example of what is right with the internet. It's nice having content come from a trusted source. Again, Thank You!
Another testament to the work of all these individuals is that anyone with an internet connection can run CFD simulations at no cost. I've used one to check the air flow through my 3d printer's part cooling ducts with various configurations & fans.
I got to see in a parallel environment the results of the fluid dynamics on several programs in my 49 year aerospace career. From the North American Rockwell B1, Space Shuttle, X31, XF32, and Lockheed F22 and X59, programs. Many other smaller programs under development at the famous Skunk Works all got attention from the fluid dynamicist working with Ames Research. Most people wouldn't have a clue about the complexity of determining weapons deployment from an aircraft flying at Mach speeds or the use of the dynamics of drag to reduce speed of an orbiting vehicle to survive reentry into the atmosphere. Your video is a credit to your research team and yourself.
I just find it amazing that the tablet I'm typing this from can run rings around those computers that were but just few decades ago the cutting edge of technology.
Thanks for your episode on NASA/Ames. I worked there from 1980 to '96 and calibrated and repaired equipment all over the base. Security wasn't significant at that time. We used to go anywhere and do everything back then. Saw a tilt-rotor fall out of the sky while doing low-level ops near NAS and main stock toward the tidelands of S.F. Bay.
I worked at NASA/LaRC and used the CDC STAR-100 and the follow on CYBER 203 (only 3 were made IIRC). I later used the CYBER 205s at NASA/GSFC and the National Meteorological Center. These were very good machines if you knew how to program them for vector instructions. This generally requires a complete rewrite but you were rewarded with very good performance.
Thanks for the great video! I worked in the NAS division from 1991 to 2005, through the name change from Numerical Aerodynamic Simulation to NASA Advanced Supercomputing (we had to retain the NAS acronym so old T-shirts were still good). As the PI of the virtual windtunnel project I was thrilled with the mention. For those curious about that project here is an early video ua-cam.com/video/tW5rHVlli14/v-deo.html. I’m super proud to have been part of the NAS division.
Thanks for doing this episode. Ah, I have fond memories of NASA Ames and the communities around it. I know it wasn't about the Ames wind tunnels specifically but I spent many many hours, days testing (production and research) at the Ames Unity Plan Wind Tunnel for Boeing products including the 787-8, 787-9, 747-9 and 777-9 variants. It is the best elevated Reynolds number production wind tunnel for transonic testing in the US, maybe even the world. I spent the better part of my 35 year career at Boeing Commercial Airplanes using and improving both wind tunnel testing and CFD to better simulate transonic flight results for practical engineering applications. There are numerous reasons that both simulations don't get it right and it still takes skilled engineers to understand why and merge the two into a cohesive combination that does, utilizing the best of each for a better pre-flight prediction.
I was an early appreciator of CGI and remember the different colored balls rolling around with shading and how the frame generation took hours if not days to generate. Now Pixar with all the new fancy computers are dazzling us with all the creativity those artist can imagine. I guess it should not be a surprise that cards/ GPU like what Invidia make are so important to supercomputers.
12:00 I remember the first time I ever saw a CRAY computer. It was at Lawrence Livermore Laboratories, gosh, in the late '80s? Seeing a "liquid cooled" computer was super WOW for a young me.
Wow, what a blast from the past. I worked at NAS in the mid-to-late 90s, specifically on high-speed (at the time, anyway!) networking and hierarchical storage management systems (those CFD simulations generate a lot of data!). I have fond memories of my time working in building N-258. Thanks for making this!
3:05 - When I came out of school in 1981, I went to work for what was Datacraft in Fort Lauderdale, Florida, where the cash cow was a superminicomputer which performed at 0.5 MIPS. Ten years later the company licensed technology from Cray Computers to build a 3MIPS superminicomputer.
When I contracted for the USAF in the late 1990s on a huge air force base, there was an old decommissioned cray in one of the huge hallways, with the benches and everything. It was just outside a bowling alley.
I saw the first cray at the bradbury museum when I worked for a lab there. I think it was sitting in the corner and not even part of a real display, It was before the museum they have now and more of a warehouse.
@@mercster I feel guilty sometimes of being in the right place at the right time for a charmed life. I just wish I'd realized it more at the time. I remember going with a friend since I didn't have a car and Bannana Rama's Cruel Summer was playing on the radio as we parked. Weird the stuff you remember
Yay! I went to Control Data Institute to learn Computer Technology. I was a field tech back in the 80s when IBM pcs 1st came out. I moved out of the city so changed careers late 80s. I probably could be retired now if I would've stayed in IT. I still utilize many skills from back then - not everyone can count in hexadecimal!
I only remember seeing a sea view animation by Cray-1 in 1970s. Real-time moving waves and clouds and in living color. Amazing. I could barely make a moving clock-face on PDP-8.
The smart phone you carry is more powerful than the CRAY 1 supercomputer. When I first saw the MIPS comparisons for modern technology vs old, that one stunned me!
I am part of a solar car team and we use CFD all the time to optimize our cars and I got to see first hand how it works working on our teams new car Gato Del Sol 7. Its always fun to learn the origins of these tools. I think it would also be cool if you did a video on solar racing, its really fun, has been going on longer than people think, and is getting more popular as of late.
I looked at my university's entry into an Australian race in the early 90's. It didn't look interesting, so I didn't get involved, but I remember being told the motor cost $25k and had an efficiency of something like 97% or 99%. Ifft was an electric 4-wheel bicycle that carried it's own solar cells and had to charge a few hours in the morning before it could take off
In the70iesI worked at a machine shop here in Minneapolis making Disk drives for the memory disks for the CDC computers. They were massive. A disk, the size of a record album had 5K of memory. We made em by the thousands for years. By the mid 80ies the disks were the size of a silver dollar, or maybe a bit bigger. Amazing stuff back then. But time marches on. Now my usb flash drive has 128G of memory.
@@cerealport2726 I remember playing 'Elite' on one of those in college. Didn't get my first PC, a 286 Packard-Bell with a whopping 40mb of drive space - I never thought I'd be able to fill it up, until 1991.
the smart phone comparison gets hilarious when you consider that in the novel Jurassic Park, one of the 'clues' that Ingen was doing something weird was the fact that they'd bought 4 Cray XMP's and networked them to run their park. meaning that Jurassic park could be run off my smartphone, or my laptop.
@@GH-oi2jf Nowadays everything is networked. A single ethernet line (via USB) is enough. Granted that doesn't conform to the redundancy standards modern industrial systems use but that would just be fitting for Jurassic Park
Cool episode! Some of my graduate research was in high-performance computing (though much more recent than the systems discussed here.) Particularly, my research was focused on ways to efficiently use graphics processors in supercomputers. While developed primarily for the computer gaming market, it turns out that graphics processors are actually quite good at running most types of scientific simulations and, since they're already designed and manufactured in the millions for the PC market, it's often cheaper to build supercomputers out of those than purpose-designed HPC chips, which is a far lower-volume market. My research focused on ways to co-schedule kernels (as in programs that run on computational accelerators like graphics cards, not operating system kernels) with complementary resource requirements in order to get the most throughput out of the GPUs. It would be cool to see an episode on the Department of Energy's supercomputers, too. The Oak Ridge National Laboratory in Oak Ridge, Tennessee currently has the fastest supercomputer in the world, as it has for a large percentage of at least the last couple of decades that I've been keeping up with it, with Jaguar, Titan, Summit, and now Frontier all holding the number one spot on the TOP500 list at the time of their entry into service. Living only an hour away from Oak Ridge, one of my graduate advisors was also working for ORNL during my time in grad school.
In the mid 90s I worked at a small Aerospace firm building a satellite that eventually flew on STS-77. Our Analytical Group was doing FEA using Pentium 133s, the fastest machines we had at our company. The supervising contractor (JPL) reran the FEA and changes were made because they had access to a CRAY their results had much greater depth, even with our systems running overnight. Funny to think of that level of computation is passe 25 years later
perhaps the biggest change in supercomputing is how what was once completely bespoke custom CPUs, is now done with commodity CPUs and GPUs in enormous arrays. GPU alone was probably a big shift. There was even one super computer awhile back that included the cell processors from Playstation 3s. Names like Cray and SGI are now long gone from a field they were once face of to the general public. SGI was especially big in the public view in the 1990s as their machines started to be used for making the graphics for video games and at firms like ILM for movie effects. Fun tidbit, that SGI GUI in Jurassic Park was apparently a GUI they actually did design. no clue if it ever made production but it did exist and was not just invented for the film.
Little known fact: when Cray started his company, with the exception of a few experienced engineer-managers, he only wanted green engineers just out of school. He did this because he wanted engineers who didn't know Cray's ideas "wouldn't work". FYI, every U.S. made automobile made since 1996 has more computing power under the hood than in the entire Apollo Command module and Lunar Lander module COMBINED! Every Pentium-based PC has many times more raw computing power than the best super computers in the world in 1969.
Mr. Cray was my computer hero growing up. I read a Popular Science magazine in 1980 of a Cray Supercomputer that was capable of 80 MFLOPS and appeared on the cover. The article mentioned the 'C' shaped design of the 'CRAY' was to allow the shortest wire distances possible between electronic component boards within the interior of the supercomputer to speed internal comms. I later saw a decommissioned Cray deployed in the building foyer of the 'Bureau of Meteorology', Melbourne, Australia, as an artistic installation in the late 1990's.
From 92 - 02 I worked with the EPA's Cray supercomputers, a YMP, a C94 and a MPP, it was my first job out of college. The plumbing required for their cooling needs was as impressive as their electronics if not just more interesting to look at. Those were the good old days.
You might look into the Sage project as a lot of major computer technology developed and cost a huge amount. Jay Forester was a particular individual involved in that program.
Probably one of your most interesting videos. Well researched and the collections of images was wonderful. Reminded me of Paul Allen's Living Computer Museum which used to let you use a lot of these old supercomputers.
I had a customer, Pratt & Whitney, who put a PW4000 high-bypass turbofan jet engine in an Ames wind tunnel controlled not by the electronic engine control normally delivered with the engine but controlled by a supermicrocomputer running a real-time version of UNIX. The P&W controls engineer was on site trying new engine control algorithms, tweeking the parameters on the controls with the engine's turbopumps running on full in front of shocked Government wind tunnel managers... now THAT's a confident customer!
I went to school in the 1990's shadow of the IlLIAC III and ILLIAC IV projects. We had test equipment from both programs in our ACM lab space, operated in the Digital Computing Lab building that was constructed to house those programs, and we even had random prototype and spare boards from the systems that were usually sitting tossed in cabinets, or boxes, etc. Most of my first experience with using Oscilloscopes was on a Tektronix 454 and 465 both labelled "ILLIAC IV MAINTENANCE TEAM ONLY". At the time all of that stuff and many parts from similar 1960-1980s super computing projects were just sitting in store rooms and warehouses treated as complete junk. Including VAX-11/780 through to VAX9000 systems, DECStations, and early ARPANET communication equipment. The Dean occasionally let us root around in storage for parts, but most of that foundational computing history was later just sold off/disposed of for the scrap metal. I think 1-2 boards got saved and put in a display case but the rest is all gone.
In my past, I grew up in Sunnyvale. (If you don't know where Sunnyvale is, before it was called Moffett Field, it was Sunnyvale NAS.) Ames Research Station is still on Moffett field. Fremont HS district had a machining class at Ames. I went. We also did trips to a LOT of the other things at Ames. Saw a model (quite large) on the 40 by 80 wind tunnel, of the space shuttle. This was 1975 or '76. Lance, my father started working at a tiny computer memory making company, in 1970. I met one of the founders at work one Saturday. The guy was Gordon Moore. Dad was employee # 47. Yes, Intel. steve
Have you already done an episode on the history of ties? The fashions, the different ways to knot them and what the knots mean, maybe not a great episode idea but it just popped in my mind. Love the show!
I can tell you a tidbit - the modern neck tie is believed to have originated in Croatia. The French word for neck tie, cravat, is a bastardization of Croat, as in a person from Croatia. I have no idea how I know that 😋
Brings back memories... I worked for GE Nuclear from 76 to 82 as a Fortran programmer simulating Advanced Nuclear reactors (Breeders) and we originally used a GE 6000 then Honeywell bought them out small word size (32 bits) eventually went to 64 bits with the CDC 7600 (water cooled). then the small Digital DEC-750, 780 was the fastest of that series. Easy to network with (Apple's apple talk was wonderful also) I eventually ended up at NASA Ames working for Boeing computer services supporting 200 Apple Mac Desktops, and writing code for Mac's and several other machines also.(I supported the referb of the Hubble team). My code (DQ) flew the Ozone mapping in U-2 (UB2) aircraft we had 3. Pilots were in Space Suits. Cray was sold to Silicon Graphics where it languished and then Sun Microsystems bought them ported the os to Solaris (Sun/unix). At the time it was the biggest system that could be bought (fastest and physically largest). I ended up working in SE Asia for Sun as a software guy Principal Consultant and installing the Sun/machines in SE Asia and supporting Y2K in the Philippines then Thailand. So many vendors went belly up in the bay area all the infrastructure was there. This is also why the companies like Facebook (located on old Sun campus) and Yahoo, Google (built on an old dump/concert venue) We built everything from Scratch (Software). I have been living in Thailand for over 20 years.
I visited the Goddard Space Centre some decades ago.... in the 90's! . It was fascinating. While on the visitor tour, I was blown sideways by a draught through a partially open blast door into a huge room full of supercomputers. I saw a load of Cray XMP's along with a very large Thinking Machines system. It was all going well, taking photos etc, until I received the 'very heavy hand' on my shoulder. I was spun around and marched off the premises. I don't expect you will believe any of the above but it really did happen. My subsequent visit to Fort George Meade went rather more badly. I didn't make it past security at the railway station.... lol!
I have worked in the NASA community most of my professional career, so I found this a very interesting listen. Listened to it on my way home from NASA's GSFC this evening in fact. Naturally it spawns multiple questions, so I'll ask the most important one: What is the bumper song at the end? It sounds like Lindsey Buckingham, so I'd guess it's a Fleetwood Mac song, and yet I don't recognize it.
I once visited the "Cray Room" at Laurence Livermore Labs. I was told that the Cray 1's had air conditioning units in the surrounding benches. The Cray 2 was liquid cooled with a separate device used to remove and store the fluid.
Interesting, thanks! I was disappointed that you made no mention of the super-mini Elxsi multi-processing computer of the 1980's, where I had been employed -- later rechristened the Trilogy for marketing reasons. We boasted as being the second-fastest computer to the Cray of the era, and these systems were installed at all major national labs and used extensively for flight simulation systems. Thank you, HG!
I remember the first time I used a Cray. It was for a structural analysis problem (FEA) for Space Station in the early 90s. I probably could have just used a Unix workstation, but I just wanted to run it on a Cray because I could 😊
A very interesting history on the development of the supercomputer and the cray in particular. One of the reasons that China got so advanced in computing technology and became what they are today is primarily due to the fact that President Clinton when he was in office decided to equip China with Cray technology by selling them supercomputers. Previously there had been a ban on selling US technology to the Chinese because of their human rights issues and other things. China insisted that the super computers were going to be used so that they could plot and map weather more properly. A very thin reason but it was accepted by the administration at that time. Considering what's happening now with China, someday we will be hearing from historians about the mistake that was made by giving China access to modern US technology.
That's a similar story to the English Labour party giving Russia Rolls Royce To Let jet engines as fitted to the current Gloster Meteor fighters And also captured German papers on jet and rocket engines, because the stupid Labour party leading being a socialist So that is how the Russians went from prop to jet with 5 yrs rather than the 20plus. years the Americans thought Oooopppssie
My mom worked at CRAY on the YMP, and I worked at SGI (Silicon Graphics Inc.). When the last space shuttle disaster happened, we had to rush and build a O3000 for Ames for disaster reconstruction. I was told that they sent out people with hand held GPS to find shuttle debris and log the location of the part to help determine the cause of the catastrophe using the SGI O3000.
Back when I was at university in the early/mid 90s, my friends and I dreamed of having our own Silicon Graphics workstation s at home.
The previous generation, the X-MP, was used to generate the visual effects of "The Last Starfighter." In Japan that same year, a Cray-1 was used on the anime film "Lensman."
My friend "worked" at SGI (I believe it got "assimilated" into the Google campus now), I'm pretty sure that the only reason he got the job there was because his mom was a pretty influential higher-up. He didn't last very long, but the free espresso and snacks when I visited him at work was pretty cool. 😂
@@nelsonbrum8496 SGI sold there HQ to Google but they were eventually bought out by Rackable Systems in 2009 who then went by SGI and then were acquired by HPE in 2016
@@ryanfoley8035 Yup, I miss seeing that SGI 3D sculpture in front of their former office buildings. That was iconic.
I started engineering in the late 80s. We'd set up an FEA (Finite Element Analyses) model on our old 386 machine and run it overnight as it would take well over 8 hours to complete. Usually it would fail for one of a dozen reasons. Fix the model, run it again overnight. Some models would take a week just to get the first result. Today, the same analyses take seconds. It's amazing how far we've come in such a short period.
Tech note, your audio track is uncharacteristically off. Your voice level is quite low. The closing music came in at the proper level and blew my head phones off.
My first job was at a research institute. More than one of the people I worked for, had done a lot of work with computers earlier in their careers, creating and loading piles of punch-cards. Many are the stories of dropped boxes of cards, cards in the wrong order, the wrong orientation... and then there are the mistakes with the code itself.
Get in the queue to use a computer overnight, only to have the job fail after a couple of hours... if you were lucky there'd be a proactive computer operator working at night who could have a go at fixing it.
we have come so far since then, it's amazing, though booking times on super-computers is still necessary.
Noticed this audio problem too! I am not crazy, and my phone is not broken! 😅
Same audio problem here, used the Goodev volume booster to hear it.
I started engineering in the early 90's. FEA is at least half voodoo. It's probably significantly better now with modern computers, but back then it was rough.
I am in University right now and we can run FEA on laptops, it takes about as long as your 386 did but I think we can have much finer meshes. We just got some funding and spent a good chunk on a workstation that should compute even our hardest tasks in seconds which is really exciting.
Great episode, but the volume seemed low. Anyway, it was great to hear the VAX 11/750 mentioned. My first years as a sys admin were spent on 11/785s. The whole concept of the VAXCluster was phenomenal at the time.
Same on volume. Also now I need to look up what happened to Cray
DEC networking was at least a decade ahead of the others, perhaps more. Took Microsoft what, until 2003, to get working clustering.
I worked on vax's (as a user) all the way through college and for years after graduation. My friend's mom was an assembler (soldering) for DEC.
Yeah, the music on the end blew my eardrums.
voice at the end was drowned out by the music
Great presentation! I had the pleasure of working on most of the platforms mentioned from CDC, Cray and IBM as a system programmer at the GM Research Laboratories. We often collaborated with NASA Ames and NASA Lewis on software issues.
I was a computer professional for 50 years,so lived through much of this story. I helped maintain the IBM 360/91 at Princeton University in the late 1960’s and early 70’s. I remember be called upon the help out at NASA in Harlem NYC. They had an IBM 360/95. Quite an honor for a young guy. Also, an adventure for a country boy. I was on the subway, feeling really out of place, and I was asked for directions. I also maintained the IBM 370/195 at NOAA on the Princeton campus. Exciting times. In later years, I was involved with clustered UNIX servers, all connected with a high speed switch. Each node would compute part of the problem.
The 91 was a pioneering machine, which got mention in Computer Science architecture classes.
Another winner! I learned my programing skills at Purdue university on CDC computers.
Had no idea Ames had been so influential. Makes it clear that the role of NASA is not limited to producing vehicles that can reach escape velocity
I agree- the “Aeronautics” part of the NASA doesn’t get the publicity of the “Space” part, but it’s impact is, perhaps, more significant.
Ames- formerly Moffett Field, has an incredible history.
I had a friend at NASA who according to him, was instrumental in getting cray off the ground. It was also central in starting global warming,
NASA does a ton. NACA did a lot of really cool research too. Almost every aircraft in WWII used a wing profile developed by NACA. Not all of them did, like for example the P-47 Thunderbolt didn't use a NACA wing, but pick an aircraft from any nation and the chances are extensive high its wing is a NACA wing.
NASA is responsible for a great part of our modern world.
Google "NASA spinoffs".
There's a Wikipedia article which breaks them down by industry.
It's shocking just how much NASA has done.
@@nomar5spaulding NACA's influence is not limited to aeronautics/aerospace. The NACA duct/scoop is prevalent on just about every motorsports vehicle competing. I think the first one I ever remember seeing was on the side of the Lamborghini Countach.
The operating system and the programming languages were just as instrumental. Programmers with clean tight code in machine instructions still remain to be the unsung heroes. After all, you can't have a schlep driving a rocket ship! 👍👍👍
The C shape of the Cray wasn’t just a branding decision. It was to minimize the signal paths between components. At least, that’s what Cray claimed.
yes
The Cray had such a fast master clock that that timing problems were solved by adjusting the length of interconnect wires in eighth inch increments!
@@billmullins6833 I worked on a computer/detector where the programming was done by the length of wires and positioning of logic modules on the rack
_... minimize the signal paths between components ..._
I too clearly remember that being explained to my young self.
@@josephgaviota Yeah, path signal is important, but the big thing MOSFET transistors are modeled as R-C circuits. Charging follows I(1-e^-t/RC)=v. The big thing is reducing the capacitance of the gate of the transistor. This is accomplished by making the gate smaller When v= ~1/ev_max, it's considered 'ON', or a logic 1, otherwise off. At the speed of the cray it probably was a factor, just like the size of motherboards became a factor when speed went to GHz range. Wires are also capacitors and must charge up before current flows. The smaller the wire, the less charging needs to be done
As the daughter of an aerospace engineer, I found your video quite fascinating, talking about that side of computing.
There were so many terms that popped out in my mind that my dad used to talk about ☺️
As an engineer’s kid in the 1970s my exposure to computers began with the optical read paper tape driven MCC drilling machines that my dad serviced as his side business.
This progressed to assisting him with his field services; swapping out the serval (sp?) motors he had taught me to rebuild.
During this time he also taught me how to do the circuit board paste-ups and assembly for the computer that he invented to translate data from the optical paper tape to reel-to-reel magnetic tape (circa 1977/78).
I held the job title of electronic service technician from the age of 11 years old when I first started working with my dad until around the age of 15 years old when my teenage angst took over and we weren’t getting along that well 🙄
My next exposure to computers was in 1985 when I took a “computer literacy” course at one of those “career colleges” which taught programming in Basic on a Commodore 64 (more like an iOS predecessor), but the computer that they gave me at the end of the course was a Sanyo MBC-550, which was more of an IBM-type operating system.
This was back in the infancy of personal computers and so these machines did not play well with one another and things like emulators did not exist.
So it was kinda like being taught to communicate in Spanish but then having to work in an environment that only understands German 😆💯🤦🏻♀️
Anyway, thanks for the awesome video, I really enjoyed it; at least until the end where it appears that whoever did the video editing accidentally turned the volume up on the music channel instead of lowering it into a fade…..🤦🏻♀️💯🤷🏻♀️
I think the volume problem is actually that the main part of the video is much lower than it was supposed to be. The outro volume was about right. Something got messed up with the gain on the main part of the video, it seems.
@@vbscript2 Ah, yeah maybe.
I’ve only done audio/video editing once, so my experience is limited to that.
It’s always helpful to hear other prospectives on this so thank you for that ☺️
I still have a lot of learning ahead of me as I plan on creating music videos for my music in the near future. I can’t even remember the name of the editing app I used last time, but would love any suggestions on what kinds would be good for a beginner like me who doesn’t have any kind of budget getting started.
@@rochelleesser7961 I've also only done a small amount of video editing, but I'm just comparing the volume here to the volume in other video content. Normally, though, the volume in the video should be using a relatively large percentage of the available dynamic range, but should just be quiet enough to not cause any clipping. The user can then adjust the volume on their device to whatever level is needed. For signal processing in general, it's pretty much always best to preserve as much signal as possible (without clipping) for as long as possible in order to get the best quality.
@@vbscript2 I agree, thanks ☺️
I worked at Cray starting as a Reliability Engineer on the Cray 2, from there I went to work in the Engineering Building then wound up in the Development division as a Circuit Design Engineer. I worked on CMOS ASICS for the massive parallel architecture systems that used 1000's of SiGe DEC Alpha chips which ran at 400mhZ+ That was the beginning of the peta flop era. Chippewa Falls was a wonderful place to live and watch my kids growing up. I have many fond memories of that time. We saw the end coming when the Fed announced cessation of funds for RnD.
I worked at a oil company that had a Cray XMP. I was a new graduate and was givne a tour of the data center where it was housed and they had it "open" so we could see the inside. We even got to sit on the "seats" and you could almost "feel" the power of the thing as it hummed away. It was really cool. Got to see the NEC SX-2 at the Houston Area Research Center that at one time was one of the fastest super computers too. Got a printout of the 29th Mersenne prime number that it had found. Really neat stuff
I remember the seats, but I didn't think they were for actual sitting. I thought it was so neat and futuristic that they were liquid cooled. That was the ultimate in tech.
@@sdrc92126 💯agree with that! The liquid cooling, the "waterfall" with the great CRAY logo on it, dang, that was wonderful back then.
The X/MP was a derivative of the original Cray-1 architecture. Seymour Cray didn’t care that much for backward compatibility, so his Cray-2 would have required users to recompile all their code to take advantage of it.
So Cray Research spun him off into a new subsidiary, Cray Computer, to work on the Cray-2 (and the later Cray-3 and Cray-4), while the parent company concentrated on producing new machines, like the X/MP and Y/MP, that kept backward compatibility, under the leadership of Steve Chen.
@@lawrencedoliveiro9104 And somewhere in the National Reconnaissance Office in Chantilly, Virginia, there are very tired and lonely engineers trying to keep those things alive because they don't want to re-write the mission-critical software they run on them every day...
@@nufosmatic I seriously doubt that. Their “mission-critical software” was doing things like traffic analysis and code-cracking, for which they want all the computing power they can muster. So running on obsolete hardware isn’t going to cut it.
I remember in the late 80s/early 90s when Tom Clancy was writing some of his best books. If Jack Ryan needed a super computer, there were 2 or 3 times where the Cray 2 was specifically mentioned. They used a Cray 2 in Hunt for Red October to do some CFD type stuff to see how the silent drive system in the submarine might work and how it might sound. There is also some stuff about the NSA using the Cray 2 to generate virtually unbreakable one time use code ciphers based on randomly recorded atmospheric fluctuations.
My father was a long time fluid dynamics researcher at the NASA Lewis, now Glenn, research Center in Cleveland, Ohio. I used to stare at some of the pictures in his office and then at home after his retirement it looked an awful lot like the modeling presented in this episode. I continue to appreciate the wide spectrum of topics you bring to us. Thank you.
This video fails to mention it, but Lewis/Glenn had some CRAY-YMPs of their own, and obviously the have the 10x10 and 8x6/9x15 facilities to go with it. The tunnels are still there, but most of the supercomputer capability was consolidated at ARC. Whether that was political or just out of budgetary necessity is debatable, at GRC has long been the redheaded stepchild of NASA.
ua-cam.com/video/vVA5hdLRSDc/v-deo.html
I spent my career working in scientific computing, supercomputers and visualization of CFD applications. This video was right in my wheelhouse!
I was in college in the mid-90's and everyone started playing around with Linux. Linus Torvalds was a folk hero to us.
Like a few others on here, my college days were spent crunching numbers on a slide rule. Handheld calculators made by Texas Instruments were just coming out, but were too expensive for most college students. Slide rules and computers that we would sneer at today put men on the moon. We've come a long way in computational power in the last 50 years.
The B 52 bomber, that may fly for 100 years, was designed by men still using slide rules in the 1950's.
It’s been said we went to the moon on a slide rule.
@@navret1707
Photoshop the Moon launch replacing the Saturn V with a sliderule. LOL
Guts is what put men on the Moon. The LEM's skin was only as thick as an aluminum TV dinner tray. You could poke through it with your finger. You couldn't get me into one of those at gunpoint. I'd insist you shot me on the spot. I don't do sheer terror too good really.
I still have my Post Versalog slip-stick and hard leather case!!! I never used the belt hanger clip, as I was trying to avoid the major geek factor!!!
History guy, could you do an episode of pinball machines?
Hear, Hear!
in 1984 or 85, I got 2 OLD pinball machines, at an auction,
when I lived in Lawton Oklahoma. Both came with GOOD
schematic drawings, and with little issue, I got both running.
Cut to 1991, when I was working at the world's largest
hamburger store, McDonalds, Douglas. (Really McDonnell-
Douglas.) I was called to troubleshoot the aft cargo bay loading
system. It was all antique relay logic. Found the schematics,
almost as good as Bally provided for the pinball machines, and
was able to find the bad wiring and relays. Oh, the MD-11. Glad
I was able to figure it out. No, I'm not an engineer, just an electrical
installer.
steve
Been following NASA since the 1st moon landing. Visited Kennedy Space Center in 2001 during a port visit to Port Canaveral, Florida on USS Wasp LHD 1.
Most enjoyable presentation THG from a guy who got to see a Cray being installed in my early days and got to use Clix.
Fascinating episode, thank you. I spent my career in various roles at Harwell, so for me it was particularly nice to see mention of the CADET machine. Later on Harwell’s computer division used a Cray 2 to model the 1987 Kings Cross underground fire that claimed 31 lives and injured many more. The modelling led to the discovery of the ‘trench effect’ for fire burning in steep inclines. Wikipedia has a good article explaining this for anyone who wants to know more about it.
Great episode.
I realized that the CDC 6600 at 3 MFLOPS is dwarfed by a Raspberry Pi 4 Model B at 9.92 GFLOPS )(and for about $100 to boot).
Geezy Peezy, what an improvement.
My first job out of grad school was to port the TCP/IP protocol suite to the Cray Timesharing System, the bespoke operating system that ran on the XMP-48 at NCSA at the University of Illinois, one of the first four academic supercomputing centers. TCP/IP was required to interface to the NSFnet, a nascent predecessor to today's Internet. They were heady times for sure.
I did Aerospace ay U. Illinois (graduated at the end of 87).
I never did anything on the CRAYs, but we were being shown what some of the postgrads were doing with CFD modelling. I was into FEA with my professor and we just used one the departments Cybers for that.
I was on the swim team and there was a post grad who'd been on the team a few years earlier who the coach still let use our locker room. I forget his name, but was working on the parallel processing compilers for the CRAYs. I remember being quite surprised to find out they were still using Fortran and he explained they did that because the math algorithms had already been proven. What he was working on was how to send different parts of the code to different processors and get the best performance.
I graduated at the end of 87 and they were just bringing on the second CRAY which I think might have been a 2 or an XMP-48.
I'm actually Australian and its incredibly hard to explain to people here what it was like to be at Illinois in that era. I had done a year at RMIT in Melbourne before going to Illinois. RMIT had all of 1 Cyber mainframe for the entire university. Just the Aerospace department at Illinois had 2 Cybers as well as access to the other engineering VAX main frames. One of my professors had 2 Micro Vax units under a desk in his office.
A few years after getting back to Oz our main Science and Research Organisation the CSIRO got a YMP single core and thought it was the greatest thing ever. It was around the time Illinois was commissioning its second XMP-64.
@@tonywilson4713 I grew up in Urbana-Champaign where the U of I is. I remember going to the supercomputing center and seeing the two Crays, the newer one had windows you could look into and see the liquid cooling flowing past all the components that were submerged in it. That was middle school or high school. My older brother was a student at the U of I (all three of my older brothers went to U of I, two became programmers) and while a student he worked at NCSA as a programmer, during the time Mosaic internet browser was being developed, that later became Netscape. He worked for them for a time after he graduated, went to California for a few years, came back and worked for them again, and ultimately ended back up in California bay area working for the usual suspects, like Apple and Yahoo etc.
I loved playing on the Plato system they had, running on a CDC 6000 series mainframe. We had them (Plato IV terminals) in middle and high school in our computer labs. I now realize all these years later, most kids didn't grow up with that kind of access to networked systems and mainframes running educational software and of course our favorite, multi-user games! When the internet came along, I was already on it years before it became available to the mass public, when it was mostly connected via institutions like the U of I. Indeed heady days, 80's through the early 90's.
@@marcusdamberger Dude what an awesome story.
I do remember Plato, but I hated it. Maybe in 85-86 it was still at that point where it was so cumbersome that it gave us all nightmares. Looking back I am certain that we were just lab rats for that kind of system. What Plato sort of achieved that's really important is what it inspired.
The engineering school had been given a bunch of systems. Dozens of IBM compatibles, AT&T Unix PCs (i had one and it was ahhh-ful!) and 2 rooms full of first gen Apple Macs that had a mouse AND THEY were so awesome for term papers. It was a big deal back then having access to fonts and basic spell checking.
I like to think that Plato's inspiration was to get the guys at the NSCA make it as easy to use as an Apple Mac *AND VOI-ah-LA* they gave us the web browser. Think about it why would a bunch of guys playing with CRAYs turn their attention to Plato and make it work like an Apple Mac? Yet they did and arguably made one of the greatest technical achievements of the computer age. It only had 1 downside - it made computers so much easier to use that millions of people suddenly thought they were tech savvy. 🤷♂🤷♂
The Cybers were from CDC, which was Seymour Cray’s former employer. That company was just set up to sell business-type machines to compete with IBM, but they made the “mistake” of hiring Cray, who designed a machine for them that was about 50 times faster than anything comparable from IBM.
You, sir, are a god. I was tasked with porting the drivers for the Ultranet gigabit network interface to my real-time UNIX machine for an opportunity at Fermilab in Batavia, Illinois. The drivers were a million lines of C code which was rigged to run on Sun UNIX. It actually would not run on Solaris, thus the opportunity. I managed to get it to run, but the rig resulted in really poor performance. Ultranet is gone and good riddence...
I worked at CDC and Cray. Finished 35 years at IBM.
That's a lifetime career. You don't see that kind of long term employment anymore. Its sad. Back then people had a job for life. It was understood that if you did your job well you were rewarded with job security and the employer was rewarded with a loyal worker that gets more proficient over time. No loss of output from worker turnover and retraining.
@@lillyanneserrelio2187 companies no longer give any incentives for long term employment. They won't raise your wages if they raise their starting salary, so you'll be making the same as a new person.
IBM boss Thomas Watson Jr was notoriously livid when CDC brought out their legendary 6600 machine, which at one leap became the world’s most powerful supercomputer. A memo he wrote said “I understand that in the laboratory developing this system there are only 34 people, including the janitor”. And yet all the massed might of IBM’s thousands of engineers could not match that achievement.
@Socky Noob or they fire you a couple years before your retirement, and then hire younger people and pay them less and give them less benefits.
@8:40 you probably meant "VAX 11/780" not VAX 11/750. The picture you show is of an 11/780 and the 780 came first. The 750s were a washing machine height cabinet and were significantly slower and cheaper.
Super proud dad moment here, my son did his university computer software engineering internship at NASA Ames Research Center, his main project was doing research on using graphics processors as computing devices in future missions. They gave him a little bit of time on the supercomputer to test his research. What an awesome opportunity!!
This post brought back many memories. I was a grad student on the 11th floor of the physics building and watched a Cray being lowered into the main compute facility on the second floor. About 1985. Came from NASA Ames.I also noticed a picture of Digital 11/370 in the post. Another fine computer but no Cray. I was given free cycles on the Cray during early phase startup. The good old days I guess. SGI is another matter for another day.
My roommate and friend's father was one of SGI's top salesmen. One day they had a show and tell expo at my univ and I got a bunch of swag. I was probably the only kid in the world with a giant 3D poster of HIV PR1 over my bed. I'm still blown away by one of their demos of water dropping into a bucket and you could change parameters in real time and spin the whole thing in 3 space. This was at a time before you could drag windows on a pc and rendering a 3d image could take hours
Idk what’s more impressive, being able to imagine, design and build these machines or making use of such relatively small amounts of computing power compared to what we are used too today
My father ran Grumman Aircraft's Flight Test Department. In 1939, he was the first person to do airborne stress testing of aircraft structures using Radio Telemetry and the first Electronic Strain Gauge he invented. Back when flight test was done the hard way.
Loved to nerd out on the specs for this episode. ❤
When I went to college, we had serial number 2 cdc6600, I created some programs for waveform shaping in my ee studies. The next machine I interacted with was the dec pdp8, I still have one that works. Eventually, I was involved in the cyber 205 circuit development where we using a cyber176 for circuit modeling at cdc. At the same time, I was working on another project with a Vax 11785 for ciruit modeling the 785 was a much better platform, I went on to eventual develope software in Lan environments. I will never forget the day when I ported software that had taken over 24 hours to simulate a model, and the Lan, with 2 IBM at pc's performed the same model in less the 20 minutes. The handwriting was on the wall for the old architecture of the Vax machines..
I grew up fascinated by super computers. I got a tour once of my university's YMP-4. It was one of the coolest displays I've ever seen, a flat black room with the flaming pink cray alone in the center with flood lighting. In the corner was an SGI crimson terminal with 128MB(!) of ram.
@@sdrc92126 I worked in my colleges tech department. The mainframe that ran all things related to class scheduling, grades, etc. It was crazy seeing a computer that took up a whole room including raised floors and signs for Halon fire suppression. There was a team of three guys whose jobs were nothing but babysitting it. A server had more power than it, but it must of been easier keeping the old thing plugged in than porting the software or adopting new software. I think most of the programs the mainframe ran was custom made just for the college. Next door was the server room where no one worked and all the auxiliary stuff like university website ran on them.
They also assisted in creating remote surgeries. One of my relatives was on that team in the 80's.
Another great episode. By the way, if you haven't already done so, you should consider doing an episode on the NORAD SAGE Air Defense system and the IBM Q-7 computer (block house) that was the heart of the system. It's was a milestone in computer development, as outlined in the book, "From Whirlwind to MITRE, The R&D Story of the SAGE Air Defense Computer." (SAGE = Semi-Automatic Ground Environment). It was the largest computer ever built (physical size, not computing power). It was comprised of hundreds of thousands of vacuum tubes and used magnetic core memory. After they started shutting down the system, Hollywood bought some of the more impressive looking components and used them as props (you can see part of the system in Dr Evil's hideout in the Austin Powers series, for example). I was a maintenance man on this system for about 7 years.
I was lucky to have seen the SAGE installation in North Bay, Ontario. I worked for Digital Equipment as a field service tech. They used our computer for the weather system.
One day I was down "in the mountain" doing maintenance and official from the base asked me if I wanted to see their IBM system. I watched the tech start up a diagnostic application, which ran a repetitive pattern which in turn created a noise through a speaker in the huge front panel. A standard telephone handset was placed near the speaker which was connected to the P.A. system.
The tech would then take a cart loaded up with vacuum tube, slowly walk down these long aisles housing the computer circuitry and tap each tube individually. If he heard the noise stop or change sound, after tapping a tube, he'd replace it. Then restart the program and carry on with the same process. It was extremely time-consuming. Tubes for computers like these were very unreliable and had a high failure rate compared to solid state equipment.
It was one helluva beast to behold. High power consumption and it took up so much space!
Love it! (I work at NAS). What I love to tell people is the importance of Pleiades in the Kepler mission (and now the TESS mission). I tell them, "Kepler and TESS weren't designed to find planets on their own. They can only gather the data. Pleiades loads the data after it's beamed back to Earth and the real search begins."
Yrs ago when I worked at a high tech company in Cupertino, one of my coworkers my friends and I ended up carpooling with was a gal married to a guy stationed at What was then NAS Moffett Field. She and her family lived in a block.of apartments that was designated for families of married enlisted. This apt complex was directly across the road from Ames, and we got to be very familiar with the facility. Ots nickname, from naval personnel stationed there, because of the very loud noise of the wind tunnels, was "Rumble Bunny."
BTW the NAS in the naval base stood for Naval Air Station. Another nickname for the trivia minded, those naval personnel
Cupertino huh... 'Tandem Computers' by any chance?
THG- My mother worked for Dr Mark. I got to see the wind tunnels and the models they used. When they were done they just thru them in trash- I got to take some home and had them for years. It was very cool to see the inner workings back then.
There’s people who collect such models
That's awesome. The wind tunnel facility where I used to work didn't let us take home any models. Apparently just because it's Top Secret you can't take it home (well, not unless you're a politician, apparently.)
Thanks for this. I did my Masters in CFD in the 90s. Used lots of Unix workstations and Plot3d etc. Also read a lot of those old NACA typeset papers. Good times
I worked at NAS for a number of years, worked on the NASA Net using advanced fiber optics to build and maintain the distributed net to all the various sites, LARK, LERC, etc.
My favorite super was the massively parallel CM1 "bubble machine". so called because you could see through the transparent panels it's processors that were cooled in a circulating, bubbling bath of fluorinert. Iy is also true that NASA and the NAS worked on many diverse areas of research, not just aero subjects, it was a privilege to contribute to the program.
Thank you for making my morning, once again! Still waiting on the Tenerife Disaster episode! Please!!
I worked in the computer hardware business for 20 years in the 1970-80's it was a facinating time.
I was a little kid then, but was always fascinated by computers. I begged my parents to take me to computer conventions where they would just drop me off in front of convention centers. I remember looking at a Lisa in awe and seeing a color display for the first time showing a map of a city. About 30 years later, I bought a house and the realtor told me he worked as a salesman selling the lisa
In 1964 I learned to program on an IBM 1620 -- also made with individual transistors. It was as big as my desk, used a built-in IBM Executive typewriter for I/O (it also could be attached to a card read-punch unit) and had about 10K of memory. It was also called the CADET -- Can't Add, Doesn't Even Try. Later, in 1988, I worked on a CRAY-2, with VAX front-ends. I think my phone probably has more computing power than either of those.
In the early 80s I lived in Moutain View, and my grandmother until her passing in the early 2000s lived in Sunnyvale. Both just miles from Moffet Field (now Onizuka Air Base) and Ames Research.
You could always tell the wind tunnels were being used because there would be this very deep droning sound that could be heard for miles around.
My grandmother also worked at Ames as well as in "the big blue cube" at Ames/Moffett. As well as the parent of my best friend in grade school through high school worked at Ames.
This is such a fun one for me.
I remember being able to hear that wind tunnel spool up. A few people I know from Sunnyvale worked at Ames.
I worked right next to Ames for several years and went on a tour of it once. The various wind tunnels there are really a sight to behold. Not sure if they're still doing tours but well worth it if they are!
The History Guy does it again!
The only thing that feels faster than watching the advancements of technology is seeing your kids grow up
And then seeing how fast your kids learn these new technologies... while you are still using your Blackberry. 🤣
@@flightmaster999 hey that blackberry was complicated
Man, the early computers that took up an entire room were incredible feats of mechanical and electrical engineering.
It's crazy to think that this 12" x 9" thing I have sitting on my bed is more powerful than any single one of those computers.
The development of the world is truly staggering sometimes.
Correction: You mentioned the Cray from 1982 was 100x slower than a modern smartphone. I looked up the iPhone 14 at 2 Tflops and that's 10,000x the 210 Mflop Cray you quoted. An amazing statistic. I looked up the Cray X-MP and it seems you have its speed right. Big numbers. Fun stuff.
I calculate the iPhone 14 as being about 7,500x faster than a single-CPU Cray X-MP but hey, close enough 😂. And talk about big numbers… a 4-CPU Cray X-MP cost $15,000,000 back then, the equivalent of over $50 million in today’s money 😮
@@MaxPower-11 The other thing that's really amazing is the 345 KILOWATTS they used, vs the iPhone using more like 345 milliWatts:-) I didn't look up the real number, but.... Also, you probably couldn't make a phone call from the Cray.
Yeah, definitely more than 100x. Modern GPUs designed for consumer PC gaming already hit 40 TFLOPS. For a few hundred bucks, it's 200,000x the computational power of the Cray.
As for power, that iPhone is almost certainly pulling Watts when it's anywhere near its theoretical peak CPU and GPU performance. Still a far cry from hundreds of kilowatts, though... or the 21 Megawatts currently used by the Frontier supercomputer at ORNL (but that 21 MW is for 1.1 exaFLOPS... i.e. 1,100,000 TFLOPS or 1,100,000,000,000 MFLOPS. It's a bit fast.)
I had the opportunity to tour some of the CDC and Cray facilities in Chippewa Falls, WI, Seymour Cray's home town. It was interesting to see how they were built. A large mass of wires on the back planes interconnecting the modules. Each one placed by hand by diligent workers, mostly if not all women. Years later some Cray employees took my Intro to Unix course at the nearby Chippewa Valley Technical College. Cray developed their own Unix variant based on System V and called it UNICOS.
What a crazy story. I was an undergraduate at MIT in the early 70s when people were talking about building a "super" computer. We were hoping to get to a Megabyte of memory. I don't think we were talking about mega flops yet. But it is amazing to see the change and progression of the computer in everyday life.
Dude.... Thank you for all the knowledge this channel has imparted upon me. With such easy access to not only information, but also misinformation, this channel is a shining example of what is right with the internet. It's nice having content come from a trusted source. Again, Thank You!
Another testament to the work of all these individuals is that anyone with an internet connection can run CFD simulations at no cost. I've used one to check the air flow through my 3d printer's part cooling ducts with various configurations & fans.
I got to see in a parallel environment the results of the fluid dynamics on several programs in my 49 year aerospace career. From the North American Rockwell B1, Space Shuttle, X31, XF32, and Lockheed F22 and X59, programs. Many other smaller programs under development at the famous Skunk Works all got attention from the fluid dynamicist working with Ames Research.
Most people wouldn't have a clue about the complexity of determining weapons deployment from an aircraft flying at Mach speeds or the use of the dynamics of drag to reduce speed of an orbiting vehicle to survive reentry into the atmosphere.
Your video is a credit to your research team and yourself.
I just find it amazing that the tablet I'm typing this from can run rings around those computers that were but just few decades ago the cutting edge of technology.
10:53 "First to use TCIP ..."
I believe that would be TCP/IP. As in "Transmission Control Protocol/Internet Protocol."
This comment is way too far down.
Thanks for your episode on NASA/Ames. I worked there from 1980 to '96 and calibrated and repaired equipment all over the base. Security wasn't significant at that time. We used to go anywhere and do everything back then. Saw a tilt-rotor fall out of the sky while doing low-level ops near NAS and main stock toward the tidelands of S.F. Bay.
I worked at NASA/LaRC and used the CDC STAR-100 and the follow on CYBER 203 (only 3 were made IIRC). I later used the CYBER 205s at NASA/GSFC and the National Meteorological Center. These were very good machines if you knew how to program them for vector instructions. This generally requires a complete rewrite but you were rewarded with very good performance.
- FASCINATING!
- Thx. Very well done.
- I've loved/watched/studied "super-computers" since the CRAY-1.
- We've come a long way since the abacus :)
Thanks for the great video! I worked in the NAS division from 1991 to 2005, through the name change from Numerical Aerodynamic Simulation to NASA Advanced Supercomputing (we had to retain the NAS acronym so old T-shirts were still good). As the PI of the virtual windtunnel project I was thrilled with the mention. For those curious about that project here is an early video ua-cam.com/video/tW5rHVlli14/v-deo.html.
I’m super proud to have been part of the NAS division.
Thanks for doing this episode. Ah, I have fond memories of NASA Ames and the communities around it. I know it wasn't about the Ames wind tunnels specifically but I spent many many hours, days testing (production and research) at the Ames Unity Plan Wind Tunnel for Boeing products including the 787-8, 787-9, 747-9 and 777-9 variants. It is the best elevated Reynolds number production wind tunnel for transonic testing in the US, maybe even the world.
I spent the better part of my 35 year career at Boeing Commercial Airplanes using and improving both wind tunnel testing and CFD to better simulate transonic flight results for practical engineering applications. There are numerous reasons that both simulations don't get it right and it still takes skilled engineers to understand why and merge the two into a cohesive combination that does, utilizing the best of each for a better pre-flight prediction.
I was an early appreciator of CGI and remember the different colored balls rolling around with shading and how the frame generation took hours if not days to generate. Now Pixar with all the new fancy computers are dazzling us with all the creativity those artist can imagine. I guess it should not be a surprise that cards/ GPU like what Invidia make are so important to supercomputers.
Great content, but the audio levels were jumping up and down by over 10 dB throughout the video, with the main narration being roughly 10 dB too low.
12:00 I remember the first time I ever saw a CRAY computer. It was at Lawrence Livermore Laboratories, gosh, in the late '80s? Seeing a "liquid cooled" computer was super WOW for a young me.
I love the idea that NASA has a supercomputer called "CRAY Aitken"
Wow, what a blast from the past. I worked at NAS in the mid-to-late 90s, specifically on high-speed (at the time, anyway!) networking and hierarchical storage management systems (those CFD simulations generate a lot of data!). I have fond memories of my time working in building N-258. Thanks for making this!
Fantastic! Love the history of computers! Made my day! :-)
3:05 - When I came out of school in 1981, I went to work for what was Datacraft in Fort Lauderdale, Florida, where the cash cow was a superminicomputer which performed at 0.5 MIPS. Ten years later the company licensed technology from Cray Computers to build a 3MIPS superminicomputer.
When I contracted for the USAF in the late 1990s on a huge air force base, there was an old decommissioned cray in one of the huge hallways, with the benches and everything. It was just outside a bowling alley.
I saw the first cray at the bradbury museum when I worked for a lab there. I think it was sitting in the corner and not even part of a real display, It was before the museum they have now and more of a warehouse.
@@sdrc92126 That's cool!
@@mercster I feel guilty sometimes of being in the right place at the right time for a charmed life. I just wish I'd realized it more at the time. I remember going with a friend since I didn't have a car and Bannana Rama's Cruel Summer was playing on the radio as we parked. Weird the stuff you remember
Yay! I went to Control Data Institute to learn Computer Technology. I was a field tech back in the 80s when IBM pcs 1st came out. I moved out of the city so changed careers late 80s. I probably could be retired now if I would've stayed in IT. I still utilize many skills from back then - not everyone can count in hexadecimal!
I only remember seeing a sea view animation by Cray-1 in 1970s. Real-time moving waves and clouds and in living color. Amazing. I could barely make a moving clock-face on PDP-8.
The smart phone you carry is more powerful than the CRAY 1 supercomputer. When I first saw the MIPS comparisons for modern technology vs old, that one stunned me!
I am part of a solar car team and we use CFD all the time to optimize our cars and I got to see first hand how it works working on our teams new car Gato Del Sol 7. Its always fun to learn the origins of these tools. I think it would also be cool if you did a video on solar racing, its really fun, has been going on longer than people think, and is getting more popular as of late.
I looked at my university's entry into an Australian race in the early 90's. It didn't look interesting, so I didn't get involved, but I remember being told the motor cost $25k and had an efficiency of something like 97% or 99%. Ifft was an electric 4-wheel bicycle that carried it's own solar cells and had to charge a few hours in the morning before it could take off
What software package do you use? Is it run on the desktop or on a server?
In the70iesI worked at a machine shop here in Minneapolis making Disk drives for the memory disks for the CDC computers. They were massive. A disk, the size of a record album had 5K of memory. We made em by the thousands for years. By the mid 80ies the disks were the size of a silver dollar, or maybe a bit bigger. Amazing stuff back then. But time marches on. Now my usb flash drive has 128G of memory.
my first computer, a Commodore Vic-20, had 5K of RAM and a cassette tape for data storage... ahh, those were the days...
@@cerealport2726 I remember playing 'Elite' on one of those in college. Didn't get my first PC, a 286 Packard-Bell with a whopping 40mb of drive space - I never thought I'd be able to fill it up, until 1991.
Amazing episode!
Feeding the algorithm on yet another outstanding video…. I love this content and respect it’s creator.
the smart phone comparison gets hilarious when you consider that in the novel Jurassic Park, one of the 'clues' that Ingen was doing something weird was the fact that they'd bought 4 Cray XMP's and networked them to run their park.
meaning that Jurassic park could be run off my smartphone, or my laptop.
You would not have the same I/O capacity.
@@GH-oi2jf Nowadays everything is networked. A single ethernet line (via USB) is enough. Granted that doesn't conform to the redundancy standards modern industrial systems use but that would just be fitting for Jurassic Park
Cool episode! Some of my graduate research was in high-performance computing (though much more recent than the systems discussed here.) Particularly, my research was focused on ways to efficiently use graphics processors in supercomputers. While developed primarily for the computer gaming market, it turns out that graphics processors are actually quite good at running most types of scientific simulations and, since they're already designed and manufactured in the millions for the PC market, it's often cheaper to build supercomputers out of those than purpose-designed HPC chips, which is a far lower-volume market. My research focused on ways to co-schedule kernels (as in programs that run on computational accelerators like graphics cards, not operating system kernels) with complementary resource requirements in order to get the most throughput out of the GPUs.
It would be cool to see an episode on the Department of Energy's supercomputers, too. The Oak Ridge National Laboratory in Oak Ridge, Tennessee currently has the fastest supercomputer in the world, as it has for a large percentage of at least the last couple of decades that I've been keeping up with it, with Jaguar, Titan, Summit, and now Frontier all holding the number one spot on the TOP500 list at the time of their entry into service. Living only an hour away from Oak Ridge, one of my graduate advisors was also working for ORNL during my time in grad school.
In the mid 90s I worked at a small Aerospace firm building a satellite that eventually flew on STS-77. Our Analytical Group was doing FEA using Pentium 133s, the fastest machines we had at our company. The supervising contractor (JPL) reran the FEA and changes were made because they had access to a CRAY their results had much greater depth, even with our systems running overnight. Funny to think of that level of computation is passe 25 years later
perhaps the biggest change in supercomputing is how what was once completely bespoke custom CPUs, is now done with commodity CPUs and GPUs in enormous arrays. GPU alone was probably a big shift. There was even one super computer awhile back that included the cell processors from Playstation 3s.
Names like Cray and SGI are now long gone from a field they were once face of to the general public. SGI was especially big in the public view in the 1990s as their machines started to be used for making the graphics for video games and at firms like ILM for movie effects.
Fun tidbit, that SGI GUI in Jurassic Park was apparently a GUI they actually did design. no clue if it ever made production but it did exist and was not just invented for the film.
"Probably"?
Little known fact: when Cray started his company, with the exception of a few experienced engineer-managers, he only wanted green engineers just out of school. He did this because he wanted engineers who didn't know Cray's ideas "wouldn't work". FYI, every U.S. made automobile made since 1996 has more computing power under the hood than in the entire Apollo Command module and Lunar Lander module COMBINED! Every Pentium-based PC has many times more raw computing power than the best super computers in the world in 1969.
1969 supercomputers were eclipsed in the 80s with the 286 processor.
Mr. Cray was my computer hero growing up. I read a Popular Science magazine in 1980 of a Cray Supercomputer that was capable of 80 MFLOPS and appeared on the cover. The article mentioned the 'C' shaped design of the 'CRAY' was to allow the shortest wire distances possible between electronic component boards within the interior of the supercomputer to speed internal comms. I later saw a decommissioned Cray deployed in the building foyer of the 'Bureau of Meteorology', Melbourne, Australia, as an artistic installation in the late 1990's.
Wow! Helping us be healthier, fly farther, and getting smarter all the time!
From 92 - 02 I worked with the EPA's Cray supercomputers, a YMP, a C94 and a MPP, it was my first job out of college. The plumbing required for their cooling needs was as impressive as their electronics if not just more interesting to look at. Those were the good old days.
You needed to sneak in a picture of the WOPR without comment. Just see whose paying attention and who gets the reference.
You might look into the Sage project as a lot of major computer technology developed and cost a huge amount. Jay Forester was a particular individual involved in that program.
Probably one of your most interesting videos. Well researched and the collections of images was wonderful. Reminded me of Paul Allen's Living Computer Museum which used to let you use a lot of these old supercomputers.
I had a customer, Pratt & Whitney, who put a PW4000 high-bypass turbofan jet engine in an Ames wind tunnel controlled not by the electronic engine control normally delivered with the engine but controlled by a supermicrocomputer running a real-time version of UNIX. The P&W controls engineer was on site trying new engine control algorithms, tweeking the parameters on the controls with the engine's turbopumps running on full in front of shocked Government wind tunnel managers... now THAT's a confident customer!
As meteorologist we also were also dependent on those supercomputers and watched as each new model allowed better atmospheric modeling.
I went to school in the 1990's shadow of the IlLIAC III and ILLIAC IV projects. We had test equipment from both programs in our ACM lab space, operated in the Digital Computing Lab building that was constructed to house those programs, and we even had random prototype and spare boards from the systems that were usually sitting tossed in cabinets, or boxes, etc. Most of my first experience with using Oscilloscopes was on a Tektronix 454 and 465 both labelled "ILLIAC IV MAINTENANCE TEAM ONLY". At the time all of that stuff and many parts from similar 1960-1980s super computing projects were just sitting in store rooms and warehouses treated as complete junk. Including VAX-11/780 through to VAX9000 systems, DECStations, and early ARPANET communication equipment. The Dean occasionally let us root around in storage for parts, but most of that foundational computing history was later just sold off/disposed of for the scrap metal. I think 1-2 boards got saved and put in a display case but the rest is all gone.
Douglas XSB2D/BTD wind tunnel model at 1:20
In my past, I grew up in Sunnyvale. (If you don't
know where Sunnyvale is, before it was called
Moffett Field, it was Sunnyvale NAS.) Ames Research
Station is still on Moffett field. Fremont HS district
had a machining class at Ames. I went.
We also did trips to a LOT of the other things at Ames.
Saw a model (quite large) on the 40 by 80 wind tunnel,
of the space shuttle. This was 1975 or '76.
Lance, my father started working at a tiny computer
memory making company, in 1970. I met one of the
founders at work one Saturday. The guy was Gordon
Moore. Dad was employee # 47. Yes, Intel.
steve
Have you already done an episode on the history of ties? The fashions, the different ways to knot them and what the knots mean, maybe not a great episode idea but it just popped in my mind. Love the show!
I can tell you a tidbit - the modern neck tie is believed to have originated in Croatia. The French word for neck tie, cravat, is a bastardization of Croat, as in a person from Croatia. I have no idea how I know that 😋
Another very interesting video. It was so good to hear the history guys theme at the end!
Brings back memories... I worked for GE Nuclear from 76 to 82 as a Fortran programmer simulating Advanced Nuclear reactors (Breeders) and we originally used a GE 6000 then Honeywell bought them out small word size (32 bits) eventually went to 64 bits with the CDC 7600 (water cooled). then the small Digital DEC-750, 780 was the fastest of that series. Easy to network with (Apple's apple talk was wonderful also) I eventually ended up at NASA Ames working for Boeing computer services supporting 200 Apple Mac Desktops, and writing code for Mac's and several other machines also.(I supported the referb of the Hubble team). My code (DQ) flew the Ozone mapping in U-2 (UB2) aircraft we had 3. Pilots were in Space Suits. Cray was sold to Silicon Graphics where it languished and then Sun Microsystems bought them ported the os to Solaris (Sun/unix). At the time it was the biggest system that could be bought (fastest and physically largest). I ended up working in SE Asia for Sun as a software guy Principal Consultant and installing the Sun/machines in SE Asia and supporting Y2K in the Philippines then Thailand. So many vendors went belly up in the bay area all the infrastructure was there. This is also why the companies like Facebook (located on old Sun campus) and Yahoo, Google (built on an old dump/concert venue) We built everything from Scratch (Software). I have been living in Thailand for over 20 years.
I visited the Goddard Space Centre some decades ago.... in the 90's! . It was fascinating. While on the visitor tour, I was blown sideways by a draught through a partially open blast door into a huge room full of supercomputers. I saw a load of Cray XMP's along with a very large Thinking Machines system. It was all going well, taking photos etc, until I received the 'very heavy hand' on my shoulder. I was spun around and marched off the premises. I don't expect you will believe any of the above but it really did happen. My subsequent visit to Fort George Meade went rather more badly. I didn't make it past security at the railway station.... lol!
Written from his prison cell at Leavenworth...
Really enjoyed watching this. Thanks for making.
the dialog is basically a whisper compared to the exit music.
thanks
I have worked in the NASA community most of my professional career, so I found this a very interesting listen. Listened to it on my way home from NASA's GSFC this evening in fact. Naturally it spawns multiple questions, so I'll ask the most important one: What is the bumper song at the end? It sounds like Lindsey Buckingham, so I'd guess it's a Fleetwood Mac song, and yet I don't recognize it.
I once visited the "Cray Room" at Laurence Livermore Labs. I was told that the Cray 1's had air conditioning units in the surrounding benches. The Cray 2 was liquid cooled with a separate device used to remove and store the fluid.
Interesting, thanks! I was disappointed that you made no mention of the super-mini Elxsi multi-processing computer of the 1980's, where I had been employed -- later rechristened the Trilogy for marketing reasons. We boasted as being the second-fastest computer to the Cray of the era, and these systems were installed at all major national labs and used extensively for flight simulation systems. Thank you, HG!
I remember the first time I used a Cray. It was for a structural analysis problem (FEA) for Space Station in the early 90s. I probably could have just used a Unix workstation, but I just wanted to run it on a Cray because I could 😊
A very interesting history on the development of the supercomputer and the cray in particular.
One of the reasons that China got so advanced in computing technology and became what they are today is primarily due to the fact that President Clinton when he was in office decided to equip China with Cray technology by selling them supercomputers.
Previously there had been a ban on selling US technology to the Chinese because of their human rights issues and other things. China insisted that the super computers were going to be used so that they could plot and map weather more properly. A very thin reason but it was accepted by the administration at that time.
Considering what's happening now with China, someday we will be hearing from historians about the mistake that was made by giving China access to modern US technology.
That's a similar story to the English Labour party giving Russia Rolls Royce To Let jet engines as fitted to the current Gloster Meteor fighters
And also captured German papers on jet and rocket engines, because the stupid Labour party leading being a socialist
So that is how the Russians went from prop to jet with 5 yrs rather than the 20plus. years the Americans thought
Oooopppssie