+Claude “Reviews4U” Rains my jaw dropped when they said that. Haven't they heard of compression? 4k is only 16MB/s XD. Though I bet they are using AVI across a network :D
well try running the (at the time nonexsistent) h265 codec on the hw of the day, i think you would be lucky if you got about one frame/minute even for low rez video. So at the time the 8Tb/s (at 7:06 in the video) that number might not have been inaccurate as for today i would say 25-50 Mbps/stream considering that we want quality and very little encoding/deciding lag
I know right, 1 TB a second? How the fuck are you going get that speed? Its 2020 and like the fastest home use speeds are like 10 gigs in very select regions.
I've been watching a bunch of these Computer Chronicles videos. It's fascinating to watch all of these then-new products and technologies being introduced as potential next-big-things, while knowing from our position of hindsight which of them would be destined to succeed and which of them were dead ends doomed to fail and dissappear.
I'm fortunate to be old enough to have experienced this era of computing (was a pre-teen when this aired), but still young enough now to stay reasonably current/comfortable with modern tech. People born in 2000+ take the instant access to volumes of information for granted, but I remember a time when everything was split up in proprietary software/services/devices. People don't know how good they have it.
@@yellowblanka6058I know exactly what you mean, being a pre-teen (or teen for matter) in the 90s. You get the best of both worlds and provided with some very valuable insight. :).
I’m doing the very same thing 2 years later and I was just thinking this exact same thing! I watched an episode on spreadsheets this morning, noticed all the stuff excel stole from their competitors.
The first guy almost got it right. We added computer and TV to our phones. Beyond that, we didn't really add computing to our TVs, people rarely surf the web and you can't bank or buy stuff on your TV. All the streaming sticks and boxes let you do is watch videos (or listen to music) on your TV.
14:15 Show me how you sync the Device. 14:16 Swallows nervously 14:31 Corrects the letter w twice 14:36 "This allows you to enter text into the system at up to 30 Words a minute, with a 100% accuracy." I Call Your bluff, Ed
TBF all of his letters besides W were done really well. It looks like he kept accidentally doing R. Probably because of Graffiti needing such specific shapes. I think he might have started writing Stuart before realising he needed Stewart as well? But yeah, it’s definitely not 100% all the time lol. He could’ve said “up to 100%” but I guess he already said “up to 30WPM”.
@@kaitlyn__Lyeah he said "up to 30 words a minute WITH 100 percent accuracy" obviously he wasn't quite 100 percent accurate 😅 But honestly for 1996 technology it seems to have worked really well. My iPad doesn't recognize my handwriting half as good (granted I scribble like a toddler)
@@sandrinowitschM I took that as part of the “up to”, but certainly that’s a bit “advertising” in terms of weasel words. Funnily enough, opposite to the Newton in the 90s, I’ve found the iPad does better with cursive. Probably because it has more rules and cues, versus print where the same squiggle could be c, i, u, e potentially, etc? It gets a few words wrong but the “replace” option usually has the intent inside. It struggles most with punctuation for me!
It made sense, the home PC explosion didn't occur until the early 90s, right about the time DOOM launched, in fact, many attribute DOOM to saving PCs in the 90s. Prior to that, PCs were very much so being marketed for corporate environments.
A real time clock (rtc) sounds redundant but actually means something very different from a regular clock; for example, even modern raspberry pis still lack one, as do plenty of embedded systems.
It looked like it was attempting to load the Apple webpage when he started Navigator but it wasn’t loading so he went to a local version to show what it would look like.
1:53 - Give me a kinda guided tour of the box - The objective is to give to our customers the kind of services they want thank you! that was a very thorough guided tour of the box
I had a Be Box. It was awesome if you wanted to watch five videos at one time and liked yellow title bars. No apps at all. Seriously though, it had potential.
The part that I still find interesting about Be is the way it handled working with various file systems. It’s still mostly redundant compared to today’s functionality, but it’s one of the parts that still works uniquely enough that I find it cool. Whereas a lot of the other USPs it used to have, have been absorbed into various other OSes over the years, hence why Haiku is also having a hard time gaining traction. I do genuinely like the variable-width title bars, and how they look like browser tabs. Dragging them together to combine windows into a single tabbed window is something we’re still having some apps refuse to play nice with, so making it be system-level seems fairly effective. (I don’t remember if BeOS actually did that, or if it’s just Haiku doing it with our modern sensibilities about tabs.)
Hah, you can tell Stewart LOVED the Transphone. The way he said "no!" when Tom told him it was only 500 for the color version. Definitely sold one there.
Haha probably. He did say in an interview that he would buy a lot of this tech stuff and still keeps most of it so it’s not crazy to think he bought one.
Excellent taste in anime, Lain is. I still find it interesting how much they did basically get right in that one too. With the roles of online social presences and the real world becoming ever more intertwined, and online affecting the offline in more and more ways. (Despite the man in the intro, I’ve never actually believed it was set in the ‘90s. For one, they have a universal processor standard! Even if it were just an Alternate ‘90s; that still wouldn’t actually be the “present day, present time” of the audience.) Apparently it doesn’t play out so well for younger folks? Who seem to get hung up on the literalism of their computers and that our internet doesn’t actually look or feel the same to use as The Wired does in Lain. But, I find that strange, as I was able to look past the differences in literalism to still appreciate a speculative fiction about the internet which was made at the start of WWW really taking off. But then even people who like the show seem to struggle with the metaphors sometimes. Like people who get hung up on how exactly erasing your online presence removes your real world presence, or how exactly the Online Lain was born, but to me they’re again more metaphors. Even though you can’t will yourself into literally blinking out of existence, in our modern day, if you suddenly stop using your social accounts it will be as if you’d actually died to all of those people. Online Lain is presented in the narrative as a genuine external consciousness, but I see it more as a representation of the struggle with ourself and our urges that we can experience with pseudononymity. “Why would I say that to someone, that’s terrible” etc. Anyway... :)
@@floydjohnson7888 Predictions can't take into account for breakthroughs, so a lot of stuff happens much sooner. I just wish they would hurry up their research with age related stuff and diseases etc.
Around that year, Toshiba had announced it's smallest 486 notebook, the Libretto 20. Was almost the same size of a vhs cassette, had a 6" tft with 640x480 res at max 65k colors, 16-20 mb ram.
They were explicitly trying to replicate passive matrix? I guess I shouldn’t be surprised. When I was little I used to use that, because I liked drawing shapes with it, such as one did with the common frozen window and so on. Turned it off at nine years old and never went back to any intrusive or “interesting” cursor. Just black for me please. Though I do appreciate a loading wheel vs an hourglass nowadays.
@@kaitlyn__L I don't think they were trying to replicate it. From what I've heard some times it would look like your cursor disappeared so the trails made it more visible.
In 1996, we were already developing the future of VR and AR. Although 20 years have passed, remarkable hardware and software development has been achieved, but popularization is still far away. We must dream and prepare for the next 20 years.
The problem is that most people don't want a VR headset on their head after a while. It looks stupid, it's quite annoying to wear plus you can't see what you are doing in the room.
@@raven4k998 Bite the extra 10 seconds on boot and get some light changing bloatware, only way to avoid said puke. I swear, why rainbow... why everywhere...
Note, the acorn branded keyboard controlling the NC, acorn were involved in the project probably because they had the perfect processor for such devices (and the in/out knowledge of it), plus said processor was available as a SOC, Acorn inventing the modern SOC iirc.
Man I cannot even get an idea of where that 1TB/s estimate came from. Assuming they are doing standard raster Video, and not some sort of proprietary 3D hologram stuff. I mean even with 0 compression, every keyframe is a BMP, here are my rough estimates: A 640x480 24bit BMP = 900 KB. x 30fps That would require a theoretical connection of ~27 MB/s per stream 1920x1080 24bit BMP = 5.93MB. x 30fps = ~177.9 MB/s per stream... 7680x4320 (8K resolution) x 30fps = ~2820 MB/s per stream... Ok that is nowhere near the 1TB/s per steam. 1GB/s would be overkill, but at least there I could understand a bit more.
I thinng we might not be dealingwith digital video here, rather a digitized form of the analog signal, lets say NTSC at 6 MHz the Nyquist theoreme sayas that to accuratly describe an analog signal figitaly you need to sample it at double the frequency, so unless I've misse something that gives us a saple rates of 12Msaples/s but at what bit debth; lets or the sake of argument say 30 so we end up with 12000000*30=360Mbps hmm stil way less than the number thay said, I must be missing somthingvery basic or they where basing it on way better than NTSC. Heywe are over analyzing this (at least I am) way to much the person on camera might just have miss spoke (tera instead of Giga). my example above gives us 45MBs/steram before overhead so with 4 people in the conferencevi are already at 200MB/s , hold on a sec 200MBps = 1.6Gbps this was a simple units screw up Tera insted of Giga and Bytes instead of bits
Maybe they meant I terabit per second...reading off a sheet and like people today confuse megabit and megabyte and will say you need a 20 megabyte connection etc. Would still be very high estimate, but would bring it down a lot
hahaha oh man, requiring 1TB/s transfer speed for live video chatting? Im so happy that's incorrect else we wouldn't have it today. A 720P @ 30fps chat is only about 250KB/s maximum. Phew :P
They are referring to bandwidth that would be needed for quality indistinguishable from reality. While 720p/30fps might be reasonable for a consumer grade device, mobile phone ,etc, , the corporate telepresence systems have high resolution, high frame rate screens almost the size of a conference room wall, incredibly high fidelity audio, and do consume up to several Gbps - and they would consume a lot more without compression, which was probably inconceivable at the time that a CPU would be able to perform such massive amounts of decompression in realtime. It would have been considered easier to bundle multiple network interfaces than achieve a computer with such compression/decompression processing power, which at the time would have been the realm of supercomputers.
If 720p at 30fps only uses 250kbps then how come video chatting on fiber-optic broadband and/or high-speed mobile data networks still looks like garbage on anything above about an 8" screen AND is choppy as shit? Raw theoretical performance on paper doesn't stop bottlenecks occuring in real-world situations.
1996 and dialup was still the way to get on the 'net. Delphi had introduced their Internet gateway services in Oct or Nov of 1992. 36 kbps dialup with v.90 and 56 kbps was still a few years away. G.992.1 DSL was 3 yrs in the future. Here we are in 2022 (26 year later) and there are still places where it is impossible to get wired services faster than 10 Mbps. And places where it is impossible to get wired broadband at all. Fixed wireless or satellite is prohibitively expensive.
I had cable modem in 1997. It was slow by today's standards, I believe just 5 Mbps but it absolutely made you pass out from joy over the speed differential from 56k.
I got a pda bundled with my first computer which was a Dell back in early 2000's. I couldn't really find a use for it. I forgot the password and after many tries, I gave up... bricked.
I think this is what innovation at it's core looks like, messy, confusing and all over the place. It's up to the leaders how to make sense of it discretely and as a whole. And, then, how to monetize it, which is followed by feasibility, prototyping into a viable product , supply chain system concerns. Then, a full fledged product and market strategy and then, branding, pricing and marketing. This all could take a decade or two, for full fruition depending on the sophistication of the technology itself and/or consumer behavior.
They had to look for a product solution. The whole Copeland project bled millions and deemed a massive disaster that sealed the fate of the Mac to less than sunny future days.
Good ol' BE OS. OS/2, etc. I remember some of these from back in the day, though didn't know what it all meant back then. All this work back in the day to give us the experiences today that we take for granted.
That terabyte per second thing might have actually been teraBIT since they were talking about data transfers over a network connection which typically are noted in bits per second and not bytes per second.
Link speed is commonly described in bits per second, but actual file transfers are usually discussed in bytes per second. For instance Steam, or a torrent program, does the latter. This has obviously been complicated further by video bitrates being talked about in bits, as well as transferring that onto video calls nowadays. But back then, I could see why Sun would say “that’s an application layer, transferring blocks of data. It’s not a serial string of bits in the link layer; so we’ll call it in bytes per second”. Now it sounds strange, as we have shifted to generally talking in bits for live speeds like when video conferencing, so it sounds very strange to us to hear them use the other measurement in this example. And obviously, decades of ISPs calling Mbs “MBs” in voiceovers on adverts hasn’t helped at all. “8 megabyte internet speed”, not even “per second”, was common everywhere ten years ago. Even tho, of course, you only got 1MB/s out of 8Mb/s. I’m sure tech supports are still getting called because of getting “only” 12MB/s in Steam or Battle Net on a 100Mb/s line to this day.
Ahh the BeBox, I remember that. Played around with BeOS as well, never saw it running on the Be hardware though. Would make an interesting video if anyone could find one.
He said "right now PC's are based around the motherboard, in the future they will be more like a backbone" yeah no mate it's still a motherboard and still looks exactly the same. Only slight differences in connectors and lanes.
The Transphone idea - a computer with basic functionality, designed to use a web-browser - was realised as the Chromebook many years later. Interesting to see it's roots there
@@joeysluzer1913 yeah the net appliances for example. I wonder if chromebook was more successful than the others tho. Kinda depends on the country you lived in too
7:10 - "will require data transfers of over 1TB/sec". Curious how they arrived at that number. I know advanced video compression schemes did not exist yet when this was filmed, but still.
BeOS 4 was a great OS I couldnt crash it. Not enough apps though. Adoption of new OS/standards is hard. Look how long linux has been around and it still isnt mainstream on the desktop. Much of the computing world is timing timing timing. Palm knew they were in deep doo-doo when the blackberry came out. I remember the look on Carl Yankowski's face (president of Palm at the time when blackberry came out) when I set up his blackberry for him to play with. He asked me what I thought I told him we were in big trouble, our Palm 7 had been delayed yet again, the only probable competition for the blackberry at the time. I left soon after. Fast forward, Palm buy's BeOS to put on their phones as Palm's OS was too outdated to do much with. Palm uses BeOS as the foudation for the new Palm OS, but it was too little too late. Fast forward again, HP buys Palm, to get in to the tablet market (late and failed stupid decisions by HP's ceo to get out of that space during the EDS acqusition), and therefore, BeOS/Palm OS which was put on the short lived HP tablets after more devlopment as webOS. Fast forward again HP sells BeOS/PalmOS/webOS to LG which is further developed and shipped on LG smart TV's and appliances. So the roots of BeOS are still around. Had HP jumped in fully with webOS, they could have competed with android, mac and winderz and possibly gained a foothold of the winderz market from MS, but HP was too busy wrecking EDS and screwing its employees. - Former Palm, EDS, HP employee that was there for all that stupidity.
It’s definitely well-suited to TVs. I tried the UI Samsung was offering at the time I was buying, as well as Android TV, and both were laggier, less well laid-out, and didn’t switch apps anywhere near as well. I loved the Palm Pre line and that version of the OS, so it was kind of nice to get to use it on the TV again, even though there’s so many differences as well. The smoothness and ease of use were retained. I do find it interesting they kept the name webOS through both the HP and LG buyouts, instead of each company trying to call it something totally different.
Yea, the Palm Pre was simply too late to the game. It more or less came out when the Android phones made by the big names came out in force, and the phone operators backed those heavily. I remember the display area for the Pre was in an obscure section of the store , while the Android phones were in rows of prime real estate. It had zero chance at that point.
Oracle really could foresee the future in the 90s. I remember seeing an interview with Larry Ellison where he talked about selling software over the web, "why am i going to buy this in a box at the store"
at 7:00 Says tele presence will require data transfer speeds of 1 Terrabyte per second. Surely not. Apparently the worlds fastest SSDs by Seagate can only manage 10 Gigabytes per second today in 2017.
I think they assumed network speeds would keep doubling, which they didn’t really, and that it would be a more feasible requirement than doing all the compression we actually do. Of course, it turned out data infrastructure rollout sucked in most of the world and we got more and more accelerated decoder hardware instead of having to use CPU for everything. (Even pretty beefy CPUs can struggle with just one H265/VP9 video without acceleration, like using 70% of a 2012 desktop i7’s entire capacity, and about 110% of a 2012 laptop i7. And that’s more or less how fast processors were until Ryzen came out and kicked Intel into gear a few years ago. That compression acceleration is a big deal.) It’s funny the ways people get stuck in the present when trying to visualise the future. I could just imagine them going: “we can barely handle running one MPEG video at low res now, how could we do dozens of video feeds for teleconferencing in high quality? I guess we just need lots of bandwidth?”
299 for the palm pilot?! That's SO CHEAP! Even WITH inflation! That's crazy!!! Holy shit! Could you imagine getting the latest bleeding edge modern smartphone for less than 600 bucks today?! They must have been making those margins RAZOR THIN!
Almost every one of these products was a failure. Apple was at the height of it’s colossal failure as a company in 1996, but they would soon make a comeback!
@@soylentgreenb pretty sure they’re just referring to the Hololens or Magic Leap style UI. Which kind of was just starting to prototype exist when they commented, and still is in very early stages and takes up a small FOV while I’m writing this. We’re probably 5-10 years away from it being at all mainstream, another 20 before it’s expected to be the norm (if it even does displace smartphones - but if it could fit in a regular pair of glasses, I could see it potentially displacing smartphones in the long term, like in “Dennou Coil”.)
You can buy new old stock Pippin`s from eBay straight from Japan. The problem is that new development is tough because to push the games you need a developer dongle that can not be found any more..
Wow! Back then I already had a CDROM, and was transferring my CD's and records to a hard drive on my 486, playing with the first Linux OS's, drooling over the BeOS because it was very smart...
It would have been amazing futuristic technology from the year 1965 though. However a kid from the 1960's would have been disappointed that there were no mobile man size sentient robots like in the TV show Lost in Space in the 1990's decade.
@@m9078jk3 That's up there with the "flying cars" thing. At best, we've a limited-production hybrid vehicle that requires two licenses. Technical concerns aside, the consequences of "rude driving" - the family of inexplicably trendy habits like tailgating, deliberately driving too fast for conditions, and blowing through stop signs - would be even more dire than they are on the roads today.
Hey Andrew Gong, I thought I was the only one that felt that way, I'm right there with you bro. I know that you are going to think I'm crazy, but today's tech is just BORING. I like everything about yesteryear's computers and how they looked and functioned. And also technology back then was just more interesting how it was presented, demonstrated, and it worked. I can't quite explain what I'm trying to say, but I think you kinda understand what I'm trying to say. Today's technology and computers are just BORING today. I wish I could just find a time machine and go back to the 90's. LOL :-) By the way, thumbs up to you. :-)
I like the amount of innovation back in the old days of the 80s and 90s before we had these stupid smartphones and High End Mac Pro computers, the problem in 2021 is that tech is stagnating there's no innovation anymore.
Probably has something to do with Moore's Law hitting a wall - we're quickly approaching the limit of transistors we can cram into a Silicon wafer, which is why we're just seeing slight "tick" process improvements, more cores etc. - until we move to a new material (Graphene has been discussed for years but still hasn't made it out of university labs and into production) I don't see any further drastic increases in processing.
There are many sides to this. Back in the 80s/90s computers were more expensive than today. Imagine spending those sums on hardware that would feel ancient and barely useable after just a couple of years. Back then new features were introduced all the time, but the hardware at the time was very weak and couldn't handle it well. As for the lack of innovation, what features are you missing today?
That AR and VR doesn't look that different from what we have today. And we're still far away from Terrabits per second being common internet communication speeds and hundreds of times higher resolution than what they had back then (640x480?) might be 16K or 32K resolution for monitors.
7:10 1TB per second lol? 5000 mega-pixels of uncompressed video? Even at that time with no compression (spoiler alert they had some) what kind of insane resolutions were they predicting to need a TB per second?
25:52 I got to get me that Desk stick. Marketed for people whose first computer is a laptop but switches to a desktop. So funny to see what people were coming up with.
Brian Cairns It was a beta version of the next generation of Mac OS, named Copland. It went under though because of various problems. Some was re-used in Mac OS 8.
No, it’s the Aaron extension that themed your desktop like Copland(MAC OS 8) by Greg Landweber, he eventually went on to produce Kaleidoscope, Aaron Lite, Greg’s Browser, etc
Yeah wtf was that ? grim reaper coming for the Palm Pilot, courtesy of Steve Jobs lol get the joke, Steve dont want them beating Apple to the smartphone game lol
The Transphone, it's not cloud computing! You need more horsepower than a 286 because the server won't improve web page rendering speed. For 1996 that's seriously obsolete right out of the box.
Ah, the Pilot, or soon-to-be PalmPilot. Remember wanting one of these back in college and the early working years. Funny to think that calendar, note-taking, and task apps are built-in to all smartphones now.
Definitely a prophetic episode. What a time it was to identify one of a number of major future computing paradigms and get rich. (I still kick myself for not squatting on certain domain names in 94-95, but it was *expensive* to do that and I was just a Jr LAN tech who connected my 250 employee division to internet email on a 28.8K modem, and it actually worked fine. My corporate IT leaders thought I was a boy genius...and I soon quit. haha. But also too poor to get rich.)
Still waiting for that 1 terabyte broadband speed!
+Claude “Reviews4U” Rains my jaw dropped when they said that. Haven't they heard of compression? 4k is only 16MB/s XD. Though I bet they are using AVI across a network :D
The Emailer Plus was worse haha
well try running the (at the time nonexsistent) h265 codec on the hw of the day, i think you would be lucky if you got about one frame/minute even for low rez video. So at the time the 8Tb/s (at 7:06 in the video) that number might not have been inaccurate as for today i would say 25-50 Mbps/stream considering that we want quality and very little encoding/deciding lag
I know right, 1 TB a second? How the fuck are you going get that speed? Its 2020 and like the fastest home use speeds are like 10 gigs in very select regions.
Still waiting 😢
I've been watching a bunch of these Computer Chronicles videos. It's fascinating to watch all of these then-new products and technologies being introduced as potential next-big-things, while knowing from our position of hindsight which of them would be destined to succeed and which of them were dead ends doomed to fail and dissappear.
I’d love to go into the past to show them our present
I'm fortunate to be old enough to have experienced this era of computing (was a pre-teen when this aired), but still young enough now to stay reasonably current/comfortable with modern tech. People born in 2000+ take the instant access to volumes of information for granted, but I remember a time when everything was split up in proprietary software/services/devices. People don't know how good they have it.
@@yellowblanka6058I know exactly what you mean, being a pre-teen (or teen for matter) in the 90s. You get the best of both worlds and provided with some very valuable insight. :).
I’m doing the very same thing 2 years later and I was just thinking this exact same thing! I watched an episode on spreadsheets this morning, noticed all the stuff excel stole from their competitors.
dont copy that floppy
“Will we add computing power to our television sets, or will we add TV capability to our personal computers?” Yes. The answer is yes. 😆
The first guy almost got it right. We added computer and TV to our phones. Beyond that, we didn't really add computing to our TVs, people rarely surf the web and you can't bank or buy stuff on your TV. All the streaming sticks and boxes let you do is watch videos (or listen to music) on your TV.
@@straightpipedieselYou can install web browsers on some of them and some smart tvs have browsers.
First added "pirated tv shows" and youtube to pc, then added "Android phone inside television runs Netflix app".
This 90's vibe
"Don't copy that floppy!"
My heart swells with nostalgia.
My brokeass me : *inserts blank floppy to a: drive
* evil laugh hahahahahahahhaah
Don't Bundy that Book!!!
I have no nostalgia for the DRM of proprietors.
It was the olden days.
They only said that to reduce piracy and man I was all about copying, lol
14:15 Show me how you sync the Device.
14:16 Swallows nervously
14:31 Corrects the letter w twice
14:36 "This allows you to enter text into the system at up to 30 Words a minute, with a 100% accuracy."
I Call Your bluff, Ed
100% accuracy.... once i correct it
TBF all of his letters besides W were done really well. It looks like he kept accidentally doing R. Probably because of Graffiti needing such specific shapes. I think he might have started writing Stuart before realising he needed Stewart as well? But yeah, it’s definitely not 100% all the time lol. He could’ve said “up to 100%” but I guess he already said “up to 30WPM”.
Ed is the kind of guy who could sell you anything. I would just freeze and run after the first failed attempt.
@@kaitlyn__Lyeah he said "up to 30 words a minute WITH 100 percent accuracy" obviously he wasn't quite 100 percent accurate 😅
But honestly for 1996 technology it seems to have worked really well. My iPad doesn't recognize my handwriting half as good (granted I scribble like a toddler)
@@sandrinowitschM I took that as part of the “up to”, but certainly that’s a bit “advertising” in terms of weasel words.
Funnily enough, opposite to the Newton in the 90s, I’ve found the iPad does better with cursive. Probably because it has more rules and cues, versus print where the same squiggle could be c, i, u, e potentially, etc? It gets a few words wrong but the “replace” option usually has the intent inside. It struggles most with punctuation for me!
The Palm Pilot was closest to the future we now have with smartphones and tablets.
couldnt help but think of Tron 2.0. specifically the part where you go into a pda.
Remote meeting... not widely accepted until 2020...
It's interesting how this show moved from a heavily corporate-oriented approach to a consumer-oriented one within 10 years or so.
It made sense, the home PC explosion didn't occur until the early 90s, right about the time DOOM launched, in fact, many attribute DOOM to saving PCs in the 90s. Prior to that, PCs were very much so being marketed for corporate environments.
thats a myth. amiga was already dying, mac very marginal in europe at least.
That’s how computers evolved
back into corporate oriented approach with SAAS and cloud computing lmao
It is because only corporations could afford most computer tech at the beginning. That stuff was absurdly expensive.
3:45 "You got a clock in there? A real time clock?" Yes. A real. time...... clock.....
A real time clock (rtc) sounds redundant but actually means something very different from a regular clock; for example, even modern raspberry pis still lack one, as do plenty of embedded systems.
11:08 His demonstration of browsing the web was really just opening a local .html file.
It probably was functional, but I would guess that they didn't have an internet connection available in the studio.
It looked like it was attempting to load the Apple webpage when he started Navigator but it wasn’t loading so he went to a local version to show what it would look like.
Yeah another episode there was a guy showcasing Internet Explorer and it said "Working Offline" at the top...
Lol. Busted.
Stewart Cheifet's combover is fantastic
matt jones Works like a charm on stewart since 1983!
Leonardo Antonio And still does as of 2015 :) Wow that's more than 30 yeas of that same combover lol
Lol, not sure who he thinks he's fooling with that. Looks so much worse than just admitting you're bald.
Never change a running system!
@@yellowblanka6058that was back in the day when the only worse thing to being bald was being dead 😂😂😂
We need to bring back the phrase "cruising the net"
1:53
- Give me a kinda guided tour of the box
- The objective is to give to our customers the kind of services they want
thank you! that was a very thorough guided tour of the box
I had a Be Box. It was awesome if you wanted to watch five videos at one time and liked yellow title bars. No apps at all. Seriously though, it had potential.
The part that I still find interesting about Be is the way it handled working with various file systems. It’s still mostly redundant compared to today’s functionality, but it’s one of the parts that still works uniquely enough that I find it cool. Whereas a lot of the other USPs it used to have, have been absorbed into various other OSes over the years, hence why Haiku is also having a hard time gaining traction.
I do genuinely like the variable-width title bars, and how they look like browser tabs. Dragging them together to combine windows into a single tabbed window is something we’re still having some apps refuse to play nice with, so making it be system-level seems fairly effective. (I don’t remember if BeOS actually did that, or if it’s just Haiku doing it with our modern sensibilities about tabs.)
Hah, you can tell Stewart LOVED the Transphone. The way he said "no!" when Tom told him it was only 500 for the color version. Definitely sold one there.
Haha probably. He did say in an interview that he would buy a lot of this tech stuff and still keeps most of it so it’s not crazy to think he bought one.
Excellent taste in anime, Lain is.
I still find it interesting how much they did basically get right in that one too. With the roles of online social presences and the real world becoming ever more intertwined, and online affecting the offline in more and more ways.
(Despite the man in the intro, I’ve never actually believed it was set in the ‘90s. For one, they have a universal processor standard! Even if it were just an Alternate ‘90s; that still wouldn’t actually be the “present day, present time” of the audience.)
Apparently it doesn’t play out so well for younger folks? Who seem to get hung up on the literalism of their computers and that our internet doesn’t actually look or feel the same to use as The Wired does in Lain. But, I find that strange, as I was able to look past the differences in literalism to still appreciate a speculative fiction about the internet which was made at the start of WWW really taking off.
But then even people who like the show seem to struggle with the metaphors sometimes. Like people who get hung up on how exactly erasing your online presence removes your real world presence, or how exactly the Online Lain was born, but to me they’re again more metaphors. Even though you can’t will yourself into literally blinking out of existence, in our modern day, if you suddenly stop using your social accounts it will be as if you’d actually died to all of those people. Online Lain is presented in the narrative as a genuine external consciousness, but I see it more as a representation of the struggle with ourself and our urges that we can experience with pseudononymity. “Why would I say that to someone, that’s terrible” etc.
Anyway... :)
Stewart does seem like a guy who owns a landline smartphone X'D
He might still have one in his garage!
3 years late, but he said "wow".
I love it how the 90s predates everything and it's 2019 and most of it is true and here
How about George Takei's observation that some of the tech depicted in "Star Trek", set in roughly 2270, was a reality in 2016?
@@floydjohnson7888 Predictions can't take into account for breakthroughs, so a lot of stuff happens much sooner. I just wish they would hurry up their research with age related stuff and diseases etc.
The 90s doesn’t predate everything. The 80s existed too.
@@JasonZakrajsek I'll give one thing to the 80s, cellphone and CD roms .
The VR one blew my mind.. stuffs been around forever, but finally starting to mature now we have the power to do everything
Around that year, Toshiba had announced it's smallest 486 notebook, the Libretto 20.
Was almost the same size of a vhs cassette, had a 6" tft with 640x480 res at max 65k colors, 16-20 mb ram.
yeah she was a beast with that little pentium 75 cpu in her
@@raven4k998 No
Ahhhh, the good ole days of passive matrix screens and the reason there is a "Pointer Trails" setting to this day.
They were explicitly trying to replicate passive matrix? I guess I shouldn’t be surprised.
When I was little I used to use that, because I liked drawing shapes with it, such as one did with the common frozen window and so on.
Turned it off at nine years old and never went back to any intrusive or “interesting” cursor. Just black for me please. Though I do appreciate a loading wheel vs an hourglass nowadays.
@@kaitlyn__L I don't think they were trying to replicate it. From what I've heard some times it would look like your cursor disappeared so the trails made it more visible.
18:21 that's the most 90s head tilt I've ever seen.
0:30 - First ever concept for a curved ultrawide monitor.
yeah is like 8k monitor also
In 1996, we were already developing the future of VR and AR. Although 20 years have passed, remarkable hardware and software development has been achieved, but popularization is still far away. We must dream and prepare for the next 20 years.
The problem is that most people don't want a VR headset on their head after a while. It looks stupid, it's quite annoying to wear plus you can't see what you are doing in the room.
They missed out on predicting the single most important advance in 21st century computers....L.E.D lighting.
Blue LEDs and its consequences have been a disaster for the human race.
@@wendysremix Red in China, Europe or Russia.
@Lord_haven111 ... Yep. You can not avoid it. I just set the few lights I have to a steady matching color.
@@Olivia-W rgb puke is the worst by far oh look at the pretty lights that makes my 1500 dollar pc worth 9000 dollars
@@raven4k998 Bite the extra 10 seconds on boot and get some light changing bloatware, only way to avoid said puke.
I swear, why rainbow... why everywhere...
A ton of jewels on this one, the Pippin, BeBox, the Pilot, all stuff I saw in magazines and could only dream to have.
Note, the acorn branded keyboard controlling the NC, acorn were involved in the project probably because they had the perfect processor for such devices (and the in/out knowledge of it), plus said processor was available as a SOC, Acorn inventing the modern SOC iirc.
Thank you for pointing that out! The fact it was going to use an ARM processor just makes it even more like modern day phones and Chromebooks.
The seamless desktop and lifelike video conferencing was cool to see. That looked very modern for the time.
That was all simulated obviously.
@@thedopplereffect00 and apparently involved 1 TB/s of bandwidth at at least one point between the two parties
@@thechemtrailkidwhen they mentioned it would require 1 TB/s it shocked me. That long ago and they really thought it would need that much bandwidth?!
Man I cannot even get an idea of where that 1TB/s estimate came from. Assuming they are doing standard raster Video, and not some sort of proprietary 3D hologram stuff. I mean even with 0 compression, every keyframe is a BMP, here are my rough estimates:
A 640x480 24bit BMP = 900 KB. x 30fps
That would require a theoretical connection of ~27 MB/s per stream
1920x1080 24bit BMP = 5.93MB. x 30fps = ~177.9 MB/s per stream...
7680x4320 (8K resolution) x 30fps = ~2820 MB/s per stream...
Ok that is nowhere near the 1TB/s per steam. 1GB/s would be overkill, but at least there I could understand a bit more.
I thinng we might not be dealingwith digital video here, rather a digitized form of the analog signal, lets say NTSC at 6 MHz the Nyquist theoreme sayas that to accuratly describe an analog signal figitaly you need to sample it at double the frequency, so unless I've misse something that gives us a saple rates of 12Msaples/s but at what bit debth; lets or the sake of argument say 30 so we end up with 12000000*30=360Mbps hmm stil way less than the number thay said, I must be missing somthingvery basic or they where basing it on way better than NTSC. Heywe are over analyzing this (at least I am) way to much the person on camera might just have miss spoke (tera instead of Giga). my example above gives us 45MBs/steram before overhead so with 4 people in the conferencevi are already at 200MB/s , hold on a sec 200MBps = 1.6Gbps this was a simple units screw up Tera insted of Giga and Bytes instead of bits
Maybe they meant I terabit per second...reading off a sheet and like people today confuse megabit and megabyte and will say you need a 20 megabyte connection etc. Would still be very high estimate, but would bring it down a lot
16:11 - Picnic table with 3 computer displays. What a sweet setup. (286 IBM, 386 or 486 tower below table, and a Unix workstation.)
It's interesting seeing some of these early devices do things that are pretty common on the internet today.
hahaha oh man, requiring 1TB/s transfer speed for live video chatting? Im so happy that's incorrect else we wouldn't have it today. A 720P @ 30fps chat is only about 250KB/s maximum. Phew :P
employing copression tjat was not avalable at the time and if it was you beeded a super computer to do it at any resnable frame rate
They are referring to bandwidth that would be needed for quality indistinguishable from reality. While 720p/30fps might be reasonable for a consumer grade device, mobile phone ,etc, , the corporate telepresence systems have high resolution, high frame rate screens almost the size of a conference room wall, incredibly high fidelity audio, and do consume up to several Gbps - and they would consume a lot more without compression, which was probably inconceivable at the time that a CPU would be able to perform such massive amounts of decompression in realtime. It would have been considered easier to bundle multiple network interfaces than achieve a computer with such compression/decompression processing power, which at the time would have been the realm of supercomputers.
Maybe we would have the bandwidth today
They didn't imagined we would have pcs good enough for compression. That time gpu acceleration was being born.
If 720p at 30fps only uses 250kbps then how come video chatting on fiber-optic broadband and/or high-speed mobile data networks still looks like garbage on anything above about an 8" screen AND is choppy as shit? Raw theoretical performance on paper doesn't stop bottlenecks occuring in real-world situations.
1996 and dialup was still the way to get on the 'net. Delphi had introduced their Internet gateway services in Oct or Nov of 1992. 36 kbps dialup with v.90 and 56 kbps was still a few years away. G.992.1 DSL was 3 yrs in the future. Here we are in 2022 (26 year later) and there are still places where it is impossible to get wired services faster than 10 Mbps. And places where it is impossible to get wired broadband at all. Fixed wireless or satellite is prohibitively expensive.
Here we are in 2023 and T-Mobile in Germany still throttle their mobile users down to 16kbps after using their high-speed allowance
I had cable modem in 1997. It was slow by today's standards, I believe just 5 Mbps but it absolutely made you pass out from joy over the speed differential from 56k.
25:28 Her cheesey grin when she finishes her segment. Epic.
Remember wanting a pda so bad. Then i got one and realised im 12 and dont need an address book or a to do list
Problem was the tech wasnt ready yet, my dad used to have one and had constant grief with it, it was certainly no iPhone 2g
I got a pda bundled with my first computer which was a Dell back in early 2000's. I couldn't really find a use for it. I forgot the password and after many tries, I gave up... bricked.
lol I remember playing with one when I was a kid too but quickly realizing I had no use for it either because I was like 10
Those pre Jobs days at Apple were really wild and all over the place.
it's kinda pre and post Steve Jobs
That's pre-posterous.
I think this is what innovation at it's core looks like, messy, confusing and all over the place.
It's up to the leaders how to make sense of it discretely and as a whole. And, then, how to monetize it, which is followed by feasibility, prototyping into a viable product , supply chain system concerns.
Then, a full fledged product and market strategy and then, branding, pricing and marketing.
This all could take a decade or two, for full fruition depending on the sophistication of the technology itself and/or consumer behavior.
In between Jobs
They had to look for a product solution. The whole Copeland project bled millions and deemed a massive disaster that sealed the fate of the Mac to less than sunny future days.
A telephone with PCMCIA-slots... What a wonderful time we are living in...
Good ol' BE OS. OS/2, etc. I remember some of these from back in the day, though didn't know what it all meant back then. All this work back in the day to give us the experiences today that we take for granted.
That terabyte per second thing might have actually been teraBIT since they were talking about data transfers over a network connection which typically are noted in bits per second and not bytes per second.
Link speed is commonly described in bits per second, but actual file transfers are usually discussed in bytes per second. For instance Steam, or a torrent program, does the latter.
This has obviously been complicated further by video bitrates being talked about in bits, as well as transferring that onto video calls nowadays. But back then, I could see why Sun would say “that’s an application layer, transferring blocks of data. It’s not a serial string of bits in the link layer; so we’ll call it in bytes per second”.
Now it sounds strange, as we have shifted to generally talking in bits for live speeds like when video conferencing, so it sounds very strange to us to hear them use the other measurement in this example.
And obviously, decades of ISPs calling Mbs “MBs” in voiceovers on adverts hasn’t helped at all. “8 megabyte internet speed”, not even “per second”, was common everywhere ten years ago. Even tho, of course, you only got 1MB/s out of 8Mb/s. I’m sure tech supports are still getting called because of getting “only” 12MB/s in Steam or Battle Net on a 100Mb/s line to this day.
amazing to see single-purpose internet appliances make a comeback some 20 years later
yeah the classics never die
This comment didn't age well
17:41 "Oracle foresees the Internet as the worldwide equivalent of a corporate in-house network". That was a pretty accurate estimation.
Ahh the BeBox, I remember that. Played around with BeOS as well, never saw it running on the Be hardware though.
Would make an interesting video if anyone could find one.
" The Only Constant is change"
He said "right now PC's are based around the motherboard, in the future they will be more like a backbone" yeah no mate it's still a motherboard and still looks exactly the same. Only slight differences in connectors and lanes.
Design wise that Pippin looks pretty damn cool
5:38 OMG, is that the early version of the SteamVR Controller ?
0:17 HEY SPA.. I copied so many floppys... Your head would spin.
AHahhaah they said that unironically i can't even
You're going to jail!
For those times the be box looked like a real banger
The Transphone idea - a computer with basic functionality, designed to use a web-browser - was realised as the Chromebook many years later. Interesting to see it's roots there
oh common a chrome book is way way to over kill you just need that address book thingy for your pc🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
yeah....no. there were many internet only devices (webtv?) way before a Chromebook.
@@joeysluzer1913 yeah the net appliances for example. I wonder if chromebook was more successful than the others tho. Kinda depends on the country you lived in too
But can it play doom 🤔
24:44 = The studio behind Abes Odyssey before they made the game. Legendary!
0:29 very interesting their visualisation of a future desktop setup
7:10 - "will require data transfers of over 1TB/sec". Curious how they arrived at that number. I know advanced video compression schemes did not exist yet when this was filmed, but still.
BeOS 4 was a great OS I couldnt crash it. Not enough apps though. Adoption of new OS/standards is hard. Look how long linux has been around and it still isnt mainstream on the desktop. Much of the computing world is timing timing timing. Palm knew they were in deep doo-doo when the blackberry came out. I remember the look on Carl Yankowski's face (president of Palm at the time when blackberry came out) when I set up his blackberry for him to play with. He asked me what I thought I told him we were in big trouble, our Palm 7 had been delayed yet again, the only probable competition for the blackberry at the time. I left soon after. Fast forward, Palm buy's BeOS to put on their phones as Palm's OS was too outdated to do much with. Palm uses BeOS as the foudation for the new Palm OS, but it was too little too late. Fast forward again, HP buys Palm, to get in to the tablet market (late and failed stupid decisions by HP's ceo to get out of that space during the EDS acqusition), and therefore, BeOS/Palm OS which was put on the short lived HP tablets after more devlopment as webOS. Fast forward again HP sells BeOS/PalmOS/webOS to LG which is further developed and shipped on LG smart TV's and appliances. So the roots of BeOS are still around. Had HP jumped in fully with webOS, they could have competed with android, mac and winderz and possibly gained a foothold of the winderz market from MS, but HP was too busy wrecking EDS and screwing its employees. - Former Palm, EDS, HP employee that was there for all that stupidity.
It’s definitely well-suited to TVs. I tried the UI Samsung was offering at the time I was buying, as well as Android TV, and both were laggier, less well laid-out, and didn’t switch apps anywhere near as well. I loved the Palm Pre line and that version of the OS, so it was kind of nice to get to use it on the TV again, even though there’s so many differences as well. The smoothness and ease of use were retained.
I do find it interesting they kept the name webOS through both the HP and LG buyouts, instead of each company trying to call it something totally different.
Yea, the Palm Pre was simply too late to the game. It more or less came out when the Android phones made by the big names came out in force, and the phone operators backed those heavily. I remember the display area for the Pre was in an obscure section of the store , while the Android phones were in rows of prime real estate. It had zero chance at that point.
Oracle really could foresee the future in the 90s. I remember seeing an interview with Larry Ellison where he talked about selling software over the web, "why am i going to buy this in a box at the store"
I was so shocked about how correct he was about the internet @16:00 and onwards. Dude is a psychopath but he knew what was coming.
10:49 Netscape, oh the memories
at 7:00 Says tele presence will require data transfer speeds of 1 Terrabyte per second. Surely not. Apparently the worlds fastest SSDs by Seagate can only manage 10 Gigabytes per second today in 2017.
I think they assumed network speeds would keep doubling, which they didn’t really, and that it would be a more feasible requirement than doing all the compression we actually do. Of course, it turned out data infrastructure rollout sucked in most of the world and we got more and more accelerated decoder hardware instead of having to use CPU for everything.
(Even pretty beefy CPUs can struggle with just one H265/VP9 video without acceleration, like using 70% of a 2012 desktop i7’s entire capacity, and about 110% of a 2012 laptop i7. And that’s more or less how fast processors were until Ryzen came out and kicked Intel into gear a few years ago. That compression acceleration is a big deal.)
It’s funny the ways people get stuck in the present when trying to visualise the future. I could just imagine them going: “we can barely handle running one MPEG video at low res now, how could we do dozens of video feeds for teleconferencing in high quality? I guess we just need lots of bandwidth?”
At 8:00 the Apple Pipin. In May 2006, the Pippin placed 22nd in PC World's list of the "25 Worst Tech Products of All Time."
😂😂
I had the first Garmin PDA palm pilot with GPS and loved it.
The transphones are the wave of the future.
Trans definitely appear to be.
@@deadbatt9321 😂😂😂
It looks like a wired proof-of-concept for the smartphone
1TB/s nice :D
There's so much The Gap in this clip
299 for the palm pilot?! That's SO CHEAP! Even WITH inflation! That's crazy!!! Holy shit!
Could you imagine getting the latest bleeding edge modern smartphone for less than 600 bucks today?! They must have been making those margins RAZOR THIN!
I think the TransPhone was the proof-of-concept for the cellular smartphone.
Almost every one of these products was a failure. Apple was at the height of it’s colossal failure as a company in 1996, but they would soon make a comeback!
It's crazy how the palm phones from 2007-08 had the same UI as that palm pilot just in color.
23:05 his mouse looks like the Cybertruck
4:50 we call that THE CLOUD
2014 and we still dont have holographic computers....that phone in the beginning is like an ancient smartphone....thats a landline
What is a holographic computer and why would that be faster?
@@soylentgreenb pretty sure they’re just referring to the Hololens or Magic Leap style UI. Which kind of was just starting to prototype exist when they commented, and still is in very early stages and takes up a small FOV while I’m writing this. We’re probably 5-10 years away from it being at all mainstream, another 20 before it’s expected to be the norm (if it even does displace smartphones - but if it could fit in a regular pair of glasses, I could see it potentially displacing smartphones in the long term, like in “Dennou Coil”.)
Shows like this really helps put the present into perspective. All this metaverse non-sense isn’t anything new, its been a pipe dream from the start.
I guess beginning of cloud storage started in 1996
You can buy new old stock Pippin`s from eBay straight from Japan. The problem is that new development is tough because to push the games you need a developer dongle that can not be found any more..
Pippin sounds like the poor dog who got eaten by Jaws
24:00 Laurie Anderson. OMG. Love her hair and her choice of attire. Style gone forever now. Sad.
Wow! Back then I already had a CDROM, and was transferring my CD's and records to a hard drive on my 486, playing with the first Linux OS's, drooling over the BeOS because it was very smart...
It was like watching an episode of the Flintstones, everything shown is extinct.
It would have been amazing futuristic technology from the year 1965 though.
However a kid from the 1960's would have been disappointed that there were no mobile man size sentient robots like in the TV show Lost in Space in the 1990's decade.
@@m9078jk3 That's up there with the "flying cars" thing. At best, we've a limited-production hybrid vehicle that requires two licenses.
Technical concerns aside, the consequences of "rude driving" - the family of inexplicably trendy habits like tailgating, deliberately driving too fast for conditions, and blowing through stop signs - would be even more dire than they are on the roads today.
12:05 Fast forward 2007 and you'll see the solution.
I seriously miss old tech. :(
Hey Andrew Gong, I thought I was the only one that felt that way, I'm right there with you bro. I know that you are going to think I'm crazy, but today's tech is just BORING. I like everything about yesteryear's computers and how they looked and functioned. And also technology back then was just more interesting how it was presented, demonstrated, and it worked. I can't quite explain what I'm trying to say, but I think you kinda understand what I'm trying to say. Today's technology and computers are just BORING today. I wish I could just find a time machine and go back to the 90's. LOL :-) By the way, thumbs up to you. :-)
Those were our pioneer years. We were doing things that very few even knew about. We were there when the whole thing began!
I miss my TRS-80 too.
13:25 Apple guy listening and planning secretly!
I like the amount of innovation back in the old days of the 80s and 90s before we had these stupid smartphones and High End Mac Pro computers, the problem in 2021 is that tech is stagnating there's no innovation anymore.
Probably has something to do with Moore's Law hitting a wall - we're quickly approaching the limit of transistors we can cram into a Silicon wafer, which is why we're just seeing slight "tick" process improvements, more cores etc. - until we move to a new material (Graphene has been discussed for years but still hasn't made it out of university labs and into production) I don't see any further drastic increases in processing.
There are many sides to this. Back in the 80s/90s computers were more expensive than today. Imagine spending those sums on hardware that would feel ancient and barely useable after just a couple of years. Back then new features were introduced all the time, but the hardware at the time was very weak and couldn't handle it well. As for the lack of innovation, what features are you missing today?
Well, that Starfire project went nowhere lol.
That humancode site design is amazing
22:03 For the uninitiated, your average 1996 computer would have a stroke, crap its insides out then go on fire if you tried doing that.
Glad we got rid of those "motherboards"
Thank goodness we figured out how to master compression so we didnt need that 1TB so soon.
I miss the early days and that sound!
That AR and VR doesn't look that different from what we have today. And we're still far away from Terrabits per second being common internet communication speeds and hundreds of times higher resolution than what they had back then (640x480?) might be 16K or 32K resolution for monitors.
Van Halen Right Now is the song the midi program is playing.
UA-cam in 2012: this is fine 🤗
UA-cam in 2019: COPYSTRIKE THEM 😡
Who's watching this 2020?
this was my jam yo. o would ve so happy when the intro rolled ❤
7:10 1TB per second lol? 5000 mega-pixels of uncompressed video? Even at that time with no compression (spoiler alert they had some) what kind of insane resolutions were they predicting to need a TB per second?
I really like the phone/laptop device, very cool
why I am so excited about the palm pilot
25:52 I got to get me that Desk stick. Marketed for people whose first computer is a laptop but switches to a desktop. So funny to see what people were coming up with.
oh shit man everyone should get themselves a deskstick that things going to be the next quantum break through in computer tech in the next few years
That Transphone looks awesome.
Does anyone know what version of Mac OS we're seeing at at 18:17? It looks like Mac OS 8, but that didn't come out until 1997.
Brian Cairns It was a beta version of the next generation of Mac OS, named Copland. It went under though because of various problems. Some was re-used in Mac OS 8.
No, it’s the Aaron extension that themed your desktop like Copland(MAC OS 8) by Greg Landweber, he eventually went on to produce Kaleidoscope, Aaron Lite, Greg’s Browser, etc
tv vs computer monitor, aww what times when a PC Monitor and a TV where vastly different things.
Lol @the black arm-looking thing at around 15:20... XD
Actually, it's pretty creepy... o.o
My other hand isn't strong enough, you take my lil hand!
@@weaponofmassconstruction1940 Whoa, you're traveling to a time of my comment(s) before COVID. :D
@Andrew Tarrant Ah. :D
Yeah wtf was that ? grim reaper coming for the Palm Pilot, courtesy of Steve Jobs lol get the joke, Steve dont want them beating Apple to the smartphone game lol
The Transphone, it's not cloud computing! You need more horsepower than a 286 because the server won't improve web page rendering speed. For 1996 that's seriously obsolete right out of the box.
The screen and the processor both scream “excess laptop components that can’t sell for their intended purpose anymore” by 1996.
Ah, the Pilot, or soon-to-be PalmPilot. Remember wanting one of these back in college and the early working years. Funny to think that calendar, note-taking, and task apps are built-in to all smartphones now.
10:55 Most People wont be surfing the web to read a lot...
I know, but it was true back in 1996. He wasn't trying to predict 20 years in the future, he was stating things as they were at the time.
@@salty-as-heck9915 webpages were still mostly text well up until the early ‘00s though?
Got a pc in 1994 and loved that dial up modem sound. And man the Pippin...what a huge failure.
Definitely a prophetic episode. What a time it was to identify one of a number of major future computing paradigms and get rich. (I still kick myself for not squatting on certain domain names in 94-95, but it was *expensive* to do that and I was just a Jr LAN tech who connected my 250 employee division to internet email on a 28.8K modem, and it actually worked fine. My corporate IT leaders thought I was a boy genius...and I soon quit. haha. But also too poor to get rich.)
7:06
How the hell can it possibly require a terrabit transpher rate? What, are you analyzing NASAs landing speeds, dude?