Alan Kay at OOPSLA 1997 - The computer revolution hasnt happened yet
Вставка
- Опубліковано 10 лют 2013
- Alan Kay's seminal 1997 OOPSLA keynote. Originally hosted on Google Video, copies of it are now only available from the squeak.org website as far as I can find. Putting it on youtube is my attempt to preserve a really important talk in computer science and computing in general.
- Наука та технологія
07:00 "On the Fact that the Atlantic Has Two Sides"
Kay: "On the Fact that Most of the World's Software Is Written On One Side Of The Atlantic!"
07:30 Two types of math: computers = practical math
structures of a much larger kind, consistent
08:55 Debugging goes on in math as well
10:00 What's being done out in the world under the name of "OOP"?
10:33 Infamous C++ quip
10:50 Same feelings about Smalltalk! Meant to be improved. Not syntax, or classes. Taken as given...
13:00 Math as gears
13:35 Analogy to 60s programs: SCALING A DOGHOUSE!
18:00 Pink plane / blue plane
21:30 Data abstraction in 1961 US Air Force
23:25 Contrast that with what you have to do with HTML on the internet! Dark Ages!
Presupposes that there should be a "browser" that should understand its format.
"This has to be one of the worst ideas since MS-DOS"
What happens when physicists decide to play with computers :P
Browser wars, which are 100% irrelevant [to the big picture]
25:30 The internet is starting to move in that direction as people invent ever more complex HTML formats, ever more intractable.
This same mistake made over and over again
26:00 Sketchpad: very much an object-oriented system
26:50 Simula
27:50 Better Old Thing vs. New Thing
29:20 Molecular biology: cell physiology and embryology (morphogenesis)
30:00 First assay of an entire living creature: E.Coli; biological info processing
34:10 Only takes 50 (40) cell division [cycles?] to make a baby!
35:20 Computers: slow, small, stupid. How can we get them to realise their destiny?
36:00 Doghouses, clocks don't scale 100x very well. Cells scale by 1,000,000,000,000x.
How do they do it, how might we adapt this for building complex systems?
36:40 Simple idea that C++ has not figured out.
"No idea so simple and powerful that you can't get zillions of people to misunderstand it"
Must not allow the INTERIOR to be a factor in the computation of the whole
Cell membrane keeps stuff OUT as much as it keeps stuff IN
Confusion with objects from noun+verb-centered language. "Our process words stink"
37:50 Apologies for "Object-Oriented". Should have been Process/Message/間 (ma) -Oriented
Stuff IN-BETWEEN objects. What you DON'T see.
39:25 An object can act like anything. You have encapsulated a COMPUTER!! The universal simulator!
Take the powerful thing you're working on and not LOSE it by PARTITIONING UP your design space!
That's the bug in data/procedures.
Pernicious thing in C++ and Java by looking like the OLD THING as much as possible.
40:35 Virtual Machine
41:00 UNIX processes: like objects, but too much overhead. "3" ought to be an object; but it can't be a UNIX process.
41:50 Bio-cells can't share their DNA. OUR "cells" can! No need to e.g. engineer a special virus to change the DNA (e.g. cystic fibrosis)
43:00 "An object is a virtual server" Every object should at least have a "URL"
I believe every object should have an IP
44:10 So far we've been CONSTRUCTING software; soon we'll have to GROW it
Easy to grow a baby 6 inches; never have to take it down for maintenance!!
Can't grow a Boeing 747! Simple mechanical world; only object was to MAKE the artefact, NOT to grow it
44:55 How many people STILL[1997] use a dev system that forces you to develop OUTSIDE of the lang?
Edit/compile/run? Even if it is "fast". Cannot possibly be other than a DEAD END for building complex systems!
[20 years later, in 2017: has anything changed?]
45:50 ARPANET, the precursor to the Internet.
From the time ARPANET started running, expanded by 8 orders of magnitude.
Not ONE PHYSICAL ATOM in the internet today that was in the original ARPANET!
Not ONE LINE OF CODE of the original remains in the system!
System expanded by 100,000,000x, has changed every atom and every bit and has NEVER HAD TO STOP!
That is the metaphor we ABSOLUTELY MUST apply to what we THINK are "smaller" things!
When we think programming is "small", that's why our programs are so BIG.
47:45 LISP, meta-reflection. "Maxwell's Equations" of software
Saddest thing about Java (possibly before reflection API?)
Java originally for programming toasters, not internet, but still --
How can you hope to cope with stuff without a meta-system?
Represents a real failure of people to understand what the larger picture is / will be.
50:10 Meta-programming. Bootstrapping on top of the language itself.
Tyranny of a single implementation.
52:15 Dozens and dozens of different object systems, all with very similar semantics, but different pragmatic details.
Think of what a URL is, a HTTP message is, an object is, an obj-oriented pointer is...
Should be clear that objects should not have to be local to the computer.
Interop is possible basically for free under this stance.
Things like JavaBeans and CORBA will not suffice.
Objects will have to DISCOVER and NEGOTIATE what each other can do.
Prod and poke each other to see response.
Must automate this!!
55:20 1970s: Abstract Data Types. Assignment-centered thinking.
C++ like MS-DOS: no-one took seriously, because who would ever fall for a joke like that? ...
57:35 McLuhan: "I don't know who discovered water, but it wasn't a fish!"
58:10 3 stages of new ideas:
Dismissed as mad.
Then: totally obvious.
Finally: original denouncers claim to have invented it!
Smalltalk, sadly, went this way.
We don't know how to make systems yet!
Don't let what we DON'T know become a religion!
Before coming out into the world, Smalltalk was very good at making obsolete previous versions of itself.
Squeak: bootstrap something better than Smalltalk!
Think of how you can obsolete the damn thing!
1:01:30 Pipe organ: "Play it grand."
Let us combine all this with Bret Victor's work, and lay the foundations for the *real* computer revolution yet to come.
“After growing wildly for years, the field of computing seems to be reaching its infancy.”
-- John R. Pierce
yes, seems soo true.
The collection of sick burns Alan Kay lays on the Computing field never get old and is still just as relevant. I love this talk and so many of his others
An absolute visionary, and seems one of the more exceptional and charismatic Turing Award winners. I am going through many Allan Kay videos with my brother. His "context" idea to get leverage from other domains to address issues of scalability is truly a great concept and seems to be far more powerful in scalability as the more conventional engineering metaphors of building bridges. It seems in fact to me an iconic example of Bacteria for the loose coupling minimal cohesion principle. Cheers, K
Thank you a lot for sharing this!!! For people being far away from people like Alan and surrounded by very different kind of people these interviews and thoughts of Alan Kay is like a fresh air and really enriching. Thank you very much. Having possibility to listen people like this really makes a difference and .I am really grateful to you for your efforts to make it available.
thanks for keeping this alive, great talk.
"Play it grand." That's great advice to end a great talk.
21:03 basically explains most of the problems in modern computer industry, people learn one language, and then just mess around in that language optimising their use of that language but never expanding their ideas into generally new (to them) concepts. We end up with knowledge being slowly lost and a constant pathetic reinvention of the wheel
Agreed. Probably not even like reinventing the wheel, but staying stuck with rolling logs, struggling to transport monoliths..
Thanks for being an unofficial archivist. This talk still applies today despite being more 15 years old. Back then we had PCs and browsers. Today, we have smartphones, browsers, and e-book readers... all using the same obsolete ideas and architectures.
HOW DO WE FIX THIS?????
Here we are more than 25 years later like he mentions in the talk.
"I made up the term object-oriented, and I can tell you I did not have C++ in mind"
the book he showed at 54min is The Art of the Metaobject Protocol by Gregor Kiczales
Thank you so much for sharing!
Thank you for this. I realise I graduated with a CS degree right after this talk happened and I've been stuck in the old "lone hacker" meme never really embracing message oriented/oo. A light has been lit for me and now I'm looking for any reading material and other media I can get my hands on.
d1663m check out squeak
If it isn't what brought you here, I recommend the blog "Tekkie" on Wordpress
Really your diligence is applaudable.
Yes! Thank you!
Best Alan Kay quote @10:34"I made up the term object-oriented, and I can tell you I did not have C++ in mind."
Reminds me of: "Those who forgot the past, are condemned to reinvent it, badly."
Did you miss what he said about smalltalk in the very next sentence? Stop nitpicking. It's the same situation as with "don't optimise too early" quote, which is misused all the time.
Great job on the timecoding by the way
This man is so genius that now in 2018 looks like he speaks about a future, PS: i'm anxious to learn Smalltalk now :D
I know. I made the same realization lol. I was really fascinated by the idea when he mentioned the planes idea and how you have to switch contexts to look at problems in a different way and maybe get these aha moments.
If you haven't learned Smalltalk yet, consider watching my videos, Squeak from the very start:
ua-cam.com/play/PL6601A198DF14788D.html
Did you?
oh my god, the music at the beginning is amazing
RIGHT?
thank you for this video
Thank you.
1997 seems like it was 1950...
How do you mean?
@@GeorgeCoxIn 1997 the TV cameras resolution/quality aren't too bad like seems on this video.
@@ArquimedesOfficial AV at conferences back then was always ropey
@@ArquimedesOfficial It's a conference. So it means it wasn't filmed on film (like movies and expensive tv-shows like X-files and so on), but on video cameras. So the quality would be crap - tape based video cameras at the same had less than what we today would call 720p, plus bad low light performance (and conferences are bad).
His description of the ossification and canonization of smalltalk (around 59:00) reminds me of the following Nietzsche quote: "As long as a man knows very well the strength and weaknesses of his teaching, his art, his religion, its power is still slight. The pupil and apostle who, blinded by the authority of the master and by the piety he feels toward him, pays no attention to the weaknesses of a teaching, a religion, and soon usually has for that reason more power than the master. The influence of a man has never yet grown great without his blind pupils. To help a perception to achieve victory often means merely to unite it with stupidity so intimately that the weight of the latter also enforces the victory of the former."
18:31 his 2015 talk on "Simplicity" has lots of elements common with this one talk.
Brilliant
html is dark ages, presupposes browser understands everything
Well it's a very good talk overall, despite quite some flippant side-comments, because it makes one thinking. And that's what good talks are about. For me this talk wasn't about programming languages or paradigms. It was about two things: a) what does a computer do and b) present a compositional view of computing to recreate complex systems. Programming is just the way to make a computer do anything from a techical perspective. I feel like the talk adds a lot of noise over programming languages and operating systems and perhaps elicites some lower level disagreements present during that time. Let's start my thinking with the critique on the notion of object orientation. If object oriented programming wasn't supposed to mean what it has become, and instead an object is like a server, then by all means why connect it with the term programming? Building servers, setting them up, running them, tearing them down, replicating them, duplicating them is not programming. At least we have not come to think of it as programming. Maybe that is more obvious from a hindsight perspective than in 1997. The configuration and operation of thousands of servers is not typically called programming. What we have today in form of the internet is a very large, very complex and very heterogeneous beast that has grown a lot more than it has been designed. But sure, there is design and there are protocols and standards and such. And as the talk suggests it wasn't compiled and shipped in its whole form. But nevertheless, despite it's complexity it is still the cog-wheel machine and not the living and transforming biological entity. More on that later. Web applications are often that, servers that are setup somewhere and that perform some task. If you will that's like a cell. It doesn't matter what programming language you use for describing how that cell works, because nobody on the outside will perceive it. Programming languages are for inner cell workings and not for describing how the whole organism works. Putting up a new server, tearing down the server IS the change of a system while it's running. While it's a laudable move to embed within a programming language (Smalltalk) this notion of growing, running and evolving, it still remains an egineer's perspective for me. Now, the intriguing thought of the complexity of biological systems and the comparison to computational or information systems. In my opinion, the comparison is not fair. It still requires a human programmer to change and grow and run "compuinformational" system. Living creatures don't grow by a third party that adds stuff to it (remember the fish that hasn't discovered water). To have a computational system that grows is one where you have to take the human out of the loop in which the fundamental question remains of what all this computing is then supposed to do? Because, nothing what computers do is real. The reality of a video is a careful alignment of atoms in some specific places which can, by some electromechanical machine, be interpreted as a stream of bytes, which can, by some virtual machine, be interpreted as a series of visual and audible frequencies. That's what we do with software. The only purpose computers serve for us is in forecasting the future physical universe to adapt (to) the real physical uinverse now and second, to organize, exchange, collaborate and coordinate or synchronize human activity. Nature just doesn't have a purpose (remember the fish that hasn't discovered water), which is why have bacteria and humans. But computers and computational/informational systems are built for a purpose and it's a very specific purpose and it's _only for humans_. Animals have no use for computers, plants have no use for computers, the universe has no use for computers. Only humans care about what computers do. And they do basically two things as stated above a) compute some model of existing or hypothetical physical realities or b) communicate and exchange information and instructions, collaborate, synchronize, coordinate, control. I say this, because there's parts of this talk on how babies grow with 40 steps of cell division and of bacteria and organisms and life and such. The discussion of how a bacteria is a phenomenal biological entity and all that is very intriguing, but at the same time, what would we do in computing it? We'd simulate it, in order to forecast and understand it, to dissect and analyze it and then to finally build a machine that does the same thing as the bacteria. And then use that machine for whatever good or bad purpose. Computing is a cog-wheel in that process. There's no universal grandness, all we do is building better and more complex machines.
Regarding the construction of software vs growing it. I think that deserves some additional commentary. We have abstraction and composition as too fundamentally different approaches. The first may be related to a top-down view, the second may be related to a bottom-up view. Programming languages are mostly perceived as abstractions, but of course their products can also be composed to form more complex systems. Alan suggests that Smalltalk fundamentally differs from these in that, yes it's a programming language, but also more like it's a system already. I am not so convinced of this. I think the level of composition realized within a lanuage is too low. If you think about a database server and an application then these two interact, yes, but do they need to be in the same language to be composed? Why? Also performance... sometimes it doesn't matter, sometimes it does matter a lot. The one language to rule them all is simply never going to work. I think that's where a lot of the noise and flippant commentary in this talk comes from. So what's the essence for me?
I suppose what the talk reveals is that a compositional approach is a viable path towards complex systems. Let's view an abstraction like a projection to a low-dimensional space. So with abstraction you take an object like a doghouse and you abstract it by projecting into the low-dimensional space (or perhaps even just the visible space that you perceive the doghouse to be in). Then you would scale that low dimensional space and then un-abstract to perceive the resulting higher-dimensional reality (I am thinking here behavioural dimensions, social, economic, etc.). And that would not work. First, because you may not even be aware of all the dimensions that went into an abstraction, because they were so small from the object that you started the projection with, e.g, think of the social dimension of a single person organization. And second, because it requires to invent the expressions in the higher dimensions (that you not even know all of as stated just before). So then composition and there the talk is about growth. And that's where it gets a bit convoluted, because abstraction scaling is associated with programming and computing (doghouse to building) and compositional scaling is associated with biological organisms. However, reality is that compositional scaling is just natural for computing and programming too, regardless of the programming language. So, for me the comparison of a cell to a bacteria and a doghouse to a building shows that the first two are functionally different and requires composition and the other is functionally similar, but nevertheless requires a lot of invention and new things. But system development doesn't necessarily be only just one of those. A system can be composed of software components, e.g. database, e.g. web server, etc. and it can be of various abstractions, e.g. the database system itself that have to scale (single table vs multiple tables and complex joins).
Thanks for sharing.
But are OOPLAs complying to this today? And which of them?
Foluso Ogunfile - Very few of them. Erlang gets the process isolation and message passing bits pretty well. And...uhhh, that’s about it. JavaScript is objects all the way down, but still janky af. Ruby too, on the same points in different ways. Class oriented languages (the ones most folks will tell you are “true” OO languages) are not necessarily OO. Object Oriented design is a problem decomposition strategy and software design approach, and is not necessarily tied to the programming language itself.
do you have transcript? since i think i need one
Around 52:24 holy shit that sounds like the json apis that are all over the place these days
Perhaps it is not objects saying what they can do but a soup of events and objects that can do something with/about those events.
He attacks Dijkstra for arrogance but fails to perceive it in his own exceptionalism, that there's something so special about his discipline that it can't be described by known mathematics.
Is that you, Dijkstra?
@@alrightsquinky7798 lol
Yeah, but he does it so beautifully.
He definitely perceives it in himself and reveals that in his particular humour.
I just realised that Google's new OS under development is called Fuschia, Apple's attempted OS was called Pink. Wow
The header of every HTML page should contain the source code for the web browser that can render it.
That might be more ideal than the current way browsers work, but it would make every html page massive, both wasting bandwidth and taking forever to load.
An internet that could guarantee that a resource is always available would solve this. (This likely involves decentralization of some sort.)
The web page would just link to this browser/renderer to keep the size of itself small.
Whoever responds to the initial request to a page can first respond with the list of dependency ids, followed by the requested page, followed by the actual dependencies (which would include the renderer here.)
Then, as soon as the requester receives the list of dependency ids, it checks if it already has any of those in its cache, then tries to cancel those responses to save bandwidth.
Or something like that, I'm not sure.
There's a spectrum here of both needing the guarantee of being able to run a piece of software to the end of time, yet needing to cut corners somewhere so we're not downloading the whole operating system & renderer every time we want to run a thing.
So, he meant microservice architecture by OOP but misunderstood?
@bcobb7777 And why would they do that? What do you gain by packaging "printf()" into a container that runs a factory thread that creates a sequence of microservices that block each later one until the earlier ones have finished printing their individual characters?
Okay, so what is a _"Texas Tent-Beating"_ ? 45:25
Google doesn't seem to know... so either it's very obscure or, more likely, my ears no longer work properly : /
"Texas tent meeting" is a reference to en.m.wikipedia.org/wiki/Tent_revival
@@Svenscreams Ah, now I get it : ) Thanks.
I wonder if the man had something to smoke before he gave this talk.
The notion of the 'tyranny of a single implementation' obtains in all domains. Regarding human language, avoiding the tyranny of a single implementation is a strong argument in support of learning Esperanto - even as merely an object of private study.
Isn't the ultimate goal of Esperanto to effectively become a single (thus tyrannical) implementation of language?
@Arlo BarnesI'm not buying it. Logic tells me otherwise.
Here we are 25 years later. Has the biological metaphor won out yet?
Not enough hardware support, but it's getting better...
11:40 OOP произносит "УП"
25 - can't agree with that. Mixing representation and interpretation of content makes sense only if there's only one sensible one of the latter. In particular, mixing in code only makes sense if the author of that code is trustworthy from the consumers perspective, the code they're writing will work on the users machine (the second you're trying to make it machine independent you're starting to expect a browser) and either there's only only one possible interpretation (so no mobile devices, screen readers, automated processing etc) or there are many and you're expecting the original encoder to account for all of them... Including the ones that don't exist yet somehow.
This doesn't even go into any of the hard problems browsers solve; yes that makes them impenetrable monoliths, but the reward is that clients can get away with just worrying about much simpler application problems rather than, say, text rendering (which before modern browsers basically no software did correctly, not that even they aren't still struggling with it).
I think it comes down to people wanting things to look good
Set playback speed to 1.5x or 1.25x.
39:43 "What you have encapsulated is a computer"
The most important idea in the last 50 years of computer science.
Objects are abstract computers. You decouple the notion of a "computer" from the physical hardware on your desk. The paradigm holds from a single object representing "1" right up to the internet (an object orientated network). Its objects all the way up and all the way, which is just another way of saying that it is "computers" all the way up and all the way down. A desktop PC can hold a thousand "computers" just as the internet can hold a million computers (each holding a millions computers).
This is the most misunderstood idea in computer science right now, and the lack of understanding is killing the industry. It is much easier for some to simply think of "1" not as a computer representing the number one, but simply as data, decoupled from any computing ability. The computation happens some where else, "1" is just one. And also the performance in the 70s of representing an small thing like one as a computer was considered unnecessary waste.
But we are now stuck with systems that cannot scale, and despite the millions of times faster our PCs are we are hitting this brick wall of the limits of procedural programming.
No. The biggest problem we have today is _limited manpower_ to write programs. The time of a person to develop code is our scarcest resource. As a result, low skill developers churn out crappy code that can barely be figured out, or high skill developers are constrained by bureaucracies into churning out code too quickly to make it good.
Procedural programming, which is _far_ simpler and easier to read, is a boon in this environment. We don't need abstract concepts like "1 is a computer" that are so pie in the sky as to be useless in practice. We don't need poetic nonsense in our code. That kind of thinking only makes our jobs harder by creating unmaintainable messes.
Write simple code that's pretty easy to figure out and uses the tools of your language as they were intended. That's the best thing you can do for everyone.
*-1*
I hate to disagree with someone as forebrained and articulate as Alan Kay, but how are we supposed to interact with URL objects the same way we're supposed to interact with the simple PrintWriter objects I was taught in Java 101? I looked up a concrete definition of OOP from Kay, and I couldn't find one. If I'm understanding him correctly, he wants a much higher level abstraction than even Lisp macros; he wants some kind of general framework for any computer, maybe like the Von Neumann architecture. But when I write good code, it does precisely one thing. I don't write that code to be compatible with some "message" from a programmer halfway across the world; that's ludicrously impractical and time-consuming.
As many smart people are, he's unfortunately very idealistic, high-level, and vague about the details. What he's saying sounds something like Elon Musk saying "we should put self-sustaining human civilization on Mars," but never doing anything concrete towards reaching that goal, and then criticizing others who are attempting to actually engineer a solution towards that extremely difficult and long-term goal. He wants a standard so scalable that databases can talk to web addresses can talk to the dinky console on my laptop? I don't see how such a standard is possible, and I doubt he does either, considering he hasn't provided one. A little too much mathematics in that beautiful brain, and not enough engineering
Nathan Bendich I don't see the issue. You write an object to recieve a message and do something with that message if it understands it. Why would you care if the message comes from around the world? A TCP/IP packet looks the same from China as from England
java is cancer
@@BladeOfLight16 That is like a monk in 1440 saying the biggest problem they have is that his follow monks don't have enough time to copy all these Bibles, so stop worrying about abstract ideas like a 'printing press' and instead find a way to give the monks even more time devoted hand copying Bibles.
If history has taught us nothing else it is that the answer to any large scaling problem is never "We just need more time/manpower".
That was, in fact, the whole point of Kay's talk. All genuine revolutions resulting in processes that require less time/manpower (the printing press, the arch) than they did previously. No one builds anything like they build the pyramids. We build taller infinitely more complex buildings which are 0.1% the mass of the pyramids and we do it in 1% the time required.
So the problem is never ever limited manpower.
7:43 > _"practical math"_
yes, i have always called computer science as applied maths. wow.
Atrthur koesler 18:28
23:48 HTML on the Internet has gone back to the dark ages, because it presuppouses that there should be a browser that should understand it's formats. This has to be the worst idea since ms-dos.
And then Sun created EJB....
23:30
powerful oo? skip java but see www.prevayler.org and how it can work in lisp or smaltalk or haskell, process oriented term for it! not object oriented, but message passing?
Jeff Gonis Please pin "Joel Jakubovic
" comment b/c he has all the timestamps GTG!
51:45 "Microsoft won't be able to capture the internet", this is the best prediction, they failed in mobile market, they failed in browser market, search engine, services etc, only apple and google were able to do it.
rewrite the art of the metaobject protocall in a general message passing programming sense
His message is good, his expertise is fantastic, but his speaking style is difficult to follow. Reading a transcript of this is so much easier.
just watch it at 2x speed
Ok, we send you the data and also the program code how to process the data. This is really optimal especially when I want to send you a virus.
This shows that this speech is from a more innocent time.
Research on how to prevent programs from harming the computer system they were executing on was already pretty well along when Alan was doing his research. He frequently mentions his mentor Bob Barton whose design of the Burroughs systems specifically aimed at this. This work went on to be elaborated in the form of "capabilities" something that Google is leaning on in the design of their Fuschia OS.
So I actually don't think this is from a more innocent time, and the technology to enable this is by no means unknown or untested, just not part of current mainstream operating systems. That will likely change in response to a lot of the things he talks about becoming more widespread.
@@jeffgonis8975 If any website I visit can send me a novel language and an interpreter for executing that language my ability to audit & review what remote sites are telling my computer to do goes directly to zero. This is a terrible idea.
1501 operating systems BTFO
It's funny how he says that Dijkstra is arrogant, but he disses browsers and HTML. Who's laughing now? :D Btw, he's a genius man really.
HTML is a disaster, no matter how many people use it. Saying that it is good just because people use is just argumentum ad populum.
He failed to mention that he measures 5 billion nano-dijkstras in arrogance. BTW he's right about HTML. And a lot of other things.
I have no strong opinions on HTML, but I agree he fails to perceive his own massive arrogance. I find it especially ironic that he shortly follows that statement with a claim of how exceptional his discipline is, that centuries of mathematics doesn't apply to it and that it requires new math. This video was made two decades ago and I have yet to see anyone bother to define OOP mathematically, let alone Alan. Category theory seems to have done a better job defining OOP concepts like polymorphism or encapsulation, and it's done so with greater clarity and generality, despite being invented before computers.
'90s web... seriously ?
+ktxed The first web browser was called Mosaic, circa 1993 or 1994.
Netscape said it was basid in Mosaic and that is why Firefox was called Mozilla before.
Internet Explore's first versions said they were based in Mosaic too.
gmoschwarz Thanks for the recap. I'm aware of the history of browsers. My comment was about the content of the early web and the rudimentary technologies available at the time.
I remember the good old days of , and
@@ximono I've seen uppercase HTML tags but not
@@ktxed Maybe I was the only one who did that :)
Something fishy about this guy
I'm not a huge fan of C++ either
A possible bright side: if humanity implemented what is being communicated here, we might have destroyed ourselves because our computers and software would be much more powerful; perhaps too powerful.
The self-limiting nature of our conceptions may be some kind of built-in fail-safe survival mechanism.
Too bad culture, and market systems drive tech instead of innovation in the west. Maybe that's why the best coders are from Austria or eastern Bloc former coldwar countries. Though I'm not sure pointers was a good thing
"On the Fact that Most of the World's Software Is Written On One Side Of The Atlantic"
Great talk, but that sentence is stupid. Great software is written everywhere, but big money is just on one side of the Atlantic.
Is it just me or does he talks like Donald Trump ?
Trump? Nope. You’re hallucinating, this is thoughtful talk, Trump boasts every other word and what else he says is as inane as it could be.
so why doesnt kay show a broswer that does not do anything much but offer bits of memory and output grpahics and show us how it looks? can it beat icewm on freebsd and firefox I am using? I think gopherspace is kinda cool idea and a network drive idea is cool and all browser doe sis read something lieka file or set of files and show you something and then accpt typing or clicking inputs.....
39:55 hurting the programmer!!