Bret Victor The Future of Programming
Вставка
- Опубліковано 9 лют 2025
- "The most dangerous thought you can have as a creative person is to think you know what you're doing."
Presented at Dropbox's DBX conference on July 9, 2013.
All of the slides are available at: worrydream.com/...
For his recent DBX Conference talk, Victor took attendees back to the year 1973, donning the uniform of an IBM systems engineer of the times, delivering his presentation on an overhead projector. The '60s and early '70s were a fertile time for CS ideas, reminds Victor, but even more importantly, it was a time of unfettered thinking, unconstrained by programming dogma, authority, and tradition. 'The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor. 'Because once you think you know what you're doing you stop looking around for other ways of doing things and you stop being able to see other ways of doing things. You become blind.' He concludes, 'I think you have to say: "We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'"
This is seriously one of the best presentations I've seen on programming
Wanted to give you a thumb up, but since the number of those is quite magical I decided to just say that I totally agree with you :)
i like how the audience is initially laughing along with him, 1973 how quaint… then falls silent as they realize how dead serious he is and how far off we are in 2013.
I landed here from your tweet only to once again read your comment. You are everywhere
I was still laughing the whole time! He’s roasting the whole industry, god bless this hero even more! Haha
Still same in 2023
@@ChrisAthanasWe're stuck. We're not really advancing anymore.
The best jokes have you laughing at the start and then in stunned, silent existential dread at the end.
31:22 “The most dangerous thought that you can have as a creative person is to think that you know what you’re doing.
Because once you think you know what you’re doing, you stop looking around for other ways of doing things and you stop being able to see other ways of doing things, you become blind…”
🔥 thanks for grabbing this quote/snip.
This is why Computer Science needs to teach courses on the history of computers and programming. All other sciences talk about their history in introductory courses but not computer science. So the history is lost. I have taken CS courses at 4 universities and none of them talked about the history. BTW very good talk.
Our first course opened with an itroduction from a historic perspective, I did enjoy it.
Same thing in Math and Physics. If only people knew how hard we had to work to shake off our assumptions and make progress, maybe they'd be a bit more sensible when it comes to the subjects. It wasn't 500 years ago an equals sign went in only one direction, and it took (at least) two generations for Kepler to figure out planets orbited in ellipses.
They do
My CS101 course was on the history of programming
@@johnhabib289 Agreed. So much progress in learning has been the result of unlearning, of the refutation of unshakable dogmas. Presently, everything is presented to beginners as a fait accompli, a final resting place from which vantage point the initial problems and inspirations become irrelevant, and the struggles and breakthroughs become moot. A sense of history, not only exciting and inspiring in itself, would remind students of their fallibility, and that they too occupy a position in the unfolding of understanding. Even Newton acknowledged that he saw further because he stood on the shoulders of giants.
I was going to write my final assignment to get my undergrad degree, randomly brought to this presentation. Best presentation I've ever heard, totally mind opener and spark my spirit to do my research not just to finish my degree but to learn much more. Thank you
This video never gets old, I like to rewatch it once in a while. I like to remind myself of this truth "The most dangerous thought that you can have as a creative person is to think that you know what you’re doing...". Thanks for sharing!
Anyone else coming here from StackOverflow ?!
This presentation is pure genius.
01:36 nature of adopting ideas
04:34 fortran
06:43 direct manipulation of data
08:25 bridge simulation
10:56 prolog
11:27 pattern matching
13:45 how do you get communication
16:31 spatial reputation of information
22:00 parallel programming
28:20 spatial representation of information
Bret Victor is such a legend, it's so fascinating how much he can inspire people with old stuff :). Incredible! In his other talks he presents a lot of his own ideas, but I love this way of presenting even more.
5:25 Overview of the four big ideas
6:42 1. coding -> direct manipulation of data
9:42 2. procedures -> goals and constraints
16:25 3. text dump -> spatial representations
21:55 4. sequential -> concurrent
27:45 Conclusion
Fun/tragicomical bits:
9:15 Ted Nelson's Xanadu, not Tim Berners-Lee's HTML + CSS
11:48 "Ken Thompson over at Bell Labs working on this system they call Unix. I know right, Unix?"
12:52 The Internet ("cute idea, might work")
15:20 ''API'' ("brittle, doesn't scale, not gonna happen")
17:00 GUIs
17:42 The mouse ("kind of hard to explain")
18:18 Flow-based Programming, no-code
20:20 "I'm totally confident that in 40 years we won't be writing code in text files. We've been shown the way."
20:53 Interactive computing, real-time, reactivity ("It's pretty obvious that in 40 years, if you interact with user interfaces, you'll never experience any delay or lag.")
24:42 MPPA ("This is the kind of architecture we're gonna be programming on in the future… unless, Intel somehow gets a stranglehold on the microprocessor market and pushes their architecture forward for 30 years, but that's not gonna happen.")
25:54 Threads and locks ("This is never gonna work, right? This does not scale. If in 40 years, we're still using threads and locks, we should just pack up and go home, 'cause we've clearly failed as an engineering field.")
27:18 Erlang ("I do think it would be kind of cool if the actor model was picked up by a Swedish phone company or something, that would be kind of weird." - one guy laughs)
28:20 "We're not gonna have text files anymore. We're going to be representing information spatially because we have video displays."
29:00 "I do think it would be a shame if in 40 years, we're still coding in procedures in text files in a sequential programming model. That would suggest we didn't learn anything from this really fertile period in computer science. That would be a tragedy."
29:40 "The real tragedy would be if people forgot that you could have new ideas about programming models in the first place."
This is easily the best presentation I have seen on programming. I begun to think recently I was starting to understand programming. Now I see how mistaken I was.
This should be called "The resistance to innovation". Great talk. I watched it shallowly before and I believed it was a joke and a boutade. Now I watched at it with more attention and I find it really inspiring and with very heavy lessons to learn
This video should have like 1000 times more views.
That was a spectacular talk... holy smokes, 49 years ago!! 2022 today. One thing is for sure, the message for creativity is ... TIMELESS.
It truly felt like I went back in time to 1960's awesome presentation 🙌
Makes me happy to realize I'm not the only one thinking like this. I have been programming for at least 30 years and have, unsuccessfully, tried to find alternative ways to do it since using text editors has always felt wrong to me. I have discussed this with plenty of colleagues over the years but no one ever understood what I meant by graphical or building block programming so finally I just gave up. Even simple concepts like generative programming is hard to discuss. Maybe someday I will try again. :)
Try LabView, easy to relatively simple programs but the graphical programming has its limits in terms of clutter management.
Lucky Lu Thanks for the tip.I did use LabView around 1995 but I didn't like it very much then. It was slow and cumbersome. Maybe it has evolved.
+John Johansson Again you have missed Wordpress.org + plugins. That IS visual programming. Code is created underneath the hood without a web designer needing to do any coding. It's a visual interface.
try again! there are so many unexplored wildernesses and only a handful of explorers!
Sounds like (data) flow based programming
Bret Victor is awesome.
Всем привет я с Университета ИТМО. Мы вас всех знаем благодаря такому замечательному предмету как Архитектура Компьютера нам очень весело всем универом смотреть ваши лекции главное выпускайте почаще спасибо вам!!!
A really incredible and inspiring experience, thank you Bret Victor.
Last few moments of the talk were truly beautiful!
Every computer science student on the planet should watch one of these retrofuturistic "Where we been / where we going" talks, every five years.
List of Pioneers:
1. Bret Victor (Presenter) : Computer Scientist and a great influencer.
2. Gordon Moore : Co-founder Intel, Moore's Law.
3. Stan Poley : created SOAP
4. John Van Neumann : Polymath- Mathematician, Computer Scientist, Engineer, Physicist.
5. John Backus : FORTRAN inventor
6. Ivan Sutherland : created Sketchpad
7. Carl Hewitt : designed Planner Programming Language, created Actor Model
8. Ralph Griswold : created SNOBOL
9. Ken Thompson : Bell Labs, Pattern matching
10. J.C.R. Licklider : Inter-galactic network
11. Douglas Engelbart : onLine System (NLS) (Mouse) spatial representation of information
12. Tom Ellis : RAND corporation. GRAIL Flowchart
13. Larry Tesler : smalltalk OOPL
When the speaker asked "Do you know the reason why all these ideas came about in the 60s? Why did it all happen then?" I thought the answer was going to be LSD.
LISP seems inspired by it.. lol
I thought exactly the same thing! LSD, and the general openness of that time period. But maybe it is related. "We don't know what a computer is" versus "we don't know what life is, how to live it, let's try something completely different". Openmindedness.
29:41 "The real tragedy would be if people forgot that you could have new ideas about programming models..."
Inspiring talk. Right now I'm in the process of unlearning after 15 years coding to learn again to code, and that's incredible experience.
I'm in the process of learning for the first time. So far it seems pretty daunting. Truthfully I don't know what I actually want to code for yet. I just know I want to.
@@alexbroGellungaRunga are you still now? I’m just getting into it. Any advice?
In 1973, it was large organizations "doing" computing, and the things those organizations did was sequential: payroll, taxes, etc.
Absolute awesome presentation, and gives you a shitload of things to think about (at least for me as a programmer). Some of the stuff is obviously still far away, esp. the auto-communication between different systems which is _really_ hard to realize or even correctly get the problem.
Many pens and no pocket protector. Quite a risk taker.
And here we are. In 1979. Look ahead to the future of programming
God bless everyone still waking up everyday chasing their dreams salute
this guy is very visionary. right now were still dealing with declarative programming and concurrency and async as hot topics
pwned. bret victor is not from the 70s
The things he apparently wants HAVE been adopted and have transformed industries. But rather than a universal general-purpose programming tool, they are implemented as the "right tools" for a variety of special purpose jobs. And the people using the tools are not "programmers" but are accountants, engineers, graphic artists, musicians, authors, lab technicians, etc. Anyone hiring a Senior PostScript Developer?
As an industry we've built incredible tools for manipulating all kinds of media, enabling countless new ways of thinking about and experimenting with those media. So why haven't we build such tools for our own industry? The overwhelming majority of programming currently is linear text describing sequential procedures.
A must watch for every developer nowadays.
I love all those pens in his shirt. Feels like he could make useful diagrams whenever he needs to.
This was an amazing talk, but I think that a large portion of the reason why these ideas haven't been prominent in the mainstream is that 1) they're computationally intensive (a Prolog hobby interpreter would grind away on my old 286 while Turbo Pascal blitzed out executables) and 2) because they were solving very limited cases. When you try to apply the same techniques to general cases, you discover a mountain of edge cases (which, related to #1, makes them unusable).
As a self taught coder for the most part, I made this EXACT mistake. Wrote a strange "assembler" that could only translate a single line of assembly code into machine language, and I never got to writing my own assembly compiler. I saw it as a waste of bytes- I'd have two copies of every program. WOW. Now I see how idiotic I was not to build the assembler first. Debugging was such a chore that I eventually gave up on what should have been a simple program to write and test. And thus ended my self taught assembly coding on the C64, because by then I was finally getting my hands on more powerful hardware.
This guy was ahead of time...already talking AI straight up back in 2013. Wow!!
He is sayng that there are other ways to use a computer, even if you use your fancy Ide to program you're still using the sequencial programming, plus he says that in the past computer were used to actually compute a way to solve problems instead of having told how to solve them, I think this is the most interesting part
Programming is not all sequential, there's functional programming and concurrency and asynchronous approaches. There's really no reason to restrict yourself to straight up sequential programming (though of course execution will be in some sequence as we experience the world via causality).
#1 talk ever made... and that includes the future too!
Inspiring.
When I watched for the first time some 7 years ago, I was inspired. Now, few years out of school I'm a bit more realistic. These are all nice ideas, but in most cases there is a very good reason why some ideas didn't get adopted. The main takeaway from this video is that you should adopt and be open to progress.
You'll get there. I'm pretty far into programming expertise and I see these ideas as pretty much spot-on. Moore's law has "ended" (or is getting there) and we'll seriously have to start getting into parallel computing. Having text-only visibility into a codebase is also outdated and I think there'll be some serious innovation on this soon.
@@joonasfi here's hoping!
@@joonasfi so back to visual basic then
Tommy what's dat
ua-cam.com/users/shorts46eJttjydOg?feature=share
This resistance to new ways of programming also seems to be taking place with programming using LLMs as assistants where you just ask the LLM for a piece of code in natural language and it just gives you a working piece of code without you having to manually type one yourself. So many programmers today don't consider that to be "programming".
It's 2022 and I'll visit this talk again after 5 years -> 2027.
I thought exactly the same thing when the smalltalk slide showed up. It is cool to have functions/methods organized but in the end it is still a text file, and more times than not it works to have the whole file visible in the same context. But maybe that is also a programming paradigm where bad sequential programming is being made.
SmallTalk’s browser is a cool idea as long as it’s just one of many views of your project. Once it becomes the only view, it’s like viewing the stuff through a tiny window, and it’s hard to get a good idea of how things interconnect. SmallTalk and its modern incarnation Pharos have been ossified in some way. They are great ideas but hold on to their obsolete browser model that’s just unnecessarily constraining. Visually we can do way better than that nowadays.
I used two visual programming languages, HP-VEE and recently MapForce. The problem they have is that even the simplest of programs become incomprehensible diagrams.
That same problem existed in Text Programming back when it was just machine code. Because machine code had no blocks or For loops or encapsulation, any long program inevitably became a mess of spaghetti code and GOTOs.
Visual PLs haven't had much research and are still in their infancy, and yet there's still encapsulation in tools like Unreal's Blueprints.
Also Visual doesn't need to be the only way to model - it's important that we have many ways of representing computation so we can pick the best tool for the job.
A BRILLIANT talk.
Believe it or not, this one video did inspire Vlad Magdalin to go and build Webflow
Oh nooo.... Webflow looks interesting, but if it's not open source, then unfortunately he didn't learn the biggest lesson in computing, not mentioned in this talk because it's the one thing we DID get right.
I don't believe that I even know who this is and what this is.
This is great. This video is like talking to my dad, who had to write his own assembler in uni...
Excellent. Worth watching.
26:02 "If it's not threads and locks, then what's going to work?"
The actor model is a very elegant solution, but you can go even further. Within each actor, you can use a functional single-assignment language that makes the most out of an MMPA. And that did exist at the time, Larry Tesler and Horace Enea's Compel from 1968. Programming could have been so different if we had adopted these ideas half a century ago. Because the ideas were there!
When an artist rigs a 3D model, that's programming visually. It works some what like Sketchpad shown in the talk.
As I watch this video I see more and more of how all of this hooks into making stuff in Minecraft. Although the backend blocks are still written with APIs for the most part, there are some things like Extra Utilities watering can that works on say Magical Crops without Extra Utils knowing anything about how Magical Crops works, just that it's a plant and that means the can should accelerate its growth. Now as an end user building a system from different blocks in Minecraft, obviously as long as you give something an inventory the hopper knows about it, but to the end user he just sets up three hoppers to a furnace and feeds them with fuel and items, and he gets processed stuff in a chest. Something like that. Visually building systems that just work together. It's not a perfect model but it's as close to a system as I can come up with off the top of my head that points to everything he's talking about so far. Again self taught programmer, so I'm probably missing obvious implications that have done everything he's talking about at once.
This. Code should be designed in such away that standards exist so implementing features work seamlessly with out some kind of buggy one time patch
I have no idea about Minecraft, so I might be wrong, but what you describe sounds to me rather like "duck typing" - which is probably off-mark (because it's still an API) but I'm guessing it *is* actually an interesting counter to a *strict* API.
In any case, kudos for thinking about it as a self-taught programmer. It's too easy to find all kinds of "certified" programmers, even young ones, who are just dead set in their ways and think there are just no alternatives: the dogma Bret mentioned. They wouldn't even try to understand. And this probably is a plus of being self-taught.
This is one of the best software development talks I've ever seen..... but those jell pens in his pocket are too modern.
I've been thinking the CS program I got a degree from could be improved by starting out with parallel courses in intro programming (similar to what it had) and an intro to doing things with bits (data encoding & manipulation), so that by the end of the first semester you get a sense of how computers work (manipulating encoded bits) and how you can compose that into useful programs. Now I think the intro programming should begin by introducing programming as specifying computations, which can be sequentially, functionally, logically, etc - it can still mostly focus on the usual sequential programming, but it should ensure that students know that the most common paradigm is only one of many that are possible
Not if it does not use the codebase/patterns, but rather gains influence based upon them. Ideally, the computer gains its own knowledge based upon analyzing a codebase or pattern. Once that knowledge is gained, the necessary components of the codebase/pattern is stored (as well as a pointer to the original) in the computers implementation of LTM.
From there, the computer can make its own judgments of how to implement its newly found knowledge.
I agree, we are probably somewhat in the infancy of his goals. The examples I thought of when listening:
1. WYSIWYGs, Content Management Systems
2. Test-driven development (sort of) - and abstraction in general; think this has the biggest room for growth but is also rather pie-in-the-sky
3. Lego Mindstorms, UML-to-code and ERD-to-DDL generators, Object Relational Mappers
4. Hadoop, GPU processing
Another reason for creativity in the early days of computers is certainly: There was no legacy of software to maintain.
Bret Victor is a genius
he is awesome
Mind-blowing.
Marvelous talk!
20:25 - "I'm totally confident than 40 years we won't be writing code in text files".
It is amazing that many of the things that he mentioned are already being done with National Instruments LabVIEW graphical programming language. It uses a data flow programming model implemented within a graphical programming environment. It also natively handles parallel processing tasks just by placing your code in separate structures.
It’s also orders of magnitude outside of the budgets of most of us, and is a closed box with relatively little innovation happening compared to mainstream programming environments. So, it’s cool and all, except it looked the same 15 years ago as far as how typical users use it, and is hidden behind a paywall that automatically relegates it to an “also ran” status. It’s to a point where if a 3rd party wants to interface their stuff with it, they have buy a license, just to make the environment more appealing to people who actually want to use it. Whatever model they have in mind has been engineered to keep it a tightly walled garden that makes Apple’s iOS look like a free-for-all.
You are so wrong bro. LabVIEW was released in 86, these are all from before 1973 (which is the year the talk is imitating as a class). You're way off on the timeline.
Because he's giving the talk from the viewpoint of a programmer in the '70s. ARPANet.
Great presentation! I got a more concrete sense of what Alan Kay has been saying all these years re. computing.
Nice talk. But I would recommend you look up the accompanying slides pages and look out for the videos he references there (Alan Kay for example). Ivan Sutherland's presentation is shown there - it's from 1962 and impressing. (The other things too, but the Sketchpad is outstanding, imo.)
oh that dripping optimism, you should share some more of it
The 2nd point (computers talking through goals) I wish had more elaboration to it. An example language specific to that task would have been nice.
Very Creative - Well Done ! This is one of the best presentations I've seen not only on programming but thinking.
Just when you think you're a master at something, you find out you know absolutely nothing.
Amazing presentation!
furthermore, as a guy who coded webpages using text editors, i had similar feelings of self-obsolescence when i saw the visual WYSWYG web design programs or the online web templates allowing people to create webpages without knowing how to write code.
I love this presentation.
Beautiful !
Binary code is still present even all old stuffs. But higher level of programming language brings bigger abstraction. Almost all stuffs present like future of programming are used over twenty years - Lisp, Ruby, Finite Automaton, Petri Net, LabView, Simulink, genetics algorithms, neural network, UML, FPGA, Unified Shaders model, OOP, Linear bounded automaton, all of these are using now, but very small people can using and understand this stuffs properly. All layers remains from binary up to Lambda calculus or Pi calculus and higher layer cannot exist without next lower one.
Still, as of the conclusion, I couldn't agree more.
At 24:00 as he talks about the fact that while the CPU transistors are furiously churning away the memory transistors are just sitting there doing next to nothing, I actually started getting a little scared.
I like it very inspirational
"accept that you don't know what you're doing and the you're free".
It's interesting because the concept that computer programs could fend for themselves to achieve their assigned goals really didn't make sense before 2022.
We tried most of those ideas. There's real reasons why we don't use them. As somebody who has coded prolog you really even with a fleshed out language understand that it's hugely limited. There's a reason a lot of procedural things were added to it. And to the extent that it makes sense we actually use it. MySQL for example is basically goals.
Absolutely brilliant !
Excellent talk!
Hashtag LoveTheOverheadProjector
Why isn't anyone howling with laughter ? Just heard the end remarks, no laughter because it's all so depressingly true.
It took me a while to notice that the overhead projector does not even point to the screen.
@@amigalemming it's a "modern" one with a camera :-) At the time when VGA/XGA projectors became common, there were still people insisting on using an overhead (slides they already had, flow of presentation), so the market came with overhead-projector-cameras
Couldn't agree more.
And this is why we don't call the developers "connectors" (as of yet).
Parallelism, in its current and envisioned form, is just as awesome as marketing.
That's the sad part. He isn't talking about the future, he's talking about the past. Most of those languages and programming ideas are defunct, some are still being explored. :/
This need to be mandatory for every computer science & programmers to watch this. Trully eye opening, especially at this time, when GPT4 challenge how we write program.
The binary to fortran part hit so close to home now than ever, to think that many people (including myself) can't see outside the box. We're too feed up how to learn about new javascript framework, completely oblivious about totally radical way of interacting with computer, GPT4 and AI fields really caught me off guard.
And about threads and deadlock, I think we still do it now, right? Is there a better way that already implemented in real world app?
A transcript of this talk is here: glamour-and-discourse.blogspot.com/p/the-future-of-programming-bret-victor.html
Who else want this re-uploaded in 4:3 format.
awesomely awesome!
I like constraint programming in geometry. Computer: Please draw seven red lines, all perpendicular, some with green ink, some with transparent ink, and one in the form of a kitten!
I can say this is so, I was there.
Could you please upload the slides?
Ahhh, so it is Conceptual, it is getting better and better
I am 9 min is and this seems like it was recorded in 80's. I mean like the number of pencils the guy has on his front pocket + the projector... As also what and how he is swaying stuff!
But the video is colorful and apparently that 2013.
Is this cosplaying!
Because parallelism is awesome. And while it is already an exceedingly hard problem it is still trivial to auto-discovery because there is much cookbook-work for it and essentially a well-prepared field, while auto-discoverability is pretty much still a black spot which has as far as I understand it much to do with (true) AI which again is still in its infancy (which is pretty funny considering how long people already work on it)
This is exaclty the way of thinking this video is trying to get rid of. It might be not possible to fix now, that doesn't mean it isn't possible at all.
In the case of CPUs imagine if all engineers had the same mindset when they where using vacuum tubes computers, the transistor based CPUs would have never existed, because it just wasn't possible to do whatever with tubes, and it can't be fixed.
As Victor said, getting rid of dogmas and paradigms is the way to invention.
Still auto-integration would change the face of programming more than having paralell processors, but i agree that it seems really far.
Wonderful! Very inspiring!