КОМЕНТАРІ •

  • @swyxTV
    @swyxTV 3 роки тому +153

    i like how the audience is initially laughing along with him, 1973 how quaint… then falls silent as they realize how dead serious he is and how far off we are in 2013.

    • @AashraiRavooru
      @AashraiRavooru 3 роки тому +10

      I landed here from your tweet only to once again read your comment. You are everywhere

    • @Stonehenge2.0
      @Stonehenge2.0 2 роки тому +11

      I was still laughing the whole time! He’s roasting the whole industry, god bless this hero even more! Haha

    • @ChrisAthanas
      @ChrisAthanas Рік тому +3

      Still same in 2023

    • @ximono
      @ximono 10 місяців тому

      @@ChrisAthanasWe're stuck. We're not really advancing anymore.

  • @engineeredarmy1152
    @engineeredarmy1152 Рік тому +7

    List of Pioneers:
    1. Bret Victor (Presenter) : Computer Scientist and a great influencer.
    2. Gordon Moore : Co-founder Intel, Moore's Law.
    3. Stan Poley : created SOAP
    4. John Van Neumann : Polymath- Mathematician, Computer Scientist, Engineer, Physicist.
    5. John Backus : FORTRAN inventor
    6. Ivan Sutherland : created Sketchpad
    7. Carl Hewitt : designed Planner Programming Language, created Actor Model
    8. Ralph Griswold : created SNOBOL
    9. Ken Thompson : Bell Labs, Pattern matching
    10. J.C.R. Licklider : Inter-galactic network
    11. Douglas Engelbart : onLine System (NLS) (Mouse) spatial representation of information
    12. Tom Ellis : RAND corporation. GRAIL Flowchart
    13. Larry Tesler : smalltalk OOPL

  • @xuefeng77
    @xuefeng77 4 роки тому +87

    31:22 “The most dangerous thought that you can have as a creative person is to think that you know what you’re doing.
    Because once you think you know what you’re doing, you stop looking around for other ways of doing things and you stop being able to see other ways of doing things, you become blind…”

    • @ctwolf
      @ctwolf 3 роки тому +1

      🔥 thanks for grabbing this quote/snip.

  • @w00tix
    @w00tix 8 років тому +230

    This is seriously one of the best presentations I've seen on programming

    • @mrufa
      @mrufa 5 років тому +8

      Wanted to give you a thumb up, but since the number of those is quite magical I decided to just say that I totally agree with you :)

  • @TomLynch
    @TomLynch 7 років тому +266

    This is why Computer Science needs to teach courses on the history of computers and programming. All other sciences talk about their history in introductory courses but not computer science. So the history is lost. I have taken CS courses at 4 universities and none of them talked about the history. BTW very good talk.

    • @RoyRope
      @RoyRope 6 років тому +2

      Our first course opened with an itroduction from a historic perspective, I did enjoy it.

    • @johnhabib289
      @johnhabib289 5 років тому +12

      Same thing in Math and Physics. If only people knew how hard we had to work to shake off our assumptions and make progress, maybe they'd be a bit more sensible when it comes to the subjects. It wasn't 500 years ago an equals sign went in only one direction, and it took (at least) two generations for Kepler to figure out planets orbited in ellipses.

    • @earthstick
      @earthstick 4 роки тому +2

      They do

    • @jonny__b
      @jonny__b 4 роки тому +4

      My CS101 course was on the history of programming

    • @Ubu987
      @Ubu987 3 роки тому +4

      @@johnhabib289 Agreed. So much progress in learning has been the result of unlearning, of the refutation of unshakable dogmas. Presently, everything is presented to beginners as a fait accompli, a final resting place from which vantage point the initial problems and inspirations become irrelevant, and the struggles and breakthroughs become moot. A sense of history, not only exciting and inspiring in itself, would remind students of their fallibility, and that they too occupy a position in the unfolding of understanding. Even Newton acknowledged that he saw further because he stood on the shoulders of giants.

  • @driziiD
    @driziiD 3 роки тому +28

    01:36 nature of adopting ideas
    04:34 fortran
    06:43 direct manipulation of data
    08:25 bridge simulation
    10:56 prolog
    11:27 pattern matching
    13:45 how do you get communication
    16:31 spatial reputation of information
    22:00 parallel programming
    28:20 spatial representation of information

  • @jorcyd
    @jorcyd 2 роки тому +5

    Anyone else coming here from StackOverflow ?!
    This presentation is pure genius.

  • @rian7079
    @rian7079 2 роки тому +11

    I was going to write my final assignment to get my undergrad degree, randomly brought to this presentation. Best presentation I've ever heard, totally mind opener and spark my spirit to do my research not just to finish my degree but to learn much more. Thank you

  • @anasdevscribbles
    @anasdevscribbles 6 місяців тому +2

    This video never gets old, I like to rewatch it once in a while. I like to remind myself of this truth "The most dangerous thought that you can have as a creative person is to think that you know what you’re doing...". Thanks for sharing!

  • @unusedName1
    @unusedName1 8 років тому +56

    This should be called "The resistance to innovation". Great talk. I watched it shallowly before and I believed it was a joke and a boutade. Now I watched at it with more attention and I find it really inspiring and with very heavy lessons to learn

  • @samuel-beek
    @samuel-beek 2 роки тому +4

    Bret Victor is such a legend, it's so fascinating how much he can inspire people with old stuff :). Incredible! In his other talks he presents a lot of his own ideas, but I love this way of presenting even more.

  • @kamel3d
    @kamel3d 4 роки тому +37

    Believe it or not, this one video did inspire Vlad Magdalin to go and build Webflow

    • @UliTroyo
      @UliTroyo 3 роки тому +5

      Oh nooo.... Webflow looks interesting, but if it's not open source, then unfortunately he didn't learn the biggest lesson in computing, not mentioned in this talk because it's the one thing we DID get right.

    • @retropaganda8442
      @retropaganda8442 2 роки тому

      I don't believe that I even know who this is and what this is.

  • @kozoba
    @kozoba 9 років тому +25

    This video should have like 1000 times more views.

  • @AlvinLee007
    @AlvinLee007 10 років тому +20

    Bret Victor is awesome.

  • @KentOJohnson
    @KentOJohnson 8 років тому +18

    This is easily the best presentation I have seen on programming. I begun to think recently I was starting to understand programming. Now I see how mistaken I was.

  • @sparkloweb
    @sparkloweb 10 років тому +30

    The things he apparently wants HAVE been adopted and have transformed industries. But rather than a universal general-purpose programming tool, they are implemented as the "right tools" for a variety of special purpose jobs. And the people using the tools are not "programmers" but are accountants, engineers, graphic artists, musicians, authors, lab technicians, etc. Anyone hiring a Senior PostScript Developer?

    • @DevonParsons697
      @DevonParsons697 4 роки тому +25

      As an industry we've built incredible tools for manipulating all kinds of media, enabling countless new ways of thinking about and experimenting with those media. So why haven't we build such tools for our own industry? The overwhelming majority of programming currently is linear text describing sequential procedures.

  • @dac514
    @dac514 11 років тому +55

    When the speaker asked "Do you know the reason why all these ideas came about in the 60s? Why did it all happen then?" I thought the answer was going to be LSD.

    • @wiskasIO
      @wiskasIO 4 роки тому +1

      LISP seems inspired by it.. lol

    • @CuriousCritter9
      @CuriousCritter9 4 роки тому +1

      I thought exactly the same thing! LSD, and the general openness of that time period. But maybe it is related. "We don't know what a computer is" versus "we don't know what life is, how to live it, let's try something completely different". Openmindedness.

  • @ximono
    @ximono 10 місяців тому +1

    5:25 Overview of the four big ideas
    6:42 1. coding -> direct manipulation of data
    9:42 2. procedures -> goals and constraints
    16:25 3. text dump -> spatial representations
    21:55 4. sequential -> concurrent
    27:45 Conclusion
    Fun/tragicomical bits:
    9:15 Ted Nelson's Xanadu, not Tim Berners-Lee's HTML + CSS
    11:48 "Ken Thompson over at Bell Labs working on this system they call Unix. I know right, Unix?"
    12:52 The Internet ("cute idea, might work")
    15:20 ''API'' ("brittle, doesn't scale, not gonna happen")
    17:00 GUIs
    17:42 The mouse ("kind of hard to explain")
    18:18 Flow-based Programming, no-code
    20:20 "I'm totally confident that in 40 years we won't be writing code in text files. We've been shown the way."
    20:53 Interactive computing, real-time, reactivity ("It's pretty obvious that in 40 years, if you interact with user interfaces, you'll never experience any delay or lag.")
    24:42 MPPA ("This is the kind of architecture we're gonna be programming on in the future… unless, Intel somehow gets a stranglehold on the microprocessor market and pushes their architecture forward for 30 years, but that's not gonna happen.")
    25:54 Threads and locks ("This is never gonna work, right? This does not scale. If in 40 years, we're still using threads and locks, we should just pack up and go home, 'cause we've clearly failed as an engineering field.")
    27:18 Erlang ("I do think it would be kind of cool if the actor model was picked up by a Swedish phone company or something, that would be kind of weird." - one guy laughs)
    28:20 "We're not gonna have text files anymore. We're going to be representing information spatially because we have video displays."
    29:00 "I do think it would be a shame if in 40 years, we're still coding in procedures in text files in a sequential programming model. That would suggest we didn't learn anything from this really fertile period in computer science. That would be a tragedy."
    29:40 "The real tragedy would be if people forgot that you could have new ideas about programming models in the first place."

  • @olio3301
    @olio3301 2 роки тому +4

    A really incredible and inspiring experience, thank you Bret Victor.

  • @Coding_knight
    @Coding_knight 2 роки тому +2

    It truly felt like I went back in time to 1960's awesome presentation 🙌

  • @erajoj
    @erajoj 10 років тому +35

    Makes me happy to realize I'm not the only one thinking like this. I have been programming for at least 30 years and have, unsuccessfully, tried to find alternative ways to do it since using text editors has always felt wrong to me. I have discussed this with plenty of colleagues over the years but no one ever understood what I meant by graphical or building block programming so finally I just gave up. Even simple concepts like generative programming is hard to discuss. Maybe someday I will try again. :)

    • @LuckyKo
      @LuckyKo 10 років тому +2

      Try LabView, easy to relatively simple programs but the graphical programming has its limits in terms of clutter management.

    • @erajoj
      @erajoj 10 років тому +1

      Lucky Lu Thanks for the tip.I did use LabView around 1995 but I didn't like it very much then. It was slow and cumbersome. Maybe it has evolved.

    • @mikezooper
      @mikezooper 8 років тому +2

      +John Johansson Again you have missed Wordpress.org + plugins. That IS visual programming. Code is created underneath the hood without a web designer needing to do any coding. It's a visual interface.

    • @Infinifiction
      @Infinifiction 6 років тому +3

      try again! there are so many unexplored wildernesses and only a handful of explorers!

    • @jpratt8676
      @jpratt8676 5 років тому +2

      Sounds like (data) flow based programming

  • @billwayzata
    @billwayzata 6 років тому +43

    Many pens and no pocket protector. Quite a risk taker.

  • @vladimirleon2487
    @vladimirleon2487 2 роки тому

    That was a spectacular talk... holy smokes, 49 years ago!! 2022 today. One thing is for sure, the message for creativity is ... TIMELESS.

  • @Thebasicmaker
    @Thebasicmaker 6 років тому +3

    He is sayng that there are other ways to use a computer, even if you use your fancy Ide to program you're still using the sequencial programming, plus he says that in the past computer were used to actually compute a way to solve problems instead of having told how to solve them, I think this is the most interesting part

    • @jpratt8676
      @jpratt8676 5 років тому

      Programming is not all sequential, there's functional programming and concurrency and asynchronous approaches. There's really no reason to restrict yourself to straight up sequential programming (though of course execution will be in some sequence as we experience the world via causality).

  • @WilliamDye-willdye
    @WilliamDye-willdye 6 років тому +35

    29:41 "The real tragedy would be if people forgot that you could have new ideas about programming models..."

  • @earthstick
    @earthstick 4 роки тому +17

    I used two visual programming languages, HP-VEE and recently MapForce. The problem they have is that even the simplest of programs become incomprehensible diagrams.

    • @LowestofheDead
      @LowestofheDead 2 роки тому +5

      That same problem existed in Text Programming back when it was just machine code. Because machine code had no blocks or For loops or encapsulation, any long program inevitably became a mess of spaghetti code and GOTOs.
      Visual PLs haven't had much research and are still in their infancy, and yet there's still encapsulation in tools like Unreal's Blueprints.
      Also Visual doesn't need to be the only way to model - it's important that we have many ways of representing computation so we can pick the best tool for the job.

  • @seeker.saylee
    @seeker.saylee 3 роки тому +1

    Last few moments of the talk were truly beautiful!

  • @DoomRater
    @DoomRater 11 років тому +4

    As a self taught coder for the most part, I made this EXACT mistake. Wrote a strange "assembler" that could only translate a single line of assembly code into machine language, and I never got to writing my own assembly compiler. I saw it as a waste of bytes- I'd have two copies of every program. WOW. Now I see how idiotic I was not to build the assembler first. Debugging was such a chore that I eventually gave up on what should have been a simple program to write and test. And thus ended my self taught assembly coding on the C64, because by then I was finally getting my hands on more powerful hardware.

  • @Chemaclass
    @Chemaclass 4 роки тому +3

    A must watch for every developer nowadays.

  • @YoungManKlaus
    @YoungManKlaus 11 років тому +16

    Absolute awesome presentation, and gives you a shitload of things to think about (at least for me as a programmer). Some of the stuff is obviously still far away, esp. the auto-communication between different systems which is _really_ hard to realize or even correctly get the problem.

  • @annsophiefans1472
    @annsophiefans1472 7 місяців тому

    Every computer science student on the planet should watch one of these retrofuturistic "Where we been / where we going" talks, every five years.

  • @bharasiva96
    @bharasiva96 3 місяці тому

    This resistance to new ways of programming also seems to be taking place with programming using LLMs as assistants where you just ask the LLM for a piece of code in natural language and it just gives you a working piece of code without you having to manually type one yourself. So many programmers today don't consider that to be "programming".

  • @paulopez78
    @paulopez78 8 років тому

    Inspiring talk. Right now I'm in the process of unlearning after 15 years coding to learn again to code, and that's incredible experience.

    • @alexbroGellungaRunga
      @alexbroGellungaRunga 8 років тому +2

      I'm in the process of learning for the first time. So far it seems pretty daunting. Truthfully I don't know what I actually want to code for yet. I just know I want to.

    • @remotefaith
      @remotefaith 4 роки тому +1

      @@alexbroGellungaRunga are you still now? I’m just getting into it. Any advice?

  • @RonJohn63
    @RonJohn63 11 років тому +1

    In 1973, it was large organizations "doing" computing, and the things those organizations did was sequential: payroll, taxes, etc.

  • @yusefrussia172
    @yusefrussia172 2 роки тому

    this guy is very visionary. right now were still dealing with declarative programming and concurrency and async as hot topics

    • @yusefrussia172
      @yusefrussia172 2 роки тому

      pwned. bret victor is not from the 70s

  • @EternalDensity
    @EternalDensity 5 років тому +3

    Programming with flowcharts is not uncommon though it's more for kids learning the basics of programming or for non-programmers who need to attach a script to something. It's not actually suitable for 'serious' programming, I think.
    Or maybe we're just doing it wrong?

  • @mgangtv2157
    @mgangtv2157 6 років тому +2

    God bless everyone still waking up everyday chasing their dreams salute

  • @SatansSpatula
    @SatansSpatula 11 років тому +5

    This was an amazing talk, but I think that a large portion of the reason why these ideas haven't been prominent in the mainstream is that 1) they're computationally intensive (a Prolog hobby interpreter would grind away on my old 286 while Turbo Pascal blitzed out executables) and 2) because they were solving very limited cases. When you try to apply the same techniques to general cases, you discover a mountain of edge cases (which, related to #1, makes them unusable).

  • @gaspybapinga5291
    @gaspybapinga5291 8 місяців тому

    This guy was ahead of time...already talking AI straight up back in 2013. Wow!!

  • @alonsgab
    @alonsgab 11 років тому +3

    I thought exactly the same thing when the smalltalk slide showed up. It is cool to have functions/methods organized but in the end it is still a text file, and more times than not it works to have the whole file visible in the same context. But maybe that is also a programming paradigm where bad sequential programming is being made.

    • @absurdengineering
      @absurdengineering 2 роки тому +3

      SmallTalk’s browser is a cool idea as long as it’s just one of many views of your project. Once it becomes the only view, it’s like viewing the stuff through a tiny window, and it’s hard to get a good idea of how things interconnect. SmallTalk and its modern incarnation Pharos have been ossified in some way. They are great ideas but hold on to their obsolete browser model that’s just unnecessarily constraining. Visually we can do way better than that nowadays.

  • @amigalemming
    @amigalemming 4 роки тому +6

    Another reason for creativity in the early days of computers is certainly: There was no legacy of software to maintain.

  • @nickbarton3191
    @nickbarton3191 6 років тому +7

    Hashtag LoveTheOverheadProjector
    Why isn't anyone howling with laughter ? Just heard the end remarks, no laughter because it's all so depressingly true.

    • @amigalemming
      @amigalemming 4 роки тому

      It took me a while to notice that the overhead projector does not even point to the screen.

    • @grdalenoort
      @grdalenoort 4 роки тому +2

      @@amigalemming it's a "modern" one with a camera :-) At the time when VGA/XGA projectors became common, there were still people insisting on using an overhead (slides they already had, flow of presentation), so the market came with overhead-projector-cameras

  • @DoomRater
    @DoomRater 11 років тому +6

    As I watch this video I see more and more of how all of this hooks into making stuff in Minecraft. Although the backend blocks are still written with APIs for the most part, there are some things like Extra Utilities watering can that works on say Magical Crops without Extra Utils knowing anything about how Magical Crops works, just that it's a plant and that means the can should accelerate its growth. Now as an end user building a system from different blocks in Minecraft, obviously as long as you give something an inventory the hopper knows about it, but to the end user he just sets up three hoppers to a furnace and feeds them with fuel and items, and he gets processed stuff in a chest. Something like that. Visually building systems that just work together. It's not a perfect model but it's as close to a system as I can come up with off the top of my head that points to everything he's talking about so far. Again self taught programmer, so I'm probably missing obvious implications that have done everything he's talking about at once.

    • @roberthales3733
      @roberthales3733 10 років тому

      This. Code should be designed in such away that standards exist so implementing features work seamlessly with out some kind of buggy one time patch

    • @HMijailAntonQuiles
      @HMijailAntonQuiles 6 років тому +3

      I have no idea about Minecraft, so I might be wrong, but what you describe sounds to me rather like "duck typing" - which is probably off-mark (because it's still an API) but I'm guessing it *is* actually an interesting counter to a *strict* API.
      In any case, kudos for thinking about it as a self-taught programmer. It's too easy to find all kinds of "certified" programmers, even young ones, who are just dead set in their ways and think there are just no alternatives: the dogma Bret mentioned. They wouldn't even try to understand. And this probably is a plus of being self-taught.

  • @edgeeffect
    @edgeeffect 2 роки тому +2

    This is one of the best software development talks I've ever seen..... but those jell pens in his pocket are too modern.

  • @rangermauve
    @rangermauve 11 років тому +1

    I love all those pens in his shirt. Feels like he could make useful diagrams whenever he needs to.

  • @margaman6021
    @margaman6021 8 років тому +37

    Just when you think you're a master at something, you find out you know absolutely nothing.

  • @kernelsoe
    @kernelsoe 2 роки тому

    It's 2022 and I'll visit this talk again after 5 years -> 2027.

  • @SarahC2
    @SarahC2 10 років тому +16

    Inspiring.

  • @ShadSterling
    @ShadSterling Рік тому

    I've been thinking the CS program I got a degree from could be improved by starting out with parallel courses in intro programming (similar to what it had) and an intro to doing things with bits (data encoding & manipulation), so that by the end of the first semester you get a sense of how computers work (manipulating encoded bits) and how you can compose that into useful programs. Now I think the intro programming should begin by introducing programming as specifying computations, which can be sequentially, functionally, logically, etc - it can still mostly focus on the usual sequential programming, but it should ensure that students know that the most common paradigm is only one of many that are possible

  • @lukaszstocki6998
    @lukaszstocki6998 3 роки тому

    #1 talk ever made... and that includes the future too!

  • @SeanJMay
    @SeanJMay 11 років тому +3

    Because he's giving the talk from the viewpoint of a programmer in the '70s. ARPANet.

  • @Jon31337
    @Jon31337 11 років тому +1

    When an artist rigs a 3D model, that's programming visually. It works some what like Sketchpad shown in the talk.

  • @smallsnippets
    @smallsnippets 9 місяців тому

    Nice talk. But I would recommend you look up the accompanying slides pages and look out for the videos he references there (Alan Kay for example). Ivan Sutherland's presentation is shown there - it's from 1962 and impressing. (The other things too, but the Sketchpad is outstanding, imo.)

  • @curufinw
    @curufinw 10 років тому +4

    A transcript of this talk is here: glamour-and-discourse.blogspot.com/p/the-future-of-programming-bret-victor.html

  • @petrgolich8055
    @petrgolich8055 8 років тому +1

    Binary code is still present even all old stuffs. But higher level of programming language brings bigger abstraction. Almost all stuffs present like future of programming are used over twenty years - Lisp, Ruby, Finite Automaton, Petri Net, LabView, Simulink, genetics algorithms, neural network, UML, FPGA, Unified Shaders model, OOP, Linear bounded automaton, all of these are using now, but very small people can using and understand this stuffs properly. All layers remains from binary up to Lambda calculus or Pi calculus and higher layer cannot exist without next lower one.

  • @amigalemming
    @amigalemming 4 роки тому +4

    I like constraint programming in geometry. Computer: Please draw seven red lines, all perpendicular, some with green ink, some with transparent ink, and one in the form of a kitten!

  • @tinkeringabout1947
    @tinkeringabout1947 11 місяців тому +2

    Всем привет я с Университета ИТМО. Мы вас всех знаем благодаря такому замечательному предмету как Архитектура Компьютера нам очень весело всем универом смотреть ваши лекции главное выпускайте почаще спасибо вам!!!

  • @ximono
    @ximono 10 місяців тому

    26:02 "If it's not threads and locks, then what's going to work?"
    The actor model is a very elegant solution, but you can go even further. Within each actor, you can use a functional single-assignment language that makes the most out of an MMPA. And that did exist at the time, Larry Tesler and Horace Enea's Compel from 1968. Programming could have been so different if we had adopted these ideas half a century ago. Because the ideas were there!

  • @MaximoPower2024
    @MaximoPower2024 Рік тому

    It's interesting because the concept that computer programs could fend for themselves to achieve their assigned goals really didn't make sense before 2022.

  • @flyingsayon
    @flyingsayon 3 роки тому

    A lot of commenters point out that the reason these ideas are not being used is because "they have limited applicability". Like Prolog had procedural features added into it. But this is not a good argument: the better an instrument is the more specialized it is. There is no point trying to cut trees with a hammer or drive the screws with an axe. The point of having specialized instruments like Prolog is to use a specific instrument for each problem to build simple and robust solutions.
    I think it's rather an argument *against* making silver bullets out of instruments like Prolog. Once you get more expertise in IT you would oppose trying to make a universal tool of Python as well.

  • @jimsomerville3924
    @jimsomerville3924 11 років тому

    I agree, we are probably somewhat in the infancy of his goals. The examples I thought of when listening:
    1. WYSIWYGs, Content Management Systems
    2. Test-driven development (sort of) - and abstraction in general; think this has the biggest room for growth but is also rather pie-in-the-sky
    3. Lego Mindstorms, UML-to-code and ERD-to-DDL generators, Object Relational Mappers
    4. Hadoop, GPU processing

  • @annsophiefans1472
    @annsophiefans1472 7 місяців тому

    At 24:00 as he talks about the fact that while the CPU transistors are furiously churning away the memory transistors are just sitting there doing next to nothing, I actually started getting a little scared.

  • @icantseethis
    @icantseethis 11 років тому

    This is great. This video is like talking to my dad, who had to write his own assembler in uni...

  • @Lucy-Luc-Lu-L
    @Lucy-Luc-Lu-L 3 роки тому +10

    When I watched for the first time some 7 years ago, I was inspired. Now, few years out of school I'm a bit more realistic. These are all nice ideas, but in most cases there is a very good reason why some ideas didn't get adopted. The main takeaway from this video is that you should adopt and be open to progress.

    • @joonasfi
      @joonasfi 3 роки тому +9

      You'll get there. I'm pretty far into programming expertise and I see these ideas as pretty much spot-on. Moore's law has "ended" (or is getting there) and we'll seriously have to start getting into parallel computing. Having text-only visibility into a codebase is also outdated and I think there'll be some serious innovation on this soon.

    • @gomesroney
      @gomesroney 3 роки тому +2

      @@joonasfi here's hoping!

    • @peppigue
      @peppigue 3 роки тому +1

      @@joonasfi so back to visual basic then

    • @blackstaExLix
      @blackstaExLix 2 роки тому

      Tommy what's dat
      ua-cam.com/users/shorts46eJttjydOg?feature=share

  • @jimelihel
    @jimelihel 5 років тому +4

    One little slip-up. He forgot that few if any in 1973 would yet have heard about classes and methods. He gave those terms without explanation. But who cares. Great idea for a talk!

  • @Tatarize
    @Tatarize 11 років тому +3

    We tried most of those ideas. There's real reasons why we don't use them. As somebody who has coded prolog you really even with a fleshed out language understand that it's hugely limited. There's a reason a lot of procedural things were added to it. And to the extent that it makes sense we actually use it. MySQL for example is basically goals.

  • @QberryShortcake
    @QberryShortcake 11 років тому +2

    That's the sad part. He isn't talking about the future, he's talking about the past. Most of those languages and programming ideas are defunct, some are still being explored. :/

  • @johnridpath251
    @johnridpath251 11 років тому +2

    It is amazing that many of the things that he mentioned are already being done with National Instruments LabVIEW graphical programming language. It uses a data flow programming model implemented within a graphical programming environment. It also natively handles parallel processing tasks just by placing your code in separate structures.

    • @absurdengineering
      @absurdengineering 2 роки тому +1

      It’s also orders of magnitude outside of the budgets of most of us, and is a closed box with relatively little innovation happening compared to mainstream programming environments. So, it’s cool and all, except it looked the same 15 years ago as far as how typical users use it, and is hidden behind a paywall that automatically relegates it to an “also ran” status. It’s to a point where if a 3rd party wants to interface their stuff with it, they have buy a license, just to make the environment more appealing to people who actually want to use it. Whatever model they have in mind has been engineered to keep it a tightly walled garden that makes Apple’s iOS look like a free-for-all.

    • @davidmiller9485
      @davidmiller9485 11 місяців тому

      You are so wrong bro. LabVIEW was released in 86, these are all from before 1973 (which is the year the talk is imitating as a class). You're way off on the timeline.

  • @ChristosRym
    @ChristosRym 5 років тому +1

    I am 9 min is and this seems like it was recorded in 80's. I mean like the number of pencils the guy has on his front pocket + the projector... As also what and how he is swaying stuff!
    But the video is colorful and apparently that 2013.
    Is this cosplaying!

  • @MikeHenken
    @MikeHenken 11 років тому

    Not if it does not use the codebase/patterns, but rather gains influence based upon them. Ideally, the computer gains its own knowledge based upon analyzing a codebase or pattern. Once that knowledge is gained, the necessary components of the codebase/pattern is stored (as well as a pointer to the original) in the computers implementation of LTM.
    From there, the computer can make its own judgments of how to implement its newly found knowledge.

  • @HarryWongCIO
    @HarryWongCIO 10 років тому

    And here we are. In 1979. Look ahead to the future of programming

  • @bulelula
    @bulelula 10 років тому +2

    Excellent. Worth watching.

  • @adicandra9940
    @adicandra9940 Рік тому +2

    This need to be mandatory for every computer science & programmers to watch this. Trully eye opening, especially at this time, when GPT4 challenge how we write program.
    The binary to fortran part hit so close to home now than ever, to think that many people (including myself) can't see outside the box. We're too feed up how to learn about new javascript framework, completely oblivious about totally radical way of interacting with computer, GPT4 and AI fields really caught me off guard.
    And about threads and deadlock, I think we still do it now, right? Is there a better way that already implemented in real world app?

  • @werewasyo
    @werewasyo 11 років тому

    furthermore, as a guy who coded webpages using text editors, i had similar feelings of self-obsolescence when i saw the visual WYSWYG web design programs or the online web templates allowing people to create webpages without knowing how to write code.

  • @666Tomato666
    @666Tomato666 11 років тому

    oh that dripping optimism, you should share some more of it

  • @DioneDomingo
    @DioneDomingo Рік тому

    Who else want this re-uploaded in 4:3 format.

  • @indavarapuaneesh2871
    @indavarapuaneesh2871 2 роки тому

    "accept that you don't know what you're doing and the you're free".

  • @palpatin7
    @palpatin7 11 років тому +2

    Still auto-integration would change the face of programming more than having paralell processors, but i agree that it seems really far.

  • @CGProd-ov9mk
    @CGProd-ov9mk Рік тому +1

    So he's telling us we've been doing it wrong for 50 years. That's kinda depressing.
    At 25:10 isn't that basically the Cell architecture from the PS3?

  • @BryonLape
    @BryonLape 3 роки тому

    Unfortunately, we spent decades arguing over the Goto statement.

  • @RonJohn63
    @RonJohn63 11 років тому

    Some things just can't be fixed.
    With airplanes, for instance, there's a reason why civilian planes aren't flying at Mach 2 from NYC to London, and the SR-71 was retired: aluminum softens, titanium is *expensive* and air doesn't get out of the way fast enough.
    In the case of CPUs, when the number of nodes gets too high (somewhere around 8, IIRC) it's just not physically possible to orient the CPUs on PCBs in such a way that they can chat fast enough. Latency skyrockets, and performance tanks.

  • @noshpie
    @noshpie 10 років тому +1

    First you must unlearn everything before you can know anything ... wait what?
    you must master your ideas or your ideas will become your master...I can keep going..

  • @Kenbomp
    @Kenbomp 4 роки тому

    This is pretty much smalltalk obj and messages. Shows how communication of tech really is important. This show how we missed the ball with smalltalk and ai. Ai was ignored for 50 years before now.

  • @lyalos
    @lyalos 4 роки тому +3

    Could you please upload the slides?

  • @mortenbrodersen8664
    @mortenbrodersen8664 10 років тому

    A BRILLIANT talk.

  • @YoungManKlaus
    @YoungManKlaus 11 років тому +1

    I work as a programmer, and I can tell you that communication is the worst problem ever, be it with software (eg. shit api) or the people who wrote said software (useless documentation).
    Anyhow, you assume that a computer will always work by executing what you tell it to. If you have a sort of AI which you can give an agenda the whole approach looks different.
    But, arguably, we need tons of features first (like actually "understanding" tasks which we now veeery slowly get into achieving).

  • @mmille10
    @mmille10 11 років тому +1

    Great presentation! I got a more concrete sense of what Alan Kay has been saying all these years re. computing.

  • @alonsgab
    @alonsgab 11 років тому

    This is exaclty the way of thinking this video is trying to get rid of. It might be not possible to fix now, that doesn't mean it isn't possible at all.
    In the case of CPUs imagine if all engineers had the same mindset when they where using vacuum tubes computers, the transistor based CPUs would have never existed, because it just wasn't possible to do whatever with tubes, and it can't be fixed.
    As Victor said, getting rid of dogmas and paradigms is the way to invention.

  •  6 років тому +4

    20:25 - "I'm totally confident than 40 years we won't be writing code in text files".

  • @EternalDensity
    @EternalDensity 5 років тому +2

    Not sure if his alternative to APIs is really feasible.

  • @YoungManKlaus
    @YoungManKlaus 11 років тому +2

    Dude, I work on an api layer as a programmer, and I can tell you that communication is the worst problem ever, be it with software (eg. because time definitions are messed up) or the people who wrote said software (to get useful documentation).
    Anyhow, you assume that a computer will always work by executing instructions that you tell it to execute. If you have a sort of AI/Agent which you can give an agenda the whole approach looks different (like Regex).
    UA-cam comment length sucks :P

  • @alonsgab
    @alonsgab 11 років тому +1

    I think that the level of creativeness (if that word exists) of those inventive persons is not because of drugs, but because they were radical and revels. Thus they were thinking outside of the box, hated authority and did drugs.

  • @MihailProg
    @MihailProg 7 років тому

    I love this presentation.

  • @ItaloMaiaTM
    @ItaloMaiaTM 10 років тому +1

    Amazing presentation!

  • @imjeffvader
    @imjeffvader 11 років тому

    Couldn't agree more.
    And this is why we don't call the developers "connectors" (as of yet).

  • @nokcha75
    @nokcha75 9 місяців тому

    Bret Victor is a genius

  • @imjeffvader
    @imjeffvader 11 років тому

    Parallelism, in its current and envisioned form, is just as awesome as marketing.

  • @TheNewton
    @TheNewton 4 роки тому +1

    31:22 ~ "...they know what programming is, this is programming, that's not programming"
    in reference to assembly coders seeing fortan code

  • @franleplant
    @franleplant 11 років тому

    Ahhh, so it is Conceptual, it is getting better and better

  • @RonJohn63
    @RonJohn63 11 років тому

    Or maybe I'm just old enough to remember the previous successes and failures, and why they succeeded and failed, and what compromises need to be made so that h/w can be manufactured so that h/w doesn't cost what a Cray-1 did back in the day, because of it's esoteric design.