And this... Is why I still use winamp from 20 years ago to play music. It opens in milliseconds and uses less than 10mb of ram, and never crashes on me.
If someone moves to a new larger apartment/house, they will tend to start buying more things to fill all the new space they suddenly have, and while on the surface it looks better, it also makes cleaning way harder, because now you have way more area, more things to dust and wipe, and you will tend to generate more trash because all of the sudden clearing out the fridge of old things now fills 2 bags of garbage instead of one. More space sure is nice and it opens more opportunities to do more things in it, but it makes more mundane things more time consuming
One of the things I hate about renting apartments is that they are usually refurbished with useless cabinets to a point when you can't even find a non-occluded wall to do, say, back exercises against. My dream is to have a vast empty space of the simplest shape possible, preferably painted white, that would only have a good bed, a table, an executive chair, a clothing rack and MAYBE maybe some gym equipment and space for ikea cubes. That would be the life, yeah
@@Bloodrammer lol is a bit sad but at the same time its exactly what i would want. it doesn't even have to be so big. of course things change dramatically if you have a family though.
In the 80's - 90's, one often saw amazing examples of coding done on absurdly underpowered and toyish hardware. Today, one see's terrible examples of coding done on the most amazing machinery ever produced.
Couldn’t have said it much better myself. Todays programmers should be on a wall of shame, with all the power and resources at their finger tips we haven’t really advanced much at all, and the requirements to do the most basic of things is ridiculous. There doesn’t seem to be any need or want to create good code, it’s more about how quick they can do things, end result buggy software that was written using other software and as such very inefficient.
@@dave24-73 I started programming in the late 90s and people would write SQL directly in their views in ASP or PHP. It was pretty atrocious and a security nightmare. People always wrote hot garbage. There was never a time where somehow there was a fairytale land where everyone wrote awesome code.
@@jackdanielson1997 code has more security holes in it now than ever before, as common libraries are being used which can be exploited vs hitting hardware directly. I’m talking about the people who use to hit the hardware directly and they usually programmed in assembly. They had greater discipline and huge skills that are lacking now. I understand for development times and the fact that code is now shared assembly isn’t practical but some of those people were so skilled and it’s rare to see this in modern computing. So there is some context there.
@@dave24-73 Code hitting hardware directly. Probably assembly language i guess. Javascript was supposed to be small, made in 10 days for Netscape. Now world uses the garbage mess. Libraries are better imo, even is it's bad atleast most are updated by Senior devs.
As a programmer from the 90s one of the things I've observed as I get older is that programmers these days come at it from a different angle. What I mean by that is in the 90s you had to understand how computers worked under the hood. But nowadays there's a lot more abstraction layered on top which means you have a to work a lot harder if you want to understand how things work at the lower layers.
thats why i learn assembler. first assembler is so much fun when you get a hang of it, also you learn actually relevant stuff to a wider spectrum. what you learn about c++ is relevant to the whole spectrum of languages but assembler you learn for the whole electronic spectrum
Ever since I first started programming in 97' as a 12 year old, I seem to remember that the default behavior has been: download library, link it and use its commands.
The issue is not just that programmers nowadays don't care about what's under the hood, performance, simplicity and all that jazz. The bigger issue is there is even a dominating culture of assuming everyone should code in the modern ways and a discouragement of those who don't follow the trends. You are sometimes assumed not being a "professional" programmer if you care about performance or don't want to add too many dependencies, if you don't follow a lot of the modern paradigms that everyone else does. There are mantras thrown at you like "are you gonna re-invent the wheel?" and "premature optimization is the root of all evil". I don't have anything against these mantras, as there might have been occasions where they made sense (like micro-optimizing non significant parts of the code) but it's used as a "feel-good" mantra, to say "We don't need to care about these things anymore, and anyone who does is living in the 90s". It's cargo-cult of the approved-methods and things that are considered depricated, not real understanding of what tools are needed to solve the problem, or when performance matters, whether it's a good idea to always add complexity, etc.
I love your post here, you sum it up perfectly. Also what bothers me is the push to store things on the cloud and everything be public, open-source and community-centric with required online 2 factor authentication, rather than private and local with no logins required.
The "premature optimization" one is so terribly wrong even. It comes from the old days, where the major bottleneck to performance was the limited processing power. You had comparatively fast memory access, so you could just think about the logic of the program and then go in and optimize at a low level the "inner loop" code to gain performance. This is no longer the case. Now performance is so intrinsically tied to cache hits, memory transfers and datarates. This means that any possible optimizations are architechtural, you have to change the very design of the program and the layout of the data in order to gain performance. This means that the right time to think about performance is actually right at the beginning, because decisions you make then might make it impossible to optimize later.
Indeed. I just spent the last two years being mentally bashed around by senior software engineers at a company because they wanted me to do things their way. Nevermind that I have been programming for 26 years. Through pull request reviews and shaming morning meeting sessions, they forced me to change the way I code. Programming was my passion for 26 years and now I feel like I give less and less of a fuck and just want to do something else. Even though I have recently left that company and don't have to deal with those people, I have now lost faith in my ability to program because I am constantly second-guessing what I code. They have successfully broken me.
I feel like a lot of this stems from capitalism basically. No company wants to pay you for doing something perfectly when you can get an 80% MVP in a fraction of the time, and that's often good enough for the marketing team. But companies also leverage senior engineers to notice things like performance regressions and containing dependency hell and all that, so it kinda balances itself out IMO. It's just not profitable for any company to write perfect software.
@@darkwoodmovies that’s not a strictly capitalist thing. That’s pragmatism. There are diminishing returns and opportunity costs to increasingly long development times. You’ll never get to “100%”.
Adding if statements is something I used to do until I started programming professionally in a team and all of a sudden if statements were evil. I had to use an abstract class and multiple subclasses with polymorphism. They call it clean code. I call it overkill.
As well as the technical depth he mentioned I think also an important factor is that there are far more software developers at this moment than there used to be. Programming used to be very, very difficult so the barrier of entry filtered out a lot of the subpar coders. People like John Backus and Donald Knuth use to make up a much larger portion of the talent pool. I believe this is the reason why functional programmers are sought after, since FP is arguably a lot harder the programmers are often of a higher caliber. As a child of the 90s I remember software being much buggier and slower than they are today. I very "fondly" remember Windows 95 crashing on a daily basis for no apparent reason.
I’m a relatively new developer, though I’ve been coding since I was young, I’ve recently been getting into more mainstream programming. The state of software, in my opinion, is abysmal for someone starting out. The complexity of systems like react is so abstract from basic concepts. It is what it is, but it’s discouraged me for sure, and I’m considering going into a more data oriented computer science role because of it.
Here's what the greybeards taught me when I was coming up: -Don't learn tech, learn fundamentals. Data structures, algos, etc. Today's hotness is tomorrow's old news. If you're just in it for the money: the faster you pivot, the faster you get paid more. If you're in it for job security: the easier it is to transition to new technologies, the harder it is to become obsolete. Always pick the right tool for the right job. Dogma is the road to disaster. -Build your own fundamental libraries from those data structures in the language you prefer. Sometimes your implementation will be better or worse than what the standard library provides. Learn how to gauge that and improve on what you have. -It doesn't get easier, you just can do more. You'll notice patterns but there will always be blockers and hurdles to climb. You don't get paid for just the work, but also for what you know and can break down.
@@Xerophun Thank you for the advice. I’ve found building my own implementations of more abstract technologies to be a really effective tool for me, it allows me to more intuitively understand those technologies when I’m required to use them. I’ve been focusing on the fundamentals lately, and if you have any recommendations for books I’d love to hear them.
@@Marcus-Lim Depends on what domain you're targeting. My recommendations for building web apps would be different than for building graphical applications. This may be my bias shining through, but having a firm grasp of C and how to implement patterns in it hasn't failed me, but it was my first language and there might be a tinge of nostalgia wrapped up in that. Google has a good site that acts as a starting point for algorithms. Outside that, it depends on what field you're looking to move into.
I agree with you, I'm learning to become a system level programmer, so I mainly work with lower level languages like C/Cpp etc, and you would think that the day to day programming with those languages was already hard enough that you would naively expect the rest of the workflow to be quite straightforward, but no. Unfortunately nobody is protected from complexity, when I started programming about a year ago, I was so overwhelmed by the sheer amount of stuff that you have to know and understand just to get started, and I'm not talking about things related to programming concepts or knowledge of your language, no just the amount of stuff that you have to use if you want to be up to the standards of other "professionals". Git, make, cmake, preprocessor macros, dynamic linking, build system, CI/CD, docker, interops/abi, how to setup your environment, unix, shell, dotfiles, ssh/gpg /lsp/ compiler/ clang / gcc/ pragma directives, compiler flags/ cross-platform compilation, etc etc etc I could go on and on my point being that It kind of baffles me that so many tools are so clunky yet expected, and on top of that if this wasn't enough, there's billions of languages that you are expected to at least understand, like json, yaml, xml, markdown. When I started I naively though you just needed to master a languages and the underlying concepts about programming, unix, git, and a build system, turns out there's billions of tools that are simply expected nowadays.
Software is such a strange field because the hardware is usually such overkill, nobody cares about doing things "well". That's how you end up compiling like 100 million lines of code from a hundred different libraries every time you need to run a "hello world" web app.
So far this is kind of my headspace (or at least a current draft) on the state of programming and software, I'm basically where I'm having a sense that something feels wrong, but I don't have a wide expertise in the field to point in a particular direction or do anything about it sadly. I'm trying to look at stuff at a lower-level or look at stuff like CollapseOS or any of the older operating systems to get an idea, but to get anywhere export level that I can do anything about it might take me a good decade or two of constant education and work to even start doing anything about it
Initial point: I might be wrong, but it sounds like you might not have had an underlying CS education that taught first principles. If you like to learn and have the stomach for complex theory, perhaps look into Principles of Computer Hardware (4th Ed.) by Alan Clements. Then follow up by looking at something like the ChibiAkumas UA-cam channel for assembly language, and after that learn C (a fun way is by Making 8-Bit Arcade Games in C by Stephen Hugg, but a better although harder way is to learn it by reading Kernighan's and Ritchie's The C Programing Language (2nd Ed.)). Then you can abstract away via any other modern language, which very much depends on what your field of interest is. General point: Such is the unfortunate byproduct of software engineering, and it's only going to get worse over time until there is either a sea change of the approach to the industry at large, or a knowledge-based limitation is reached. Think of it like mathematics or physics when they first started out during the early years following their discovery, centuries or even millennia ago. As new areas/topics were theorised, tested, and implemented into the greater body of work, that body of work became more complex, and in many instances the new topics would require many or all of the first principles to be understood. Topics built upon topics built upon topics. To learn either subject now takes a considerable amount of time - decades. Computer Science's growth has been exponential, both in terms of discovery and in speed of discovery. It also benefits tremendously from - and engineers suffer equally due to - significant abstraction. We risk becoming so far abstracted from the first principles of computing that if some or other infrastructural catastrophe were to happen in the not-too-distant future, we would struggle to replicate what we lost. There is so much - too much - for an individual to ever learn, especially as in many cases programming is merely a tool used on top of another base subject. My headspace is such that I feel like I need to accrue a huge amount of knowledge to be versatile in an industry that refuses to regulate/standardise/consolidate/slow down, not just in terms of new languages and paradigms but in terms of what programming is actually used for. To me it feels like being cut loose into a void of endless potential but without any direction whatsoever.
@@atheosmachina Thanks for the advice, I didn't have a proper CS education, mostly a computer technician college course. I'm still young and trying to get a better understanding of things, been playing around with C and later Rust for the last 6-8 months, and learned a couple of things from it. I still got a long ways to go tho, school did not prep me for this amount of head banging lol
@@andrewg.2996 The great thing is you're young, lots of time to learn things! I started much later and find it far more difficult to try to keep up. Best thing to do is to find an area of CS you really like, look into how future-proof it'll be in the age of AI, and become really good at it so you're indispensable, building your knowledge around that area as you improve. It won't be an easy or quick road but if you like to learn, you can't lose 👍🏻
I suppose that's why game consoles were designed. The Operating System of a game console is so simple because of the requirements imposed by societal expectations and the medium. You are not supposed to edit text, sound, image or mesh files in a game console, just play games, so all the code is supposed to do is entertain the user, not solve problems through software, like in the case of generic OSs for Personal Computers.
My biggest problem is with technology being introduced for the sake of the technology, and not to solve a business problem. I have spent well over 50% of my time going through the various processes in place because of the development environment was built not to solve a business problem, but so someone could report to their boss that they "enhanced" the toolset that is being used. At a bad organization this can quickly get out of control, with programmers struggling to learn the technology that keeps getting added on and wasting their time with it instead of developing business solutions.
Technology for the sake of technology is like all of CS grad programs in a nutshell lol. It's like you're solving the hardest problems in the world for absolutely no practical purpose.
My Visual Studio 2022 is the most laggy and slow VS version Ive used to date. Once I got tired of it, so I tried to go offline - and it instantly got better. Apparently VS is communicating online continuously w/o me ever asking for it.
It was A LOT more elitist back then, like a guild almost. It was VERY hard just to find out what you needed to learn, just a starting place. Most "solutions" just led you down a blind alley...
In 1995 there was three computer platforms that consumers use where to write code. 1. Win32s (almost for every software that consumers use) 2. DOS (mostly games, because Windows still suck in gaming) 3. Apple System 7 (not competetive at time) Because of that most leanest software designs just didn't care about portability while Wintel was so strong. It was after mid 90s when developers started to put more attention to portability and later those non-portable old software rot away. Today's applications can run every devices with all operating systems, hardware architectures and screen sizes so something is changed. In addition to that, people had expectation that when they buy new computer, software they bought previously still works, although they are distributed in binary form. How that works in practice is that platform drag along all code from platform written before. Every new version just adds code because removal of code causes applications to break. This same happens when they change file format in some word processor, they need to drag along older version code.
There's also the sin of gluttony in the sense that we get real lazy about resource usage because there is so much and the software suffers massive bloat and waste. for example.. Electron.
Jonathan looks like a buzzed cut/short hair American version of Dave Murray from Iron Maiden. If you don't know who that is, google him. His voice even sounds similar, but with an American accent!
One thing is the unnecessary complexity, the other to my mind is that creating a good, clean architecture starts being a luxury unfortunately. If you want to create clean and scalable architecture other companies will outrun you. And that's something that many companies can't afford. What I experience and hear from my friends in IT on daily basis is that you have to put a lot of effort and nearly fight with your employer for him to allow you to do your job well. Very often I hear and see programmers being forced to use a very poorly written 'ready' solution for something that should be written internally. We often joke that the employers would rather us not write a single line of code if that was possible. It's a big shame. But unfortunately the companies make decisions mainly based on what will maximize their profit. They often do not care about quality of the product at all. If it kinda works lets just push it out as fast as possible. We use broken solutions that were built using other broken solutions, all of them created in this fashion, so no wonder that the software is getting unnecessarily complex and broken.
I think he means that it takes longer, adjusted for hardware improvements. Compilation speed has improved, but not nearly as much as CPU speed, and disk I/O time to read source files and write object files doesn't explain the relative slowdown.
You’re building on the backs of the abstraction that came before you. Wanna just open up an ide and debug some code? Too bad, wait 1min for startup, 3min for the app to run, 5min for updates, 5min for compilation, and 10 to figure out what useless package broke your build. Oh, and make store you have like 16gb cause your ide most likely uses chrome/v8 and is built on a pile of lazily imported libraries cause an intern was too inept to write a function to check chars or some crap.
@@alvarohigino but your understanding of the high level tools will become better and the way you work objectively better. I would recommend to just look into what your tools do in the background to get atleast a better understanding
I don't either. But I know he was describing factorial, which also happens to be listed in the wiki article on combinatorial explosion, and I thought the wiki article aligned very closely with what JB was describing, so I'm curious if you could write a small excerpt of what you 'know' that it is.
This is a very pessimistic outlook. I think this is just a natural progression of life and a natural reaction for old people to resist change. To say "things were better and more simple back in my day". Yeah, your day there were cavemen with stones and now we have LLMs, streaming videos at tremendous scale and ridiculous FPS and PPI, fully interactive websites with near instantaneous load times. I started programming in the late 90s and both the state of programming from a programming perspective and the state of the world as a consumer of technology was laughably infant compared to today. Now there is a lot more complexity, but it's also a lot more feature rich, and it's a GOOD sign that there is a lot of overlap, and competing frameworks and standards (or whatever else he's abstractly alluding to) because it means the community is thriving and people are pushing the boundaries and finding their own ways of innovating and optimizing.
There's still a very valid claim that a lot of solutions are bloat on top of man-made problems and not intrinsic problems, instead of fixing those lower-level problems. (Eg the weird fixation with docker and a whole OS of complexity in a sandbox to deal with dependency isolation.) Or running a web browser as cross-platform UI library (eg electron) and then layers of frameworks in top of that to make a document layout system be a UI system. There's no reasonable reason a chat app needs gigabytes of RAM.
@@doltBmBno because “old people” dont want change they just want to be negative contrarians who would never be happy even if everyone was programming in assembly
@Andai It’s a trivial task that just takes time to do. Pong actually has a set of specifications for how the ball and paddles interact. The original one was done purely with circuits and no actual computers. So the mathematics behind it are all known and available to the public.
Well, in absolute terms, there are probably exponentially more experts in the field today than there were during the "good old days", and just a hell of a lot more of everyone else.
This is hard to listen to and watch. Like nobody prepared for this interview. It’s really simple guys. Stop putting useless stuff in the way of me getting simple tasks done. For example 20 reminders a day to start using some widget feature in my operating system. That’s how you explain it. Stop adding complexity to programing languages that use up system resources and don’t produce any more useful work.
@@hansolo7988 Nothing that would put me up there with Linus. Other than making Braid, what else has to us guy contributed to the scene aside from his opinion?
@@gmodrules123456789 He made The Witness and is close to releasing a fully fledged programming language and compiler. He's opinionated and sometimes pretentious but he's a talented engineer and fully dedicated to the craft.
And this... Is why I still use winamp from 20 years ago to play music. It opens in milliseconds and uses less than 10mb of ram, and never crashes on me.
and it really whips the llama's ass
If someone moves to a new larger apartment/house, they will tend to start buying more things to fill all the new space they suddenly have, and while on the surface it looks better, it also makes cleaning way harder, because now you have way more area, more things to dust and wipe, and you will tend to generate more trash because all of the sudden clearing out the fridge of old things now fills 2 bags of garbage instead of one. More space sure is nice and it opens more opportunities to do more things in it, but it makes more mundane things more time consuming
One of the things I hate about renting apartments is that they are usually refurbished with useless cabinets to a point when you can't even find a non-occluded wall to do, say, back exercises against. My dream is to have a vast empty space of the simplest shape possible, preferably painted white, that would only have a good bed, a table, an executive chair, a clothing rack and MAYBE maybe some gym equipment and space for ikea cubes. That would be the life, yeah
@@Bloodrammer lol is a bit sad but at the same time its exactly what i would want. it doesn't even have to be so big. of course things change dramatically if you have a family though.
In the 80's - 90's, one often saw amazing examples of coding done on absurdly underpowered and toyish hardware. Today, one see's terrible examples of coding done on the most amazing machinery ever produced.
Couldn’t have said it much better myself. Todays programmers should be on a wall of shame, with all the power and resources at their finger tips we haven’t really advanced much at all, and the requirements to do the most basic of things is ridiculous. There doesn’t seem to be any need or want to create good code, it’s more about how quick they can do things, end result buggy software that was written using other software and as such very inefficient.
@@dave24-73 I started programming in the late 90s and people would write SQL directly in their views in ASP or PHP. It was pretty atrocious and a security nightmare. People always wrote hot garbage. There was never a time where somehow there was a fairytale land where everyone wrote awesome code.
@@jackdanielson1997 code has more security holes in it now than ever before, as common libraries are being used which can be exploited vs hitting hardware directly. I’m talking about the people who use to hit the hardware directly and they usually programmed in assembly. They had greater discipline and huge skills that are lacking now. I understand for development times and the fact that code is now shared assembly isn’t practical but some of those people were so skilled and it’s rare to see this in modern computing. So there is some context there.
@@dave24-73 Code hitting hardware directly.
Probably assembly language i guess.
Javascript was supposed to be small, made in 10 days for Netscape. Now world uses the garbage mess.
Libraries are better imo, even is it's bad atleast most are updated by Senior devs.
Couldn’t agree more
As a programmer from the 90s one of the things I've observed as I get older is that programmers these days come at it from a different angle. What I mean by that is in the 90s you had to understand how computers worked under the hood. But nowadays there's a lot more abstraction layered on top which means you have a to work a lot harder if you want to understand how things work at the lower layers.
thats why i learn assembler. first assembler is so much fun when you get a hang of it, also you learn actually relevant stuff to a wider spectrum. what you learn about c++ is relevant to the whole spectrum of languages but assembler you learn for the whole electronic spectrum
Ever since I first started programming in 97' as a 12 year old, I seem to remember that the default behavior has been: download library, link it and use its commands.
The issue is not just that programmers nowadays don't care about what's under the hood, performance, simplicity and all that jazz. The bigger issue is there is even a dominating culture of assuming everyone should code in the modern ways and a discouragement of those who don't follow the trends. You are sometimes assumed not being a "professional" programmer if you care about performance or don't want to add too many dependencies, if you don't follow a lot of the modern paradigms that everyone else does. There are mantras thrown at you like "are you gonna re-invent the wheel?" and "premature optimization is the root of all evil". I don't have anything against these mantras, as there might have been occasions where they made sense (like micro-optimizing non significant parts of the code) but it's used as a "feel-good" mantra, to say "We don't need to care about these things anymore, and anyone who does is living in the 90s". It's cargo-cult of the approved-methods and things that are considered depricated, not real understanding of what tools are needed to solve the problem, or when performance matters, whether it's a good idea to always add complexity, etc.
I love your post here, you sum it up perfectly.
Also what bothers me is the push to store things on the cloud and everything be public, open-source and community-centric with required online 2 factor authentication, rather than private and local with no logins required.
The "premature optimization" one is so terribly wrong even. It comes from the old days, where the major bottleneck to performance was the limited processing power. You had comparatively fast memory access, so you could just think about the logic of the program and then go in and optimize at a low level the "inner loop" code to gain performance. This is no longer the case. Now performance is so intrinsically tied to cache hits, memory transfers and datarates. This means that any possible optimizations are architechtural, you have to change the very design of the program and the layout of the data in order to gain performance. This means that the right time to think about performance is actually right at the beginning, because decisions you make then might make it impossible to optimize later.
Indeed. I just spent the last two years being mentally bashed around by senior software engineers at a company because they wanted me to do things their way. Nevermind that I have been programming for 26 years. Through pull request reviews and shaming morning meeting sessions, they forced me to change the way I code.
Programming was my passion for 26 years and now I feel like I give less and less of a fuck and just want to do something else. Even though I have recently left that company and don't have to deal with those people, I have now lost faith in my ability to program because I am constantly second-guessing what I code. They have successfully broken me.
I feel like a lot of this stems from capitalism basically. No company wants to pay you for doing something perfectly when you can get an 80% MVP in a fraction of the time, and that's often good enough for the marketing team. But companies also leverage senior engineers to notice things like performance regressions and containing dependency hell and all that, so it kinda balances itself out IMO. It's just not profitable for any company to write perfect software.
@@darkwoodmovies that’s not a strictly capitalist thing. That’s pragmatism. There are diminishing returns and opportunity costs to increasingly long development times. You’ll never get to “100%”.
Adding if statements is something I used to do until I started programming professionally in a team and all of a sudden if statements were evil. I had to use an abstract class and multiple subclasses with polymorphism. They call it clean code. I call it overkill.
In Unix, I wrote C, HTML, and Perl in vi. On DOS/Windows, Quick C and then Visual C++. And compiling is MUCH faster today.
As well as the technical depth he mentioned I think also an important factor is that there are far more software developers at this moment than there used to be. Programming used to be very, very difficult so the barrier of entry filtered out a lot of the subpar coders. People like John Backus and Donald Knuth use to make up a much larger portion of the talent pool. I believe this is the reason why functional programmers are sought after, since FP is arguably a lot harder the programmers are often of a higher caliber. As a child of the 90s I remember software being much buggier and slower than they are today. I very "fondly" remember Windows 95 crashing on a daily basis for no apparent reason.
I’m a relatively new developer, though I’ve been coding since I was young, I’ve recently been getting into more mainstream programming. The state of software, in my opinion, is abysmal for someone starting out. The complexity of systems like react is so abstract from basic concepts. It is what it is, but it’s discouraged me for sure, and I’m considering going into a more data oriented computer science role because of it.
Here's what the greybeards taught me when I was coming up:
-Don't learn tech, learn fundamentals. Data structures, algos, etc. Today's hotness is tomorrow's old news. If you're just in it for the money: the faster you pivot, the faster you get paid more. If you're in it for job security: the easier it is to transition to new technologies, the harder it is to become obsolete. Always pick the right tool for the right job. Dogma is the road to disaster.
-Build your own fundamental libraries from those data structures in the language you prefer. Sometimes your implementation will be better or worse than what the standard library provides. Learn how to gauge that and improve on what you have.
-It doesn't get easier, you just can do more. You'll notice patterns but there will always be blockers and hurdles to climb. You don't get paid for just the work, but also for what you know and can break down.
@@Xerophun Thank you for the advice. I’ve found building my own implementations of more abstract technologies to be a really effective tool for me, it allows me to more intuitively understand those technologies when I’m required to use them. I’ve been focusing on the fundamentals lately, and if you have any recommendations for books I’d love to hear them.
@@Marcus-Lim Depends on what domain you're targeting. My recommendations for building web apps would be different than for building graphical applications.
This may be my bias shining through, but having a firm grasp of C and how to implement patterns in it hasn't failed me, but it was my first language and there might be a tinge of nostalgia wrapped up in that.
Google has a good site that acts as a starting point for algorithms. Outside that, it depends on what field you're looking to move into.
I agree with you, I'm learning to become a system level programmer, so I mainly work with lower level languages like C/Cpp etc, and you would think that the day to day programming with those languages was already hard enough that you would naively expect the rest of the workflow to be quite straightforward, but no. Unfortunately nobody is protected from complexity, when I started programming about a year ago, I was so overwhelmed by the sheer amount of stuff that you have to know and understand just to get started, and I'm not talking about things related to programming concepts or knowledge of your language, no just the amount of stuff that you have to use if you want to be up to the standards of other "professionals". Git, make, cmake, preprocessor macros, dynamic linking, build system, CI/CD, docker, interops/abi, how to setup your environment, unix, shell, dotfiles, ssh/gpg /lsp/ compiler/ clang / gcc/ pragma directives, compiler flags/ cross-platform compilation, etc etc etc I could go on and on my point being that It kind of baffles me that so many tools are so clunky yet expected, and on top of that if this wasn't enough, there's billions of languages that you are expected to at least understand, like json, yaml, xml, markdown. When I started I naively though you just needed to master a languages and the underlying concepts about programming, unix, git, and a build system, turns out there's billions of tools that are simply expected nowadays.
that's why it's so unfortunate that the KISS principle hasn't really seen a widespread adoption
Software is such a strange field because the hardware is usually such overkill, nobody cares about doing things "well". That's how you end up compiling like 100 million lines of code from a hundred different libraries every time you need to run a "hello world" web app.
The answer was more complex than it needed to be.
So far this is kind of my headspace (or at least a current draft) on the state of programming and software, I'm basically where I'm having a sense that something feels wrong, but I don't have a wide expertise in the field to point in a particular direction or do anything about it sadly. I'm trying to look at stuff at a lower-level or look at stuff like CollapseOS or any of the older operating systems to get an idea, but to get anywhere export level that I can do anything about it might take me a good decade or two of constant education and work to even start doing anything about it
Initial point: I might be wrong, but it sounds like you might not have had an underlying CS education that taught first principles. If you like to learn and have the stomach for complex theory, perhaps look into Principles of Computer Hardware (4th Ed.) by Alan Clements. Then follow up by looking at something like the ChibiAkumas UA-cam channel for assembly language, and after that learn C (a fun way is by Making 8-Bit Arcade Games in C by Stephen Hugg, but a better although harder way is to learn it by reading Kernighan's and Ritchie's The C Programing Language (2nd Ed.)). Then you can abstract away via any other modern language, which very much depends on what your field of interest is.
General point: Such is the unfortunate byproduct of software engineering, and it's only going to get worse over time until there is either a sea change of the approach to the industry at large, or a knowledge-based limitation is reached.
Think of it like mathematics or physics when they first started out during the early years following their discovery, centuries or even millennia ago. As new areas/topics were theorised, tested, and implemented into the greater body of work, that body of work became more complex, and in many instances the new topics would require many or all of the first principles to be understood. Topics built upon topics built upon topics. To learn either subject now takes a considerable amount of time - decades.
Computer Science's growth has been exponential, both in terms of discovery and in speed of discovery. It also benefits tremendously from - and engineers suffer equally due to - significant abstraction. We risk becoming so far abstracted from the first principles of computing that if some or other infrastructural catastrophe were to happen in the not-too-distant future, we would struggle to replicate what we lost. There is so much - too much - for an individual to ever learn, especially as in many cases programming is merely a tool used on top of another base subject. My headspace is such that I feel like I need to accrue a huge amount of knowledge to be versatile in an industry that refuses to regulate/standardise/consolidate/slow down, not just in terms of new languages and paradigms but in terms of what programming is actually used for.
To me it feels like being cut loose into a void of endless potential but without any direction whatsoever.
@@atheosmachina Thanks for the advice, I didn't have a proper CS education, mostly a computer technician college course. I'm still young and trying to get a better understanding of things, been playing around with C and later Rust for the last 6-8 months, and learned a couple of things from it. I still got a long ways to go tho, school did not prep me for this amount of head banging lol
@@andrewg.2996 The great thing is you're young, lots of time to learn things! I started much later and find it far more difficult to try to keep up. Best thing to do is to find an area of CS you really like, look into how future-proof it'll be in the age of AI, and become really good at it so you're indispensable, building your knowledge around that area as you improve. It won't be an easy or quick road but if you like to learn, you can't lose 👍🏻
I suppose that's why game consoles were designed. The Operating System of a game console is so simple because of the requirements imposed by societal expectations and the medium. You are not supposed to edit text, sound, image or mesh files in a game console, just play games, so all the code is supposed to do is entertain the user, not solve problems through software, like in the case of generic OSs for Personal Computers.
Game consoles originally didn't even have operating systems!
Well, nowadays you can browse the internet, watch movies, join voice calls, and even livestream from your game console!
@@jorgevillarreal2245 And there's a price to pay for that.
My biggest problem is with technology being introduced for the sake of the technology, and not to solve a business problem. I have spent well over 50% of my time going through the various processes in place because of the development environment was built not to solve a business problem, but so someone could report to their boss that they "enhanced" the toolset that is being used. At a bad organization this can quickly get out of control, with programmers struggling to learn the technology that keeps getting added on and wasting their time with it instead of developing business solutions.
In a work environment business solutions should be the goal. But sometimes you just want to program fun stuff, a.k.a. open source lol.
Technology for the sake of technology is like all of CS grad programs in a nutshell lol. It's like you're solving the hardest problems in the world for absolutely no practical purpose.
So true Mr. Blow!
My Visual Studio 2022 is the most laggy and slow VS version Ive used to date. Once I got tired of it, so I tried to go offline - and it instantly got better. Apparently VS is communicating online continuously w/o me ever asking for it.
It was A LOT more elitist back then, like a guild almost. It was VERY hard just to find out what you needed to learn, just a starting place. Most "solutions" just led you down a blind alley...
In 1995 there was three computer platforms that consumers use where to write code.
1. Win32s (almost for every software that consumers use)
2. DOS (mostly games, because Windows still suck in gaming)
3. Apple System 7 (not competetive at time)
Because of that most leanest software designs just didn't care about portability while Wintel was so strong. It was after mid 90s when developers started to put more attention to portability and later those non-portable old software rot away. Today's applications can run every devices with all operating systems, hardware architectures and screen sizes so something is changed.
In addition to that, people had expectation that when they buy new computer, software they bought previously still works, although they are distributed in binary form.
How that works in practice is that platform drag along all code from platform written before. Every new version just adds code because removal of code causes applications to break. This same happens when they change file format in some word processor, they need to drag along older version code.
The harder we stretch hardware, the more features we want and the harder the job of a dev is.
There's also the sin of gluttony in the sense that we get real lazy about resource usage because there is so much and the software suffers massive bloat and waste. for example.. Electron.
Jonathan looks like a buzzed cut/short hair American version of Dave Murray from Iron Maiden. If you don't know who that is, google him. His voice even sounds similar, but with an American accent!
One thing is the unnecessary complexity, the other to my mind is that creating a good, clean architecture starts being a luxury unfortunately. If you want to create clean and scalable architecture other companies will outrun you. And that's something that many companies can't afford. What I experience and hear from my friends in IT on daily basis is that you have to put a lot of effort and nearly fight with your employer for him to allow you to do your job well. Very often I hear and see programmers being forced to use a very poorly written 'ready' solution for something that should be written internally. We often joke that the employers would rather us not write a single line of code if that was possible. It's a big shame. But unfortunately the companies make decisions mainly based on what will maximize their profit. They often do not care about quality of the product at all. If it kinda works lets just push it out as fast as possible. We use broken solutions that were built using other broken solutions, all of them created in this fashion, so no wonder that the software is getting unnecessarily complex and broken.
Compiling takes longer now? What? I can compile a C++ app in a few minutes that used to take hours in 1995.
I expect he’s making things with a lot more code, whether it’s his team’s or their dependencies’ code.
I think he means that it takes longer, adjusted for hardware improvements. Compilation speed has improved, but not nearly as much as CPU speed, and disk I/O time to read source files and write object files doesn't explain the relative slowdown.
I wish it was just Jonathan Blow, and not this other pretentious bloke. I like JB being pretentious on his own, thank you.
You’re building on the backs of the abstraction that came before you. Wanna just open up an ide and debug some code? Too bad, wait 1min for startup, 3min for the app to run, 5min for updates, 5min for compilation, and 10 to figure out what useless package broke your build. Oh, and make store you have like 16gb cause your ide most likely uses chrome/v8 and is built on a pile of lazily imported libraries cause an intern was too inept to write a function to check chars or some crap.
Are you a mobile dev?
The new stuf isnt bad but the real problem is that most new programmers learn thst new stuff but you actually need to start at the beginning
Definitely agree, a lot of new programmers completely seem to miss the fundamentals.
Yeah, but they don't want because they know they need to work with new tech in order to gain money, and I kinda understand them.
@@alvarohigino but your understanding of the high level tools will become better and the way you work objectively better. I would recommend to just look into what your tools do in the background to get atleast a better understanding
Jonathan Blow doesn't know what a combinatorial explosion is. Noted
I don't either. But I know he was describing factorial, which also happens to be listed in the wiki article on combinatorial explosion, and I thought the wiki article aligned very closely with what JB was describing, so I'm curious if you could write a small excerpt of what you 'know' that it is.
Why making 2d and maybe 2.5d games with html5, javascript and webgl is so great, because no compilation.
This is a very pessimistic outlook. I think this is just a natural progression of life and a natural reaction for old people to resist change. To say "things were better and more simple back in my day". Yeah, your day there were cavemen with stones and now we have LLMs, streaming videos at tremendous scale and ridiculous FPS and PPI, fully interactive websites with near instantaneous load times.
I started programming in the late 90s and both the state of programming from a programming perspective and the state of the world as a consumer of technology was laughably infant compared to today. Now there is a lot more complexity, but it's also a lot more feature rich, and it's a GOOD sign that there is a lot of overlap, and competing frameworks and standards (or whatever else he's abstractly alluding to) because it means the community is thriving and people are pushing the boundaries and finding their own ways of innovating and optimizing.
There's still a very valid claim that a lot of solutions are bloat on top of man-made problems and not intrinsic problems, instead of fixing those lower-level problems. (Eg the weird fixation with docker and a whole OS of complexity in a sandbox to deal with dependency isolation.)
Or running a web browser as cross-platform UI library (eg electron) and then layers of frameworks in top of that to make a document layout system be a UI system. There's no reasonable reason a chat app needs gigabytes of RAM.
Funny how this cry, this whinge about "old people" is actually used by people who are the ones genuinely resistant to change.
@@doltBmBno because “old people” dont want change they just want to be negative contrarians who would never be happy even if everyone was programming in assembly
@@plaidchuck Or they see that change turns out to be very crappy.
bloat
The average programmer now is a crappy JavaScript dev that would struggle to make Pong.
Lets see you make Pong
@Andai
It’s a trivial task that just takes time to do.
Pong actually has a set of specifications for how the ball and paddles interact. The original one was done purely with circuits and no actual computers. So the mathematics behind it are all known and available to the public.
Well, in absolute terms, there are probably exponentially more experts in the field today than there were during the "good old days", and just a hell of a lot more of everyone else.
This is hard to listen to and watch. Like nobody prepared for this interview. It’s really simple guys. Stop putting useless stuff in the way of me getting simple tasks done. For example 20 reminders a day to start using some widget feature in my operating system. That’s how you explain it. Stop adding complexity to programing languages that use up system resources and don’t produce any more useful work.
This guy made one successful video-game and now acts like he's the next Linus Torvalds.
He's done quite a bit more than that. What have you done?
@@hansolo7988
Nothing that would put me up there with Linus.
Other than making Braid, what else has to us guy contributed to the scene aside from his opinion?
@@gmodrules123456789 He made The Witness and is close to releasing a fully fledged programming language and compiler. He's opinionated and sometimes pretentious but he's a talented engineer and fully dedicated to the craft.
@@hansolo7988
Ok, so he made two games and might make a hobby language.
@@gmodrules123456789 And Linus made a hobby kernel, then smarter people came along and made it into something.