"the people that come after you" - which could be yourself 6 months later! :) "Just because you can, doesn't mean you should." - a mantra I try to encourage coworkers to follow.
An outstanding re-examining of the remarkable skills of an early engineer. Yes the warning is out there for current 'I think I am very clever' people, but also a wonderful celebration of an engineer at the cutting edge who completely understood the 'engineering system' he was being asked to work with. Wonderful.
Ah, the mysterious Melvin Kaye. Wish we had more info about him. As far as i know there's only one photo of him and a few papers of code online, dating back to 1956 and 57.
Reminds me of a it of sage advice.. "Before you get too clever lad, just remember the person who will have to maintain your code is a homicidal maniac who knows where you live."
I think he wrote it that way on purpose so that no one would be able to change it ... which is exactly what happened ... which makes him an even bigger genius.
I experimented with SMC for a graphics engine.. Could really speed up some loops. Pre-roll out the loop (usually 1024 times in my little engine) and overwrite the code with a return where you want it to stop... I used it for all the line and block blitters... Also rolling out an entire angled line drawing algo for transforms (ie. sheer - roll out the line once and then change the start address for the pixel you want to start from.. Even with VirtualProtect overhead and rolling out the line template for the block transform it was much quicker than a standard loop... Faster than semi-rolled out MMX parallelised versions... Quite wasteful of memory - but not if you want max speed.... Could match a Voodoo 3 graphics card in blit speed and rotations and 1x anti-aliasing, but obviously it was using the entire CPU resources, which the GC freed up, and the GC had to be sent the texture data which was AGP 4x speed (slowwww)..
The computer used in the story was an LGP-30 from Librascope. It was not 4096 bytes, it was 4096 words, with a word being 31 bits ( technically 32, but the last bit was always zero, so 31 were usable. ) The amount of usable memory would be equivalent to 15,872 octets, which is the ubiquitous size of bytes in 2020, but the only relevant information on its byte size that I can find is that the standard input terminal was 6 bits per character, so that's probably the size of its "byte". Instructions took up half a word, 4 bits for the 16 instructions and 12 bits for the operand.
This quote from another friend of Computerphile seems quite apt. "Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?" - Brian Kernighan
The side effect saved some RAM and they just didn't have much to start with. Obviously if the program was simple, they'd just ask the next guy to write a new version. All these hacks taken together would turn the program from impossible to implement to just barely possible to implement.
"saving milliseconds but needing hours of programmer time". This is the history of computing over the last 50 years. When I was learning Algol 60 in the 70s, machine time was expensive, as was storage, but programmer time was cheap. We programmed as "tightly" as we could, declaring variables within the routine so they were lost when the routine was exited, and storage reclaimed. Pre-computing values before a loop to avoid recomputing them. That has turned right round since then and now everyone is, quite properly, extolling the values of avoiding gee-whizz tricks and programming for ease of maintenance.
The story of Mel is something I came across at the turn of the millennium I think. I have a distinct memory of the friend who linked it to me doing so on a forum we both posted on, but he disavowed all knowledge of the story later. Somehow I came across it, in the strange free-verse line format that it was in the jargon file at the time, and was utterly fascinated. It was a story from that magical time when computers were a mix of the mechanical and the fully digital, and I salute Mel for his achievement.
Mel should have just written compilers and left the high level stuff to others. Half the time I look at a (highly optimized) disassembly of the many flavors of GCC I'm totally flabbergasted by what the compiler has chosen to do. "You put THAT in program memory? It's not even the right address size! Oh, the program space visibility address was configured to be a convenient offset from a working register? WTF GCC..."
Regarding "Don't try to be too clever". I remember a point, though I can't remember who said it, that debugging is twice as hard as writing the code in the first place, so if you write as cleverly as possible, you are by definition not clever enough to debug it.
@@criticadarling He's aware, I'm absolutely sure of it. So many people did this, couple of years ago I found a printout of this - that I archived, not in the circular archive but properly - out of respect...
This reminds me of the learning algos used with "big data", where the training hones the program to the task. The end result isn't necessarily codable on its own.
Code like that made sense when computers were very basic and 1K at best of memory. Today code like this makes absolutely no sense at all. How times have changed, in 10 or more years time future programmers probably say how we code today makes no sense
Exactly, back in the 8-bit/early 16-bit era when on CPU cache wasn't really a thing we'd have self modifying code if desperate to save space. Not really viable at all these days.
Mel, from the article "real programmer " ?, I read this article in my 10th grade and it motivated me to learn assemblly Language and system programming, I used to make fun of people who write in high level language. Anyway now I also have to do python😂
@@aricsvenz3412 sorry if my comment was confusing. I didn't mean that you were being mean, but that people in general get too uppity about "real programmers". Like everybody should just chill.
@@jackeown Real programmers use butterflies. They open their hands and let the delicate wings flap once. The disturbance ripples outward, changing the flow of the Eddy currents in the upper atmosphere. These cause momentary pockets of higher-pressure air to form, which act as lenses that deflect incoming cosmic rays, focusing them to strike the drive platter and flip the desired bit. (From XKCD, in case someone doesn't know.)
"I wonder whether Mel has embraced the new idea of OO design" ahahahahahahahahahahahahahaha not a CHANCE. This dude is a "real" programmer, he'd have been offended by C, could you imagine him writing Java or C#?
Don't fear the self modifying code. Embrace the self modifying code. Eventually all code will be self modifying code and all programmers will be expected to write only self modifying code. And a new day will dawn.
Is there a "Computerphile" that is more cyber security focused. I'd love to see that! Computerphile helped me with my CS bachelors degree and now i've started a cyber security masters! Id love to see it. Thanks!
You might want to check out the Channel "LiveOverflow". He has some interesting videos in that regard and focusses on the viewers learning things instead of just showing stuff.
If a colleague of mine wrote code that was practically unreadable, I wouldn't call it elegant no matter how clever it was. To me, elegance means a combination of simple, clever, and readable.
Self-modifying code relies on exact hardware implementation. You can't manipulate bits of an instruction to get another instruction if you want the code to be portable (which is kinda more important nowadays). However, to a lesser extent such constructs are still present, like doubling an integer by bit-shifting it instead of multiplicating it.
@@muche6321 the order associated with any instruction set affords sufficient scale to implement self-modifying coding schema. Codesets are consistently ported.
Perl is only write-only only if you choose to make it so. It gives you a lot of freedom to express things, but with freedom comes responsibility - responsibility that many people take as carte blanche to write intentionally cute or obfuscated code. For insanely readable code, check out its sister language Raku (formerly Perl 6).
@@MSStuckwisch Ooh! I actually had no idea perl6 changed name or finally became a thing! Thanks for the info - now I've got another nifty language to play with! Seems cool, I really like the grammar part.
C can be made write only. One of my alltime favourites: Without searching can you work out what it does? I first encountered this in the mid '90s. main() recursively calls itself over 70,000 times during execution. #include main(t,_,a) char *a; {return!0
But they're usually executed even more often. Too often I find people use the argument of "but it's readable/simple/easy to understand" to defend their code when what they actually mean: "It's the first thing I thought of.". Obviously, write readable/maintainable code, but at least think a second before writing down the first thing that comes to mind that works.
In the late 90s I had a job writing a Direct X interface to a system. I was learning as I went and nobody else in the small company had ever worked with Direct X at all. After I'd completed my project the company 'downsized' and I was told I was being let go in a week since I was the only one that was finished. The next day the senior coder came in with a stack of paper, dropped it on my desk and said "I gotta respect a man that can write a 25 page file without a single comment". My reply: "It was job security, didn't work out" 6 months later they hired me as a contractor to make modifications ;)
@@SamTheEnglishTeacher It was a joke, and Ken (the coder) knew it. I spent the day going over it with him and he said it all made sense, it was just that nobody else knew anything about Direct X.
@@RasperHelpdesk okay la... I've worked with people who did this unironically. All of them Boomers. Day of the Pillow can't come soon enough for parasites like them amirite
I'm not a programmer but I would say that you should take the most basic microcontroller you can find and learn it inside out. Work with the datasheet. Read up on everything that you don't understand. Program a whole lot for it. Then revisit older programs and optimize. Try to get as much as possible out of that tiny thing.
This is one of the first Computerphile videos I've seen and wow this is positively charming. Maybe he's just a great storyteller, or maybe it's just a nice little story, but either way I enjoyed this way more than I expected!
This talk of self-modifying code and how hard it is to read makes me wonder whether there's been any research into developing theories or models that illuminate the behavior of self-modifying programs. Maybe something like meta-flow diagrams? What's interesting, I believe, is that you can write self-modifying programs in the same step-by-step format as usual, except now you have steps like "go back to step 4, erase the fifth word and replace it with 'sixth'". Maybe there's some regular or "affine" way to transform a self-modifying program into a non-self-modifying program.
I see "never write self-modifying code" as a rule like "never use goto/jmp". As in, it can make code unreadable spaghetti but sprinkled in (during unique circumstances) it can provide significant improvement and readability.
Such a great story. This "don't be too clever" rule is very much true even today, it permeates the whole industry. It's easy to be too clever, for example reusing the same variable for values with different meaning, or adding a clever *if* to an already too large function instead of splitting it up.
Comments are a clutch/hack. If you can't express the intent in the code, you'll usually not fare much better with comments, after all all you did was change the language (and to one that reflects far less accurately what's actually happening) and tried again. The only acceptable comments are the ones that explain why something is done, never how.
I am currently working on a project that many people have worked on during 25 years. I hate the comments. They often are out of date with the code, and in many cases completely off. What I recommend is write code that is easy to understand and only comment hacks. Readable code does not need comments.
and those shenanigans are why modern languages dont just let you do this mess anymore - maintainability is one of the most important issues for a software dev, because you can guarantee you wont be the last person to work on the code
@@username6338 or possibly secretly based - he's essentially saying whites are the only race with enough agency to be autodidacts 😂 Imagine needing a mentor, I'd KMS
@@username6338 "how am I supposed to learn when nobody's here to teach me, if they can't understand me how can they reach me?" Someone with no agency, who would waste the time of anyone trying to teach in the first place. Also a 90s hip hop lyric
people often take this advice too seriously and end up never trying anything other than what the inistitutions say i say instead to break every single rule taught about programming in private so one knows which one to use to their advantage when its needed
"Don't be extremly clever" Don't agree with that one. A. Doesn't need to be clever to be obscure. Plenty of poor written code have created monsters of unmaintainable software. B. If extremely clever, then only useful if necessary documentation exists for others to understand. (that's the lesson I get from the story, not that you shouldn't be clever)
A program written by a genius without comments requires a doubly smart genius to decode it. BTW, for the Z80, there are some horrible (yet clever) constructs even in fixed ROM code. It was either Bill Gates or Steve Allen who manipulated the Stack Pointer to embed fixed data between CPU instructions. Another horrible construct is to have programs jump into the midst of CPU instructions to execute completely different instructions or do nothing at all. There’s no problem with these and other clever techniques provided that they’re properly documented.
Couldn't the new programmer just rewrite the thing? He had two weeks or more according to your story and it's just a text-based blackjack, if i spend more than a few hours to understand the code written by the previous programmer, I just rewrite it, especially if it's a task I understand well and I know i can accomplish in a short time
Remember how constrained the program memory is. A regular program couldn't possibly deliver all the same features in the same space. If this program is anything like demoscene programs you will be scratching your head figuring out how it could do more than blink a few lights on that budget.
Mel: "I found a trick where depending on what byte you use to read the instructions, the instructions change to a different routine, and the instructions set changes to make this routine more efficient here, and this routine does two different things at the same time depending on what instruction set is being used..."
Moral: Don't employ "clever" people to create basic computer applications/games on their own in the corner without creating any readable documentation. They will get bored and unmotivated and will create a monster that will need to be trashed because it is too hard to understand/change/debug. Instead let this "clever" person teach useful tricks to standard developers and let the standard developers write and document the basic applications.
True for today, but not necessarily in Mel's time. Mel's brain-melting optimisation was probably actually needed back in the 1950s, when a kilobyte was considered an extravagant amount of memory. Remember, the Apollo Guidance Computer managed the moon landing with the equivalent of 4KB of RAM and 70-something KB of ROM, and that was a decade or more after Mel's time.
@@davidwillmore Your childish insults are not reflecting well on you. Especially since you so far off the mark. I imagine in gives you pleasure to say such things witch would be the definition of an evil person. So, I'm just adding you to my ignore filter.
@@lucidmoses Enjoy you ignorance and do please add me to your list of people to ignore. I can see no value to your flouting your ignorance and have no interest in seeing more of it. Maybe in 40 years you will find your way out of that desert.
Sounds like some of the code I wrote back in the 90's for voice processing and voicemail systems. It ran like a titanium cheetah, but I almost feel bad for those that came along after me.
The fallacy that many have is that because smarter people tend to be able to handle more complexity that producing a more complicated solution must mean you're smarter than the person who came up with the simpler solution. I'd argue that often the reality is often the opposite, true cleverness is often finding ways to simplify seemingly complex tasks while still accomplishing the task. Advanced technology is complicated so if some nitwit introduces unnecessary complexity all they're accomplishing is impeding further progress because future development will not only have to deal with inherent complexity but also with the unnecessary attempts at cleverness.
this is very abridged and left out many details. Details that really show what genius Mel really was.
Yeah, sad that "pessimum" didn't make the video.
These small gold nuggets of history lessons are fantastic.
"the people that come after you" - which could be yourself 6 months later! :)
"Just because you can, doesn't mean you should." - a mantra I try to encourage coworkers to follow.
6 months? that's optimistic... "too clever" code is something you can't understand next day, let alone few months later
An outstanding re-examining of the remarkable skills of an early engineer. Yes the warning is out there for current 'I think I am very clever' people, but also a wonderful celebration of an engineer at the cutting edge who completely understood the 'engineering system' he was being asked to work with. Wonderful.
Ah, the mysterious Melvin Kaye.
Wish we had more info about him.
As far as i know there's only one photo of him and a few papers of code online, dating back to 1956 and 57.
"You might save a few milliseconds here and there but end up costing a few hours down the line" is great
Reminds me of a it of sage advice..
"Before you get too clever lad, just remember the person who will have to maintain your code is a homicidal maniac who knows where you live."
I think he wrote it that way on purpose so that no one would be able to change it ... which is exactly what happened ... which makes him an even bigger genius.
Agreed, it's a way to ensure that you won't get fired without your employer being *really* sorry for letting you go.
Is Tha Google's whole strategy? "you'll REALLY miss us whine we're gone heh"
I experimented with SMC for a graphics engine.. Could really speed up some loops. Pre-roll out the loop (usually 1024 times in my little engine) and overwrite the code with a return where you want it to stop... I used it for all the line and block blitters... Also rolling out an entire angled line drawing algo for transforms (ie. sheer - roll out the line once and then change the start address for the pixel you want to start from.. Even with VirtualProtect overhead and rolling out the line template for the block transform it was much quicker than a standard loop... Faster than semi-rolled out MMX parallelised versions... Quite wasteful of memory - but not if you want max speed.... Could match a Voodoo 3 graphics card in blit speed and rotations and 1x anti-aliasing, but obviously it was using the entire CPU resources, which the GC freed up, and the GC had to be sent the texture data which was AGP 4x speed (slowwww)..
The computer used in the story was an LGP-30 from Librascope. It was not 4096 bytes, it was 4096 words, with a word being 31 bits ( technically 32, but the last bit was always zero, so 31 were usable. ) The amount of usable memory would be equivalent to 15,872 octets, which is the ubiquitous size of bytes in 2020, but the only relevant information on its byte size that I can find is that the standard input terminal was 6 bits per character, so that's probably the size of its "byte". Instructions took up half a word, 4 bits for the 16 instructions and 12 bits for the operand.
This quote from another friend of Computerphile seems quite apt.
"Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?" - Brian Kernighan
Mel seemed to manage it OK.
I’ll get clever-er
"Never be too clever"
// Here I have an overflow that changes the next instruction into a jump
Other Programmer *why*
He was too concerned with the how that he never stopped to ask why
The side effect saved some RAM and they just didn't have much to start with. Obviously if the program was simple, they'd just ask the next guy to write a new version. All these hacks taken together would turn the program from impossible to implement to just barely possible to implement.
Programmer with engineering background: "Is it marginally more efficient?"
Very nice to hear all this discussion on the story written by my dad, Ed Nather. He would have loved it. - Wendy Nather
Mel was my grandfather. I didn’t know anything about this story until recently.
"saving milliseconds but needing hours of programmer time". This is the history of computing over the last 50 years. When I was learning Algol 60 in the 70s, machine time was expensive, as was storage, but programmer time was cheap. We programmed as "tightly" as we could, declaring variables within the routine so they were lost when the routine was exited, and storage reclaimed. Pre-computing values before a loop to avoid recomputing them. That has turned right round since then and now everyone is, quite properly, extolling the values of avoiding gee-whizz tricks and programming for ease of maintenance.
I have tried to maintain that type of code. It is a learning experience.
The story of Mel is something I came across at the turn of the millennium I think. I have a distinct memory of the friend who linked it to me doing so on a forum we both posted on, but he disavowed all knowledge of the story later. Somehow I came across it, in the strange free-verse line format that it was in the jargon file at the time, and was utterly fascinated. It was a story from that magical time when computers were a mix of the mechanical and the fully digital, and I salute Mel for his achievement.
Mel should have just written compilers and left the high level stuff to others. Half the time I look at a (highly optimized) disassembly of the many flavors of GCC I'm totally flabbergasted by what the compiler has chosen to do.
"You put THAT in program memory? It's not even the right address size! Oh, the program space visibility address was configured to be a convenient offset from a working register? WTF GCC..."
RISC: Relegate Important Stuff to Compiler
I'd love to peak at what the compilers get up to. How can I do this?
@@jamieg2427 gcc "-save-temps"
@@jamieg2427 godbolt.org and cppinsights.io give insight of C/C++ compilers
@@jamieg2427 Pretty sure GCC is open source
I thought Mel's exploit, in the end, came down to using/abusing the timings of the rotary storage drum?
That was another thing he did - input took same time as drum to fully rotate so instruction after input requested before it requiring full rotation
Regarding "Don't try to be too clever". I remember a point, though I can't remember who said it, that debugging is twice as hard as writing the code in the first place, so if you write as cleverly as possible, you are by definition not clever enough to debug it.
One of my favourite stories from CS. I had this printed and pinned on my table during my college days
My father, Ed Nather, would be happy to know this.
@@criticadarling He's aware, I'm absolutely sure of it. So many people did this, couple of years ago I found a printout of this - that I archived, not in the circular archive but properly - out of respect...
Much respect for Mel!
This reminds me of the learning algos used with "big data", where the training hones the program to the task. The end result isn't necessarily codable on its own.
its
@@NoName-zn1sb it's
Yeah and this doesn't mean there aren't errors, only that the errors are next to impossible to track down
Code like that made sense when computers were very basic and 1K at best of memory.
Today code like this makes absolutely no sense at all.
How times have changed, in 10 or more years time future programmers probably say how we code today makes no sense
Exactly, back in the 8-bit/early 16-bit era when on CPU cache wasn't really a thing we'd have self modifying code if desperate to save space. Not really viable at all these days.
Not nessicarily the case: It really depends on your priorities.
My father, Ed Nather, would have loved this. (If you know, you know.)
Mel, from the article "real programmer " ?, I read this article in my 10th grade and it motivated me to learn assemblly Language and system programming, I used to make fun of people who write in high level language. Anyway now I also have to do python😂
"Real programmers" write in C, C++, Cython, and Python. (jk...real programmers aren't dicks about who "real programmers" are)
@@jackeown relax my Friend, I respect all programmers, Read that article, you will understand my point, it's just a joke.
@@aricsvenz3412 sorry if my comment was confusing. I didn't mean that you were being mean, but that people in general get too uppity about "real programmers". Like everybody should just chill.
@@jackeown Real programmers use butterflies. They open their hands and let the delicate wings flap once. The disturbance ripples outward, changing the flow of the Eddy currents in the upper atmosphere. These cause momentary pockets of higher-pressure air to form, which act as lenses that deflect incoming cosmic rays, focusing them to strike the drive platter and flip the desired bit.
(From XKCD, in case someone doesn't know.)
My father, Ed Nather, would be proud to know this. He made me take FORTRAN in 10th grade. 😆
"I wonder whether Mel has embraced the new idea of OO design" ahahahahahahahahahahahahahaha not a CHANCE. This dude is a "real" programmer, he'd have been offended by C, could you imagine him writing Java or C#?
I bet he is disgusted by garbage collection. "Automatic memory management?? IN MY COMPUTER?! BLASPHEMY!"
Don't fear the self modifying code. Embrace the self modifying code. Eventually all code will be self modifying code and all programmers will be expected to write only self modifying code. And a new day will dawn.
"Too clever" people nowadays make compilers, and they put all the trickery there.
Marc Gràcia Or malicious shell code...
He even said as much in the video!
Is there a "Computerphile" that is more cyber security focused. I'd love to see that! Computerphile helped me with my CS bachelors degree and now i've started a cyber security masters! Id love to see it.
Thanks!
You might want to check out the Channel "LiveOverflow".
He has some interesting videos in that regard and focusses on the viewers learning things instead of just showing stuff.
superb. self-modifying code is fantastic, so elegant, magickal...
If a colleague of mine wrote code that was practically unreadable, I wouldn't call it elegant no matter how clever it was.
To me, elegance means a combination of simple, clever, and readable.
Self-modifying code relies on exact hardware implementation. You can't manipulate bits of an instruction to get another instruction if you want the code to be portable (which is kinda more important nowadays).
However, to a lesser extent such constructs are still present, like doubling an integer by bit-shifting it instead of multiplicating it.
@@muche6321 the order associated with any instruction set affords sufficient scale to implement self-modifying coding schema. Codesets are consistently ported.
This is the real story of Mel I was waiting for! Thank you for posting this!
I think the moral of the story is "Job Security"
The perennial can vs should argument
But you should when memory is highly constrained and and speeds are measured in KIPS.
most programs are write once, read many; except when written in perl (a write only language).
Oh, you would love J then.
quicksort=: (($:@(#[)) ({~ ?@#)) ^: (1
Perl is only write-only only if you choose to make it so. It gives you a lot of freedom to express things, but with freedom comes responsibility - responsibility that many people take as carte blanche to write intentionally cute or obfuscated code. For insanely readable code, check out its sister language Raku (formerly Perl 6).
@@MSStuckwisch Ooh! I actually had no idea perl6 changed name or finally became a thing! Thanks for the info - now I've got another nifty language to play with! Seems cool, I really like the grammar part.
C can be made write only. One of my alltime favourites: Without searching can you work out what it does?
I first encountered this in the mid '90s. main() recursively calls itself over 70,000 times during execution.
#include
main(t,_,a)
char *a;
{return!0
But they're usually executed even more often. Too often I find people use the argument of "but it's readable/simple/easy to understand" to defend their code when what they actually mean: "It's the first thing I thought of.". Obviously, write readable/maintainable code, but at least think a second before writing down the first thing that comes to mind that works.
I learned about this story from Bryan Cantrill. Love his delivery. He mentioned it in a whole talk on programming lore.
In the late 90s I had a job writing a Direct X interface to a system. I was learning as I went and nobody else in the small company had ever worked with Direct X at all. After I'd completed my project the company 'downsized' and I was told I was being let go in a week since I was the only one that was finished. The next day the senior coder came in with a stack of paper, dropped it on my desk and said "I gotta respect a man that can write a 25 page file without a single comment".
My reply: "It was job security, didn't work out"
6 months later they hired me as a contractor to make modifications ;)
Whereas some of us are employed because we're actually wanted, not because we trapped a company into needing us.
@@SamTheEnglishTeacher It was a joke, and Ken (the coder) knew it. I spent the day going over it with him and he said it all made sense, it was just that nobody else knew anything about Direct X.
@@RasperHelpdesk okay la... I've worked with people who did this unironically. All of them Boomers. Day of the Pillow can't come soon enough for parasites like them amirite
It could be argued that this is a clever anti reverse engineering technique.
“Ppl after you” = “you in 2mo”
i want to learn to program like this ! where can i learn this stuff ???
I'm not a programmer but I would say that you should take the most basic microcontroller you can find and learn it inside out. Work with the datasheet. Read up on everything that you don't understand. Program a whole lot for it. Then revisit older programs and optimize. Try to get as much as possible out of that tiny thing.
how can i become a legendary programmer like Mel?
This is one of the first Computerphile videos I've seen and wow this is positively charming. Maybe he's just a great storyteller, or maybe it's just a nice little story, but either way I enjoyed this way more than I expected!
This talk of self-modifying code and how hard it is to read makes me wonder whether there's been any research into developing theories or models that illuminate the behavior of self-modifying programs. Maybe something like meta-flow diagrams?
What's interesting, I believe, is that you can write self-modifying programs in the same step-by-step format as usual, except now you have steps like "go back to step 4, erase the fifth word and replace it with 'sixth'". Maybe there's some regular or "affine" way to transform a self-modifying program into a non-self-modifying program.
I see "never write self-modifying code" as a rule like "never use goto/jmp". As in, it can make code unreadable spaghetti but sprinkled in (during unique circumstances) it can provide significant improvement and readability.
Sounds like a clear case of Melfeasance to me.
A great highway ghost story.
The Nuclear Gandhi bug comes to mind...
I love these stories" The dawn of computer science is so compelling almost mystical and mythical.
I want more "real programming" videos.
Such a great story. This "don't be too clever" rule is very much true even today, it permeates the whole industry. It's easy to be too clever, for example reusing the same variable for values with different meaning, or adding a clever *if* to an already too large function instead of splitting it up.
unmaintainable seems to mean no one else can understand it.
the moral of the story is comment your code! less you need remember how the darn thing works six months from now!
I can't help but comment about the irony in your above statement given your YT handle... :)
difficult with 4096 bits but sure
@@itisALWAYSR.A. 4096 words, each being 31 usable bits.
Comments are a clutch/hack. If you can't express the intent in the code, you'll usually not fare much better with comments, after all all you did was change the language (and to one that reflects far less accurately what's actually happening) and tried again.
The only acceptable comments are the ones that explain why something is done, never how.
I am currently working on a project that many people have worked on during 25 years. I hate the comments. They often are out of date with the code, and in many cases completely off. What I recommend is write code that is easy to understand and only comment hacks. Readable code does not need comments.
Been there, done that.
One way of protecting intellectual property... by making your code "hairy" ;).
Moral of the Story:
Cater to the lowest common denominator.
I though this wpuld be about the French "mél" word that didn't take of 🤔
do the magic switch next.
that one is my most favourite story of the times!
MAGIC / [ MORE MAGIC }
and those shenanigans are why modern languages dont just let you do this mess anymore - maintainability is one of the most important issues for a software dev, because you can guarantee you wont be the last person to work on the code
Yes, you can. Mel did. Be like Mel.
Modern languages cater to soydevs
@@username6338 didn't realise it was that pozzed but I should have guessed
@@username6338 or possibly secretly based - he's essentially saying whites are the only race with enough agency to be autodidacts 😂
Imagine needing a mentor, I'd KMS
@@username6338 "how am I supposed to learn when nobody's here to teach me, if they can't understand me how can they reach me?"
Someone with no agency, who would waste the time of anyone trying to teach in the first place. Also a 90s hip hop lyric
"Don't try and be to clever." Finally, some advice I can follow.
people often take this advice too seriously and end up never trying anything other than what the inistitutions say
i say instead to break every single rule taught about programming in private so one knows which one to use to their advantage when its needed
Thought it was a portal 2 video... ... silly me
"Don't be extremly clever"
Don't agree with that one.
A. Doesn't need to be clever to be obscure. Plenty of poor written code have created monsters of unmaintainable software.
B. If extremely clever, then only useful if necessary documentation exists for others to understand. (that's the lesson I get from the story, not that you shouldn't be clever)
A program written by a genius without comments requires a doubly smart genius to decode it.
BTW, for the Z80, there are some horrible (yet clever) constructs even in fixed ROM code.
It was either Bill Gates or Steve Allen who manipulated the Stack Pointer to embed fixed data between CPU instructions.
Another horrible construct is to have programs jump into the midst of CPU instructions to execute completely different instructions or do nothing at all.
There’s no problem with these and other clever techniques provided that they’re properly documented.
Couldn't the new programmer just rewrite the thing? He had two weeks or more according to your story and it's just a text-based blackjack, if i spend more than a few hours to understand the code written by the previous programmer, I just rewrite it, especially if it's a task I understand well and I know i can accomplish in a short time
Remember how constrained the program memory is. A regular program couldn't possibly deliver all the same features in the same space. If this program is anything like demoscene programs you will be scratching your head figuring out how it could do more than blink a few lights on that budget.
If nobody else could figure out how to fit equivalent functionality into the available resources, then Mel did absolutely the right thing.
This video does not do justice to this story at all, it feels rushed and unprepared. Also feels disrespectful to the man.
Mel: "I found a trick where depending on what byte you use to read the instructions, the instructions change to a different routine, and the instructions set changes to make this routine more efficient here, and this routine does two different things at the same time depending on what instruction set is being used..."
Go read the text
Mel before stackoverfow
How times have changes. Nowadays you could write a command line blackjack game in an hour and it would only require 2 gigs of RAM to run.
in short ...... Don't be cocky while writing code !!
Moral: Don't employ "clever" people to create basic computer applications/games on their own in the corner without creating any readable documentation. They will get bored and unmotivated and will create a monster that will need to be trashed because it is too hard to understand/change/debug. Instead let this "clever" person teach useful tricks to standard developers and let the standard developers write and document the basic applications.
True for today, but not necessarily in Mel's time. Mel's brain-melting optimisation was probably actually needed back in the 1950s, when a kilobyte was considered an extravagant amount of memory. Remember, the Apollo Guidance Computer managed the moon landing with the equivalent of 4KB of RAM and 70-something KB of ROM, and that was a decade or more after Mel's time.
So basically KISS.
still no subtitles ?! :(
The host sounds so much like Jared Harris I wonder if there is a connection. Anyone know?
So in other words, when he went to modify it, he forgot how it worked. GG Mel.
Real men write all the source in 1 line
N = N + 1
N += int(bool(N)) or 1
People who are that deep into bit fiddling are a seriously odd bunch.
It's not being "too clever" that's the problem; it's the lack of documentation!
If you need documentation other than the code, maybe you are in the wrong profession.
Sounds like when intelligence isn't accompanied by wisdom.
Maybe if you had more intelligence, you would see the wisdom of it.
@@davidwillmore Well, that was ignorant.
@@lucidmoses Then educate yourself and ascend from your ignorance.
@@davidwillmore Your childish insults are not reflecting well on you. Especially since you so far off the mark. I imagine in gives you pleasure to say such things witch would be the definition of an evil person. So, I'm just adding you to my ignore filter.
@@lucidmoses Enjoy you ignorance and do please add me to your list of people to ignore. I can see no value to your flouting your ignorance and have no interest in seeing more of it. Maybe in 40 years you will find your way out of that desert.
I just created an AI that can reprogram Itself. I named it "Skynet"
And let me guess, it promptly went on to modify itself into a refrigerator thermostat controller?
Sounds like some of the code I wrote back in the 90's for voice processing and voicemail systems. It ran like a titanium cheetah, but I almost feel bad for those that came along after me.
its so sad that we have reached such a high level of idiocracy that the recommendation is to avoid advancing technology and just stick to the basics
It's dangerous hack not advancing technology
The fallacy that many have is that because smarter people tend to be able to handle more complexity that producing a more complicated solution must mean you're smarter than the person who came up with the simpler solution. I'd argue that often the reality is often the opposite, true cleverness is often finding ways to simplify seemingly complex tasks while still accomplishing the task.
Advanced technology is complicated so if some nitwit introduces unnecessary complexity all they're accomplishing is impeding further progress because future development will not only have to deal with inherent complexity but also with the unnecessary attempts at cleverness.
@@jon9103 I would say your comment is complicated and you could have simplify it to make it better:)
@@volodyadykun6490 touche
Occam's Razor disagrees.