Looking for books & other references mentioned in this video? Check out the video description for all the links! Want early access to videos & exclusive perks? Join our channel membership today: ua-cam.com/channels/s_tLP3AiwYKwdUHpltJPuA.htmljoin Question for you: What’s your biggest takeaway from this video? Let us know in the comments! ⬇
This might be my favorite talk ever. The attitude toward programmers and those otherly business folk, the anecdotes I know, the anecdotes I've suffered from, and the anecdotes I'd never heard, all presented with utmost charm.
I'm a grey beard. I started programming as a teenager in 1976. I stayed awake for nearly 48 hours as a Unix support engineer for Sequent Computer Systems during Y2K turnover. We didn't get any calls for help related to Y2K. I spent all of that time working on my `top2` performance monitoring program. I was particularly amused by the kennel club anecdote where they were storing a number as a hexadecimal string and "fixed" the limitation of the maximum being 37 by adding two roman numeral characters to the string rather than implementing a proper fix that would have reduced the storage requirements.
Excellent talk - made me feel instantly better about some of the coding "decisions" I've made over the years! I also got pulled up on the naming of stuff - my manager pulled me into a meeting to discuss why the team were seeing a "Big Whopper Error" on screen. I explained that I'd put a last resort error handler into the code to stop it outright crashing like it used to (this was VBA)... there were calls to change it but the team using the product quite liked it so we kept it! Ah - the "Big Whopper Error" - much more fun to fix than a standard error 😁
As a young engineer, I learned an important lesson: We need to remember that no one ever does a stupid thing intentionally.... Everyone does what they believe is right at the time. (Note that the Pentium floating point problem, was originally in the DEC alpha chip -- DEC discovered it and corrected it. This proved that Intel's design was not their own, which resulted in a large out-of-court settlement. Naughty!)
For the benefit of the younger crowd here, Intel got rid of the DEC lawsuit by *buying* DEC. I'm sure a small handful of executives got mighty rich, everybody else got screwed.
I found the time to look up that 1994 Pentium fp bug "fdiv" and it was caused by 5 constant values not getting copied into a machine that prepped fabrication and the algorithm was STR which I believe was published research that used a lookup table on partial bit ranges. So despite 90's Intel having stolen designs, being criminally inclined, found guilty of bribery and other illegal practices, I don't believe this story about DEC Alpha. Back then I was in regular contact with DEC pros and it's not the kind of story I'd forget.
I was there in Amsterdam, and can myself hear laughing in the recoring 😅 The event is amzing to attend, loved meeting the people organising and attending!
The speaker, Mark Rendle, is entertaining, and I admire the honesty of self-reflection. I was a CS student at the time of the Pentium bug and wasn't concerned about the implications. The only reason that I eventually took advantage of the replacement was because the newer revision was claimed to run cooler. WIkipedia has the following about Petrov "... he was also reprimanded for improper filing of paperwork with the pretext that he had not described the incident in the military diary. [...] He received no reward. According to Petrov, this was because the incident and other bugs found in the missile detection system embarrassed his superiors and the influential scientists who were responsible for it, so that if he had been officially rewarded, they would have had to be punished. "
"y2k was nothing" no, we fixed it. "We don't hear about the ozone hole anymore" no, we fixed it. "We don't hear about acid rain and tree death anymore" no, we fixed it. Lets hope my grandchildren get to say "We don't hear about global warming anymore" ...
We do hear about the ozone hole again this year, though. See e.g.: 2023 ozone hole ranks 12th largest on record, find NOAA and NASA (for a single day, 16th largest when averaged from September 7 to October 13) So that's sadly not quite fixed yet. :/
I hope our descendants will get to say: "We don't hear about the destruction of the natural world anymore" no, we stopped doing that. That's more important than "fixing" global warming, if you ask me. It's not about fixing things (doing more), but about doing less, consuming less, polluting less, destroying less.
It bears repeating that type coercion was NOT in the first version of JS. It was added later at the insistence of the devs using JS and Eich of the 90s didn't say no like a more experienced Eich of today would have.
When Brendan Eich was asked what he would have done differently, he replied: I would have avoided some of the compromises that I made when I first got early adopters, and they said, “Can you change this?” I believe type coercion was added because s are always given as strings. So instead of having to do `parseInt(age)`, we got the mess that is type coercion. There's a lot that's wrong with JavaScript, including the name, but there is an elegant little language in there, both functional and object-based (prototypal, not OOP). The trick is to only use "the good parts".
Roman numerals used in a database, that's pretty much the biggest folktale foolery I've ever heard of. Thank you, Mr. Rendle! :D And thanks to Stanislaw Petrow go out as well. Good thing we're still alive here to laugh about these mistakes.
We used to store either a _base64 string_ or a _JSON object_ in a nvarchar field in a database. Which one it was, was determined at runtime. I _tried_ to convince them that we do not serve hundreds of thousands of requests per second, we do not _need_ an RDBMS, we could just use flat files, but nooooooooo...
I was a newbie and the author of a comment about naming conventions that was funny and self deprecatory, asked me to watch him writing something and say what I thought. He’d coded for early parallel processors at British Aerospace and had a suck it and see approach which I didn’t appreciate at the time. So I’m trying to evaluate his code and I don’t have an idea of what he wants me to see. And eventually he says neither do I, run it and we’ll both figure it out. I had a chip on my shoulder and that was very nice. He introduced me to the paradigms of keep it simple stupid and don’t fix if not broken.
Small correction, the mars climate orbiter failed because the results from Lockheed Martin returned values in pound-force seconds, while NASA expected newton-seconds
@@the_real_ch3 by the time that orbiter was launched (1998), HP had proper units as first-class-objects, and unit-aware-calculations (if you add 1_m and 1_ft you get 4.28_ft as the result) in the programming language of their _pocket calculators_ for more than a decade.
@@GeorgeTsiros Unit aware calculation is one of the reasons I like Qalculate, FreeCAD and yes, RPL. You can get it in other languages too, e.g. Rust uom, Haskell dimensional, C++ units, but it's certainly not going to catch everything (e.g. angles are dimensionless) and it does involve inconvenience.
In the JavaScript section where multiple other inaccuracies. "In every other language that's a syntax error." Not in Ruby (Baaa), Python (runtime error), PHP (runtime error), or Perl (0). And the stuff about document.all is of course true and every browser implements it, but that's not a Netscape invention (kinda implied because Netscape/Brendan Eich is credited for JavaScript and no other credit is given), but a Microsoft invention, and it's not in the standard except for being deprecated. But fun and interesting talk otherwise.
I feel like he’s got the Knight Capital story backwards on a few points - going to have to go and read a bit more about it to see if my memory is dodgy or his.
Anybody not testing his own fix for corrupted data, and anybody entrusted with the QA part of testing it before release should be fired for not doing their jobs.
23:46 NULL is excellent. After you compiled your spreadsheet and talk there have been so many improvements on languages which gave is ?? and ??= and other nifty functions at least in hight level languages that deals with these efficient, readable code. I guess the referee still uses complete if then else statements. But a good pub-talk nevertheless
@@MarkRendle ah. Makes sense. Yeah I do feel like the therac 25 is probably the greatest mistake in programming, but the reason it is is because it did cost people’s lives.
46:25 "In every other language that's a syntax error." Not really. In Ruby it gives: Baaa In Python it gives a *runtime* error: TypeError: bad operand type for unary +: 'str' In PHP it gives a *runtime* error: Uncaught TypeError: Unsupported operand types: string + string in php shell code:1 In Perl it gives: 0 (Note that PHP and Perl don't have a + operator that concatenates strings. They use . for that.) So none of these dynamic languages give a syntax error, but two give runtime errors. Then he says that the unary + in JavaScript is the same as Math.abs(). No it's not. It's the same as Number() (not new Number(), just Number(), and not like parseFloat() because of edge cases). But +-1 just gives -1. The differences in behavior between Number() and parseFloat() are the actual WTF here. Compare empty string and string containing a number followed by garbage between the two functions. And the existence of boxed numbers (new Number()) also makes no sense at all and just confuses people. And document.all was invented by Microsoft, is deprecated now, and I *think* it never was part of the official standard? Not sure about that. I also never saw that one used anywhere. No, I'm *not* a JavaScript fan. Just have to use it for work. Just like Ruby, Python, PHP, etc. I just try to know my tools (programming languages), even if I don't like them. I don't like a lot of them. (Wonder a bit about the accuracy of the rest of the talk now.) Still a fun talk that I get the feeling I have seen before already?
@@blenderpanzi He changed stuff, means there's sometimes new errors introduced in changes - he himself mentions this at the start that people pointed out previous mistakes. The 'syntax error' line does not appear in e.g. the NDC Copenhagen 2022 version of the talk, neither does the Math.abs thing. And document.all being like that doesn't depend on it being part of the standard, the fact that it's possible to have such a weird edge case with document.all means the language semantics allowed for this to be possible, that is all he's saying, really.
Javascript gets so much shade. But the inference makes Javascript ultra powerful. If he had more time, it may have just been another language gobbled up by Java.
In the earlier part of the talk, Mark expresses his preference for .NET, but in his later section 'The Big Rewrite' he exposes its biggest problem: by signing up to use .NET, you agree to become Microsoft's bitch, and I don't mean that in a good way. He's lucky it only cost him six weeks in the hospital.
A type-safe alternative, which is part of Algol 68, is extending the reference type with something explicitly meaning 'nothing here', which would required explicit handling of that case. I think it could look something like: Type T = Ref A | Nothing; And wherever your type T shows up, you have to be prepared for both cases: the Ref to A, or Nothing.
@@Fiyaaaahh The point of typed null is to give more context to what a variable is . Let's say a function returns Nullable, the programmer knows that it can either be a null or a value. If it returns String, it must always be a string. It helps reduce programmer errors and remove the null problem entirely.
If there's a pointer, it points to something. If there is no value to point to, you do not have a pointer. Similarly, if you want to add a row in a datatable but do not have a value for one of the columns, you are not allowed to add a row to that table since you _do not have a valid row to add to that table_
Historical core memory price: "costs began at roughly US$1.00 per bit and dropped to roughly US$0.01 per bit." In essence the price is quoted pr. bit - not byte. ;-) 👍
You Dont Need to Use Goto, for like exceptions. You Can Use Nested IF Statements. If You Need Several Operating System Resources. Only Continue Down the Nest, If you Acquire the Resource you Need. If it Fails, It will Break to the ELSE. There's MANY Ways to Write a Program.
@@GeorgeTsiros YOU CANT READ CODE. FULL STOP. ALL YOU CAN DO IS MEMORISE AND REPEAT WHAT YOUR GOD HAS GIVEN YOU AND PRETEND TO BE COMPUTING GENIUSES. ENGLISH IS A SEMI-ANIMAL LANGUAGE AND YOU DON'T "FULLY" UNDERSTAND IT. "USE YOUR 'OWN' BRAIN" AND COME UP WITH SOMETHING LIKE I HAVE. LIKE MY PROOF OF 0.999 RECURRING = 1. NOT MERELY, REPEAT WHAT YOUR GOD HAS GIVEN YOU. THEN YOU'LL BE ABLE TO WRITE COMPUTER PROGRAMS OF YOUR OWN. ASK YOUR GODS GOD.
@@0LoneTech You're a LOW Intelligence P1G. You Don't FULLY Understand English. Latin - The Animal Language is Your Language. Your Speciality is To Be a Two Faced Terrorist Ba5tard.
much high level bullshit because in assembler/machine code "goto" is all there is: "jump" and/or "branch" instructions depending on the PSW/PSR condition code...
Goto Should Only be Used for Like Exceptions. A Jump Down 10 Lines Max. Goto can be Abused. You Can Use it To Shorten Your Programs. You can Use it to Repeat a Part of Your Algorithm. I wanted to fit a Programming Exercise on One page of A4. So I repeated a Part of My algorithm with it. I only did that Once in My Programming Assignments. I wrote Hundreds of Programs. I only did that once. The Problem with Writing Spaghetti Code is that over time it get un-maintainable. Say My Algorithm Changed. The Middle Part. If after a Few Months I changed the Middle Part to change middle part of the Algorithm. Accidentally I'd Mess Up the Bottom part of the algorithm Without Knowing.
Looking for books & other references mentioned in this video?
Check out the video description for all the links!
Want early access to videos & exclusive perks?
Join our channel membership today: ua-cam.com/channels/s_tLP3AiwYKwdUHpltJPuA.htmljoin
Question for you: What’s your biggest takeaway from this video? Let us know in the comments! ⬇
This might be my favorite talk ever. The attitude toward programmers and those otherly business folk, the anecdotes I know, the anecdotes I've suffered from, and the anecdotes I'd never heard, all presented with utmost charm.
I'm a grey beard. I started programming as a teenager in 1976. I stayed awake for nearly 48 hours as a Unix support engineer for Sequent Computer Systems during Y2K turnover. We didn't get any calls for help related to Y2K. I spent all of that time working on my `top2` performance monitoring program. I was particularly amused by the kennel club anecdote where they were storing a number as a hexadecimal string and "fixed" the limitation of the maximum being 37 by adding two roman numeral characters to the string rather than implementing a proper fix that would have reduced the storage requirements.
Excellent talk - made me feel instantly better about some of the coding "decisions" I've made over the years!
I also got pulled up on the naming of stuff - my manager pulled me into a meeting to discuss why the team were seeing a "Big Whopper Error" on screen. I explained that I'd put a last resort error handler into the code to stop it outright crashing like it used to (this was VBA)... there were calls to change it but the team using the product quite liked it so we kept it! Ah - the "Big Whopper Error" - much more fun to fix than a standard error 😁
Funny that due to the design of the cover image, the title in it reads "GOTO; Programming Greatest Mistakes"
As a young engineer, I learned an important lesson: We need to remember that no one ever does a stupid thing intentionally.... Everyone does what they believe is right at the time. (Note that the Pentium floating point problem, was originally in the DEC alpha chip -- DEC discovered it and corrected it. This proved that Intel's design was not their own, which resulted in a large out-of-court settlement. Naughty!)
For the benefit of the younger crowd here, Intel got rid of the DEC lawsuit by *buying* DEC. I'm sure a small handful of executives got mighty rich, everybody else got screwed.
@@tumunu A better memory than mine.
Compaq bought DEC in 1998
@@RobBCactive Yes, sorry, my memory is failing as I age. Intel bought the technology.
I found the time to look up that 1994 Pentium fp bug "fdiv" and it was caused by 5 constant values not getting copied into a machine that prepped fabrication and the algorithm was STR which I believe was published research that used a lookup table on partial bit ranges.
So despite 90's Intel having stolen designs, being criminally inclined, found guilty of bribery and other illegal practices, I don't believe this story about DEC Alpha. Back then I was in regular contact with DEC pros and it's not the kind of story I'd forget.
I was there in Amsterdam, and can myself hear laughing in the recoring 😅
The event is amzing to attend, loved meeting the people organising and attending!
The speaker, Mark Rendle, is entertaining, and I admire the honesty of self-reflection. I was a CS student at the time of the Pentium bug and wasn't concerned about the implications. The only reason that I eventually took advantage of the replacement was because the newer revision was claimed to run cooler. WIkipedia has the following about Petrov "... he was also reprimanded for improper filing of paperwork with the pretext that he had not described the incident in the military diary. [...] He received no reward. According to Petrov, this was because the incident and other bugs found in the missile detection system embarrassed his superiors and the influential scientists who were responsible for it, so that if he had been officially rewarded, they would have had to be punished. "
"y2k was nothing" no, we fixed it.
"We don't hear about the ozone hole anymore" no, we fixed it.
"We don't hear about acid rain and tree death anymore" no, we fixed it.
Lets hope my grandchildren get to say
"We don't hear about global warming anymore" ...
We do hear about the ozone hole again this year, though. See e.g.: 2023 ozone hole ranks 12th largest on record, find NOAA and NASA (for a single day, 16th largest when averaged from September 7 to October 13)
So that's sadly not quite fixed yet. :/
You fixed nothing. All of these were fabricated media scares. The media just cycles through several of them over the years.
What do you mean we don't hear about tree death anymore? Have you been paying attention to the news the past... 4-5 years?
I hope our descendants will get to say:
"We don't hear about the destruction of the natural world anymore" no, we stopped doing that.
That's more important than "fixing" global warming, if you ask me. It's not about fixing things (doing more), but about doing less, consuming less, polluting less, destroying less.
I am surprised he did not mention the transition between Python 2 and 3
It bears repeating that type coercion was NOT in the first version of JS. It was added later at the insistence of the devs using JS and Eich of the 90s didn't say no like a more experienced Eich of today would have.
When Brendan Eich was asked what he would have done differently, he replied: I would have avoided some of the compromises that I made when I first got early adopters, and they said, “Can you change this?”
I believe type coercion was added because s are always given as strings. So instead of having to do `parseInt(age)`, we got the mess that is type coercion.
There's a lot that's wrong with JavaScript, including the name, but there is an elegant little language in there, both functional and object-based (prototypal, not OOP). The trick is to only use "the good parts".
Roman numerals used in a database, that's pretty much the biggest folktale foolery I've ever heard of. Thank you, Mr. Rendle! :D And thanks to Stanislaw Petrow go out as well. Good thing we're still alive here to laugh about these mistakes.
We used to store either a _base64 string_ or a _JSON object_ in a nvarchar field in a database. Which one it was, was determined at runtime.
I _tried_ to convince them that we do not serve hundreds of thousands of requests per second, we do not _need_ an RDBMS, we could just use flat files, but nooooooooo...
A good programmer doesn’t make mistakes, he corrects bugs.
I was a newbie and the author of a comment about naming conventions that was funny and self deprecatory, asked me to watch him writing something and say what I thought. He’d coded for early parallel processors at British Aerospace and had a suck it and see approach which I didn’t appreciate at the time. So I’m trying to evaluate his code and I don’t have an idea of what he wants me to see. And eventually he says neither do I, run it and we’ll both figure it out. I had a chip on my shoulder and that was very nice. He introduced me to the paradigms of keep it simple stupid and don’t fix if not broken.
As a seasoned enterprise software developer, I can confirm this is 100% accurate.
Would Knight Capital have survived if they'd taken a short position on themselves?
Small correction, the mars climate orbiter failed because the results from Lockheed Martin returned values in pound-force seconds, while NASA expected newton-seconds
Yes the biggest takeaway from the MCO is about clarity of communication and making sure units are always labeled.
@@the_real_ch3 by the time that orbiter was launched (1998), HP had proper units as first-class-objects, and unit-aware-calculations (if you add 1_m and 1_ft you get 4.28_ft as the result) in the programming language of their _pocket calculators_ for more than a decade.
Never assume. The word *assume* stands for making an *ass* out of *u* and *me.*
@@GeorgeTsiros Unit aware calculation is one of the reasons I like Qalculate, FreeCAD and yes, RPL. You can get it in other languages too, e.g. Rust uom, Haskell dimensional, C++ units, but it's certainly not going to catch everything (e.g. angles are dimensionless) and it does involve inconvenience.
First computer was a 486 DX.. that floating point was clutch. Still remember dropping 16 megs in chip by chip. Felt like a wizard.
+a is definitely not the same as math.abs. Try `let a = -3; console.log(+a)`. but it does attempt to perform coercion, so + '1' is 1, so +'a' is NaN.
No one claims it is
@@wiseman9960 the guy in the video did. The thing this comment section is for.
In the JavaScript section where multiple other inaccuracies. "In every other language that's a syntax error." Not in Ruby (Baaa), Python (runtime error), PHP (runtime error), or Perl (0). And the stuff about document.all is of course true and every browser implements it, but that's not a Netscape invention (kinda implied because Netscape/Brendan Eich is credited for JavaScript and no other credit is given), but a Microsoft invention, and it's not in the standard except for being deprecated.
But fun and interesting talk otherwise.
Brilliant talk!
SystemD and Rust. Next topic.
And wayland
I feel like he’s got the Knight Capital story backwards on a few points - going to have to go and read a bit more about it to see if my memory is dodgy or his.
fun stories about programming
thank you for the badly needed laughter....it has been a long time since I had laughed!!!!
Anybody not testing his own fix for corrupted data, and anybody entrusted with the QA part of testing it before release should be fired for not doing their jobs.
23:46 NULL is excellent. After you compiled your spreadsheet and talk there have been so many improvements on languages which gave is ?? and ??= and other nifty functions at least in hight level languages that deals with these efficient, readable code. I guess the referee still uses complete if then else statements. But a good pub-talk nevertheless
Crazy that out of all these he doesn’t mention the Therac 25
It’s a deliberate choice: the talk is supposed to be lighthearted and fun, so I don’t want to talk about people dying.
@@MarkRendle ah. Makes sense. Yeah I do feel like the therac 25 is probably the greatest mistake in programming, but the reason it is is because it did cost people’s lives.
This was standup comedy for programmers
46:25 "In every other language that's a syntax error." Not really.
In Ruby it gives: Baaa
In Python it gives a *runtime* error: TypeError: bad operand type for unary +: 'str'
In PHP it gives a *runtime* error: Uncaught TypeError: Unsupported operand types: string + string in php shell code:1
In Perl it gives: 0
(Note that PHP and Perl don't have a + operator that concatenates strings. They use . for that.)
So none of these dynamic languages give a syntax error, but two give runtime errors.
Then he says that the unary + in JavaScript is the same as Math.abs(). No it's not. It's the same as Number() (not new Number(), just Number(), and not like parseFloat() because of edge cases). But +-1 just gives -1. The differences in behavior between Number() and parseFloat() are the actual WTF here. Compare empty string and string containing a number followed by garbage between the two functions. And the existence of boxed numbers (new Number()) also makes no sense at all and just confuses people.
And document.all was invented by Microsoft, is deprecated now, and I *think* it never was part of the official standard? Not sure about that. I also never saw that one used anywhere.
No, I'm *not* a JavaScript fan. Just have to use it for work. Just like Ruby, Python, PHP, etc. I just try to know my tools (programming languages), even if I don't like them. I don't like a lot of them.
(Wonder a bit about the accuracy of the rest of the talk now.)
Still a fun talk that I get the feeling I have seen before already?
He's given it before, I think this might even be the third time.
@@Adowrath Which all the more increases my criticism if after holding this talk several times he didn't correct these mistakes.
@@blenderpanzi He changed stuff, means there's sometimes new errors introduced in changes - he himself mentions this at the start that people pointed out previous mistakes. The 'syntax error' line does not appear in e.g. the NDC Copenhagen 2022 version of the talk, neither does the Math.abs thing. And document.all being like that doesn't depend on it being part of the standard, the fact that it's possible to have such a weird edge case with document.all means the language semantics allowed for this to be possible, that is all he's saying, really.
@@Adowrath I watch these talks as _pure entertainment_ and having _next to zero reliable knowledge_
is it possible to set the scaling between 100 and 200%? i find it crazy that there's a 400% option but not 125%
I feel like this talk is just a short summary of a bunch of other goto talks.
it is! a greatest hits compilation :D
Too bad CrowdStroke didnt happend yet when this went out
If null is considered a mistake then what is the solution?
The Option and Result monads
@@nexusxeit just makes "nullability" explicit, no?
You don't get to create something, if you do not have a value to assign to it.
Javascript gets so much shade. But the inference makes Javascript ultra powerful. If he had more time, it may have just been another language gobbled up by Java.
Type coercion, not inference. It should be opt-in at best. Instead we have TypeScript as an opt-in. TS does have type inference.
@46:48 "javascript + is the same as Math.abs" ... is absolute complete bunk you should correct for your next talk.
It's interesting to see that inheritance isn't a section on this list.
I hope "Chris" was "given a good talking to" for not listening, testing, or reading.
"In any other programming language this is an error, in Javascript it's a 🍌"
In the earlier part of the talk, Mark expresses his preference for .NET, but in his later section 'The Big Rewrite' he exposes its biggest problem: by signing up to use .NET, you agree to become Microsoft's bitch, and I don't mean that in a good way. He's lucky it only cost him six weeks in the hospital.
so, what would be better solution for null reference? 0?
A type-safe alternative, which is part of Algol 68, is extending the reference type with something explicitly meaning 'nothing here', which would required explicit handling of that case.
I think it could look something like:
Type T = Ref A | Nothing;
And wherever your type T shows up, you have to be prepared for both cases: the Ref to A, or Nothing.
@@MeriaDuck How is that any different. That's just a typed null and you still get the same amounts of null (or "nothing") check code.
@@Fiyaaaahh The point of typed null is to give more context to what a variable is . Let's say a function returns Nullable, the programmer knows that it can either be a null or a value. If it returns String, it must always be a string. It helps reduce programmer errors and remove the null problem entirely.
I saw someone mention Option and Result monads (available in Rust and Haskell)
If there's a pointer, it points to something. If there is no value to point to, you do not have a pointer. Similarly, if you want to add a row in a datatable but do not have a value for one of the columns, you are not allowed to add a row to that table since you _do not have a valid row to add to that table_
Did he say HALF A TRILLION!?
@@christosmantas4308 Yes. Yes I did.
@@MarkRendle I'm really enjoying your presentations. Congrats
@@christosmantas4308 🙏
Shaggy, that's not wid !
Make me wanna quit 😂
Historical core memory price:
"costs began at roughly US$1.00 per bit and dropped to roughly US$0.01 per bit."
In essence the price is quoted pr. bit - not byte. ;-)
👍
You Dont Need to Use Goto, for like exceptions. You Can Use Nested IF Statements. If You Need Several Operating System Resources.
Only Continue Down the Nest, If you Acquire the Resource you Need. If it Fails, It will Break to the ELSE.
There's MANY Ways to Write a Program.
But _very few_ ways that are readable, reliable, maintainable and performant.
@@GeorgeTsiros YOU CANT READ CODE. FULL STOP. ALL YOU CAN DO IS MEMORISE AND REPEAT WHAT YOUR GOD HAS GIVEN YOU AND PRETEND TO BE COMPUTING GENIUSES. ENGLISH IS A SEMI-ANIMAL LANGUAGE AND YOU DON'T "FULLY" UNDERSTAND IT. "USE YOUR 'OWN' BRAIN" AND COME UP WITH SOMETHING LIKE I HAVE. LIKE MY PROOF OF 0.999 RECURRING = 1. NOT MERELY, REPEAT WHAT YOUR GOD HAS GIVEN YOU. THEN YOU'LL BE ABLE TO WRITE COMPUTER PROGRAMS OF YOUR OWN. ASK YOUR GODS GOD.
You also don't need to titlecase randomly.
@@0LoneTech You're a LOW Intelligence P1G. You Don't FULLY Understand English. Latin - The Animal Language is Your Language. Your Speciality is To Be a Two Faced Terrorist Ba5tard.
Wouldn't 38 in Roman numerals be "XXXIIX"?
No
much high level bullshit because in assembler/machine code "goto" is all there is: "jump" and/or "branch" instructions depending on the PSW/PSR condition code...
Like many things in coding, there is nothing wrong with goto when used responsibly.
limies....
Was enjoyable up until the sudden racism.
Sorry, what?
00:30:35?
@@SavingDemo Oh thank goodness, I was worried there for a minute
Goto Should Only be Used for Like Exceptions. A Jump Down 10 Lines Max.
Goto can be Abused. You Can Use it To Shorten Your Programs.
You can Use it to Repeat a Part of Your Algorithm.
I wanted to fit a Programming Exercise on One page of A4.
So I repeated a Part of My algorithm with it. I only did that Once in My Programming Assignments. I wrote Hundreds of Programs.
I only did that once.
The Problem with Writing Spaghetti Code is that over time it get un-maintainable.
Say My Algorithm Changed. The Middle Part. If after a Few Months I changed the Middle Part to change middle part of the Algorithm.
Accidentally I'd Mess Up the Bottom part of the algorithm Without Knowing.