I think the part with Date(NaN) is the best example for the fact that all behaviour and edge cases are also an implicit part of the API. Imagine how much of the internet would break if an update changed the error message for a few nonsensical pieces of code.
Can someone enlighten me why it is so important to keep sane the output of this function , to such extent, that it requires a separate library code to cover this case? It looks like some people can't stand without wiping asses of everyone who writes absurdly broken & non-functional code, to make sure it keeps being absurd and broken even after you upgrade the javascript version. :D
@@igorstasenko9183 Because there might be some website somewhere that checks for that output to determine something important, and if the message changes by even one character it won't match and everything breaks. It might even be a TEST that expects that output, and changing it would cause all the tests to fail which could cost someone time and money figuring out what changed. So instead, there spec specifies the message, and the polyfill makes sure the message matches the spec even if it's a silly thing like that. There are dozens of other messages that are way more important than that one, but the author of the polyfill isn't going to decide which ones seem important and which ones don't, they are just going to make sure they are all to-spec just in case. That is, after all, the point of the polyfill. If we didn't care that all the edge cases were covered, we would just write out own `Array.prototype.at = function...` replacement. It's not as if that's a complicated function after all! But we want anything like a polyfill to be extremely reliable, that's why we use them. So they need to cover EVERYTHING in the spec.
@@igorstasenko9183 Because people can't left-pad strings and if new Date(NaN) casts to some other string value, there's a small-but-not-zero possibility of ICBMs being launched due to an invalid date being treated as valid. And ICBMs being launched is not part of ECMAScript, so Babel is keeping things nice and safe.
I think the slow rollout of WASM is generally a response to the state of JS and the history of browser compatibility. There's nothing I'm aware of stopping it from being a complete replacement for JS. The problem is that we all have to live with the decisions made about it for the rest of our careers, and nobody wants to make a mistake that we're all working around for the next 20 years because 100,000 websites depend on that mistake.
As long as developers keep using JS on the server and in place where they don't have to use JS, I don't have a ton of hope for WASM. The idea is fine and I also liked asmjs. But we have generations of developers who don't realize the problems with JS just don't exist in other languages.
@@username7763 well, there are also a lot of developers who (would like to) not use JS tbf, a lot of them are currently in backend or an entirely different field in IT, but still, they exist and when WASM can actually replace JS, I could see the field becoming considerably more competitive
@@ivanjermakov That's the root of my point. There's nothing absolutely stopping someone from just shoveling some version of a DOM API out the door, if your only goal is to ship it. I feel that the reason it hasn't been done yet is because it's not as simple as just making it work.
As a non-javascript developer I thought that this ES6 thing as some new thing created literally yesterday (as customary in JS world that is famous for its relentless speed) but then I googled it and realized that ES6 was released almost 10 years ago and the current one is version 15th, lol. And the vast majority of websites serving minced code processed god knows how and containing shreds of different versions... Honestly, I'm glad I never bit the bullet and dived into that stuff
Yeah, and except for modules, which everything current supports, ES6 has been in the browser for roughly as long. I stopped using the polyfills several years ago now. I wouldn't use them today at all.
The problem is unpredictable client machines. I write nodejs backend, which in a server, we control what runtime version we use. So, in this situation, it's much easier to use recent standards such as ES2023 without any polyfill.
HOLD UP! IE 11 is not dead. They "killed it", but they literally made it so Edge can delegate to the Trident engine and they committed to supporting that to 2029. The company I work for literally had some dumb butt medical client complain that a report broke because we deployed a feature that didn't work in IE11 and we were all like "WTF? You are doing what!? You're using COMPATIBILITY MODE with IE IN EDGE!?"
Also, while that doesn't work for Mac Edge, it does still work in Windows Edge... there are legitimately clients being stupid about needing Trident rendering still...
@@anobodyscontentstream5347 when you're farther behind in technology than the govt. I've worked on several banking and govt projects and legacy support is no longer required. If they want to use a modern app, they should have a modem browser. It's a matter of security and should be stated as such. #NoFix is my general response to these things
At my job they still have some old terminals where they use the IE compatibility mode in Edge. The worst part is that there are things that work in actual IE, actual Edge, but not in Edge compatability mode...
@@JamesJansson I forgot that the behavior didn't used to be common across browsers,. It'd been so long. It's kind of like the select box in or 5.0.0 or the JSON hydration function when extending array or object prototype in ie8. It's just something I don't think about any more. The IE 5.0.0 issues were painful as that point release was burned to discs and even though fixed in 5.0.1 many just used what came with office 2000 and never upgraded. It supported the newer DOM mode but not the old list API.
Firefox is always exactly to spec. They don't implement and ship draft specs before they are finalized, which is probably what you are referencing. Chrome is notoriois for introducing new APIs that are later deprecated because the draft spec changes.
@@thekwoka4707 on desktop. Safari on mobile is trash and the fact that it's tied to the OS version is even worse. Mobile Chrome is basically identical to desktop chrome and way ahead of safari. The moment Mozilla started actively refusing to add features from the spec is the moment I stopped giving a fuck about ff compatibility.
First off: There *is* a proposal for wasm host bindings, which would enable direct DOM access for wasm in the browser! Second: It is *far* less than a 1 MB. For example the 1000 row benchmark - leptos 0.6.3 is ~200kb uncompressed and 60kb compressed. And while those numbers are still pretty high, leptos 0.7 promises significant improvements on that.
I remember when fancy layouts were made by using pictures with every designed element on it, like a spritesheet. Then we used tables and these sprites as background images to make a "beautiful" website with it. Many rounded corners and things like buttons were actually just pictures because CSS/Browsers didnt support this yet. Today you can do this all in just HTML/CSS lol. I hope Im not the only one who remembers and had to suffer through this hahaha
@@DatMilu2K You aren't. I remember when we discovered that repeating a tiny square image can make the website load faster. Spent hours figuring out which pixels to chop/add to make the pattern look cool. Everything was tables, literally everything. I assumed all curves were just images. Never even considered the possibility it could be done any other way.
The death of IE was wonderful, but it was like blowing up the Death Star only to see a second one was finished and sitting right behind it waiting to attack the whole time in the form of Edge.
@@josephvictory9536 Don't forget that the modern Edge is a completely different browser that just happens to share the name of the original Edge that replaced IE.
The kind of genius that will blame interns for not using ES5 everywhere to make the codebase compatible with everything it was already compatible with, right ?
On the topic of "tech project are a snapshot of the time it's made" the project i'm working on at my company had to upgrade framework as it's not supported anymore and it's still half zendframework 1, half zf 2. It truly is a snapshot even if a bit of longtime exposure is used
Місяць тому+4
5:17 Comparing `new Date(NaN)` seems like checking if the browser supports a specific feature
I was required to use babel at a mega corpo, and it caused a bunch of our deployments to end up super bloated for languages that didn't even need to use it in the first place...
hey prime or editor, i feel like prime's mic is often clipping in these videos recently. maybe his mic gain is too high? love your videos regardless ✌️
"When to write an DSL?" If the DSL solves a Problem AND it is your core business. Good Examples: Dockerfile uses a DSL. The DSL is a tight reflection of how Docker builds the file and the dsl removes a ton of noise in comparison to YAML. More complex requirements can be solved trough RUN Commands and args etc. In comparison saltstack and ansible are using YAML with jinja. Any DSL would be so massiv complex that it wouldnt add any value. In contrast Config languages for e.g. webservers most of the time are not adding great value (see apache,nginx vs. traefik)
35:47 The whole video I was thinking: Nix fixes this. I'm so glad we got there in the end. Dependencies without some kind of Nix approach is hell, why would you do that. I hope there is The Nix of Node, but I fear the answer. That should be the next thing being worked on, it seems very straightforward.
there is lock files, at least 3 variations being package-lock.json, yarn.lock, .pnpm-lock.yaml which take the hash of the packages similar to how nix has flake.lock. then there is package.json + .npmrc similar to flake.nix inputs declaration. Then there is the .npm_cache folder similar to the nix_store, but its not read only, just hopes you don't modify the cache. But also its not symlinked, its copied. Its a mess, But so was nix till flakes came along and set a better standard.
How is this any different? This works only for features native to the host language, new ones still need to be implemented separately, otherwise you wouldn't need a DSL in the first place
@@Archimedes.5000 It at least surpasses the "+1" problem, and needing to add any form of control flow or anything. Imagine if their eDSL looked like (kotlin syntax): ``` targets { Chrome greaterThan 0 Firefox atLeast 0 and last(2.5.years) } ``` Then you can easily do arithmetic for their values, and you could even define reusable conditions etc
@@Archimedes.5000 In fact, here's a runnable example (just search up Kotlin Playground because I can't post links): ``` import kotlin.time.Duration import kotlin.time.Duration.Companion.days fun main() { val targets = listOf( Chrome atLeast "0", Firefox greaterThan "0" last 2.5.years ) println(targets) } sealed interface Query data object Chrome: Query data object Firefox: Query data class VersionSpec(val query: Query, val comparedTo: String, val comparisonType: ComparisonType): Query enum class ComparisonType { Equals, NotEquals, LessThan, GreaterThan, LessThanOrEquals, GreaterThanOrEquals } data class LastSpec(val query: Query, val duration: Duration): Query infix fun Query.atLeast(version: String): Query = VersionSpec(this, version, ComparisonType.GreaterThanOrEquals) infix fun Query.greaterThan(version: String): Query = VersionSpec(this, version, ComparisonType.GreaterThan) infix fun Query.last(duration: Duration): Query = LastSpec(this, duration) val Double.years get() = (this * 365).days ```
I find it crazy that npm ships minified code. I think it would be really cool to transpile the code once on install with your browserlist settings. This would also make the code in node_modules readable and match what's on git, which is good for people like me who like to go digging around in there.
I don't see any problem in transpiling `node_modules` libraries. I mean, Rust is doing the same thing, even during debug builds. The difference is that Rust is caching the compilation of every individual crate.
Using any polyfill library for fictional oneliner is not the best strategy. But if you develop smth larger, polyfill footprint becomes neglible fast. Few years ago I compared ES5 and ES6 rollup minified builds of a reasonably large web app and the difference in size was about 2%
16:30 I think we can say the higher the abstraction, the more volatile it is. Likewise, the more that's built on top of something, the more resistant it is to change. Everything uses binary under the hood of the hood of the hood. Most things use C under the hood of the hood. ES5 is buried under enough abstractions that moving away from it creates enough resistance and doesn't have enough need for some to bother. In some ways, it's a "don't fix what ain't broken", in other ways it's that to change the foundation means to change the everything that's on top.
"people don't want to sue multiple languages". Is a bit ironic given that the biggest learning cost you'll face is in learning the frameworks & libraries themselves. Which have little to no consistency or staying power in JS.
22:32 What in the heck is this spammer even trying to say? We invented dishwashers decades ago, you don't need AI for that you just need surfactants, hot water, and time.
I stopped using polyfills for web projects about 4 years ago. Saved a lot of space in general and there hasn't been that much added to justify the overhead.
So here is the future - 1. Someone probably Evan You, comes up with removing bloated dependencies that are not needed for transpiled code. 2. It is a success, but then people have opinions and 10 other debloat-transpiler-code. 3. 1-10+ new dependencies for this new package added to your code base. 4. Someone now comes and makes same thing in Rust claiming 80% improvement on speed to debloat-transpiled code. Repeat same cycle to detect ES6+ code in ES5 code. And as Murphy's law says if it can happen it will happen.
I noticed something with these types of videos. It starts interesting and gradually spirals into a spaghetti of web dialact that makes me bored out of my life.
I just went through the package,json and vite config file of the webapp my company builds, and I'm happy to say we don't support legacy browsers. Phew.
I think the solution is to port that Python module that makes sure your code works by deleting lines that thow exceptions until the program no longer throws exception to JavaScript. Just imagine, every time your code has an error, if it's too modern JavaScript or anything else, it will automatically correct itself by removing the offending code. Perfect solution. Nothing could possibly go wrong. I shall not name it as to not risk getting banned from this mortal realm, but I can say this: The module name starts with a word that is simultaneously a swear word and one of the favorite pastime for us humans, and it ends with the word "it".
42:07 Well, that answers the question about pr0n... lol. Row 3414 is right there on screen. Interesting that that one isn't a hyperlink for *some* reason 🤣
To me it feels far more likely that the sites were built during a time when ES5 support was required. Then moved to not supporting it, but the rxisting code wasnt cleaned up.
Even when ie 11 was supported by Microsoft, I dropped support for it. Since I was a junior developer, was still using base javascript and had written a fair amount of code ei 11 was fundamentally incompatible with.
I will never understand why companies of a sector where you can fairly easily switch jobs don’t offer better pay over time. I love my job and colleagues, the only reason I want to go is because of low pay increases. I even know why they can’t pay me better, but that’s irrelevant for the financial plans I made signing up. Two years ago, I was completely new. They compensated me ~5% more each year, which amounts to at best inflation. Why would you pay an fresh-from-university employee the same or more than a trusted employee? Makes no sense. I get why your salary wouldn’t have a big net increase going from year 11 to year 12.
I'm noticing that the 44kb example makes no mention of how tree shaking affects the bundle size. Why would Babel do the tree shaking when that's an existing part of pretty much every build process?
I think library authors should use reasonably up-to-date js and leave it to people using it to transpile. They have to pick library versions and targets anyways.
Tell me you havent ever maintained a Library without actually telling me. This requests IS Just straight Up unreasonable. Many libraries are developed in devs freetime. You would always be out of date somewhere. Just a week of having installed dependabot anywhere Just made me stop bothering
"just give 'em a raise" - This is what an entrepreneur would do. But salary determination is rarely done by entrepreneurs. For a manager, success is not defined by how well the company does, but by any metric that can be traced to their decisions. So if a manager fires an employee, that can only be a bad decision, if the absence of that employee can be directly traced to some damage. The savings in salary can always be traced to the decision to fire them. An increase in salary is always an increase in cost and thus at least questionable. If an employee quits, you don't necessarily know why. So it's common sense or guesswork competing with integer numbers. What is more convincing?
There is people who’ve had to maintain DSLs on a real project and people who have not, I don’t think anyone that has had to maintain a DSL on a real project with a moderate level of complexity would ever recommend it to anyone period :P
As i recall, that dsl grew out of parsing a simple line based .browserlist file that was just browser name followed by a version. I think it's pretty crap and json would have been better even from then, but i get it. I actually think DSLs with variables and functions etc are super fun to write... but I've never really managed to figure out editor support for any of my toys. A proper DSL is certainly better than inflicting magic JSON or YAML configs on people, but *most* of the time, youre going to want to just use an existing scripting lang like Javascript, but occasionally that results in way to much syntax overhead or weird peformance issues because you don't have access to be able to cache things or the like across threads or runs.
@@thekwoka4707 yeah that's what I'm realising now. I'm not super familiar with JS, I come from a JVM background where DCE is standard and not worried about at all.
This just goes to show that Angular is miles ahead of every major lib out there as they started shipping ES6+ bundles long ago and even created Angular Package Format which streamlined library authoring and pretty much every angular library is also shipping ES6 bundles just like Angular core. React and it's ecosystem simply don't stand a chance in comparison.
Your turnover rate of under 2 years is criminal for as little as a 10% increase in pay. Something is clearly messed up in industry that this is accepted
Yes, supporting ES5 is a waste of resources, but It sounds like the author forgot that advertisements and embedded content exists, instead assuming every piece of JS returned by a site is in its control.
I was confused seeing the thumbnail with at(-1). I thought at would just be an offset relative to the pointer. So negative wouldn’t be allowed. If you want that you’ll just use length-1. Hell I would’ve even use the at method but the [] indexing. Can we stop adding useless functions to languages such as these and just learn to code again, just do simple pointer arithmetic.
This isn't C though.. Array could just as well be implemented as a linked list for all you know. Negative indexing as a shorthand for "from the end" is a very common language construct, and also reduces noise for the common action picking the last (few) elements of an array. And yes, the whole point of the method is to support negative indices. So if you're not expecting negative values you shouldn't use it.
@@leeroyjenkins0 linked lists aren’t circular so the first node doesn’t have a pointer set to te end. Otherwise it’s a circular buffer. And it wouldn’t be an array. An array is a contiguous piece of memory the size of the number of elements you specified. You can defend it but it’s a syntactically inconsistent. If you want such a behavior write your own method that does it. In a proper language this shouldn’t have a place. Or at least do it for example list[::-1] which is at least more explicit. It I hate it, all this syntactic sugar is making a language unnecessarily big and inconsistent. Not a fan.
@@CallousCoder nope, this is JavaScript. An Array is an object where you can only use numerical properties. Doing array[N] is simply setting or getting a property of the object. You're free to handle that however you want as the implementer of the interpreter. "let x = []; x[15] = 1; x[0] === undefined; x[20] = undefined; x.length === 16; " it's all valid JS despite not setting a length when creating the Array. JS doesn't expose memory addresses, there's no concept of contiguous sections of memories you can request. You can implement an Array any number of ways, even though typically it's mostly just a C-style array of some kind.. Yes you can implement it in your own util.. but why would you? You can implement indexing yourself too, why don't you just do actual pointer arithmetics "int* x = malloc(...); int x1 = *(x + sizeof(int));" instead of the bloat that is "int x1 = x[1];"?
@@leeroyjenkins0 all for pointer arithmetic 🤣 But seriously indexing is fine of course. It’s the extra method .at() that already is a bit too much added functionality to me because you can index already. And then at(-1) is syntactically inconsistent. Because it would suggest base pointer -1. Perhaps in a syntactical purist 😉I love language to be tiny and not have all that frivolity that also adds weird inconsistencies that require you to know how the method is I implemented. I also know now, why I hardly use JavaScript 😂Although C++ also has its horrible extra ease of us abstractions. Mee should just go back to C or assembly 😉 But thanks for the enlightening JavaScript deep dive.
It is I, Polyfloat Bill.
0:30 😂
"They thought no-one could master hydrodynamics until he came to town"
I love this
hi Bill
Things change, Standford Pines! *T̺͛H̭̅I͔̞̾͑N̘͖̬ͫ͋ͥG͚ͯͭ̅ͅͅS̩̙͎̭ͫ̈̿ ̺̳̥̘̉ͭ͌C͕̠̙̥̄ͧ̽H͚̩̭͉̅͋̃Á͉͈͍͉͋͒N͈̪͎͂͛ͦG̘͎̗͗ͭ͋E̮̓ͫͅ* HAHAHAHAHA
I think the part with Date(NaN) is the best example for the fact that all behaviour and edge cases are also an implicit part of the API. Imagine how much of the internet would break if an update changed the error message for a few nonsensical pieces of code.
That's why !==String(Date(NaN)) is better than !=="invalid date"
Can someone enlighten me why it is so important to keep sane the output of this function , to such extent, that it requires a separate library code to cover this case?
It looks like some people can't stand without wiping asses of everyone who writes absurdly broken & non-functional code, to make sure it keeps being absurd and broken even after you upgrade the javascript version. :D
@@igorstasenko9183 Because there might be some website somewhere that checks for that output to determine something important, and if the message changes by even one character it won't match and everything breaks. It might even be a TEST that expects that output, and changing it would cause all the tests to fail which could cost someone time and money figuring out what changed. So instead, there spec specifies the message, and the polyfill makes sure the message matches the spec even if it's a silly thing like that. There are dozens of other messages that are way more important than that one, but the author of the polyfill isn't going to decide which ones seem important and which ones don't, they are just going to make sure they are all to-spec just in case. That is, after all, the point of the polyfill. If we didn't care that all the edge cases were covered, we would just write out own `Array.prototype.at = function...` replacement. It's not as if that's a complicated function after all! But we want anything like a polyfill to be extremely reliable, that's why we use them. So they need to cover EVERYTHING in the spec.
This would be great, then there are even more ways to differantiate between versions lol
@@igorstasenko9183 Because people can't left-pad strings and if new Date(NaN) casts to some other string value, there's a small-but-not-zero possibility of ICBMs being launched due to an invalid date being treated as valid. And ICBMs being launched is not part of ECMAScript, so Babel is keeping things nice and safe.
"I know I'm the last committer, I committed atrocities" gonna steal this when my current project ends at the end of the month
“You didn’t commit code, you committed a crime!” 😂
I think the slow rollout of WASM is generally a response to the state of JS and the history of browser compatibility. There's nothing I'm aware of stopping it from being a complete replacement for JS. The problem is that we all have to live with the decisions made about it for the rest of our careers, and nobody wants to make a mistake that we're all working around for the next 20 years because 100,000 websites depend on that mistake.
As long as developers keep using JS on the server and in place where they don't have to use JS, I don't have a ton of hope for WASM. The idea is fine and I also liked asmjs. But we have generations of developers who don't realize the problems with JS just don't exist in other languages.
@@username7763 well, there are also a lot of developers who (would like to) not use JS
tbf, a lot of them are currently in backend or an entirely different field in IT, but still, they exist and when WASM can actually replace JS, I could see the field becoming considerably more competitive
> There's nothing I'm aware of stopping it from being a complete replacement for JS
As of now, no DOM API
@@ivanjermakov That's the root of my point. There's nothing absolutely stopping someone from just shoveling some version of a DOM API out the door, if your only goal is to ship it. I feel that the reason it hasn't been done yet is because it's not as simple as just making it work.
@@alexanderjordan2506 it's not as simple and not anyone can do this, browsers must natively allow DOM access.
As a non-javascript developer I thought that this ES6 thing as some new thing created literally yesterday (as customary in JS world that is famous for its relentless speed) but then I googled it and realized that ES6 was released almost 10 years ago and the current one is version 15th, lol. And the vast majority of websites serving minced code processed god knows how and containing shreds of different versions... Honestly, I'm glad I never bit the bullet and dived into that stuff
Yeah, and except for modules, which everything current supports, ES6 has been in the browser for roughly as long. I stopped using the polyfills several years ago now.
I wouldn't use them today at all.
The problem is unpredictable client machines. I write nodejs backend, which in a server, we control what runtime version we use. So, in this situation, it's much easier to use recent standards such as ES2023 without any polyfill.
Most client browsers are self updating and less than a year behind as it stands.
HOLD UP! IE 11 is not dead. They "killed it", but they literally made it so Edge can delegate to the Trident engine and they committed to supporting that to 2029. The company I work for literally had some dumb butt medical client complain that a report broke because we deployed a feature that didn't work in IE11 and we were all like "WTF? You are doing what!? You're using COMPATIBILITY MODE with IE IN EDGE!?"
Also, while that doesn't work for Mac Edge, it does still work in Windows Edge... there are legitimately clients being stupid about needing Trident rendering still...
@@anobodyscontentstream5347 when you're farther behind in technology than the govt. I've worked on several banking and govt projects and legacy support is no longer required.
If they want to use a modern app, they should have a modem browser. It's a matter of security and should be stated as such. #NoFix is my general response to these things
At my job they still have some old terminals where they use the IE compatibility mode in Edge.
The worst part is that there are things that work in actual IE, actual Edge, but not in Edge compatability mode...
It's dead enough that I would not want any modern tool to still target IE
The code is probably using Date to check JS engine version.
What
I think he realized it shortly after
@@JamesJansson I forgot that the behavior didn't used to be common across browsers,. It'd been so long.
It's kind of like the select box in or 5.0.0 or the JSON hydration function when extending array or object prototype in ie8. It's just something I don't think about any more.
The IE 5.0.0 issues were painful as that point release was burned to discs and even though fixed in 5.0.1 many just used what came with office 2000 and never upgraded. It supported the newer DOM mode but not the old list API.
Safari is the new IE11
Due to our platform's OS support we need to support Safari 10.10.
Safari supports more of the spec than Firefox does
@@thekwoka4707 Would be great if they could support the parts I need, because if I have browser issues, they tend to be caused by Safari.
Firefox is always exactly to spec. They don't implement and ship draft specs before they are finalized, which is probably what you are referencing. Chrome is notoriois for introducing new APIs that are later deprecated because the draft spec changes.
@@thekwoka4707 on desktop. Safari on mobile is trash and the fact that it's tied to the OS version is even worse. Mobile Chrome is basically identical to desktop chrome and way ahead of safari. The moment Mozilla started actively refusing to add features from the spec is the moment I stopped giving a fuck about ff compatibility.
First off: There *is* a proposal for wasm host bindings, which would enable direct DOM access for wasm in the browser!
Second: It is *far* less than a 1 MB.
For example the 1000 row benchmark - leptos 0.6.3 is ~200kb uncompressed and 60kb compressed.
And while those numbers are still pretty high, leptos 0.7 promises significant improvements on that.
Just to pit it in perspective. Super Mario World is 1MB.
Compiler optimizations are magic, javascript transpilers are dark magic
uncurryThis(Nuts)
I wasted my youth on making rounded corners in IE6. Times before HTTP/2 were crazy.
The true follys of youth. Curves all kinds of curves
I remember when fancy layouts were made by using pictures with every designed element on it, like a spritesheet. Then we used tables and these sprites as background images to make a "beautiful" website with it. Many rounded corners and things like buttons were actually just pictures because CSS/Browsers didnt support this yet. Today you can do this all in just HTML/CSS lol. I hope Im not the only one who remembers and had to suffer through this hahaha
@@DatMilu2K You aren't. I remember when we discovered that repeating a tiny square image can make the website load faster. Spent hours figuring out which pixels to chop/add to make the pattern look cool. Everything was tables, literally everything. I assumed all curves were just images. Never even considered the possibility it could be done any other way.
@@josephvictory9536 tables take me back to high school 20 years ago. It was a simpler time.
What does HTTP/2 have to do with round corners?
The death of IE was wonderful, but it was like blowing up the Death Star only to see a second one was finished and sitting right behind it waiting to attack the whole time in the form of Edge.
Edge is good though.. and based on chromium.
Nah, the new IE is Safari
@@josephvictory9536 Don't forget that the modern Edge is a completely different browser that just happens to share the name of the original Edge that replaced IE.
And it runs in the background doing god-knows-what from startup
Underrated comment
Some developers stick with ES5 to keep their code working on older browsers like Chrome 49, which still runs on Windows XP.
Might as well just tell the users to use supermium instead of targeting such old version of chrome
At that point just don't write JavaScript at all
There are more modern browsers for winxp. There's no need to use Chrome 49 anymore.
Don't, just don't
Doesn't windows xp get hacked the second you turn it on?
Rocking that screen tear for more then a year now, that's dedication
for real
call TOM we need genius
J-Diesel truly is the hero we need to fix this
The kind of genius that will blame interns for not using ES5 everywhere to make the codebase compatible with everything it was already compatible with, right ?
On the topic of "tech project are a snapshot of the time it's made"
the project i'm working on at my company had to upgrade framework as it's not supported anymore and it's still half zendframework 1, half zf 2. It truly is a snapshot even if a bit of longtime exposure is used
5:17 Comparing `new Date(NaN)` seems like checking if the browser supports a specific feature
It will give a different output based on the JS engine being used. It's a hack to detect the engine version
I was required to use babel at a mega corpo, and it caused a bunch of our deployments to end up super bloated for languages that didn't even need to use it in the first place...
Now I want a video where Prime doesn't say "No" to any of the rabbit holes. It can be 10 hours long.
You look to me like dr disrespects evil twin brother, Dr Respect
Or would it he High-School Drop-out Disrespect?
I wouldn't want to be compared to him before seeing the messages...
Man... Now I cant unsee it
Dr DatRespect
Evil dr disrespect be like: I'm not going to be a pedo
respek for mentioning Closure Compiler - a massively awesome and criminally underappreciated tool
5:30 look, undefined behavior in JS!
Why is anyone transpiling ES6 right now? It's literally a decade old... A browser from ten years ago doesn't even support TLS1.3 lol
im so happy i started just writing my own framework years ago, its 7kb unoptimized and uncompressed and just werks for all my purposes
hey prime or editor, i feel like prime's mic is often clipping in these videos recently. maybe his mic gain is too high? love your videos regardless ✌️
"When to write an DSL?" If the DSL solves a Problem AND it is your core business. Good Examples: Dockerfile uses a DSL. The DSL is a tight reflection of how Docker builds the file and the dsl removes a ton of noise in comparison to YAML. More complex requirements can be solved trough RUN Commands and args etc. In comparison saltstack and ansible are using YAML with jinja. Any DSL would be so massiv complex that it wouldnt add any value. In contrast Config languages for e.g. webservers most of the time are not adding great value (see apache,nginx vs. traefik)
That code is something else hahahaha what fun. TY for sharing.
Yooo, Happy Birthday big guy!
35:47 The whole video I was thinking: Nix fixes this. I'm so glad we got there in the end. Dependencies without some kind of Nix approach is hell, why would you do that. I hope there is The Nix of Node, but I fear the answer. That should be the next thing being worked on, it seems very straightforward.
there is lock files, at least 3 variations being package-lock.json, yarn.lock, .pnpm-lock.yaml which take the hash of the packages similar to how nix has flake.lock. then there is package.json + .npmrc similar to flake.nix inputs declaration. Then there is the .npm_cache folder similar to the nix_store, but its not read only, just hopes you don't modify the cache. But also its not symlinked, its copied. Its a mess, But so was nix till flakes came along and set a better standard.
That's why eDSLs are goated. Your base language solves the error-handling and +1 problem and all that.
How is this any different?
This works only for features native to the host language, new ones still need to be implemented separately, otherwise you wouldn't need a DSL in the first place
@@Archimedes.5000 It at least surpasses the "+1" problem, and needing to add any form of control flow or anything.
Imagine if their eDSL looked like (kotlin syntax):
```
targets {
Chrome greaterThan 0
Firefox atLeast 0 and last(2.5.years)
}
```
Then you can easily do arithmetic for their values, and you could even define reusable conditions etc
@@Archimedes.5000 In fact, here's a runnable example (just search up Kotlin Playground because I can't post links):
```
import kotlin.time.Duration
import kotlin.time.Duration.Companion.days
fun main() {
val targets = listOf(
Chrome atLeast "0",
Firefox greaterThan "0" last 2.5.years
)
println(targets)
}
sealed interface Query
data object Chrome: Query
data object Firefox: Query
data class VersionSpec(val query: Query, val comparedTo: String, val comparisonType: ComparisonType): Query
enum class ComparisonType {
Equals,
NotEquals,
LessThan,
GreaterThan,
LessThanOrEquals,
GreaterThanOrEquals
}
data class LastSpec(val query: Query, val duration: Duration): Query
infix fun Query.atLeast(version: String): Query = VersionSpec(this, version, ComparisonType.GreaterThanOrEquals)
infix fun Query.greaterThan(version: String): Query = VersionSpec(this, version, ComparisonType.GreaterThan)
infix fun Query.last(duration: Duration): Query = LastSpec(this, duration)
val Double.years get() = (this * 365).days
```
33:23 comrade prime.
Aye happy birthday! Didn’t know our bdays were so close
That webpack comment hit hard… i have no idea how that shit works on the thing im working on now…
Someone setup webpack once a long time ago, and now all devs are copying the same webpack config to every new project.
So true and so sad, Going to work and using webpack 4 ... After using Vite SWC ... At personal projects.
I find it crazy that npm ships minified code. I think it would be really cool to transpile the code once on install with your browserlist settings. This would also make the code in node_modules readable and match what's on git, which is good for people like me who like to go digging around in there.
Would be even cooler if article told why the heck do you need generators(or any other included stuff) for this. But still an interesting topic.
I don't see any problem in transpiling `node_modules` libraries. I mean, Rust is doing the same thing, even during debug builds. The difference is that Rust is caching the compilation of every individual crate.
14:50 You can redefine functions, you can't redeclare variables (and, therefore, arrow functions)
Using any polyfill library for fictional oneliner is not the best strategy. But if you develop smth larger, polyfill footprint becomes neglible fast. Few years ago I compared ES5 and ES6 rollup minified builds of a reasonably large web app and the difference in size was about 2%
11:00 Technically etcd is a kind of database (just not a sql relational database), so (almost) anyone using k8s is using a database for config.
Happy birthday code daddy
16:30 I think we can say the higher the abstraction, the more volatile it is. Likewise, the more that's built on top of something, the more resistant it is to change. Everything uses binary under the hood of the hood of the hood. Most things use C under the hood of the hood. ES5 is buried under enough abstractions that moving away from it creates enough resistance and doesn't have enough need for some to bother. In some ways, it's a "don't fix what ain't broken", in other ways it's that to change the foundation means to change the everything that's on top.
Webdevs will tell you this is good, because at least they don't have to deal with types
"people don't want to sue multiple languages". Is a bit ironic given that the biggest learning cost you'll face is in learning the frameworks & libraries themselves. Which have little to no consistency or staying power in JS.
22:32 What in the heck is this spammer even trying to say? We invented dishwashers decades ago, you don't need AI for that you just need surfactants, hot water, and time.
I stopped using polyfills for web projects about 4 years ago. Saved a lot of space in general and there hasn't been that much added to justify the overhead.
So here is the future -
1. Someone probably Evan You, comes up with removing bloated dependencies that are not needed for transpiled code.
2. It is a success, but then people have opinions and 10 other debloat-transpiler-code.
3. 1-10+ new dependencies for this new package added to your code base.
4. Someone now comes and makes same thing in Rust claiming 80% improvement on speed to debloat-transpiled code.
Repeat same cycle to detect ES6+ code in ES5 code.
And as Murphy's law says if it can happen it will happen.
Happy Birthday!
shipping polyfills for my polyfills never got more exciting
arrow functions does not introduce a new dynamic scope (iow `this` has different meaning)
Never managed to write that "target browser version" string on the first try.
Everytime you pronounce Vite the way you do (like bite) and not in French like it's supposed to be, a baby kitten dies asphyxiated by a croissant.
I noticed something with these types of videos. It starts interesting and gradually spirals into a spaghetti of web dialact that makes me bored out of my life.
browserlist targets evergreen browsers.
You should still make sites that work without Javascript, less pain, faster loading, can be archived for the future.
I just went through the package,json and vite config file of the webapp my company builds, and I'm happy to say we don't support legacy browsers. Phew.
I think the solution is to port that Python module that makes sure your code works by deleting lines that thow exceptions until the program no longer throws exception to JavaScript. Just imagine, every time your code has an error, if it's too modern JavaScript or anything else, it will automatically correct itself by removing the offending code. Perfect solution. Nothing could possibly go wrong.
I shall not name it as to not risk getting banned from this mortal realm, but I can say this: The module name starts with a word that is simultaneously a swear word and one of the favorite pastime for us humans, and it ends with the word "it".
42:07 Well, that answers the question about pr0n... lol. Row 3414 is right there on screen. Interesting that that one isn't a hyperlink for *some* reason 🤣
Last time I looked at Angular, it did just that: split bundles
To me it feels far more likely that the sites were built during a time when ES5 support was required. Then moved to not supporting it, but the rxisting code wasnt cleaned up.
Even when ie 11 was supported by Microsoft, I dropped support for it. Since I was a junior developer, was still using base javascript and had written a fair amount of code ei 11 was fundamentally incompatible with.
The real issue was using Babel. Aint nobody using babel
What you think is the Canadian way of saying idempotent is actually just the correct way, for everyone.
wait till they find out about a hello world in next
34:19 ✨ pʰɑɹ'sɛl ✨
I will never understand why companies of a sector where you can fairly easily switch jobs don’t offer better pay over time. I love my job and colleagues, the only reason I want to go is because of low pay increases. I even know why they can’t pay me better, but that’s irrelevant for the financial plans I made signing up. Two years ago, I was completely new. They compensated me ~5% more each year, which amounts to at best inflation. Why would you pay an fresh-from-university employee the same or more than a trusted employee? Makes no sense. I get why your salary wouldn’t have a big net increase going from year 11 to year 12.
eternally thankful that I'm not in webdev
I'm noticing that the 44kb example makes no mention of how tree shaking affects the bundle size. Why would Babel do the tree shaking when that's an existing part of pretty much every build process?
Why does he say Babel like that
Because he wants to lose subscribers
I think library authors should use reasonably up-to-date js and leave it to people using it to transpile. They have to pick library versions and targets anyways.
Tell me you havent ever maintained a Library without actually telling me.
This requests IS Just straight Up unreasonable. Many libraries are developed in devs freetime. You would always be out of date somewhere.
Just a week of having installed dependabot anywhere Just made me stop bothering
"just give 'em a raise" - This is what an entrepreneur would do. But salary determination is rarely done by entrepreneurs. For a manager, success is not defined by how well the company does, but by any metric that can be traced to their decisions. So if a manager fires an employee, that can only be a bad decision, if the absence of that employee can be directly traced to some damage. The savings in salary can always be traced to the decision to fire them. An increase in salary is always an increase in cost and thus at least questionable. If an employee quits, you don't necessarily know why. So it's common sense or guesswork competing with integer numbers. What is more convincing?
Arrow functions close over this, other functions don't. That's IMO the most important difference.
If only we could trust the user agent string, then we could only serve this nonsense to the users who insist on using old or bad browsers.
There is people who’ve had to maintain DSLs on a real project and people who have not, I don’t think anyone that has had to maintain a DSL on a real project with a moderate level of complexity would ever recommend it to anyone period :P
As i recall, that dsl grew out of parsing a simple line based .browserlist file that was just browser name followed by a version.
I think it's pretty crap and json would have been better even from then, but i get it.
I actually think DSLs with variables and functions etc are super fun to write... but I've never really managed to figure out editor support for any of my toys.
A proper DSL is certainly better than inflicting magic JSON or YAML configs on people, but *most* of the time, youre going to want to just use an existing scripting lang like Javascript, but occasionally that results in way to much syntax overhead or weird peformance issues because you don't have access to be able to cache things or the like across threads or runs.
If they ever get around to actually shipping WASI those WASM problems will be solved. Direct DOM API bindings, browser-provided GC and heap, et. al.
Bruh if we can't stop transpiling to ES5 because people don't want to stop supporting IE9 or whatever we'll never adopt WASM.
Isn't tree-shaking (dead code elimination) a thing though? Or is that hard to do with transpiling because polyfills end up in obscure places?
You can't really eliminate dead code like that, since it's...not dead. It's Injected to run
@@thekwoka4707 yeah that's what I'm realising now. I'm not super familiar with JS, I come from a JVM background where DCE is standard and not worried about at all.
function has different 'this'-semantics to arrow functions
The longer we go on, the more I feel transpiling was just a mistake to begin with.
I'm at the point where the more shit a programming language is the more it gets adopted... the end is nigh
Let's not tell web developers about esolangs then.
Oh look, it's like autoconf shell code but in javascript.
Damn, Swoleagen
Why not look at the user agent to decide which JS code to ship?
Although I do not have OCD still his text highlighting triggers some kind of itch in my brain.
Tbf… 89 is just 68 rotated 180 degrees.
This just goes to show that Angular is miles ahead of every major lib out there as they started shipping ES6+ bundles long ago and even created Angular Package Format which streamlined library authoring and pretty much every angular library is also shipping ES6 bundles just like Angular core. React and it's ecosystem simply don't stand a chance in comparison.
Your turnover rate of under 2 years is criminal for as little as a 10% increase in pay. Something is clearly messed up in industry that this is accepted
What does only 1 fedora per group REALLY MEAN?!
Whoever wrote this has not experience web web development! No body in web dev says "website developer"
Greatest day for all of web development was when I could stop supporting IE6.
Terraform > Pulumi.
Sometimes a DSL can be useful.
67.54 lol they did use the ceiling
Holding stocks in 2024 sounds like not a good idea tho...
Yes, supporting ES5 is a waste of resources, but It sounds like the author forgot that advertisements and embedded content exists, instead assuming every piece of JS returned by a site is in its control.
the JS ecosystem is a horror show
My client uses Chromium 64 and says it is not possible to upgrade for at least 5 more years. ;-)
I was confused seeing the thumbnail with at(-1). I thought at would just be an offset relative to the pointer. So negative wouldn’t be allowed. If you want that you’ll just use length-1. Hell I would’ve even use the at method but the [] indexing. Can we stop adding useless functions to languages such as these and just learn to code again, just do simple pointer arithmetic.
This isn't C though.. Array could just as well be implemented as a linked list for all you know.
Negative indexing as a shorthand for "from the end" is a very common language construct, and also reduces noise for the common action picking the last (few) elements of an array.
And yes, the whole point of the method is to support negative indices. So if you're not expecting negative values you shouldn't use it.
@@leeroyjenkins0 linked lists aren’t circular so the first node doesn’t have a pointer set to te end. Otherwise it’s a circular buffer.
And it wouldn’t be an array. An array is a contiguous piece of memory the size of the number of elements you specified.
You can defend it but it’s a syntactically inconsistent. If you want such a behavior write your own method that does it. In a proper language this shouldn’t have a place. Or at least do it for example list[::-1] which is at least more explicit.
It I hate it, all this syntactic sugar is making a language unnecessarily big and inconsistent. Not a fan.
@@CallousCoder nope, this is JavaScript. An Array is an object where you can only use numerical properties. Doing array[N] is simply setting or getting a property of the object. You're free to handle that however you want as the implementer of the interpreter. "let x = []; x[15] = 1; x[0] === undefined; x[20] = undefined; x.length === 16; " it's all valid JS despite not setting a length when creating the Array. JS doesn't expose memory addresses, there's no concept of contiguous sections of memories you can request. You can implement an Array any number of ways, even though typically it's mostly just a C-style array of some kind..
Yes you can implement it in your own util.. but why would you? You can implement indexing yourself too, why don't you just do actual pointer arithmetics "int* x = malloc(...); int x1 = *(x + sizeof(int));" instead of the bloat that is "int x1 = x[1];"?
@@leeroyjenkins0 all for pointer arithmetic 🤣 But seriously indexing is fine of course. It’s the extra method .at() that already is a bit too much added functionality to me because you can index already. And then at(-1) is syntactically inconsistent. Because it would suggest base pointer -1.
Perhaps in a syntactical purist 😉I love language to be tiny and not have all that frivolity that also adds weird inconsistencies that require you to know how the method is I implemented. I also know now, why I hardly use JavaScript 😂Although C++ also has its horrible extra ease of us abstractions. Mee should just go back to C or assembly 😉
But thanks for the enlightening JavaScript deep dive.
the name is
Bond
@@adnan7698bondagen
20:00 superintendent! I was just stretching my JS skills on the server. Isomorphic exercise! Care to join me?