Every project that uses NPM is basically a security problem too. Like setup a basic project and you already have a billion dependencies nobody knows what they do.
using a web browser scripting language for anything else... using it for serverside programs ... using an especially retarded web browser scripting language for serverside programs......... ALSO, I just KNEW hosting a couple of small files instead of linking cdns was safer.
The worst thing is, this also happens outside of NPM too. `sudo apt install nodejs npm` on Debian pulls ~300 dependencies, most of which match "node-*"
Seriously. React takes many minutes to download just because of all the random dependencies and their dependancies and their dependancies dependancies. Its dependancy diarrhea and I dont think anything has it as bad as JavaScript, and I think its just because of the batteries not included nature of node. You need to find little modules to do everything and this causes dependency hell for the simplest libraries.
@@BeefIngot that's why I prefer compilers like Svelte. No cdn or dependencies, just serving pure HTML/CSS/JS to the browser. Always wondered what would happen to react websites if meta's CDN is down
Web dev here. Although V8 hack is possible, I am almost certainly sure this code is actually intended to still user sessions, user input or any other security tokens etc. It's especially useful if you get admin session or credentials on things like wordpress, as from that you can hack the server and use it as a bot farm for DDOS or hoping that wp will give you access to other systems.
And this is why you don't store session tokens in localStorage, or non-HttpOnly cookies, folks. Fortunately Wordpress sends session with HTTPOnly cookie, so they wouldn't be affected unless the user of that WP instance uses a plugin that happens to bypass this security feature.
JS hosted from some other domain (like a cdn) cannot read the HTTP only cookies of other domain page (which is including the cdn js) so stealing sessions is not possible in such a case. This malicious code can do other things like opening a popup or overlay and show a google login page etc to fool users into giving up their credentials.
@@ankur-dhama the issue is that a lot of cookies are not http-only, especially on old sites. Local storage is also quite hard, but there were ways around that ( at least there were ). Input listening is still pretty common. Popups and redirects are common, but i kind of assumed it's for ddos farm as the polyfill ddosed some pages. Might been wrong assumption.
If you are including scripts from a CDN, you should always use the integrity="sha..." attribute. The feature has been supported by browsers for around 5+ years, and protects you from supply chain attacks, as the browser will refuse to load the script if the checksum does not match.
@@klausgrnbk6862 That wouldn't work with this Polyfill service, as it's not a static file. It sends back the polyfills that the browser requesting the URL needs. So for most modern browsers it will return nothing. For older browsers, it returns whatever polyfills that particular browser needs.
The web is the only place where it's generally accepted to run 100k lines of code* to render text Disclaimer: If you're reading this as a smart expert: I'm talking about application code. We have abstractions for a reason 🙂
You think it takes less than 100k loc to open your terminal app? This is a popular, lazy, incoherent take. It's not an issue with code size as much as it is with trust and sourcing.
@@chipmo that's not application code. Let's not remove the browser, js and css engine from the equation if you really want to change topics PS: your browser adds a terminal on top
The code is pretty easily de-obfuscated. All it does is attempt to redirect you to other (probably malicious) websites. It has a few interesting features, like its own custom base64 decoder, its own implementation of RC4, and some code to check if you have an admin cookie set (probably so it won't redirect the developer.) But it's definitely not some kind of memory exploit.
@@LowLevelTVwouldn't it be a waste of resources, even from a state-sponsored attacker, to burn a V8 0 day on some random people who used their cdn? I would imagine that if you had such an exploit you could do much more than just that
There are also attacks on browsers that don't need a vulnerability in the JS engine. One could for example: - mine crypto currency - attack other hosts (ddos) - collect user data (phishing) - record user interactions - crash or modify websites Which, considering how many applications are web based nowadays, is already really bad.
@@LowLevelTV Releasing memory exploit to the public like that wouldn't make sense because as the time goes on the harder it gets to find new ones. More likely just going to direct users to some phishing site. Memory exploits are probably reserved for high value targets to avoid getting them patched.
The example that I saw in class was Ρaypal, with the Greek rho character or the Cyrillic er character. They're super hard to pick out compared to i and l
@@nonamenolastname8501 it actually literally is. Look for videos on the latest Mozilla finance report… google is (iirc) like over 2/3 of their funding rn - billions just to be the default search engine. Depressing but true
That "..., showcasing the true power of capital." line sounds like it comes from Senator Armstrong in Metal Gear Rising. "We are making the mother of all omlettes. Can't fret over every egg."
Lol, nope. Firefox don't use V8. Being the inventor of JavaScript, they use the engine they developed during Netscape's heyday. Mozilla has maintained it ever since. That is specific to Chromium-based browsers.
specific to chromium based browsers sounds like there's just a few of them. but right now i'm not sure there are any that don't use chromium. Firefox is one and not sure about safary or lynx. still this affects most of the browsers.
Firefox can have sandbox scapes too. This is not specific to chromium based browsers. They could just as fine put a sandbox escape for any other browser there, if it exists and they know it.
I feel a real level of vindication right now given how I went to lengths to avoid the practice of loading chunks of JS from third party domains that so many of my colleagues would happily partake in. Admittedly I don't feel great about NPM either.
While escaping the js runtime certainly is a possibility, especially if they're targetting old unpatched browsers, my mind with this sort of exploit immediately jumps to user data theft rather than RCE.
Polyfilling, as it says on the MDN page on the screen, is the name given to backporting features by rewriting them in compatible older JS, it doesn't refer to some specific library.
V8 is part of blink (Chromium's web engine) and a fork of JavaScript-core (which is safari's JavaScript engine) Firefox uses SpiderMonkey which is part of Gecko (Firefox's Web engine)
@@AJenbo that is where the entirety of webkit originated, not just javascriptcore. KDE project birthed webkit and javascript core and then which blink came from. KDE are the heroes of the modern web world
@@kensmith5694 yeah, that is the next steps. There are some good tools out there to scan for vulnerabilities. In most companies I’ve worked for we had a toolchain in the build process that was easy to integrate. For indie folks there are good plugins for the CI platforms out there.
I am 100% confident that this code is NOT trying to escape the V8 sand box and exploit C++ bugs. First, that is extremely difficult to do at this point. Second, you do not need a supply chain attack to do that, you could just host that code on your own domain. A more likely scenario is that the goal is to capture data or authentication tokens on a target site. That (1) is way easier to do and (2) requires a supply chain attack to do, as you generally cannot capture data across domains. I.e. JavaScript in your website cannot steal data the user enters on their bank's website.
I feel like we need a better middle ground between that and this always update state where no one can possibly keep up with the changes in their dependencies.
@@BeefIngot Yeah, a form of "code splitting". I think the name suits my idea pretty well; it splits the code into individual chunks that can then be cached individually. You could even group different dependencies together, e.g. if those dependencies also have shared dependencies. Would be cool. But alas, I think it would be really hard for anyone to do and will likely not happen in the next 10 years.
I'll say that as an engineer (not a security researcher), I've always worried about supply chain vulnerabilities, partly because my peers clearly didn't give a crap about it. You don't even have to be security paranoid to be concerned. When every build is a roll of the dice for what gets included with "modern practices", you cannot even control for which external bugs you're shipping. But when you start to take practical steps to limit the exposure, vetting updates and locking versions, storing external dependencies locally in a verifiable way, etc, the powers at your company will always push back that this is non-essential work, and try to get you to focus on pushing out the latest feature tweak because some customer that will never even use that feature is trying to establish dominance over some sales rep.
Even if it's not an exploit to get out of the browser's sandbox. They would still have access to the website and all user data and their security tokens would get leaked to that company. And as even financial institutions used it, that's a big issue.
Even more than that, this questions client side code execution in general, sandbox or not. Which was always an insane proposition to begin with, let's be real. Everyone tells you to not open email attachments from random sources, but our browesrs JS sandbox gets bombarded by potential malicious code from random sources constantly.
What exaclty should be the alternative? Using a remote Desktop like protocol to transfer Video from the server and User input to the server? This will be way to expensive for hosters, use way more bandwith, cost more ressources for clients, require high bandwith, etc. We *need* client side code execution, there is currently no feasible way around it. A possibility could be a JS Engine thats writen in a memory safe Language. That would probably be Rust since speed is essential here. Mozilla does have a Rust implementation of spidermonkey but Firefox seems to use the c++ implementation.
@@user-to7ds6sc3p if you absolutely need code execution client side, supply an application that the user needs to install explicitely and that doesn't sideload code from domains you don't control.
And they are really pushing the limits of this with manifest v3. Its the most clear example of why the conflict of interest with an ads and telemetry company dominating how the internet is browsed is bad for the world.
12:04 what is there to say about open source? Whenever an open source project comes out to have malicious code injected it's always the same story: "oh I really wonder if open source was a good idea, just saying you know. I wonder what this means for the future of open source". The only reason we found out about this and many previous vulnerabilities and the word got spread is because of open source and open source platforms like github. Would you have liked it better if polyfill were closed source and were just as popular? Without a community board or forum to discuss these things openly? You think Microsoft's proprietary IE js interpreter was any more resilient compared to the same era Chrome interpreter because it was closed source? No, ofc not and even Microsoft knows that now. What a naive way of looking at the world.
Opensource wins every time. yeah bad shit sometimes get put in opensource code but it also gets noticed generally quickly. closed source stuff like microcrap who the fuck knows that they are doing and it wouldn't shock me one bit if some windows exploits from 20 years ago nobody but a select few know about are still in current versions.
Just out of curiosity... did you write all the software that runs on your machine (or in your products if you are a developer)? Because most of us have to trust an unspecified number of strangers to have our stuff working and be commercially viable. Linux for instance is a huge dependency tree composed by code written by thousands of strangers without any guarantee of correctness, accuracy or even a simple promise that it will somewhat do what you expect it to do (and nothing else). I too despise having too many dependencies in my code, but if you want to deliver a product that works and looks good/okish you pretty much have to. Long are gone the times when users were ok with small software written in BASIC with a minimal UI composed of mostly bare text. So, what are we even talking about here?
I was up entirely too late last night redoing comments in code, turning them into ascii art. Now im running on 4hrs of sleep on a 12hr shift at a factory job, and my code looks like it belongs in an 80s videogame according to my wife.
This is actually why I don't buy into the modern dev cycle of dependency management... yeah I'm a dinosaur... 1) don't use dependencies 2) if you do bake them and review them yourself and basically don't EVER update them. Sure it COULD be dangerously outdated; but it COULD be way safer too... You say dependency; I say attack vector...
@@sourandbitter3062 there are plenty of relatively tiny frameworks with no dependencies, consider smth like preact (which is basically react, but tiny and with no deps)
@@sourandbitter3062 Thats BS. The problem is no one really wants to pay the technical debt of 100% rolling their own implementations and the dev market is flooded with coders who can't do anything without a framework and IDE IntelliSense helping them. Vulnerabilities and breaches have exploded with the reliance of FOSS.
In software companies I have worked we always download all dependencies and ship them with our software and one of the reasons is that if it gets changed or removed then we would would still continue delivering original dependency with no interruptions.
A lot of modern 'coders' are "coders of convenience." They don't want to get their hands or brains dirty in the details. They want to get to production BEFORE the code base is secured, trying to please management and not the Logic Gods lol... A red flag for any project, IMHO.
@@SioxerNikita What? We don't have enough time to do a good job, therefore, we have no other choice but to produce code which we have no idea how it works. I think this perspective is a problem. Or your managers.
I was recommended this channel by the algorithm - it's incredible how well it got to know me in the last 17 years since my account is technically active.
I've used polyfill, but I never put an external library link like that, unless it's one of those google libraries that are dynamically versioned for either Analytics or Maps. My philosophy has been to bundle or re-host as much as we can, because we don't want the page to get stuck loading from a third-party server. So whatever polyfills I've used are from the official npm registry.
0:11 open source is not a supply chain, as open source devs are not payed by the corporation who abuse their work. It's more like a racoon scraping trash cans to find food
They keep saying that PHP is insecure, but in reality, it was due to inexperienced programmers, and obviously, there were flaws that got fixed. But in JavaScript, being so popular, the same thing starts to happen. Many new programmers and people who don't even know the basics make the system insecure. And if you add to that the belief that learning JS is just using the framework and it's secure by default... we're heading in the wrong direction.
It can't though given that the entire point of the polyfill service was that it reads your UA and generates appropriate script by including only the necessary polyfills needed for that UA. Anytime that changes, you break the integrity.
Just something to add polyfills are not only used to work with old browsers. Pretty often when a new api will be released in the future (like new date library in javascript) there will be a polyfill for this to use and try it out in a browser which does not have the functionality for it as its not released yet.
And remember mates, the death rate in any other Planet, other than Earth is 0%, and also there is NO JAVASCRIPT in those Planets, coincidence? Don't think so, mate.
One of the reasons why I was never in favour of CDNs. I understand that larger sites can actually off load some amount of traffic that way, however just the fact that you integrate code from a third party that could change at any time without you noticing always was my biggest concern. Apart from the analytics they get. In the project's I worked in we most of the time put an actual copy on our machine. Versioning has to be handled by us the developers anyways. Often times you can not simply load the most recent version of a library because it may not be backwards compatible. So you usually load a specific older version anyways.
@@Bunny99s back in around 2010/2011 when html5 really took of I beleive people used the CDN construct so that the client browser would retrieve the js library from cache more, and thus loading new websites/domains more quickly, as opposed to receiving it for every website/domein.
@@bjorn1761 Yes, this was one of the main reasons CDNs took off and a huge benefit both for users and for sites. But unfortunately it also made it trivial to track users through the browser caching. Already by 2013, WebKit had already changed the caching strategy and removed resource-caching across sites and domains to prevent this. Chrome followed along in 2020 since v86 and Firefox v85 in 2021. Though, that doesn't mean CDNs are useless; CDNs still take a huge load off the server and more importantly, can host the content globally and deliver it closest to the end-user-there are always tradeoffs when it comes to choosing where and how to host content.
@@bjorn1761 not just JS, CSS and icons too. with react this nocks, what, 300kb right on the start if user already visited page that pull react from cdn
I just remember my fronted classes and the automated “warnings & vulnerabilities check” tool always panicking about some obscure dependency which our very nice and professional teacher told us we could ignore for the purposes of all school exercises.
Regardless of browser engine exploitation, malicious actors can use this to steal credentials entered on the web page, exfil cookies for use in a plethora of attacks like CSRF, and a number of other activities. I haven’t analyzed it myself yet, but this is pretty serious nonetheless.
7:40 it does not need to be memory corruption in v8. They could collect user passwords, they could inject crypto mining ...etc. they can read cookies, hijack sessions...etc. without a v8 vulnerability.
Can web browsers just have a collection of JavaScript libraries by default? I know there are extensions like LocalCDN and Decentaleyes, but it's a bit silly and redundant to have like 5 copies of a javascript library (with the chance of a library being poisoned) because different websites chose different CDNs.
So after looking at it some more, it just redirects the browser to another website. However, there are some other JS files that are loaded that are now missing. I'm working on finding those. Could still be memory exploitation.
Even in the audio world: "Code they use that they didn't write themselves." Approximately 80-90% of all plugin software is based on JUCE framework, which is by Tracktion Software. It is rare to come across plugins built totally in-house. Well, not rare, but the odds of any random plugin you choose, to be in-house, is low.
I very much doubt such a massive supply chain exploit will also include a browser memory exploit. Browser memory exploits are to hack into your computer from the browser, but these can be served from any malicious site. With this supply chain exploit, or allows the JS code to become part of the website, so it can also read whatever is on the site: username and password fields upon login, user profiles, private content, contacts,... There's just no need to include a costly browser hack into this to do damage.
My website has 0 JS. Just some CSS, HTML, and a custom built build script written in a low level language to build the static pages. (I have split the header and footer of the pages into template files because I couldn't be bothered to copy them into all the pages, and I don't want to update every page when I want to update the header or footer. So I wrote a program to do it for me.)
In the beginning we just wrote our own code for everything. There was no internet, no downloading, no using other people's code. And when your code worked, it stayed working, no updates that were indistinguishable from remote hacking.
Note that you do not need shell code execution to do a lot of harm with this injection. The code injected can take arbitrary actions on the website it was injected in as the user of that website, steal cookies, etc. if that’s all this exploit was aimed at, then that’s already scary enough.
Reading declasses on the philosophy of 5th generation warfare: "The net catches all fish. The fisherman needs only to harvest the ones they want to eat."
This is why I use NoScript and only allow scripts from domains that I need for websites to work. Of course this doesn't fully prevent an attack however it does limit the attack surface and it also would prevent a connection that fake GA domain in the event that I loaded the modified JS file. I've been using NoScript for over a decade and strongly recommend it.
I mean if you have a stable version of a JS library. I store the code, and wouldn't dynamically call it from a web resource hosting site (So it doesn't update or modify). If the version is known to be relatively safe. Is this a better/safer option in theory?
The only time I even consider using CDN instead of bundling is if it's a direct provider of 3rd party services with a commercial relationship. And of course even then you should lock your package versions until there's a reason to update and you reviewed what is updated, why and what are the changes.
When I started my coding career(JS), I had this idea where I would be writing all of the code myself, because I had bad experience with dependencies from other parts of my life. Then I met CDN, libraries and frameworks and I changed my mind. I completely gave up on my idea when I saw node modules folder. This video makes my original idea more attractive. I guess the line has to be drawn somewhere.
The script you mentioned essentially redirects mobile users to a malicious website (I won't provide the URL here). Interestingly, the redirect can happen at different hours of the day, with varying probabilities, for example, there's a 10% chance the you'll be redirected between 0-2AM, and 20% chance between between 4 - 7AM
I get more and moure contious on the dependencies my projects use. And really welcome libraries that have no runtime dependencies. As that stops the infite grpah traversal.
Even if there is no sandbox escape, it still can be a rootkit to amend meaning of what consumers try to perform using it. Maybe it overwrite every request to genuine site, they log those requests to some other sites alongside.
Yikes, this sounds like fertile ground for some DNS spoof/hijacks to catch some very common JavaScript library requests coming from a specific target and replacing them with malicious responses
javascript? NO THANKS. learn ASSEMBLY at lowlevel.academy (and get 20% off)
Yeah but I need something to run on the server though - I heard JS is the best for that
@@Kane0123 🤔
Assembly? NO THANKS. Only self-made compilators running on TempleOS
@@Kane0123 😂
@@spythere a true man of culture right here, or there? 👍
btw mozzila uses spider monkey not v8
thank you
and webkit (safari) uses javascriptcore. only chromium uses v8
@@se7ense7ense7ense7ense7en and yet it's still most browsers!
@@se7ense7ense7ense7ense7en not only chromium, more like all of chromium based use v8
it’s called gecko not spider monkey ???
Every project that uses NPM is basically a security problem too. Like setup a basic project and you already have a billion dependencies nobody knows what they do.
using a web browser scripting language for anything else...
using it for serverside programs
...
using an especially retarded web browser scripting language for serverside programs.........
ALSO,
I just KNEW hosting a couple of small files instead of linking cdns was safer.
The worst thing is, this also happens outside of NPM too. `sudo apt install nodejs npm` on Debian pulls ~300 dependencies, most of which match "node-*"
makes me think of 'kik' npm debacle
Seriously. React takes many minutes to download just because of all the random dependencies and their dependancies and their dependancies dependancies.
Its dependancy diarrhea and I dont think anything has it as bad as JavaScript, and I think its just because of the batteries not included nature of node. You need to find little modules to do everything and this causes dependency hell for the simplest libraries.
@@BeefIngot that's why I prefer compilers like Svelte. No cdn or dependencies, just serving pure HTML/CSS/JS to the browser. Always wondered what would happen to react websites if meta's CDN is down
Web dev here. Although V8 hack is possible, I am almost certainly sure this code is actually intended to still user sessions, user input or any other security tokens etc.
It's especially useful if you get admin session or credentials on things like wordpress, as from that you can hack the server and use it as a bot farm for DDOS or hoping that wp will give you access to other systems.
ah interesting. thank you!
LinusTechTips got hacked using these session exploits only
And this is why you don't store session tokens in localStorage, or non-HttpOnly cookies, folks. Fortunately Wordpress sends session with HTTPOnly cookie, so they wouldn't be affected unless the user of that WP instance uses a plugin that happens to bypass this security feature.
JS hosted from some other domain (like a cdn) cannot read the HTTP only cookies of other domain page (which is including the cdn js) so stealing sessions is not possible in such a case. This malicious code can do other things like opening a popup or overlay and show a google login page etc to fool users into giving up their credentials.
@@ankur-dhama the issue is that a lot of cookies are not http-only, especially on old sites. Local storage is also quite hard, but there were ways around that ( at least there were ). Input listening is still pretty common. Popups and redirects are common, but i kind of assumed it's for ddos farm as the polyfill ddosed some pages. Might been wrong assumption.
If you are including scripts from a CDN, you should always use the integrity="sha..." attribute. The feature has been supported by browsers for around 5+ years, and protects you from supply chain attacks, as the browser will refuse to load the script if the checksum does not match.
@@klausgrnbk6862 That wouldn't work with this Polyfill service, as it's not a static file. It sends back the polyfills that the browser requesting the URL needs. So for most modern browsers it will return nothing. For older browsers, it returns whatever polyfills that particular browser needs.
@@klausgrnbk6862 thanks for this. I've never heard of this attribute before and will start using it in the future.
@@klausgrnbk6862 is there a perf hit with this?
And also CSP header that can be easily setup to refuse to connect to any googie-woogie domain names
The web is the only place where it's generally accepted to run 100k lines of code* to render text
Disclaimer: If you're reading this as a smart expert: I'm talking about application code. We have abstractions for a reason 🙂
This comment took 252k lines of code to render and it's still running in the background, doing post-rendering tasks. 👍
You think it takes less than 100k loc to open your terminal app? This is a popular, lazy, incoherent take. It's not an issue with code size as much as it is with trust and sourcing.
@@B0wser998 gotta need to emulate the CPU instructions in the background
@@chipmo that's not application code.
Let's not remove the browser, js and css engine from the equation if you really want to change topics
PS: your browser adds a terminal on top
text rendering is generally a very complicated task
The code is pretty easily de-obfuscated. All it does is attempt to redirect you to other (probably malicious) websites. It has a few interesting features, like its own custom base64 decoder, its own implementation of RC4, and some code to check if you have an admin cookie set (probably so it won't redirect the developer.) But it's definitely not some kind of memory exploit.
Yeah I realize that now. Another issue is those sites you’re redirect to could also be doing the memory exploity stuff. My bad on that
@@LowLevelTVwouldn't it be a waste of resources, even from a state-sponsored attacker, to burn a V8 0 day on some random people who used their cdn?
I would imagine that if you had such an exploit you could do much more than just that
Didnt the XZ exploit contain base64 decoder and encryption implementations too?
There are also attacks on browsers that don't need a vulnerability in the JS engine.
One could for example:
- mine crypto currency
- attack other hosts (ddos)
- collect user data (phishing)
- record user interactions
- crash or modify websites
Which, considering how many applications are web based nowadays, is already really bad.
@@LowLevelTV Releasing memory exploit to the public like that wouldn't make sense because as the time goes on the harder it gets to find new ones. More likely just going to direct users to some phishing site. Memory exploits are probably reserved for high value targets to avoid getting them patched.
when the child says googie : 🥰
when the hacker says googie : 💀
true hackers say googIe (capital I)
@@NguyenTran-cx3uy you are so freaking epic !!! buddy !!!
The example that I saw in class was Ρaypal, with the Greek rho character or the Cyrillic er character.
They're super hard to pick out compared to i and l
@@GyroCannonis there a way to change those in the font so theyre easily visible? 👀
@@NguyenTran-cx3uy it may have gotten lowercased along the way
Firefox uses gecko not V8, its their own engine and one of the main reasons we need it alive. V8 is however, really optimized at this point.
Also with Firefox Quantum added in, but we need Servo as a modular replacement to Gecko so it can actually compete with chromium/electron.
IIRC, Chromium uses Blink (Webkit based) as rendering engine, and V8 for JS. Firefox uses Gecko and Spider-Monkey respectively
@@nonamenolastname8501 lmao yes google is their biggest funder
@@nonamenolastname8501 it actually literally is. Look for videos on the latest Mozilla finance report… google is (iirc) like over 2/3 of their funding rn - billions just to be the default search engine. Depressing but true
Ff is still around, because its the best browser that exists.
Its faster than ch4ome, uses less memory, is less vulnerable to exploits, et cetera.
And people are still confused as to why I "waste" so much time developing tools from scratch for my medical data company's web app
That "..., showcasing the true power of capital." line sounds like it comes from Senator Armstrong in Metal Gear Rising. "We are making the mother of all omlettes. Can't fret over every egg."
Strong "This isn't even my final form" vibes
Another reason why we don’t use 3rd party libraries or cdn’s. you can’t secure what you don’t control
KISS- keep it simple, stupid. The more externalities you depend on the more likely it is to all break
“But I digest” is such a great eggcorn.
That is why the guy down in the comments farted
TIL about eggcorns.
Neat!
lol.
@@davecgriffith TIL what TIL means.
Neat!
Lol, nope. Firefox don't use V8.
Being the inventor of JavaScript, they use the engine they developed during Netscape's heyday. Mozilla has maintained it ever since.
That is specific to Chromium-based browsers.
ty
specific to chromium based browsers sounds like there's just a few of them. but right now i'm not sure there are any that don't use chromium. Firefox is one and not sure about safary or lynx.
still this affects most of the browsers.
@@Miha-hq4hd Safari also uses its own JavaScriptCore
Firefox can have sandbox scapes too. This is not specific to chromium based browsers. They could just as fine put a sandbox escape for any other browser there, if it exists and they know it.
comes with up and downs...
how good is the firefox sandboxing?
like they arent using compressed pointers like v8 right?
i totally agree with the guy who commented “i just farted”
same
A little too bit political to me but yeah, I agree
He has the cat pfpf tho lol
hell nah this comment section is becoming instagram
@@doxyfmood
I feel a real level of vindication right now given how I went to lengths to avoid the practice of loading chunks of JS from third party domains that so many of my colleagues would happily partake in. Admittedly I don't feel great about NPM either.
oh no. so now you're going to write everything yourself and not include other's js
While escaping the js runtime certainly is a possibility, especially if they're targetting old unpatched browsers, my mind with this sort of exploit immediately jumps to user data theft rather than RCE.
Either that or throwing easy to replicate CVE's at old browsers, but I agree with what you're saying.
Polyfilling, as it says on the MDN page on the screen, is the name given to backporting features by rewriting them in compatible older JS, it doesn't refer to some specific library.
V8 is part of blink (Chromium's web engine) and a fork of JavaScript-core (which is safari's JavaScript engine)
Firefox uses SpiderMonkey which is part of Gecko (Firefox's Web engine)
JavaScript-core is a fork of KJS which was part of the KDE desktop project.
everything is a fork of some other thing
@@animezia and forks are part of forknife
forknife mentioned 🗣🗣🗣
@@AJenbo that is where the entirety of webkit originated, not just javascriptcore. KDE project birthed webkit and javascript core and then which blink came from. KDE are the heroes of the modern web world
10:15 Congrats to having a working digestive tract.
lol
I too, digest. lol
I opened the comments to find this comment hahaha
This is why I always host all the JavaScript for my sites internally.
That is a good first step. Ideally, you also checked all that code to make sure no evil stuff is now being hosted by you.
Same, I keep it in my butt
@@kensmith5694 yeah, that is the next steps. There are some good tools out there to scan for vulnerabilities. In most companies I’ve worked for we had a toolchain in the build process that was easy to integrate. For indie folks there are good plugins for the CI platforms out there.
but you didn't create the js
@@MrTweetyhack When trees were small and computers were large, people wrote the scripts for their sites from scratch
I am 100% confident that this code is NOT trying to escape the V8 sand box and exploit C++ bugs. First, that is extremely difficult to do at this point. Second, you do not need a supply chain attack to do that, you could just host that code on your own domain.
A more likely scenario is that the goal is to capture data or authentication tokens on a target site. That (1) is way easier to do and (2) requires a supply chain attack to do, as you generally cannot capture data across domains. I.e. JavaScript in your website cannot steal data the user enters on their bank's website.
Totally agree, I was also surprised how LowLevelLearning drops that connotation
Thats why you pack all the js your website depends on on your own host and never update ;)
I feel like we need a better middle ground between that and this always update state where no one can possibly keep up with the changes in their dependencies.
@@BeefIngot stable updates in the style of debian?
Better yet don't use so much javascript and write all that you need. 99.999999% of what people do with JS didn't need to be done.
@@BeefIngot Yeah, a form of "code splitting". I think the name suits my idea pretty well; it splits the code into individual chunks that can then be cached individually. You could even group different dependencies together, e.g. if those dependencies also have shared dependencies. Would be cool. But alas, I think it would be really hard for anyone to do and will likely not happen in the next 10 years.
@@dealloc code splitting is very much a thing and not that hard to do, look up webpack chunks and React.Lazy, it comes out of the box
I'll say that as an engineer (not a security researcher), I've always worried about supply chain vulnerabilities, partly because my peers clearly didn't give a crap about it. You don't even have to be security paranoid to be concerned. When every build is a roll of the dice for what gets included with "modern practices", you cannot even control for which external bugs you're shipping.
But when you start to take practical steps to limit the exposure, vetting updates and locking versions, storing external dependencies locally in a verifiable way, etc, the powers at your company will always push back that this is non-essential work, and try to get you to focus on pushing out the latest feature tweak because some customer that will never even use that feature is trying to establish dominance over some sales rep.
6:26: "in V8's interpretation of C++" should be "in V8's interpretation of Javascript"
It's not only system takeover but also simple things like listening to all user input like user names and passwords
I just got from work, made enormous shit and I feel good
Well done.
😎
Proud of you, son.
Even if it's not an exploit to get out of the browser's sandbox. They would still have access to the website and all user data and their security tokens would get leaked to that company. And as even financial institutions used it, that's a big issue.
Even more than that, this questions client side code execution in general, sandbox or not. Which was always an insane proposition to begin with, let's be real. Everyone tells you to not open email attachments from random sources, but our browesrs JS sandbox gets bombarded by potential malicious code from random sources constantly.
What exaclty should be the alternative? Using a remote Desktop like protocol to transfer Video from the server and User input to the server?
This will be way to expensive for hosters, use way more bandwith, cost more ressources for clients, require high bandwith, etc.
We *need* client side code execution, there is currently no feasible way around it.
A possibility could be a JS Engine thats writen in a memory safe Language. That would probably be Rust since speed is essential here.
Mozilla does have a Rust implementation of spidermonkey but Firefox seems to use the c++ implementation.
@@user-to7ds6sc3p if you absolutely need code execution client side, supply an application that the user needs to install explicitely and that doesn't sideload code from domains you don't control.
Will he call himself Ed or LowLevelLearning… place your bets!
Both! 0:55
I'm gonna start messing with you guys... get ready.
"hey ed this is lowlevel videos where I cyber about security!"
@@LowLevelTV you can start using random names each video if you really wanna mess with people, Bob.
@@LowLevelTV, Please do! That would be perfect!
@@ProtossOPthat a genius move… Lester
The reason why Chrome sets the standard is simple.
Chrome has a monopoly.
And they are really pushing the limits of this with manifest v3.
Its the most clear example of why the conflict of interest with an ads and telemetry company dominating how the internet is browsed is bad for the world.
@@BeefIngot Ohh I never said that's a good thing or anything.
@@PhilippBlum Oh no, I didnt think you were. I was just adding my own thoughts.
Ironically the ability to simply disable js per site is what got me to switch to chrome before they nerfed malware blockers. Noscript is cumbersome.
i mean, that's what happens when you offer a better engine than the competition
12:04 what is there to say about open source?
Whenever an open source project comes out to have malicious code injected it's always the same story: "oh I really wonder if open source was a good idea, just saying you know. I wonder what this means for the future of open source".
The only reason we found out about this and many previous vulnerabilities and the word got spread is because of open source and open source platforms like github.
Would you have liked it better if polyfill were closed source and were just as popular? Without a community board or forum to discuss these things openly?
You think Microsoft's proprietary IE js interpreter was any more resilient compared to the same era Chrome interpreter because it was closed source? No, ofc not and even Microsoft knows that now.
What a naive way of looking at the world.
Opensource wins every time. yeah bad shit sometimes get put in opensource code but it also gets noticed generally quickly. closed source stuff like microcrap who the fuck knows that they are doing and it wouldn't shock me one bit if some windows exploits from 20 years ago nobody but a select few know about are still in current versions.
not to mention you can make a fork add a fix and be ready to go before maintainer sees you opened an issue
Nobody could have guessed that automatically using other people's code on your site could be dangerous 😂
LOL.
Yes because of the blind trust
Yes, who could have predicted that "Trust me bro" would be desecrated like this?
Just out of curiosity... did you write all the software that runs on your machine (or in your products if you are a developer)?
Because most of us have to trust an unspecified number of strangers to have our stuff working and be commercially viable.
Linux for instance is a huge dependency tree composed by code written by thousands of strangers without any guarantee of correctness, accuracy or even a simple promise that it will somewhat do what you expect it to do (and nothing else).
I too despise having too many dependencies in my code, but if you want to deliver a product that works and looks good/okish you pretty much have to.
Long are gone the times when users were ok with small software written in BASIC with a minimal UI composed of mostly bare text.
So, what are we even talking about here?
Especially from a Chinese company, who could've predicted this? 😮
Watching half way through this it's already terrifying...
35% of the planet is jabbed and asking for more is far more terrifying. This not so much.
@@CStoph1979 Oh my god man do you push your agenda everywhere? 😭
Ed the only youtuber who wants you to stay not for the content, but to hang out. Our goat.
V I B E
why is there no space after the comma on your shirt?
i hate grammer 😤
He writes in C, whitespace doesn't matter
I have to....
*grammar
It was removed at compile time.
I lowkey want like a 2 hour malware analysis video on that obfustaced pastebin code
I was up entirely too late last night redoing comments in code, turning them into ascii art. Now im running on 4hrs of sleep on a 12hr shift at a factory job, and my code looks like it belongs in an 80s videogame according to my wife.
having a wife who recognizes 80s videogames is hot
@@acters124fr, lucky man.
@@acters124he won in life frfr
Web development is such a shit show. This kind of stuff happens all the time because people assume cdns and random code are safe to use. It's so dumb.
cdns are used so client can use same js file on multiple sites, less download means faster load time
Maybe someday people will finally figure out that trusting other sources to deliver their libraries to clients is a bad thing.. maybe.
if user already has that filed downloaded it speeds up page load times, there is a very good reason why people do it this way
@@fulconandroadcone9488 I know why people do it, but with the many issues this can raise. I don't see the value outweighing the downsides.
They should be using SRI. Some people say that wouldn’t work for polyfill, but in general that’s the way to make shared CDN usage safer.
This is actually why I don't buy into the modern dev cycle of dependency management... yeah I'm a dinosaur... 1) don't use dependencies 2) if you do bake them and review them yourself and basically don't EVER update them. Sure it COULD be dangerously outdated; but it COULD be way safer too...
You say dependency; I say attack vector...
You mean std::vector
I wish it was possible but I don't think you can build a medium or large website today without a framework, you need change detection.
@@sourandbitter3062 there are plenty of relatively tiny frameworks with no dependencies, consider smth like preact (which is basically react, but tiny and with no deps)
@@sourandbitter3062 Thats BS. The problem is no one really wants to pay the technical debt of 100% rolling their own implementations and the dev market is flooded with coders who can't do anything without a framework and IDE IntelliSense helping them. Vulnerabilities and breaches have exploded with the reliance of FOSS.
You say attack vector, I say business opportunity
In software companies I have worked we always download all dependencies and ship them with our software and one of the reasons is that if it gets changed or removed then we would would still continue delivering original dependency with no interruptions.
A lot of modern 'coders' are "coders of convenience." They don't want to get their hands or brains dirty in the details. They want to get to production BEFORE the code base is secured, trying to please management and not the Logic Gods lol... A red flag for any project, IMHO.
@@_Stin_stop calling people "gods"
@@_Stin_You don't have time to audit every single dependency... Every single time... And every single update
@@SioxerNikita What? We don't have enough time to do a good job, therefore, we have no other choice but to produce code which we have no idea how it works.
I think this perspective is a problem. Or your managers.
@@rusi6219 Urm... I didn't... O.o Stop misunderstanding people's comments. If you're not sure, ask.
I was recommended this channel by the algorithm - it's incredible how well it got to know me in the last 17 years since my account is technically active.
I've used polyfill, but I never put an external library link like that, unless it's one of those google libraries that are dynamically versioned for either Analytics or Maps.
My philosophy has been to bundle or re-host as much as we can, because we don't want the page to get stuck loading from a third-party server.
So whatever polyfills I've used are from the official npm registry.
Time to start downloading libraries instead of using CDNs
0:11 open source is not a supply chain, as open source devs are not payed by the corporation who abuse their work.
It's more like a racoon scraping trash cans to find food
Some are paid, just not enough high level top tier programmers run to the chance to get paid peanuts.
@@GoonyMclinux the proportion of paid to unpaid is just crazy. And most of paid FOSS dev earn not enough to live decently
They keep saying that PHP is insecure, but in reality, it was due to inexperienced programmers, and obviously, there were flaws that got fixed. But in JavaScript, being so popular, the same thing starts to happen. Many new programmers and people who don't even know the basics make the system insecure. And if you add to that the belief that learning JS is just using the framework and it's secure by default... we're heading in the wrong direction.
All my homies enable noscript
a V8 sandbox escape is a huge thing even without any compromised CDN.
A malicious CDN could steal credentials and it's basically a limited botnet.
By the way, the usage of SRI would have prevented this entire situation. The website owners are to blame for not protecting end users.
It can't though given that the entire point of the polyfill service was that it reads your UA and generates appropriate script by including only the necessary polyfills needed for that UA. Anytime that changes, you break the integrity.
@@dealloc Hmm...good point.
And this is exactly why NoScript is worth the hassle.
Just something to add polyfills are not only used to work with old browsers.
Pretty often when a new api will be released in the future (like new date library in javascript) there will be a polyfill for this to use and try it out in a browser which does not have the functionality for it as its not released yet.
trying out in production is... not very smart
That is one of the reasons Javascript was critisized since it existed. Besides the bloat it creats on websites.
Polyfill is still a thing, but it’s usually compiled with the code rather than a link to another website.
So this only affects websites that use polyfill via cdn? Most webapps should use it as a node-module, which makes it safe?
via the polyfill.io cdn, yes. anything backed by cloudflare or another is fine.
And remember mates, the death rate in any other Planet, other than Earth is 0%, and also there is NO JAVASCRIPT in those Planets, coincidence? Don't think so, mate.
One of the reasons why I was never in favour of CDNs. I understand that larger sites can actually off load some amount of traffic that way, however just the fact that you integrate code from a third party that could change at any time without you noticing always was my biggest concern. Apart from the analytics they get. In the project's I worked in we most of the time put an actual copy on our machine. Versioning has to be handled by us the developers anyways. Often times you can not simply load the most recent version of a library because it may not be backwards compatible. So you usually load a specific older version anyways.
@@Bunny99s back in around 2010/2011 when html5 really took of I beleive people used the CDN construct so that the client browser would retrieve the js library from cache more, and thus loading new websites/domains more quickly, as opposed to receiving it for every website/domein.
@@bjorn1761 Yes, this was one of the main reasons CDNs took off and a huge benefit both for users and for sites. But unfortunately it also made it trivial to track users through the browser caching.
Already by 2013, WebKit had already changed the caching strategy and removed resource-caching across sites and domains to prevent this. Chrome followed along in 2020 since v86 and Firefox v85 in 2021.
Though, that doesn't mean CDNs are useless; CDNs still take a huge load off the server and more importantly, can host the content globally and deliver it closest to the end-user-there are always tradeoffs when it comes to choosing where and how to host content.
@@bjorn1761 not just JS, CSS and icons too. with react this nocks, what, 300kb right on the start if user already visited page that pull react from cdn
At least it should shine a spotlight on the integrity attribute everyone should be using.
now im confident that i was never paranoid about polyfil but just beign realistic
I think it's funny that says "googie analytics" because he doesn't notice that the lowercase L in "anaiytics" has also been replaced
wee need spell check for all text regardless wheatear or not it is in a text box.
I just remember my fronted classes and the automated “warnings & vulnerabilities check” tool always panicking about some obscure dependency which our very nice and professional teacher told us we could ignore for the purposes of all school exercises.
Regardless of browser engine exploitation, malicious actors can use this to steal credentials entered on the web page, exfil cookies for use in a plethora of attacks like CSRF, and a number of other activities. I haven’t analyzed it myself yet, but this is pretty serious nonetheless.
This is why Subresource Integrity checking is a thing.
11:04 ooo, Rebane. That's a name I recognize.
Not from normal development stuff, but from Minecraft.
They are a 2b2t player.
Do script, link or href tags have some kind of checksum attribute ? If yes, does it help preventing this kind of attack ?
Yes but if attacker controls checksum too like they could here, it's little help. Trust origins using noscript or notscripts or ublock
7:40 it does not need to be memory corruption in v8. They could collect user passwords, they could inject crypto mining ...etc. they can read cookies, hijack sessions...etc. without a v8 vulnerability.
I find it pretty astonishing, that anyone can upload something to pip, cargo, npm etc but the majority of packages don't seem to contain malware.
Yeah well that's changing fast, now that repositories are the hot new attack vector. For that reason Foss is dead, they just don't realize it yet.
@@angrydachshundsmartest windows user 💀
Can web browsers just have a collection of JavaScript libraries by default? I know there are extensions like LocalCDN and Decentaleyes, but it's a bit silly and redundant to have like 5 copies of a javascript library (with the chance of a library being poisoned) because different websites chose different CDNs.
And again, same question I have: why is obfuscated code still accepted?
I started returning to 6502 Assembly recently, thanks to digging up the old book from 1983 in my storage. ;w;
It's an older code, but it checks out.
As soon as you said "Chinese" I already knew what's up
JS and nodeJS are already notorious for memory issues
I'm extremely interested in the contents of the pastebin. Will you post your results?
So after looking at it some more, it just redirects the browser to another website. However, there are some other JS files that are loaded that are now missing. I'm working on finding those. Could still be memory exploitation.
@@LowLevelTV by now, the payload would pretty much be impossible to find, until someone who had access to it publishes a security risk report
Even in the audio world: "Code they use that they didn't write themselves." Approximately 80-90% of all plugin software is based on JUCE framework, which is by Tracktion Software. It is rare to come across plugins built totally in-house. Well, not rare, but the odds of any random plugin you choose, to be in-house, is low.
I very much doubt such a massive supply chain exploit will also include a browser memory exploit.
Browser memory exploits are to hack into your computer from the browser, but these can be served from any malicious site.
With this supply chain exploit, or allows the JS code to become part of the website, so it can also read whatever is on the site: username and password fields upon login, user profiles, private content, contacts,...
There's just no need to include a costly browser hack into this to do damage.
this is why you use "integrity" to specify the checksum of a remote script loaded from a cdn
I just found your channel! I love it haha and this year has been crazy!
My website has 0 JS.
Just some CSS, HTML, and a custom built build script written in a low level language to build the static pages.
(I have split the header and footer of the pages into template files because I couldn't be bothered to copy them into all the pages, and I don't want to update every page when I want to update the header or footer. So I wrote a program to do it for me.)
Very s@m4rt_
Lol it's called templates but nice try
Sansec has already analyzed some sample and it doesn't look like a 0day, just heavily obfuscated code with protections
In the beginning we just wrote our own code for everything. There was no internet, no downloading, no using other people's code. And when your code worked, it stayed working, no updates that were indistinguishable from remote hacking.
Note that you do not need shell code execution to do a lot of harm with this injection. The code injected can take arbitrary actions on the website it was injected in as the user of that website, steal cookies, etc. if that’s all this exploit was aimed at, then that’s already scary enough.
Isn't there a way to verify the hash of the Javascript? I'm quite sure I've seen it on the documentation.
Doesn’t polykill have access to most of the DOM even without exploits?
Reading declasses on the philosophy of 5th generation warfare: "The net catches all fish. The fisherman needs only to harvest the ones they want to eat."
chilling really
5th generation warfare is not cyber, it's neurowarfare. Which cyber is only a small part of. Huge red herring.
@@Acetyl53 it's a lot like narcissistic abuse on a global scale.
@@Acetyl53 our opinions are not in conflict.
CEO's in the near future: It's not our fault, our AI workforce chose the malicious CDN.
and our mallicious AI was selected by... lowest bidder AI
This is why I use NoScript and only allow scripts from domains that I need for websites to work. Of course this doesn't fully prevent an attack however it does limit the attack surface and it also would prevent a connection that fake GA domain in the event that I loaded the modified JS file.
I've been using NoScript for over a decade and strongly recommend it.
Yep better than ublock and chromium has notscripts
Ditto. And it can really speed up browsing by not allowing so much of that garbage to load.
is there an update? what should you do if you use any of the effected websites?
I mean if you have a stable version of a JS library. I store the code, and wouldn't dynamically call it from a web resource hosting site (So it doesn't update or modify). If the version is known to be relatively safe. Is this a better/safer option in theory?
Er when do you use a cdn without the code hash?
The only time I even consider using CDN instead of bundling is if it's a direct provider of 3rd party services with a commercial relationship. And of course even then you should lock your package versions until there's a reason to update and you reviewed what is updated, why and what are the changes.
Hey, I wonder what software you use to create this content ? I love the square layout of you, can I do it with Kdenlive ?
Awesome analysis! Thanks for sharing.
There seems to be a lot of people saying "don't use JavaScript" in UA-cam comments 🤨
Based takes.
@@Kane0123 it is a fine language, it is people taking short cuts all the time
When I started my coding career(JS), I had this idea where I would be writing all of the code myself, because I had bad experience with dependencies from other parts of my life. Then I met CDN, libraries and frameworks and I changed my mind. I completely gave up on my idea when I saw node modules folder. This video makes my original idea more attractive. I guess the line has to be drawn somewhere.
The script you mentioned essentially redirects mobile users to a malicious website (I won't provide the URL here). Interestingly, the redirect can happen at different hours of the day, with varying probabilities, for example, there's a 10% chance the you'll be redirected between 0-2AM, and 20% chance between between 4 - 7AM
I get more and moure contious on the dependencies my projects use. And really welcome libraries that have no runtime dependencies. As that stops the infite grpah traversal.
will you do a follow-up once what it does its reverse engineered? personally I am really curious
been waiting for this
Javascript + data collection causes a vulnerability, and water is wet
Even if there is no sandbox escape, it still can be a rootkit to amend meaning of what consumers try to perform using it. Maybe it overwrite every request to genuine site, they log those requests to some other sites alongside.
Yikes, this sounds like fertile ground for some DNS spoof/hijacks to catch some very common JavaScript library requests coming from a specific target and replacing them with malicious responses
there’s an attribute on the script tag called “integrity” that allows you to include the hash of your JS code that would mitigate this issue