Optional chaining is definitely I've been waiting for years, checking a deeply nested property was very painful exercise. I always thought why JS is not working on this direction or am I the only one felt there could be a better way to handle such things. Thanks for this feature.
In Angular templating there was what everyone called the Elvis operator, but was actually the "Safe navigation operator", which you guys are now called Operational Chaining. I really missed this feature having moved to React.
Glad to see memory segmentation ( Pointer Compression ) making it into the V8 engine. This is one of my goto methods for speeding up critical code sections too as it promotes better CPU cache utilization as well as memory usage.
I like the WeakRef feature, but it would be so much nicer if classes has a "deconstruct" method, so when calling "delete classInstance" it would automatically call the "deconstruct" method.
Yeah, a dispose or deconstruct method seems like it might still be needed. In their example, the WeakRef can protect heavy stuff, but there's still a memory leak in that `wrapper` won't get cleaned up. The wrapper functions will keep piling up in memory. It'll take longer for the heap to get big, but it'll still get there. EDIT: Nvm, watched the rest of the video
Methods like .dispose() & .close() often cause production issue since we, humans usually forget to call them at the right time, Golang even creates defer statement to help solving this issue.
These two are hilarious and engaging with simplicity (and complexity lurking) . Really loving the improvements in performance and the fact that you don't give up and become complacent. You keep improving.
I am creating a web app which analyse the data in client browser and draws the different types charts according to user selection. Now question is that whenever i am having millions of records to analyse the data and renders the whole dashboard(i.e. charts) which is have functionality of filters as well. Will it be feasible to do that?
Great call on not making parsing block the main thread. I guess soon I don't have to use web workers to load huge libraries any more. At least I think that's what I did to solve this problem - it's been a while.
Why isn't WeakRef() a variable scope/type instead? like `weak = new FinalizationRegistry()` or something similar. Would make the syntax look way better and since the behavior of the return value from WeakRef() is a loosely tied variable, shouldn't it be a variable definition rather?
Ok I need some help Regarding Garbage collection of event listeners with NodeJS, else this will bug me for my life In Socket.io when my users connect to my server this is triggered io.on('connection', (socket)=>{ //user is connected var memory = "allocating a..... long ....... string" }) Now when the user disconnects what will happen to {memory} ? Will, it deallocated or stay forever. If it stays alive forever, does that mean socket.io a broken framework?
In this exact case, as given here, you'll have exactly one instance of "memory" shared between all connections/users. This is possible because it's the same string for all connections, and strings are immutable. So you won't make copies on each connection (and therefore won't have any copies to destroy in each disconnection), but as long as that connection event is attached to the socket, you'll always have that one copy of the string. If this string was dynamic and not shareable, then the answer would depend on whether anything else keeps it alive after the connection event is triggered. In this example nothing does, so it would be collectable, but in a more complex example it might not.
Weakref looks way more confusing and complicated and more code than just removing the event listener when you null out the instance. If you have a lot of listeners why not write a deconstruct method that nulls out the instance you pass in and also all event listeners its tracking in an array in that instance. I am pretty slow when it comes to this stuff and all this new syntax so I probably just have no idea what Im missing lol.
In this simple example you're right, the dispose is probable less confusing and good enough. When things get more complex though, e.g. multiple owners of the disposable class, or the possibility of the _owner_ of the disposing class getting collected before calling something like "stop", is when this starts becoming really cool. This example, jokes about chat apps aside, is a very simplified version of a real memory leak we found.
What if GC collects an event listener before it fire? Then we will facing randomly swallowed events. Now imagine someone debugging this... I think you should provide better use case for a WeakRef.
The socket, or whatever your attaching the event listener to, will hold the event listener strongly, so it won't be collected before firing unless it's detached before firing.
@@LeszekSwirski No, we're adding weak reference to the socket message event. So it could be collected at any time. See their very misleading article v8.dev/features/weak-references
@@vsDizzy "They" is me, I'm one of the authors of the article. The whole point is to be able to collect listeners when that listener stops existing, but that listener is held strongly by the only thing that cares about the events (in the article, by MovingAvg). Hence swallowing events is actually desired. The strong reference I was referring to was the strong reference to the wrapper listener, and that one is indeed strong and has to be explicitly disconnected (by the finalizer).
@@arkanciscan Well, the idea wanted to express is the shipping of some of typescript functionnalities into JS, and the fact this could end with typescript replacing JS in the future. But I shouldn't assert those things since I have never read any V8/JS specifications and features proposal and those languages are probably growing together (although TS is faster).
Optional chaining is definitely I've been waiting for years, checking a deeply nested property was very painful exercise. I always thought why JS is not working on this direction or am I the only one felt there could be a better way to handle such things. Thanks for this feature.
Well yes, but you could still use `get` from lodash
I had this problem two weeks ago.. Where I had to code like this (data.options || {}).tag
Optional chaining would have helped a lot
@@VipinYadav1 Ruby implement this too
"I'm making a new chat app - something of a strength for Google Engineers" HA, we see what you did there. :b
There are lije 7 or 8 out there
I almost didn't catch that
LMFAO
The best part is the guy on the right smirked and then rolled his eyes after 2 seconds at this lame joke
Love the disclaimer about the weakRef :D 😁
It is the one moment when Leszek breaks out of his scripted presentation mode and is fighting the urge to laugh. :D
i thought i was listening to side effects on a medicine advertisement with that one
I literally had to check if my playback speed moved up or something
Sounds like he was reading the terms and conditions.
My code is going to look hella confused now...
Thanks, for a practical example on memory leak and also optional chaining, nullish-coalescing operators.
In Angular templating there was what everyone called the Elvis operator, but was actually the "Safe navigation operator", which you guys are now called Operational Chaining. I really missed this feature having moved to React.
Thank you for all the optimizations and hard work.
Glad to see memory segmentation ( Pointer Compression ) making it into the V8 engine. This is one of my goto methods for speeding up critical code sections too as it promotes better CPU cache utilization as well as memory usage.
I like the WeakRef feature, but it would be so much nicer if classes has a "deconstruct" method, so when calling "delete classInstance" it would automatically call the "deconstruct" method.
Yeah, a dispose or deconstruct method seems like it might still be needed. In their example, the WeakRef can protect heavy stuff, but there's still a memory leak in that `wrapper` won't get cleaned up. The wrapper functions will keep piling up in memory. It'll take longer for the heap to get big, but it'll still get there.
EDIT: Nvm, watched the rest of the video
I think given what classes actually are in JavaScript, it'd be an odd implementation to the spec.
It seems .dispose() method is far simpler than WeakRef
Methods like .dispose() & .close() often cause production issue since we, humans usually forget to call them at the right time, Golang even creates defer statement to help solving this issue.
@@asendiamayco I think that point should be introduced briefly in the video
@@DwiAjiKurniawan It is. 10:35
These two are hilarious and engaging with simplicity (and complexity lurking) . Really loving the improvements in performance and the fact that you don't give up and become complacent. You keep improving.
You can never escape from Memory Management. It will always haunt programmers.
I am creating a web app which analyse the data in client browser and draws the different types charts according to user selection. Now question is that whenever i am having millions of records to analyse the data and renders the whole dashboard(i.e. charts) which is have functionality of filters as well. Will it be feasible to do that?
This was a really good video, I learned much, thank you. I'm not excited to use weak references, it sounds like a nightmare. :
Great call on not making parsing block the main thread. I guess soon I don't have to use web workers to load huge libraries any more. At least I think that's what I did to solve this problem - it's been a while.
Why isn't WeakRef() a variable scope/type instead? like `weak = new FinalizationRegistry()` or something similar. Would make the syntax look way better and since the behavior of the return value from WeakRef() is a loosely tied variable, shouldn't it be a variable definition rather?
great video, learned a lot
0:10 how to make such drawing?
Yah draw
Ok I need some help Regarding Garbage collection of event listeners with NodeJS, else this will bug me for my life
In Socket.io
when my users connect to my server this is triggered
io.on('connection', (socket)=>{
//user is connected
var memory = "allocating a..... long ....... string"
})
Now when the user disconnects what will happen to {memory} ?
Will, it deallocated or stay forever.
If it stays alive forever, does that mean socket.io a broken framework?
In this exact case, as given here, you'll have exactly one instance of "memory" shared between all connections/users. This is possible because it's the same string for all connections, and strings are immutable. So you won't make copies on each connection (and therefore won't have any copies to destroy in each disconnection), but as long as that connection event is attached to the socket, you'll always have that one copy of the string.
If this string was dynamic and not shareable, then the answer would depend on whether anything else keeps it alive after the connection event is triggered. In this example nothing does, so it would be collectable, but in a more complex example it might not.
If optional chaining done by itself in inbuilt makes code more readable I guess
17:46 pre-allocate… Sounds like we will need those exabytes after all.
Amazing! This was fascinating and very fun to watch. Keep content like this coming
Instead of declaring a stop function just have a dispose function.
There is lot's quite same concepts as C# have... good :)
ew
Weakref looks way more confusing and complicated and more code than just removing the event listener when you null out the instance. If you have a lot of listeners why not write a deconstruct method that nulls out the instance you pass in and also all event listeners its tracking in an array in that instance.
I am pretty slow when it comes to this stuff and all this new syntax so I probably just have no idea what Im missing lol.
In this simple example you're right, the dispose is probable less confusing and good enough. When things get more complex though, e.g. multiple owners of the disposable class, or the possibility of the _owner_ of the disposing class getting collected before calling something like "stop", is when this starts becoming really cool. This example, jokes about chat apps aside, is a very simplified version of a real memory leak we found.
@@LeszekSwirski I appreciate the insight. Thank you very much!
Thanks for the very informative cast.
What does deref() stand for? de-reference?
Yeah
Thank god for optional chaining. I’ve been using object unwrapping to do this but it’s too verbose and painful to implement
Plot Twist. These are Python Devs.
😆
love the null-safe kotlin feature!
You mean c# :D.was introduced waaay sooner than kotlin did :)
Great talk. Thank you guys. It would be useful to add time codes to video as well.
time codes?
also how is your comment a week ago?? the video just got released! :O
Creators can add timestamps to the video description, and they’ll show up on the video timeline as “sections”.
@@johnfridja Haha yeah thats strange :D
What if GC collects an event listener before it fire? Then we will facing randomly swallowed events. Now imagine someone debugging this... I think you should provide better use case for a WeakRef.
The socket, or whatever your attaching the event listener to, will hold the event listener strongly, so it won't be collected before firing unless it's detached before firing.
@@LeszekSwirski No, we're adding weak reference to the socket message event. So it could be collected at any time. See their very misleading article v8.dev/features/weak-references
@@vsDizzy "They" is me, I'm one of the authors of the article. The whole point is to be able to collect listeners when that listener stops existing, but that listener is held strongly by the only thing that cares about the events (in the article, by MovingAvg). Hence swallowing events is actually desired. The strong reference I was referring to was the strong reference to the wrapper listener, and that one is indeed strong and has to be explicitly disconnected (by the finalizer).
@@LeszekSwirski, nice to meet you. Well actually it makes sense after looking at "reachability diagrams" provided in the article.
Good content and funny characters :)
1:56 isn't that just typescript
Typescript doesnt live in run time
It only exists in Typescript because it was an upcoming Javascript spec. Typescript doesn't just add new features (unrelated to typing) willy nilly.
Nice video 👌👌👌👌👌👌
Advanced level of experience
Every day we stray further from legibility.
Already in ts
Nice
It almost looks like c#
ew
JavaScript is starting to catch up csharp feature implemented in 2016
great
Oh, why Google can give their staff good microphones? Interesting info, but sound is so bad.
You can still understand what they're saying, and also having a expensive mic won't help if the area you're recording in has poor acoustics.
useful
Is it me or is JavaScript is turning to csharp
I think its Ruby
WeakRef is cool ! ))
WeakRef and FinalizationRegistry makes javascript to c++. :(
its 2020 why does this insanely convoluted language still exist?
Copied from C#
I am telling you guys... Typescript is the new JS...
Typescript is the new Coffeescript
@@arkanciscan Well, the idea wanted to express is the shipping of some of typescript functionnalities into JS, and the fact this could end with typescript replacing JS in the future. But I shouldn't assert those things since I have never read any V8/JS specifications and features proposal and those languages are probably growing together (although TS is faster).
Optional chaining, welcome to CoffeeScript anno 2010. Only took a decade.
anoo 1610 i think was the game called . sorry didnt played since long time
@@ilin76bb Anno 1602. Their game numbering always has the digits sum up to 9 btw: 1701, 1503, 1800, 2070
Steal optional chaining from kotlin..
It is in Swift and csharp already too