@@fusedqyou The "smart" ones are out preaching incomplete advice while the actual smart ones are busy doing real-life work and accomplishing something.
@@fusedqyou Even in the old version I would argue against it because find is more specialized. Sure it was faster, but for 99.9 % of use cases it will still be insignificant compared to even the fastest DB access the code does so if falls under premature optimization. Using FirstOrDefault means it will work for any enumerable so you can easily switch away from list if you need to and also change to singleordefault easily. And even IF you need the extra performance, depending on the use case find might not even be the best since you might have options to redesign the whole approach to get even more performance, and only after checking all that, falling back to an older more specific method should be used. At least in my opinion. Easy to read and maintain code is almost always the better code and old methods are often not as clear. FirstOrDefault is much more descriptive than Find since it includes how it handles not finding the value :)
@Natusch_ precisely. That's why my code reviews of others is usually passing quickly as long as there is something really bad, like never, and my code is always being dragged through idiotic nagging which have never resulted in any slightest improvements in any aspect so far. And yes, I have 30 years of uninterrupted experience while others have 8 tops. Your point explains it all.
I think that Nick is a smart boy and I really admire young people nowadays for being that smart. Always high quality advices from Nick. His school and teachers must be really very proud of him.
It seems unlikely that a significant bottleneck in an application will come from using "FirstOrDefault" over "Find" for List. It's possible and for those of us that cannot upgrade our version of .NET due to external dependencies, that might be a reasonable optimization (assuming this is the problem). However, based on profiling we have done, our slowdowns are not this simple. Usually, issues are caused by repeated calls to our 3d party dependency for the same information over and over. TL;DR: Profile first, optimize later.
Oh, on longer collections this is very likely. On lists with 10 items, you don't notice the difference. Remember you never remove bottlenecks. If you remove the bottleneck you know, you make another piece of code your bottleneck. Maybe the Find is a 2nd or 3nd order degree of bottleneck.
I am getting sick of this kind of "advice". I would always just use, what I find is the most readable/understandable (even if it is the supposed inferior performing version). And if I find my application to slow down or uselessly crank up in memory usage, I investigate and optimize the bottleneck. No point in optimizing, what doesn't need to be optimized.
I agree with you. Plus -- the use-case for this is so simple and irrelevant... a better demonstration would be a list of a class type where the property of one of those classes matches.. with the caveat that the test data MUST have "duplicates" in there (or at least multiple records where the matching field has the same value -- and only return the FIRST of those...).
In my domain of work performance has high priority. So whenever I can squeeze out a little more performance, why not. For example I created a lot of methods in my general use library that allow me to set byte values in byte arrays using unsafe. I do that quite a lot, both on client and server side (namely for TCP/UDP communication), and why do something worse if better is possible? I know it's more performant, and I know it's often used. Should I default to some "simple", inferior standard instead, because so far it wasn't a bottleneck yet? When I am working on something, I have to load the entire system into memory. That's the best time to work on it, and also to improve it. When I am done with it, I will forget too much of it, and it would take days or weeks to get into it. If I'd switch domains too much, I'd lose out more time on that (without benefit) than what optimizations take in time - especially if they are necessary/beneficial later anyway.
@@brianviktor8212 Different projects have different performance requirements, but in my experience, premature optimization is often a curse. Why optimize for performance if it won’t significantly impact the app in the end? Especially if it comes at the cost of readability and maintainability. On top of that, you often need to make trade-offs between time and memory performance before you even know what kind of issue will cause a bottleneck. Of course, there are caveats. For example, I once worked on a originally statically rendered web-front-end where server-side queries to a specific data service were painfully slow, causing frequent timeouts. As a result, we implemented client-side caching and effectively built a modern JavaScript app for this one use case, allowing users to begin working while the data loaded. But months later, we finally reviewed the data server’s code and found the issue - a poorly scaling lookup-loop with multiple database queries inside each iteration. A single, somewhat ugly, database query fixed the issue. If the initial developer had been more performance-focused, this might have been avoided. However, management also could have listened when we identified the server as the bottleneck. Instead of a simple back-end fix, we spent months optimizing the front-end, which became unnecessary once the real issue was fixed. In the end, I prioritize readability. If a problem comes up, it's important that any competent team member - not just the original developer - can understand the code’s intent and fix the issue. Readability ensures that the team can adapt and optimize when it matters, using clear helper functions and selective comments to guide future changes (in general I dislike comments, but sometimes they are important). This way, the code may initially “perform worse”, but it’s maintainable by the whole team. At least that is the idea :D
Okay so Find is specific to List vs FirstOrDefault which works on all data types. So you loose reusability and type agnostic code for a tiny bit of performance? Oh and if you use .NET 9 its actually faster. Yeah, stick to FirstOrDefault because regardless of the performance its better to have something that is type agnostic.
Number of times an in memory lookup has been the culprit of slowness in my professional life, is pretty darn few. Its always disk, network, database or some other external thingy.
This is a great point that performance advice is so specific to versions and implementations. Forgetting this seems to be the cardinal sin of all these advice posts.
It turns to idea what any perf-related decisions should be covered by perf-benchmarks, including benchmarking standard stuff (as base of decision). But this rarely matters.
@@junior.santana It does, but because Unity is reaaaally slow with updating to newer C# versions, Linq's performance is not that great and can be a real fps killer.
It is joke and EF is not fastest nor best linq to database provider which additionally implements UoW - great old antipattern. More over linq is not best way to make a query. But if you can embed parameters in query (instead of using parameters) in some cases (semi dynamic queries mostly) - is requirement. Not sure if EF even can handle this.
So basically Microsoft was lazy and didn't optimize Find 😉 I still don't think it's bad advise to use a specialized method over a LINQ method as it could be optimized further than the LINQ method. Probably not significantly in this case, but it's a good principle.
I don't think comparing against a pre-release version of .Net is necessarily fair to disqualify the advice. It's good advice UNTIL .Net 9 comes out which makes this concern trivial.
And maybe now someone goes in and optimizes the Fins method. Since they know what the backing array is it will be faster than the Linq method. So the whole point is once again valid and anything can be optimized by Microsoft at any time...
I'd like to say it's a good advice until .NET 10 comes out as many projects migrate only to LTS versions. But, well, it's good only for hot parts of your code ehich greatly affect the performance (if any). Most of your code is not a hot path and don't bottleneck the performance
@@YT-dr8qi There are people who care about LTS? There is no practical difference between 18 or 36 months. Applications live 20 years or longer. And you're not going to get budget to upgrade applications without any practical improvement for business and especially not when you're using beta features like Blazor that still get big breaking changes every single new release. Choosing your .NET version is about which features you need. Through most of an applications' life, LTS/STS won't be a factor.
This was fantastic and honestly my mind was indeed blown with those results from .NET 9. Once I saw it was spans it made sense, I'm pretty excited to learn about what other optimizations were made in .NET 9 now. My question now is - what happens to programs that were compiled in .NET 8 but run in .NET 9? I assume they also get the optimizations because the methods in the runtime are what got optimized, but I don't know how much implementation is inlined into your program by the compiler vs. what is left as calls into the runtime.
Also, remember that Linq query extensions like FirstOrDefault and SingleOrDefault queries can be converted to SQL by Entity Framework, Collection specific methods might not have those optimizations, and might cause higher bandwidth between the application and database, same applies to Any vs Array,Exists etc. So always using collection specific methods is ill adviced by that alone.
Hi @Nick, very nice video. You always surprise me with your content. Can you make video how you are able to drill down to built in/library code. Thanks
My first thought would be "are you only searching for one thing, or multiple things"? More often then not it is multiple things, in the context of a loop (or several nested loops). At which point I would use HashSet, Dictionary, or a lookup table. I have lost count of the number of times I have seen other developers use Find (or First, or FirstOrDefault) for everything, no matter the context. Also, I almost never use a List directly these days. I might use ICollection, IReadOnlyList, or IEnumerable. Which might be backed by a list. Or it might not. So I rarely if ever optimize for a specific collection type, and when I do I'm usually back to using HashSet or Dictionary.
I generally take the approach of using linq until I have a good reason not to. I know a lot of optimizations are in place that may be better than mine, it gets updated and is very versatile. Plus I know it will work on any collection if it changes in the future. Once I have/need a specific behavior more attached to the specific collection or type, then I think about doing it differently
It may depend on the company, as not all companies are going to update their applications every time a new .NET version is released. Maybe if you are working on a project that can be upgraded frecuently, It can be a good advice, but if not, finding an optimal approach to that implemented version can be a better option.
I am a little curious about the performance on "not found". These tend to have very different characteristics. Other than that, just this week, I have been looking at replacing FirstOrDefault with Find on a .NET 8 project. Looks like it is good...until we switch to 9 (or probably 10)
Another reason why you shouldn't write custom code to replace readable and consise linq unless you absolutely have to for performance reasons. Your custom approach is unlikely to get optimized by using a new version, whereas builtin code might.
I feel like the take away here is not that you use FirstOrDefault over Find because Microsoft is prioritizing it, but that contextless performance advice akin to "use this not this" should not be blindly followed without a deeper understanding and up-to-date knowledge. Over all I would even say that the original advice is not wrong. Its just gonna be out of date soon.
Can you update the test to have a sealed IntWrapper and test & compare it across 8 & 9? Optimally have a test with the latest Framework version for library maintainers Maybe just make it a UA-cam Short, if it isn't enough content
Theoretically, just looking at the name, find wouldn't neccessarily need to iterate in order, so if you had list.FirstOrDefault(x => x.Firstname == "John") and list.Find(x => x.Firstname == "John"), it could get you a different object. But in this case, Find iterates in order and the difference is just enumerator overhead.
It's telling that you have an EF preface. The problem is always that people want one way of doing things. We have a hammer and want everything to be nails. Personally, I dislike the name "Find". To me, it implies it should act more like Where+ToList, where I am filtering and potentially getting back multiple objects. should be renamed to "FindFirst" if that is what it is doing.
Crazy that TryGetFirst doesn't return a boolean but the found element, and sets the bool found as the out parameter. Maybe it's a memory allocation optimization or something but it looks weird.
You need to have a bool that tells you if the element was found or not. Other languages like C++ use an iterator that points to past end in case element was not found (there is no need for this pair: value + found)
@@MarincaGheorghe I understand, but usually TryXXX methods return the boolean, and set the found/parsed/whatever element as the out value, so you can write: "if (TryXXX(obj, out var xxx)) { doSomething(xxx); }"
@@MarincaGheorghe the issue is that it works the opposite of how every other Try method works reutrn bool if sucessful and out the result so you can simplyfy the code by doing if(Try)
This is one of those videos I quite don't like, not something like "you don't want to use the default implemented method of the type", it goes against what is designed in the language itself and feels like a very artificial advice.
If there is an older and a newer implementation inside the language library, use the new one. I think that's a good advice. IT's two different "default implementations". You have to go "against what is designed in the language" in either way.
OK, I made my comments because I'm always calling a DB with LINQ... With memory collections, I'm usually on the client... I guess I can use it with Blazor though... There are a lot of cases where you want to keep data local rather than call the DB for lookups...
Seems like MS wouldn't have to rewrite Find to use Spans so much as just point Find to the whatever method the FirstOrDefault is using and then maybe handle the result differently? I didn't look too closely at the code
you use FirstOrDefault because the name is more precise, that how you can also tell that find is an old version. And if you dont care about this, you always build the loop yourself, because LINQ was garbage performance since release. good to see thats getting better.
Im working with a company that takes a long time to update dotnet, is it not worth making this chantge, since we are stil on dotnet 7? Performance is important for our software
Func is definitely more descriptive, since you can see it returns a bool without knowing what a predicate means. Also Func can have more than one (16 in fact) input types. Predicate is just a very specific implementation, feels redundant.
@@artemisDev yes thats what the name predicate implies. We have one condition and it can be either true or false. So when you read predicate somewhere normally you should know that you can pass down some expression that should return either true or false.
A Predicate is just one very specific version of Func. Meanwhile, the Func can be Func, Func, Func, etc. They realized that the Func (or Action) generics covered everything, and that the specialized versions like Predicate were just kind of confusing and slightly non-compatible. There's no functional difference in behavior between a Func and a Predicate, but you can't substitute one for the other because they are different types (although a lambda can be converted to either). I believe only the Array and List classes use Predicate, while LINQ always uses Func. Since LINQ is preferred for almost everything, and LINQ tries to have a unified syntax, encountering a specialized form of Func just introduces a confusion that doesn't need to be there. "Predicate" is definitely more descriptive (for me; I don't know about everyone else), but it introduces the concept of specialized terms that act as 'exceptions' to the general Func rule, and that's just a conceptual 'leak' that you're better off not surfacing in the language syntax. And it's "not modern" because Predicate was introduced way early, before MS realized it was a bad idea. Unfortunately, by then they couldn't get rid of it, so now it's mostly just, "Please ignore this little bit of legacy trivia", or "Use LINQ instead."
@@artemisDev a predicate IS a condition that should evaluate to `true` when applied to something. The name `Predicate` carries semantic meaning that `Func` does not.
Find also happens to be a list only linq method. Many libraries only return ienumerables. If you have to convert an ienumerable to a list it’s going to to be slower than firstordefault.
Find isn't a linq method. It came along in 2.0 with List in System.Collections.Generic and uses a Predicate. They also created a Array.Find with a Predicate in 2.0 for all the boomers who didn't want to move on from arrays. LINQ (System.Linq) and lambda expressions (System.Linq.Expressions) were added in 3.5. and while thinking more globally about this whole concept, they moved on to using the Func/Action philosophy. But yeah for sakes of compatibility or refactoring it makes way more sense to work against IEnumerable than IList. There are concepts that become forgotten as more and more the .NET environment becomes littered by javascriptification of it's languages and specifically aiming at C# as a victim.
@@krs-tube The person I replied to said "a list only linq method" so I explained it's not a linq method. So take your own advice and get some social lessons while you're at it.
@@krs-tube He said it but never fully explained why it was bad and what the consequences were…. I guess you just like posting mean comments for a dopamine hit…. Thanks for your useful comment
Very insightful analysis but I have to disagree somewhat on not using something that currently performs better because Microsoft will eventually optimize it.
It's highly situational. And picking the right method is what separates good engineers for mediocre. If you really, really need that performance right now, then use Find. If you don't care all that much and don't want to refactor code, then use Linq everywhere and play the odds that it'll get faster and faster over time.
You are somewhat right. It all depends on the situation. If you think you can upgrade the code later with latest version, then it's fine. But most of the times i would go for something more general/generic. I'll just have to increase the version & i get the benefits.
At 4:06 I thought it might've been a local issue where Rider no longer jumps to the appropriate definition now on the first attempt. It loads the source files and then you have to go back and re-jump. I hope they fix that soon b/c I swear it used to work
@@artemisDev those factors should be fairly equivalent, since they're executing identical predicates in an identical manner to find a match. Only difference is the Enumerator, so I'd personally expect the same performance delta between ref types as there was between value types...
Another example of being too smart for your own good. Or rather generalizing over a broader topic based on specific example. Your method runs slow? Profile it and find the lowest hanging fruit. Cautiously build a heuristic over it, but only to potentially detect similar issues in the future and not to dismiss one type/method over another.
You cant profile method which did not run in loop. It might be called few times per seconds, but it still adds dead weight. Performance starts from doing things efficiently in minimal way. Optimizing via profiling - is cheaper way to make "slow" things bit faster. Also any benchmark (micro or profiler) can't show scalability. So all this standard stuff like pick profiler and go is rubbish.
@@NotACat20 The idea of using a profiler is to lower the bar of things that are responsible for the major slowdowns in your application, so it is pretty profilable, yes, it is not rubbish.
SingleOrDefault doesn't go through a collection twice the length. It finds first match and continues to look for one more match, which in worst case leads to the end of the collection, and that's it.
And if the first match is in the middle of the collection and there isn’t a second match, the it goes twice the length, which is what I said in the video
Meh. Microsoft can optimize LINQ all they want. Won't do you any good if your projects are .NET Core 3.1 in production and no one is planning to change that
Dangit, Sonarqube is flagged all of my FirstOrDefaults in my projects and telling me to replace them with Find..... And I've made the change to make it quit nagging me...
I find your title confusing. I admit I'm not a very regular viewer, but I read the title initially as what your final advice is, not the original claim that you're going to dispute as the Code Cop. If you'd put "Stop Using FirstOrDefault" in quotes followed by code cop, that would make it clearer, imo, that someone else said to stop and you don't agree.
Premature optimization. There’s usually something else your doing that results in multi second responses you should fix before going down this rabbit hole.
What rabbit hole, lol? This is a bizarre take for something that is neither absurd not problematic, it is an almost free thing you could do. And while I agree that your project to this exclusively because of optimization factors is a premature optimization, doing this from start will not be a problem.
Unless your app is making millions ($) per hour, this is so pointless.... honestly dont have to time to worry about nanoseconds in my code that get's hit maybe a thousand times a day.
Until I saw the Code Cop at the end of the title all my usages of FirstOrDefault flashed before my eyes
Literally he drops these and I’m like “okay boys what are we refactoring today” 😂
😂🤣😂🤣i stopped listening
Exactly this, I hate the code cop vids.
😂
They are the best way to learn things in a practical, real-world way
Astronaut 1: Wait its a span?
Astronaut 2: Always has been 'shoots gun'
"it's spans. It's ALWAYS spans." -- Nick
I wanna subscribe to Nick's OnlySpans
This deserves an Oscar 🤣@@tedchirvasiu
Can someone care to explain?
@@silkfire Maybe Indiana Jones? He did not like snake case as I remember...
@@tedchirvasiu ohmygod LMAOO
I hate LinkedIn. Everyone's out to prove how special they are. And it's never the smart ones doing it.
The smart ones don't care much about how you do it as long as you're accomplishing real life solutions in a sufficient or, ideally, great way.
@@fusedqyou The "smart" ones are out preaching incomplete advice while the actual smart ones are busy doing real-life work and accomplishing something.
@@fusedqyou Even in the old version I would argue against it because find is more specialized. Sure it was faster, but for 99.9 % of use cases it will still be insignificant compared to even the fastest DB access the code does so if falls under premature optimization.
Using FirstOrDefault means it will work for any enumerable so you can easily switch away from list if you need to and also change to singleordefault easily.
And even IF you need the extra performance, depending on the use case find might not even be the best since you might have options to redesign the whole approach to get even more performance, and only after checking all that, falling back to an older more specific method should be used.
At least in my opinion.
Easy to read and maintain code is almost always the better code and old methods are often not as clear.
FirstOrDefault is much more descriptive than Find since it includes how it handles not finding the value :)
@Natusch_ precisely. That's why my code reviews of others is usually passing quickly as long as there is something really bad, like never, and my code is always being dragged through idiotic nagging which have never resulted in any slightest improvements in any aspect so far. And yes, I have 30 years of uninterrupted experience while others have 8 tops. Your point explains it all.
@@tuckertcs He who can does. He who cannot, teaches.
I think that Nick is a smart boy and I really admire young people nowadays for being that smart. Always high quality advices from Nick. His school and teachers must be really very proud of him.
My jaw dropped when i saw the .net 9 results
It seems unlikely that a significant bottleneck in an application will come from using "FirstOrDefault" over "Find" for List. It's possible and for those of us that cannot upgrade our version of .NET due to external dependencies, that might be a reasonable optimization (assuming this is the problem). However, based on profiling we have done, our slowdowns are not this simple. Usually, issues are caused by repeated calls to our 3d party dependency for the same information over and over.
TL;DR: Profile first, optimize later.
Oh, on longer collections this is very likely. On lists with 10 items, you don't notice the difference.
Remember you never remove bottlenecks. If you remove the bottleneck you know, you make another piece of code your bottleneck.
Maybe the Find is a 2nd or 3nd order degree of bottleneck.
I am getting sick of this kind of "advice". I would always just use, what I find is the most readable/understandable (even if it is the supposed inferior performing version). And if I find my application to slow down or uselessly crank up in memory usage, I investigate and optimize the bottleneck. No point in optimizing, what doesn't need to be optimized.
totally agree,. call it optimized for readabillity.
Preemptive optimizing is pure evil.
I agree with you. Plus -- the use-case for this is so simple and irrelevant... a better demonstration would be a list of a class type where the property of one of those classes matches.. with the caveat that the test data MUST have "duplicates" in there (or at least multiple records where the matching field has the same value -- and only return the FIRST of those...).
In my domain of work performance has high priority. So whenever I can squeeze out a little more performance, why not. For example I created a lot of methods in my general use library that allow me to set byte values in byte arrays using unsafe. I do that quite a lot, both on client and server side (namely for TCP/UDP communication), and why do something worse if better is possible? I know it's more performant, and I know it's often used.
Should I default to some "simple", inferior standard instead, because so far it wasn't a bottleneck yet? When I am working on something, I have to load the entire system into memory. That's the best time to work on it, and also to improve it. When I am done with it, I will forget too much of it, and it would take days or weeks to get into it.
If I'd switch domains too much, I'd lose out more time on that (without benefit) than what optimizations take in time - especially if they are necessary/beneficial later anyway.
@@brianviktor8212 Different projects have different performance requirements, but in my experience, premature optimization is often a curse. Why optimize for performance if it won’t significantly impact the app in the end? Especially if it comes at the cost of readability and maintainability. On top of that, you often need to make trade-offs between time and memory performance before you even know what kind of issue will cause a bottleneck.
Of course, there are caveats. For example, I once worked on a originally statically rendered web-front-end where server-side queries to a specific data service were painfully slow, causing frequent timeouts. As a result, we implemented client-side caching and effectively built a modern JavaScript app for this one use case, allowing users to begin working while the data loaded. But months later, we finally reviewed the data server’s code and found the issue - a poorly scaling lookup-loop with multiple database queries inside each iteration. A single, somewhat ugly, database query fixed the issue.
If the initial developer had been more performance-focused, this might have been avoided. However, management also could have listened when we identified the server as the bottleneck. Instead of a simple back-end fix, we spent months optimizing the front-end, which became unnecessary once the real issue was fixed.
In the end, I prioritize readability. If a problem comes up, it's important that any competent team member - not just the original developer - can understand the code’s intent and fix the issue. Readability ensures that the team can adapt and optimize when it matters, using clear helper functions and selective comments to guide future changes (in general I dislike comments, but sometimes they are important). This way, the code may initially “perform worse”, but it’s maintainable by the whole team. At least that is the idea :D
Okay so Find is specific to List vs FirstOrDefault which works on all data types. So you loose reusability and type agnostic code for a tiny bit of performance? Oh and if you use .NET 9 its actually faster. Yeah, stick to FirstOrDefault because regardless of the performance its better to have something that is type agnostic.
At 04:38, the Find method returns nullable of T which should return null in case element is not found in List.
Number of times an in memory lookup has been the culprit of slowness in my professional life, is pretty darn few.
Its always disk, network, database or some other external thingy.
Or wrong algorithm!
@@odomobo - yes, some unintended nested loops is a fun one.
True, the other big one in managed code in general is HEAP churn which I do optimise for. Otherwise, clean code all the way.
Analyzers suggesting the use of List (and Array) specific methods: Oh man, we screwed up.
I was about to say... I've had analyzers yelling at my for years to use Find on lists. In this guy's defense, it's been like this for years.
I had a hunch that Nick is going to create the video on this when I had first seen it on LinkedIn 😂 and here we are!!!
This is a great point that performance advice is so specific to versions and implementations. Forgetting this seems to be the cardinal sin of all these advice posts.
It turns to idea what any perf-related decisions should be covered by perf-benchmarks, including benchmarking standard stuff (as base of decision). But this rarely matters.
Love that you made a video about this. I was just on LinkedIn just this morning and saw this.
Cries in Unity
Why? Doesn't Unity support Linq?
@@junior.santana It does, but because Unity is reaaaally slow with updating to newer C# versions, Linq's performance is not that great and can be a real fps killer.
@@testitestmann8819 I see. That's annoying, it's a use case that would benefit a lot from those micro-optimizations
Never use LINQ to filter data in the database, instead do all the filtering in memory 😅
Huh why?
@@Anonymous-un2nj it's a joke.
An example of bad advice 😜
It is joke and EF is not fastest nor best linq to database provider which additionally implements UoW - great old antipattern. More over linq is not best way to make a query. But if you can embed parameters in query (instead of using parameters) in some cases (semi dynamic queries mostly) - is requirement. Not sure if EF even can handle this.
Nah, roll your own SQL to do it 😅
@@klocugh12 just use linq2db - is much faster, is still linq and is done right.
I saw that advice and was still chewing on it, I'm glad you did this Nick. I was wondering if the LINQ optimizations were being considered.
Is Predicate considered old-hat now? I thought it was basically just a shorthand way of saying Func ?
Yes, it's simply physicly older, .net 2.0 vs. .net 3.5; which is 2003 vs. 2007 or so.
This also applies for Any/Exists: Exists was faster than Any, but in .NET 9, Any has been optimized to outperform Exists.
I always had faith in linq optimization, glad they're pulling through! :)
So basically Microsoft was lazy and didn't optimize Find 😉 I still don't think it's bad advise to use a specialized method over a LINQ method as it could be optimized further than the LINQ method. Probably not significantly in this case, but it's a good principle.
I don't think comparing against a pre-release version of .Net is necessarily fair to disqualify the advice. It's good advice UNTIL .Net 9 comes out which makes this concern trivial.
And maybe now someone goes in and optimizes the Fins method. Since they know what the backing array is it will be faster than the Linq method. So the whole point is once again valid and anything can be optimized by Microsoft at any time...
I'd like to say it's a good advice until .NET 10 comes out as many projects migrate only to LTS versions.
But, well, it's good only for hot parts of your code ehich greatly affect the performance (if any). Most of your code is not a hot path and don't bottleneck the performance
@@YT-dr8qi There are people who care about LTS? There is no practical difference between 18 or 36 months. Applications live 20 years or longer. And you're not going to get budget to upgrade applications without any practical improvement for business and especially not when you're using beta features like Blazor that still get big breaking changes every single new release.
Choosing your .NET version is about which features you need. Through most of an applications' life, LTS/STS won't be a factor.
@@Rizon1985 we have 3 internals apps, we upgrade them for every new LTS
This was fantastic and honestly my mind was indeed blown with those results from .NET 9. Once I saw it was spans it made sense, I'm pretty excited to learn about what other optimizations were made in .NET 9 now. My question now is - what happens to programs that were compiled in .NET 8 but run in .NET 9? I assume they also get the optimizations because the methods in the runtime are what got optimized, but I don't know how much implementation is inlined into your program by the compiler vs. what is left as calls into the runtime.
Also, remember that Linq query extensions like FirstOrDefault and SingleOrDefault queries can be converted to SQL by Entity Framework, Collection specific methods might not have those optimizations, and might cause higher bandwidth between the application and database, same applies to Any vs Array,Exists etc. So always using collection specific methods is ill adviced by that alone.
The message was: If you have a known collection type: use it. Not vice versa. Don't create lists, to use it.
Somehow I knew from the very beginning it will be spans :) Great video, as always.
Hi @Nick, very nice video. You always surprise me with your content. Can you make video how you are able to drill down to built in/library code. Thanks
My first thought would be "are you only searching for one thing, or multiple things"? More often then not it is multiple things, in the context of a loop (or several nested loops). At which point I would use HashSet, Dictionary, or a lookup table. I have lost count of the number of times I have seen other developers use Find (or First, or FirstOrDefault) for everything, no matter the context. Also, I almost never use a List directly these days. I might use ICollection, IReadOnlyList, or IEnumerable. Which might be backed by a list. Or it might not. So I rarely if ever optimize for a specific collection type, and when I do I'm usually back to using HashSet or Dictionary.
I generally take the approach of using linq until I have a good reason not to.
I know a lot of optimizations are in place that may be better than mine, it gets updated and is very versatile. Plus I know it will work on any collection if it changes in the future.
Once I have/need a specific behavior more attached to the specific collection or type, then I think about doing it differently
Great video! I would like to see how this method behaive with EF
It may depend on the company, as not all companies are going to update their applications every time a new .NET version is released. Maybe if you are working on a project that can be upgraded frecuently, It can be a good advice, but if not, finding an optimal approach to that implemented version can be a better option.
I am a little curious about the performance on "not found". These tend to have very different characteristics.
Other than that, just this week, I have been looking at replacing FirstOrDefault with Find on a .NET 8 project. Looks like it is good...until we switch to 9 (or probably 10)
Could you make a video about spans and why they are so cool?
That is a decent improvement in performance for free!
Another reason why you shouldn't write custom code to replace readable and consise linq unless you absolutely have to for performance reasons. Your custom approach is unlikely to get optimized by using a new version, whereas builtin code might.
Everyone who has been using FirstOrDefault breaths a sigh of relief..
I feel like the take away here is not that you use FirstOrDefault over Find because Microsoft is prioritizing it, but that contextless performance advice akin to "use this not this" should not be blindly followed without a deeper understanding and up-to-date knowledge.
Over all I would even say that the original advice is not wrong. Its just gonna be out of date soon.
That was one hell of a thriller! I enjoyed it the way one would enjoy watching Hitchcock's movies.
I’m using the Array specific Find and Exists methods as SonarQube advised me to do. We’re on .Net 8 right now, maybe I should run some benchmarks.
Can you update the test to have a sealed IntWrapper and test & compare it across 8 & 9?
Optimally have a test with the latest Framework version for library maintainers
Maybe just make it a UA-cam Short, if it isn't enough content
Thanks for the explanation,
thank you for this series, I learn a lot
Really nice advice!
I cannot find any Blazor Deep Dive course. Is it currently being recorded?
Yes, Jimmy is currently working on it
1:46 that is wrong, if duplicate exist Single method throw exception that's not applicable for First or FirstDefault
You could say instead that you do not use FirstOrDefault on in-memory collections, but should use it in EntityFramework ?
Find and FirstOrDefault work differently in EF
I still don't understand how Find can be faster in .NET 8 if the methods basically do the same thing... is it something in the IL?
It’s the enumerator
find is specialized for list, allowing faster access
Theoretically, just looking at the name, find wouldn't neccessarily need to iterate in order, so if you had list.FirstOrDefault(x => x.Firstname == "John") and list.Find(x => x.Firstname == "John"), it could get you a different object.
But in this case, Find iterates in order and the difference is just enumerator overhead.
Good work Microsoft
It's telling that you have an EF preface. The problem is always that people want one way of doing things. We have a hammer and want everything to be nails. Personally, I dislike the name "Find". To me, it implies it should act more like Where+ToList, where I am filtering and potentially getting back multiple objects. should be renamed to "FindFirst" if that is what it is doing.
🎵 span, span, span, span, lovely spaaaaaaan, lovelyyyy spaaaaaan 🎵
Crazy that TryGetFirst doesn't return a boolean but the found element, and sets the bool found as the out parameter. Maybe it's a memory allocation optimization or something but it looks weird.
You need to have a bool that tells you if the element was found or not. Other languages like C++ use an iterator that points to past end in case element was not found (there is no need for this pair: value + found)
@@MarincaGheorghe I understand, but usually TryXXX methods return the boolean, and set the found/parsed/whatever element as the out value, so you can write: "if (TryXXX(obj, out var xxx)) { doSomething(xxx); }"
@@MarincaGheorghe the issue is that it works the opposite of how every other Try method works
reutrn bool if sucessful and out the result so you can simplyfy the code by doing if(Try)
This is one of those videos I quite don't like, not something like "you don't want to use the default implemented method of the type", it goes against what is designed in the language itself and feels like a very artificial advice.
If there is an older and a newer implementation inside the language library, use the new one. I think that's a good advice. IT's two different "default implementations". You have to go "against what is designed in the language" in either way.
How hard is it to optimize .Find/.Exists to match that, not that hard. Just get a span instead of for looping. (if someone just opens an issue)
Wow - you are blowing my mind
well ... kind of
OK, I made my comments because I'm always calling a DB with LINQ... With memory collections, I'm usually on the client... I guess I can use it with Blazor though... There are a lot of cases where you want to keep data local rather than call the DB for lookups...
Seems like MS wouldn't have to rewrite Find to use Spans so much as just point Find to the whatever method the FirstOrDefault is using and then maybe handle the result differently? I didn't look too closely at the code
Why don’t you put your courses on Udemy?
Why would I do that?
you use FirstOrDefault because the name is more precise, that how you can also tell that find is an old version.
And if you dont care about this, you always build the loop yourself, because LINQ was garbage performance since release.
good to see thats getting better.
I try to avoid SingleOrDefault, but that's just because of the cost of throwing an exception.
Im working with a company that takes a long time to update dotnet, is it not worth making this chantge, since we are stil on dotnet 7? Performance is important for our software
Why would you prefer `Func` over `Predicate`? The latter is more descriptive. What makes it "not modern"? Seems arbitrary to me.
Func is definitely more descriptive, since you can see it returns a bool without knowing what a predicate means. Also Func can have more than one (16 in fact) input types.
Predicate is just a very specific implementation, feels redundant.
@@artemisDev yes thats what the name predicate implies. We have one condition and it can be either true or false.
So when you read predicate somewhere normally you should know that you can pass down some expression that should return either true or false.
A Predicate is just one very specific version of Func. Meanwhile, the Func can be Func, Func, Func, etc. They realized that the Func (or Action) generics covered everything, and that the specialized versions like Predicate were just kind of confusing and slightly non-compatible.
There's no functional difference in behavior between a Func and a Predicate, but you can't substitute one for the other because they are different types (although a lambda can be converted to either). I believe only the Array and List classes use Predicate, while LINQ always uses Func. Since LINQ is preferred for almost everything, and LINQ tries to have a unified syntax, encountering a specialized form of Func just introduces a confusion that doesn't need to be there.
"Predicate" is definitely more descriptive (for me; I don't know about everyone else), but it introduces the concept of specialized terms that act as 'exceptions' to the general Func rule, and that's just a conceptual 'leak' that you're better off not surfacing in the language syntax.
And it's "not modern" because Predicate was introduced way early, before MS realized it was a bad idea. Unfortunately, by then they couldn't get rid of it, so now it's mostly just, "Please ignore this little bit of legacy trivia", or "Use LINQ instead."
@@artemisDev a predicate IS a condition that should evaluate to `true` when applied to something. The name `Predicate` carries semantic meaning that `Func` does not.
Man we’re all the same, we think just about one single thing… spans!
I had to use Find instead of FirstOrDefault because we treat warning as error and analyser was flagging that as warning.
Why is span iteration faster than array iteration!?
It's more based on pointer arithmetics, like in C++.
Saw this post on LinkedIn today😅
Sure, it might be slower, but I'll still use FirstOrDefault over Find.
You speak as fast as your code runs in dotnet 9, however I prefer find method talk speed😂
why use .NET 9 in production when it doesn't have LTS?
love it,
Find also happens to be a list only linq method. Many libraries only return ienumerables. If you have to convert an ienumerable to a list it’s going to to be slower than firstordefault.
Find isn't a linq method. It came along in 2.0 with List in System.Collections.Generic and uses a Predicate. They also created a Array.Find with a Predicate in 2.0 for all the boomers who didn't want to move on from arrays. LINQ (System.Linq) and lambda expressions (System.Linq.Expressions) were added in 3.5. and while thinking more globally about this whole concept, they moved on to using the Func/Action philosophy.
But yeah for sakes of compatibility or refactoring it makes way more sense to work against IEnumerable than IList. There are concepts that become forgotten as more and more the .NET environment becomes littered by javascriptification of it's languages and specifically aiming at C# as a victim.
@@krs-tube The person I replied to said "a list only linq method" so I explained it's not a linq method.
So take your own advice and get some social lessons while you're at it.
@@Rizon1985 I tagged the wrong person in my reply, appologies fam!
Nick specifically said and shown that .Find is a method on List() I guess you watched carefuly before dropping your hot take
@@krs-tube He said it but never fully explained why it was bad and what the consequences were…. I guess you just like posting mean comments for a dopamine hit…. Thanks for your useful comment
Great!
Stephen Toub is doing his magic 😂
Very insightful analysis but I have to disagree somewhat on not using something that currently performs better because Microsoft will eventually optimize it.
It's highly situational. And picking the right method is what separates good engineers for mediocre. If you really, really need that performance right now, then use Find. If you don't care all that much and don't want to refactor code, then use Linq everywhere and play the odds that it'll get faster and faster over time.
You are somewhat right. It all depends on the situation. If you think you can upgrade the code later with latest version, then it's fine. But most of the times i would go for something more general/generic. I'll just have to increase the version & i get the benefits.
@@shahzaibhassan2777 Yep
I would rely on solar analysers to remove the cognitive load
At 4:06 I thought it might've been a local issue where Rider no longer jumps to the appropriate definition now on the first attempt. It loads the source files and then you have to go back and re-jump. I hope they fix that soon b/c I swear it used to work
what is so special about spans?
I have a full video talking about this
code cop, funny!
Мое почтение Нику и ребятам из Майкрософт.
Haha I had seen this one 2 or 3 days ago on LinkedIn xD
Dayum... why is the wrapped version so much worse in .NET 8?
Something bizarre is going on there...
Cost of de-referencing, memory lookups, cache fails perhaps?
@@artemisDev those factors should be fairly equivalent, since they're executing identical predicates in an identical manner to find a match.
Only difference is the Enumerator, so I'd personally expect the same performance delta between ref types as there was between value types...
Writing about LINQ on LinkedIn???
Another example of being too smart for your own good. Or rather generalizing over a broader topic based on specific example.
Your method runs slow? Profile it and find the lowest hanging fruit. Cautiously build a heuristic over it, but only to potentially detect similar issues in the future and not to dismiss one type/method over another.
You cant profile method which did not run in loop. It might be called few times per seconds, but it still adds dead weight. Performance starts from doing things efficiently in minimal way. Optimizing via profiling - is cheaper way to make "slow" things bit faster. Also any benchmark (micro or profiler) can't show scalability. So all this standard stuff like pick profiler and go is rubbish.
@@NotACat20
The idea of using a profiler is to lower the bar of things that are responsible for the major slowdowns in your application, so it is pretty profilable, yes, it is not rubbish.
@@diadetediotedio6918 you will not able profile inefficiency between methods shown in this video.
Code Cop?! Issue arrest warrants for some of the guys on my dev team please! They are bad offenders!
Plot twists in this video 😱😱😱
SingleOrDefault doesn't go through a collection twice the length. It finds first match and continues to look for one more match, which in worst case leads to the end of the collection, and that's it.
And if the first match is in the middle of the collection and there isn’t a second match, the it goes twice the length, which is what I said in the video
@@nickchapsas ah, then I misunderstood you, sorry!
Well that is really great, People in my company still using 4.8 framework LOL. so Find it is.
Neat.
Meh. Microsoft can optimize LINQ all they want. Won't do you any good if your projects are .NET Core 3.1 in production and no one is planning to change that
You have to advertise .NET 8 and collect pro arguments.
I'm hoping this is a what not to do... You use FirstOrDefaultAsync so you will return null...
i'm guessing it is the same situation with .NET Framework.
Dangit, Sonarqube is flagged all of my FirstOrDefaults in my projects and telling me to replace them with Find..... And I've made the change to make it quit nagging me...
I use Where() then FirstOrDefault() 😅😅😅😅😅😅
Mind = Blown :O
The reason the OP takes microseconds instead of nanoseconds is because they are running the benchmark on a potato.
I find your title confusing. I admit I'm not a very regular viewer, but I read the title initially as what your final advice is, not the original claim that you're going to dispute as the Code Cop. If you'd put "Stop Using FirstOrDefault" in quotes followed by code cop, that would make it clearer, imo, that someone else said to stop and you don't agree.
Premature optimization. There’s usually something else your doing that results in multi second responses you should fix before going down this rabbit hole.
What rabbit hole, lol? This is a bizarre take for something that is neither absurd not problematic, it is an almost free thing you could do. And while I agree that your project to this exclusively because of optimization factors is a premature optimization, doing this from start will not be a problem.
No, that's bad code
@@diadetediotedio6918 It's more important to have readable code than chase micro-optimization.
Good sample, but the title is kind of misleading 🤔
yes, very clickbaity considering how many FirstOrDefaults I have in database queries :)
Unless your app is making millions ($) per hour, this is so pointless.... honestly dont have to time to worry about nanoseconds in my code that get's hit maybe a thousand times a day.
OnlySpans 😂
Hashtable