One thing to note, you can see when comparing the first api calls in postman.. you can see the first request in AOT is 7ms compared to 119ms! Sure there are some things you can do to mitigate this, but to have this right away is amazing.
This is great for me! I'm building an open source end to end encrypted cloud storage and needed to build a CLI for people who prefer them (like me), and the file size of the executables were huge 😅 it's also one of those places where 10x start time improvements is quite actually visible.
I can't wait to be able to do this for real-world applications. There's so many benefits of AOT, from the load time improvements, to the executable size, to the not having to rely on JIT on lower-end hardware. Hopefully we start seeing better tooling and such for microcontrollers and RISCV that will be able to take some of the education/hacker market share from micropython.
Great video. Well done! Very accurate info based on my experience. Last year, I created a translation tool for our dev team (translate old HTML/VBScript to HTML5/JavaScript). Started out in .NET 6 self-contained, but then I switched it to .NET 7 AOT. The use case favors AOT with devs translating pages one by one. However, I built the translator to be able to spin through folders and translate multiple pages using concurrency. Translating 1K+ files under AOT is actually slower than under MSIL.
I know I'm 9 months late to the party on this but I love this, I was going to write an essay long paragraph on why this is just how c# should be written where you can but I'm too old and tired now to fight with developers who live by the mantra 'to simply exist and never push boundaries is enough', let their response time SLAs with clients be the thing that looses them the contract you win.
When I saw minimal APIs, I thought, "huh, looks a bit like nodejs" - so, now, with NativeAOT, you can have node-like experience when creating small APIs without compromising performance for whole node VM overhead, which might be pretty useful for some applications.
@@gordonfreimann not exactly, I don't want to go into details, but since C# is compiled to IL some optimization can be made at the start of application (this is why startup is relatively long), whereas if you have fully interpreted language like JS it is almost impossible. JIT is a bit of a different story and, yeah, it is quite similar.
@@EzequielRegaldo Wouldn´t necessarily say so, go is from the ground up built for this. So everything it can do just works without thinking about "can this be AOT compiled". The go compiler is also crazy fast at compiling. It´s not even close in that department (like seriously the go compiler is orders of magnitude faster). Runtime I don´t know... test it for your usecase I guess. For the most part I´d expect go to edge out c#. Both c# and go are much faster than my performance requirements pretty much all the time. I did not actually end up using go for anything that I want to keep alive though. But not because I think c# is hands down better, but because it would be more friction to keep it alive when I don´t use go regularly. I have to use c# at work so I can´t exactly forget the syntax. I would with go if I don´t use it for a few month. The main thing I wanted was a smaller file size for containers. So this change is gonna do exactly that.
The more things change, the more they stay the same. If memory serves me right, MS removed the capability to compile to Native code in VB 5 (before the .NET days.) Then came along .NET with IL code, and now AOT is back again. Pretty soon I'm guessing MS will introduce a new language to the mix, and maybe they'll call it Assembler.
Nice to see this gaining traction. Earlier this year I created authorizers in AWS Lambda using light weight NET7 Okta/Auth0 authorizer that had incredible response times for JWT validation. Probably the fastest thing I have ever built (simple though). Took a while to figure out how I would do the JSON source generation as that was new to me and the AWS SDK contracts don't match production AWS but I got it there in the end.
How were you able to figure out everything to exclude and things that need to be source generated? I have an app but have not been able to work through all that yet for my aws lambda.
Looks like il2cpp from unity engine tool. I used it a lot. What i know after years: writing aot friendly code is difficult, but possible. And it always be few steps behind jit, and also will contain some bugs. But in some scenarios it's very good
Hi Nick, amazing video and content, I have a couple of questions and a little premise: In order to gain the AOT startup time advantages, you always release your application, and then verify that the startup is faster as compared to the JIT compiler. If the application makes use of some not supported features, for instance, the reflection-based JSON serialization, you just "discover" the error because you get an internal server error the moment the API is called. 1) Is there anything we can do to make the application fail at startup time? I feel like it's ok that some features are not supported, but it's not really optimal to discover what you did wrong at Runtime, it is bad for the user experience and it violates the concept of "fail fast". 2) You always release the application and then take measurements, is it possible to debug native code?
1) Is there anything we can do to make the application fail at startup time? The compiler will emit warnings if you are compiling for NativeAOT and it detects code patterns that are problematic. You can actually see just such a warning in the video (it's the big block of yellow text). If you want the compilation to fail entirely, just enable the "treat warnings as errors" feature in the project file.
@@GumbootMan It would be great to add some attribute that would allow you to tag an assembly as AOT compatible/incompatible, so that we could see an error at compilation. Would be great for libraries that rely on reflection, or runtime compilation.
@@nocturne6320 that's exactly how the warning systems works. Libraries that have been annotated as trimmable and had their APIs decorated with the appropriate attributes will result in analyzer based warnings at design time & compile time when the consuming project is marked for PublishTrimmed or PublishAot
this would be way more popular if there was a transpiler/scaffolding tool that found all everything that used reflection and implemented an alternative automatically or if none existed provided meaningful error explaining which calls that will fail (and line numbers of course)
Amazing 😻 Waiting for your example with AWS lambda web api, especially in case where your lambda should consume your own nuget packages from Code Artifact.
Тhanks Nick. I thick you should start making your videos on mac or linux since so many people have questions about supporting other platforms except windows
Feels like a huge improvement for the future of serverless with this greatly reduced start up time. Really looking forward on how serverless will evolve in few years.
I've already tried it and wile inspired by Eric Sink's articles where he links C libraries directly into the native AOT app did the same with some of our own applications. It looks pretty cool, but I don't think we will be able to do this in a real-world app just yet with the existing limitations.
hi Nick, thanks for very interesting and informational video. But let's imagine what in some minimal api project type which contains a lot of http clients and serialization flows (as example 5-6 different serialization models) . Now do we need to create each time (5-6) json serializer context(s) and register in start up? Or it not relevant to do? Thanks
The biggest lacking feature for me is the ability to statically link native dependency assemblies. It defeats the point of single file publish and limits it to managed assemblies only. Also the file size output test wasn't fair, you'd need to enable trimming under link mode in the managed assembly, to have a fair comparison.
Hi Nick! You are great teacher and I enjoy every clip you make. Have you thought on making these courses (from dometrain I mean) available also on Udemy?
Very cool indeed but i hope the efforrt are not in one direction only, i'm worried about compatbilty. "Tommorow" this will be the new way to do things and im still "fighting" to move from controllers accepting xml to minimal api. I like the fact the we are getting a glimps to the future but i hope not to much will be "barried" in the past (signalR for example).
Enjoyed your contenT, THANKS! 2 QUESTIONS though: What are and how do you use extensions in the prompt? And why can't I save your video in my playlist? =(
could this be used instead of obfuscating my code that is running on user desktops to prevent people easily decompiling using dotpeek. Also is it possible to just build a library project with AOT then reference that dll in another normally built c# winforms/console app
Great video, Nick! Have you considered whether Native-AOT supports Linux? It could be beneficial for containerized applications if it's faster than AKS. What are your thoughts on this?
How does it affect integration testing? You need to change your code in some cases as you said (reflection). How do I test that using automated testing? Can you run integration/unit tests against native code? If so, do you still need to change how you develop/run your tests (IDE tools, additional workarounds)? If I can't develop automated tests to make sure my code works, that is a big deal breaker for me.
You cannot test using the usual approaches once you've published native AOT. Typically you'll write your tests as normal with the testing framework of your preference and run those tests in non-AOT mode. How you approach automated functional testing of the native AOT app output will depend a lot on the type of project it is (web UI, web API, grpc, etc.)
This may be why azure functions are slow on applying AOT... since most functions I write opperate at a higher tier where the code is always running. I do this do to start up times... By not supporting this many people are forced to a higher tier.
@@KimBrusevoldTV You can actually use a docker image to build it in the docker image as well. That's how AWS allows you to build NativeAOT Lambdas form windows for Linux
Why will anyone use GO after this feature is fully supported. Learning GO was amazing but using it not that much. I guess I will stick with c# for a long long time!
Shit, finally back to sanity? I hate packed/unpacked/self-contained/or whatever and huge nonsense compiled files! This should be the normal output as it was years ago.
I did not understand something: basically if you generate the code with the AOT are you basically compiling both your stuff AND the kestrel webserver together?
Nick I have other question. Is your chair good for long work on computer ? I see that it is gaming chair. I found a lot of info that only ergomonic chair can be comfortable. Thanks in advance for asnwer! 😊
How do we use this in a lambda? I had the concept that we use Api Gateway adressed to a lambda so we no longer need our lambda to be an API itself, i.e. we only need the function handler, what am I missing here?
Why is everything "INSANE" nowadays? Technology develops, things get smaller, small programs used to be small. NativeAOT does exactly what it advertises. How is that "INSANE" in all-caps? I would call it insane if it would NOT perform how it performs.
@@nickchapsas We don't recommend that approach as the PublishAot property set in the project file will also enable the trimming and native AOT analyzers, resulting in warnings during design-time (in the editor) when you call APIs that can cause incompatibility issues with native AOT. So yes, technically possible by passing the property value as stated, but as designed it's intended to be but in your project file.
I am sure they are not magically 10x faster just by compiling them to native executable. Maybe 10x faster startup time. But overall no way. It might even get a little bit slower.
I feel like Microsoft optimize thinks which are fast enough for 99,9% of the users. But do not care of thinks people really needs. For example: 1. XSLT Parser is still in Version 1.0 and do not XSLT transformation 3.0 2. RDLC is still not in plan for creating reports 3. Developing Backend with minimalApi and ASP WebApi is both very nice but when it comes to front end, there is no real competition to nodeJs. Razor is old and Blazor is slow and has a small environment.
I think there's need for real time applications, and I do believe that such low level functionnality is a step to achieve that. It's been long I have not used xslt, but I remember that the xslt processor can be extended. There may be an extension library to achieve XSLT 3.0.
@@nickchapsas I mean, we can't revert a code generated as AOT so easily then revert a code using JIT, there is some tools where we could revert a JIT compiled application and see the code behind it...
A couple of notes. First, I don't know what AOT stands for and had trouble finding this online. Second, it has bothered me since .NET Framework 1.0. Why has Microsoft resisted being able to compile to native code? I've always thought the JIT compiler was silly. This seemed like the obvious need to me. And yet it's considered an exciting new feature now? This is insanity to me! Finally, why is this specific to APIs? Let us compile our code please. Not just for some tasks. Just let us compile our code! The thinking behind all this here is really annoying.
Because C# is not C++? It's designed to run in a runtime as managed code, with garbage collection... It's designed to be agnostic of processor architecture. Also it;s Microsoft's answer to Java and the JVM
@@GumbootMan Before C# came out, I had a background in C and C++. I'm not saying compiling to native code is trivial, but it isn't that hard. It's just that the mindset behind .NET has been going in another direction for reasons I never quite understood. And now changing that design to more easily compile to native code may be hard.
@@HikingUtah Right, compiling to native code isn't that hard. The JIT is doing that already, you "just" need to run that as part of the compilation process. I'm saying that's not the problem, the problem is dealing with dynamic features like runtime code generation, reflection and runtime assembly loading. These features have been there since .NET 1.0 (I'm pretty sure) so it's not really a change in direction. Of course you may argue that these features are more trouble than they're worth but MS doesn't really have the luxury of ignoring them now that they are so widely used.
Seems odd that it's not supporting reflections. There are already attributes to flag classes and class members so that they don't get trimmed, so it's odd that reflections and the default json deserialization functionality would not work. But of course your 3rd party libraries are gonna be hit/miss since they probably won't have their classes properly flagged.
if everyone had to learn dotnet from Nick it would be the most popular language
I use the publish single file a lot, very useful, being able to AOT compile it certainly would make it much nicer with all the speed.
One thing to note, you can see when comparing the first api calls in postman.. you can see the first request in AOT is 7ms compared to 119ms! Sure there are some things you can do to mitigate this, but to have this right away is amazing.
This is great for me! I'm building an open source end to end encrypted cloud storage and needed to build a CLI for people who prefer them (like me), and the file size of the executables were huge 😅 it's also one of those places where 10x start time improvements is quite actually visible.
I can't wait to be able to do this for real-world applications.
There's so many benefits of AOT, from the load time improvements, to the executable size, to the not having to rely on JIT on lower-end hardware.
Hopefully we start seeing better tooling and such for microcontrollers and RISCV that will be able to take some of the education/hacker market share from micropython.
Great video. Well done! Very accurate info based on my experience. Last year, I created a translation tool for our dev team (translate old HTML/VBScript to HTML5/JavaScript). Started out in .NET 6 self-contained, but then I switched it to .NET 7 AOT. The use case favors AOT with devs translating pages one by one. However, I built the translator to be able to spin through folders and translate multiple pages using concurrency. Translating 1K+ files under AOT is actually slower than under MSIL.
I know I'm 9 months late to the party on this but I love this, I was going to write an essay long paragraph on why this is just how c# should be written where you can but I'm too old and tired now to fight with developers who live by the mantra 'to simply exist and never push boundaries is enough', let their response time SLAs with clients be the thing that looses them the contract you win.
Great intro on NativeAOT.
But so far, it's only good in startup time (serverless) and publish size (docker), like Nick said.
Finally the feature is there. I love it!
When I saw minimal APIs, I thought, "huh, looks a bit like nodejs" - so, now, with NativeAOT, you can have node-like experience when creating small APIs without compromising performance for whole node VM overhead, which might be pretty useful for some applications.
Its like a Golang with steroids lol
@@gordonfreimann not exactly, I don't want to go into details, but since C# is compiled to IL some optimization can be made at the start of application (this is why startup is relatively long), whereas if you have fully interpreted language like JS it is almost impossible. JIT is a bit of a different story and, yeah, it is quite similar.
@@EzequielRegaldo Wouldn´t necessarily say so, go is from the ground up built for this. So everything it can do just works without thinking about "can this be AOT compiled". The go compiler is also crazy fast at compiling. It´s not even close in that department (like seriously the go compiler is orders of magnitude faster). Runtime I don´t know... test it for your usecase I guess. For the most part I´d expect go to edge out c#. Both c# and go are much faster than my performance requirements pretty much all the time. I did not actually end up using go for anything that I want to keep alive though. But not because I think c# is hands down better, but because it would be more friction to keep it alive when I don´t use go regularly. I have to use c# at work so I can´t exactly forget the syntax. I would with go if I don´t use it for a few month. The main thing I wanted was a smaller file size for containers. So this change is gonna do exactly that.
@@shioli3927 aot c# uses 30-40 mb, isnt huge difference
how is it that you have a dotnet news every day but at work my dotnet procedures haven't changed for some 2-4 years 😂
The more things change, the more they stay the same. If memory serves me right, MS removed the capability to compile to Native code in VB 5 (before the .NET days.) Then came along .NET with IL code, and now AOT is back again. Pretty soon I'm guessing MS will introduce a new language to the mix, and maybe they'll call it Assembler.
This is so awesome :) Hope Microsoft will keep investing in AOT area.
Nice to see this gaining traction. Earlier this year I created authorizers in AWS Lambda using light weight NET7 Okta/Auth0 authorizer that had incredible response times for JWT validation. Probably the fastest thing I have ever built (simple though). Took a while to figure out how I would do the JSON source generation as that was new to me and the AWS SDK contracts don't match production AWS but I got it there in the end.
How were you able to figure out everything to exclude and things that need to be source generated? I have an app but have not been able to work through all that yet for my aws lambda.
Looks like il2cpp from unity engine tool. I used it a lot. What i know after years: writing aot friendly code is difficult, but possible. And it always be few steps behind jit, and also will contain some bugs. But in some scenarios it's very good
This is really nice for autoscaled microservices where spin up times are important.
C# é uma das melhores linguagens, por anos atuei, hoje com o Java, sinto muita falta do Dotnet......
In my tests AOT and regular "Release" mode build are exactly the same speed. Measured on for-loops that calculate simple time-series-like arithmetics.
Do-Me-Train!
Sorry, couldn't resist, really nice vid, thanks 👍👍👍
You are not the first to bring this up 😂
@@nickchapsas is it "Dome train" as in "get that into your thick dome"?
Hi Nick,
amazing video and content, I have a couple of questions and a little premise:
In order to gain the AOT startup time advantages, you always release your application, and then verify that the startup is faster as compared to the JIT compiler.
If the application makes use of some not supported features, for instance, the reflection-based JSON serialization, you just "discover" the error because you get an internal server error the moment the API is called.
1) Is there anything we can do to make the application fail at startup time?
I feel like it's ok that some features are not supported, but it's not really optimal to discover what you did wrong at Runtime, it is bad for the user experience and it violates the concept of "fail fast".
2) You always release the application and then take measurements, is it possible to debug native code?
1) Is there anything we can do to make the application fail at startup time?
The compiler will emit warnings if you are compiling for NativeAOT and it detects code patterns that are problematic. You can actually see just such a warning in the video (it's the big block of yellow text). If you want the compilation to fail entirely, just enable the "treat warnings as errors" feature in the project file.
@@GumbootMan thank you! This is a good practice in general, also outside of the AOT scenario
@@GumbootMan It would be great to add some attribute that would allow you to tag an assembly as AOT compatible/incompatible, so that we could see an error at compilation. Would be great for libraries that rely on reflection, or runtime compilation.
@@nocturne6320 that's exactly how the warning systems works. Libraries that have been annotated as trimmable and had their APIs decorated with the appropriate attributes will result in analyzer based warnings at design time & compile time when the consuming project is marked for PublishTrimmed or PublishAot
this would be way more popular if there was a transpiler/scaffolding tool that found all everything that used reflection and implemented an alternative automatically or if none existed provided meaningful error explaining which calls that will fail (and line numbers of course)
Amazing 😻 Waiting for your example with AWS lambda web api, especially in case where your lambda should consume your own nuget packages from Code Artifact.
I hope it can finally apply to WPF/WinForm applications. It can also prevent such client programs from being decompiled in a sense.
This is very interesting and exciting. Thanks for the great content!
This is very nice for serverless applications.
Тhanks Nick. I thick you should start making your videos on mac or linux since so many people have questions about supporting other platforms except windows
Feels like a huge improvement for the future of serverless with this greatly reduced start up time. Really looking forward on how serverless will evolve in few years.
This is mainly usefull for command line programs, and IOT EDGE modules on memory constrained devices, or devices that update over slow 3G cellular.
Most benefit will be for Lambda's and functions where you want it as small as possible and native.
I've already tried it and wile inspired by Eric Sink's articles where he links C libraries directly into the native AOT app did the same with some of our own applications. It looks pretty cool, but I don't think we will be able to do this in a real-world app just yet with the existing limitations.
what about some big projects where a lot of serializable objects? What about their memory and startup time?
hi Nick, thanks for very interesting and informational video. But let's imagine what in some minimal api project type which contains a lot of http clients and serialization flows (as example 5-6 different serialization models) . Now do we need to create each time (5-6) json serializer context(s) and register in start up? Or it not relevant to do? Thanks
The biggest lacking feature for me is the ability to statically link native dependency assemblies. It defeats the point of single file publish and limits it to managed assemblies only.
Also the file size output test wasn't fair, you'd need to enable trimming under link mode in the managed assembly, to have a fair comparison.
You can statically link some Linux native dependencies. What native dependencies were you trying to link?
gRPC support is already enough to start using this feature. Very interesting.
I don't need it but it's awesome!
Will NativeAOT improve Azure functions cold start as well?
Going to use it for AWS lambdas
Hi Nick! You are great teacher and I enjoy every clip you make. Have you thought on making these courses (from dometrain I mean) available also on Udemy?
No this will never happen. My courses will always be exclusive to my own platform
woww ! Thanks for the great content!
Very nice. Ty! I wonder if it removes the startup cost of a new comnection. Dotnet is very slow on connections compared to native.
Yeah #dotnet10 as you say, that'd be grand. :-). Great video Nick!
Very cool indeed but i hope the efforrt are not in one direction only, i'm worried about compatbilty. "Tommorow" this will be the new way to do things and im still "fighting" to move from controllers accepting xml to minimal api. I like the fact the we are getting a glimps to the future but i hope not to much will be "barried" in the past (signalR for example).
Enjoyed your contenT, THANKS! 2 QUESTIONS though: What are and how do you use extensions in the prompt? And why can't I save your video in my playlist? =(
Whcih console application tool you are using while publishing the AOT app in the demo? which is showing colorfullp path?
could this be used instead of obfuscating my code that is running on user desktops to prevent people easily decompiling using dotpeek. Also is it possible to just build a library project with AOT then reference that dll in another normally built c# winforms/console app
"Preview" agian? Nick, can you tag videos about preview features somehow, please?
How does this startup performance improvement scale? Is it always x times faster even for bigger/more complex applications, or always 100ms faster?
We'll take a very close look at that in the next NativeAOT video :)
@@nickchapsas Exciting!
Great video, Nick! Have you considered whether Native-AOT supports Linux? It could be beneficial for containerized applications if it's faster than AKS. What are your thoughts on this?
It’s does support Linux and with .NET 8 it supports MacOS too
How does it affect integration testing? You need to change your code in some cases as you said (reflection). How do I test that using automated testing? Can you run integration/unit tests against native code? If so, do you still need to change how you develop/run your tests (IDE tools, additional workarounds)?
If I can't develop automated tests to make sure my code works, that is a big deal breaker for me.
You cannot test using the usual approaches once you've published native AOT. Typically you'll write your tests as normal with the testing framework of your preference and run those tests in non-AOT mode. How you approach automated functional testing of the native AOT app output will depend a lot on the type of project it is (web UI, web API, grpc, etc.)
@@DamianPEdwards Yes, that is what I was saying and it is a big no-no for me.
I didn't know that I needed till now 😂
This may be why azure functions are slow on applying AOT... since most functions I write opperate at a higher tier where the code is always running. I do this do to start up times... By not supporting this many people are forced to a higher tier.
Very cool! Which console are you using?
Nick, is this valid for Desktop applications as well?
Would love to see you explore bFlat
Anyone know if Complete Course Bundle will also include a future courses or just the released ones?🙂 Btw. Thank you for your videos on youtube Nick.
Just released ones
i think this is great match with azure func or aws lambdas where you are penalized for cold start significantly
Hi Nick, can i use this ApiJsonSerializerContext with Generic Types? Like
What about Native AOT for Linux, as nobody hosts their apps on Windows anyways?
Sure just compile it for Linux. Both Windows, Linux and MacOS are supported for both x64 and ARM64
@@KimBrusevoldTV You can actually use a docker image to build it in the docker image as well. That's how AWS allows you to build NativeAOT Lambdas form windows for Linux
Hi, how to make my terminal looks like yours?
I'm wondering why did you mention docker and native compilation in one sentence. Is docker still necessary for AOT compiled applications?!
Why will anyone use GO after this feature is fully supported. Learning GO was amazing but using it not that much. I guess I will stick with c# for a long long time!
Shit, finally back to sanity? I hate packed/unpacked/self-contained/or whatever and huge nonsense compiled files! This should be the normal output as it was years ago.
I did not understand something: basically if you generate the code with the AOT are you basically compiling both your stuff AND the kestrel webserver together?
Yes, everything into that since file
Nice!
We can attach the nativeaot process on visual studio for debug the code?
Yes but you'll be debugging a native app at that point which is a different experience
Nick I have other question. Is your chair good for long work on computer ? I see that it is gaming chair. I found a lot of info that only ergomonic chair can be comfortable. Thanks in advance for asnwer! 😊
I’ve been using it for 2-3 years now and I’ve been very happy with it. I sit on it for 10-18 hours a day
I can really recommend the Herman Miller Embody if you can afford it :)
How do we use this in a lambda? I had the concept that we use Api Gateway adressed to a lambda so we no longer need our lambda to be an API itself, i.e. we only need the function handler, what am I missing here?
Well, we could be expert at other languages such as Go until we have full support for AoT and have these things much sooner :P
Anyone know if this would be a good idea with isolated azure function apps? Do they execute a new instance on each invocation?
Tried to try it.... but it isn't supported in the official 8.0 docker image
How about the decompile? For Native I is it still possible?
It’s native code so just like native code
You can read assembler instructions via IDA for example, but can't get source code.
How to do the 4:57 implementation in Visual Studio 2022 community?
this has been working for a long time
GAC cancelled?
would you use this in prod?
May I know the IDE you're using?
Does this work with EntityFramework?
Why is everything "INSANE" nowadays? Technology develops, things get smaller, small programs used to be small. NativeAOT does exactly what it advertises. How is that "INSANE" in all-caps? I would call it insane if it would NOT perform how it performs.
how to do it in linux ? there is no visual studio there
but, what is NativeAOT?
Nice
use autocannon to compare the diff...
is linux arm64 supported yet?
Sure
My face after watching 5+ minutes and slowly realizing that I confused AOT with AOP.
What about garbage collection?
There’s still a garbage collector in there
I wouldnt use it yet, maybe in the future
👍
Is there a command line option to build aot instead of modifying the project file?
Yeah you can always pass csproj parameters through the CLI.
@@nickchapsas We don't recommend that approach as the PublishAot property set in the project file will also enable the trimming and native AOT analyzers, resulting in warnings during design-time (in the editor) when you call APIs that can cause incompatibility issues with native AOT. So yes, technically possible by passing the property value as stated, but as designed it's intended to be but in your project file.
@@DamianPEdwards thanks! that's a very important clarification of the use and effect of AOT on the application design
I am sure they are not magically 10x faster just by compiling them to native executable. Maybe 10x faster startup time. But overall no way. It might even get a little bit slower.
He says exactly that in the video. Did you even watch it?
@@drewfyre7693 Yes. I was simply addressing the clickbait in the thumbnail
It’s 10 time faster startup, not runtime like mentioned in the video.
@@nickchapsas To be fair, the thumbnail does say "10X FASTER AND SMALLER .NET APIS". 😄
@@phizc Gotta get them clicks somehow
😢😢 ef core is just 😮😮😮😢😢😢 gone 😢😢
so machine code... cool
I feel like Microsoft optimize thinks which are fast enough for 99,9% of the users.
But do not care of thinks people really needs.
For example:
1. XSLT Parser is still in Version 1.0 and do not XSLT transformation 3.0
2. RDLC is still not in plan for creating reports
3. Developing Backend with minimalApi and ASP WebApi is both very nice but when it comes to front end, there is no real competition to nodeJs. Razor is old and Blazor is slow and has a small environment.
I think there's need for real time applications, and I do believe that such low level functionnality is a step to achieve that.
It's been long I have not used xslt, but I remember that the xslt processor can be extended. There may be an extension library to achieve XSLT 3.0.
GO's dead
Edit: Oh wait, not yet but soon
Blazingly fast
Without JWT support it is useless...
Also, the security of an AOT app has no comparison with a JIT app
How do you define security here given that it’s a very broad term that could mean anything
@@nickchapsas I mean, we can't revert a code generated as AOT so easily then revert a code using JIT, there is some tools where we could revert a JIT compiled application and see the code behind it...
I hope it includes a garbage collector in it! 😂
It does
A couple of notes.
First, I don't know what AOT stands for and had trouble finding this online.
Second, it has bothered me since .NET Framework 1.0. Why has Microsoft resisted being able to compile to native code? I've always thought the JIT compiler was silly. This seemed like the obvious need to me. And yet it's considered an exciting new feature now? This is insanity to me!
Finally, why is this specific to APIs? Let us compile our code please. Not just for some tasks. Just let us compile our code!
The thinking behind all this here is really annoying.
Because C# is not C++? It's designed to run in a runtime as managed code, with garbage collection... It's designed to be agnostic of processor architecture. Also it;s Microsoft's answer to Java and the JVM
AOT means "Ahead Of Time", the opposite of JIT "Just In Time".
@@GumbootMan Before C# came out, I had a background in C and C++. I'm not saying compiling to native code is trivial, but it isn't that hard. It's just that the mindset behind .NET has been going in another direction for reasons I never quite understood. And now changing that design to more easily compile to native code may be hard.
@@HikingUtah Right, compiling to native code isn't that hard. The JIT is doing that already, you "just" need to run that as part of the compilation process. I'm saying that's not the problem, the problem is dealing with dynamic features like runtime code generation, reflection and runtime assembly loading. These features have been there since .NET 1.0 (I'm pretty sure) so it's not really a change in direction. Of course you may argue that these features are more trouble than they're worth but MS doesn't really have the luxury of ignoring them now that they are so widely used.
AOT = Awesome Optimization Tricks
Java is still better, because it's an awesome and beautiful island, while C# is just another programming language... :/
lol ok boomer
Seems odd that it's not supporting reflections. There are already attributes to flag classes and class members so that they don't get trimmed, so it's odd that reflections and the default json deserialization functionality would not work.
But of course your 3rd party libraries are gonna be hit/miss since they probably won't have their classes properly flagged.
Seems like a good reason to be weaned off reflection and move on to source code generation
ERR_SSL_PROTOCOL_ERROR when created project without any change
that super sexy ohmyposh cmd line =]