Oh yes. I've set up tests like that for my team as part of 1M codebase with about 7k unit tests. That makes debugging them among all potential noise absolutely crucial. What more you can also do is set up the test environmental variable or similar to set the log level with some reasonable default when not present. And then hook your CI pipeline to allow that level to be set optionally. This way you are not logging much when everything is green, but when you need more verbose logs, run that build again with variable set to INFO or DEBUG.
We use Serilog across our solution so its a matter of telling it to log into console (Log.Logger = new LoggerConfiguration() .WriteTo.Console() .CreateLogger();). All logs gets displayed during tests in the IDE and on pipeline
Really clever with your timing regarding your British audience. Whenever I start my lunch break there's always a new Nick Chapsas video that's ready to watch :)
As an upcoming SDET I appreciate this, Nick! While competent SDEs know how to both develop and test their code, I believe that the future of engineering teams will mainly consist of SDETs that can be used to build features, fix bugs, and develop/execute end-to-end tests and tooling… I wonder how many of your YT community skips over your testing videos. testing is HIGHLY important; it validates that the developer created the right thing and that the software product meets performance and security requirements.
to this i would add you can disable exception handling completely when running in tests and that way it wont get handled and serialized the original exception will be visible in the test logs as well
Ha! Outrageous! Just today I encountered a problem with a broken integration test, and found myself wishing for exactly this. I know what I'll be doing tomorrow! 🤩
In our projects, our logger is just simple console output provider (structured or unstructured) and there is scrubber outside our application that read all this stream of console logs and post to Kibana. This is on TEST and PROD environments. We host our services in Kubernetes.. In CI, since it's console output, it's already spitting out as part of the logs during CI execution, if somethings goes wrong or failed and we can see the stacktrace too. So we kinda already have this solution.
Yeah I did this like 3 years ago and it was such a game changer, that when I presented it to the team, they didn't have an opinion and needed like 5min to process it.
Very luck to have this video during my integration test course from dometrain and faced exactly the same issue while running test cases, thanks for the useful content !
This is exactly what I was looking recently. I tried to display exception details on problem details for IntegrationTests environment. But this logging is way better
The WAF is great for integration tests, it gives control over basically everything The only thing I’ve changed that it doesn’t support out of the box is the ability to pass custom args to Main
This is very nice! I tried it and I loved it, with some limitations to which I found workaround. When you have Logging section in your appsettings.json file, and it sets LogLevel, calling SetMinimumLevel([with different level]) will have no effect. You have change your settings, I did it by removing Logging section, e.g.: builder.ConfigureAppConfiguration( (_, configBuilder) => { var config = configBuilder.Build();
var settings = config.AsEnumerable() .Where(x => !x.Key.StartsWith("Logging:")) .ToDictionary(k => k.Key, v => v.Value); foreach (var source in configBuilder.Sources.ToArray()) { configBuilder.Sources.Remove(source); } configBuilder.AddInMemoryCollection(settings); } );
Any suggestion on how we spin up 1 instance of our API in memory and still use ITestOutputHelper? the problem with this approach is we need to run our API project N times which is not great. in other words looks like it is not possible to do this when we're using class fixtures in xUnit 😞
@@adamdiament3214 not yet, didn't have time to take a look but soon I need to find a solution for this problem. I think we might be able to the similar thing using the IMessageSink interface provided by xunit. I'll try to let you know if I found any better way
I wrote a buffer logger provider that works quite well: Basically, register a singleton that implements ILoggerProvider that in the log method that adds messages to a concurrent queue of string _logMessages ``` public void Log(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func formatter) { var message = formatter(state, exception); _logMessages.Enqueue(message); } ``` Then in the initialize async of the WebApplicationFactory, get it and clear it, so you don't get all the startup logs var bufferLogger = Services.GetRequiredService() as BufferLoggerProvider; bufferLogger?.Clear(); In the constructor of your test this.bufferLoggerProvider = factory.Services.GetRequiredService() as BufferLoggerProvider; Then in the dispose method of your test, write your logs out to the standard ITestOutputHelper and clear them // At the end of each test, flush the logs to ITestOutputHelper foreach (var log in bufferLoggerProvider.GetLogs()) { output.WriteLine(log); } // Clear the logs for the next test bufferLoggerProvider.Clear();
Very interesting and useful video. I had the same problem and will check this package, but it would be good to have also logging the actual HTTP request which is sent to the controller, like in HTTP logging middleware.
Very nice. But, how would you accomplish the same thing if your WebApplicationFactory is used in an ICollectionFixture? I tried adding ITestOutputHelper as a constructor parameter and it yelled at me.
Thank you for an amazing content! However, in the video, isn't it a system test? I think integration test is when a repository is tested with a fake database running in Docker or in memory, but in this video, the whole application was tested. Correct me, if I'm wrong
how do you make test where your application needs a service to do anything, like lets say you access data based with a lot of enumerations that are from a service so to test operations on the data you need to get enums from service first, but enum service is quite dynamic, you can't just mock it, you would have to mock too much data and also maintain the mocks when user changes enums in the enum service and you are trying to test the operations on the data not the actual enum service
It's a bit tricky and depends if that other service is part of your company or not. If it is, the way to do it is to generate code from those enums on that other service, then use that code as an input to your service. It's called contract testing, and yes, it is hard to setup. Or you can just mock and maintain them.
An integration test should test the real thing if possible. It's unlikely an integration test would call a real external APIbut a database can easily be spun up.
I am more interested in the integration testing itself, but not for entity framework. Where can I learn more about integration testing without EF? Everyone always shows EF integration testing, which I understand is super easy with in-memory databases, but in the real life EF sucks hard (especially further down the line), and should be avoided like fire. In my case all of the interactions with DB happen with Dapper and stored procedures. And I haven't seen much on proper integration testing such code. Anybody can help?
Oh yes. I've set up tests like that for my team as part of 1M codebase with about 7k unit tests.
That makes debugging them among all potential noise absolutely crucial.
What more you can also do is set up the test environmental variable or similar to set the log level with some reasonable default when not present. And then hook your CI pipeline to allow that level to be set optionally.
This way you are not logging much when everything is green, but when you need more verbose logs, run that build again with variable set to INFO or DEBUG.
We use Serilog across our solution so its a matter of telling it to log into console (Log.Logger = new LoggerConfiguration()
.WriteTo.Console()
.CreateLogger();). All logs gets displayed during tests in the IDE and on pipeline
Really clever with your timing regarding your British audience.
Whenever I start my lunch break there's always a new Nick Chapsas video that's ready to watch :)
I just woke up so clever for Americans as well
The link to the blog of the guy you mentioned is missing ?
i love this channel
Yes, if only he aspirated the consonants😥
As an upcoming SDET I appreciate this, Nick! While competent SDEs know how to both develop and test their code, I believe that the future of engineering teams will mainly consist of SDETs that can be used to build features, fix bugs, and develop/execute end-to-end tests and tooling… I wonder how many of your YT community skips over your testing videos. testing is HIGHLY important; it validates that the developer created the right thing and that the software product meets performance and security requirements.
to this i would add you can disable exception handling completely when running in tests and that way it wont get handled and serialized the original exception will be visible in the test logs as well
I was just facing this exact issue today! thanks a lot
Ha! Outrageous! Just today I encountered a problem with a broken integration test, and found myself wishing for exactly this. I know what I'll be doing tomorrow! 🤩
In our projects, our logger is just simple console output provider (structured or unstructured) and there is scrubber outside our application that read all this stream of console logs and post to Kibana. This is on TEST and PROD environments. We host our services in Kubernetes.. In CI, since it's console output, it's already spitting out as part of the logs during CI execution, if somethings goes wrong or failed and we can see the stacktrace too. So we kinda already have this solution.
Yeah I did this like 3 years ago and it was such a game changer, that when I presented it to the team, they didn't have an opinion and needed like 5min to process it.
Very luck to have this video during my integration test course from dometrain and faced exactly the same issue while running test cases, thanks for the useful content !
This is exactly what I was looking recently. I tried to display exception details on problem details for IntegrationTests environment. But this logging is way better
Love your channel, unbelievably helpful for me to keep up with new .NET features
I really needed this. Thanks a lot for sharing !
I didn't know about that. Thank you for sharing this library!
The WAF is great for integration tests, it gives control over basically everything
The only thing I’ve changed that it doesn’t support out of the box is the ability to pass custom args to Main
This is very nice! I tried it and I loved it, with some limitations to which I found workaround. When you have Logging section in your appsettings.json file, and it sets LogLevel, calling SetMinimumLevel([with different level]) will have no effect. You have change your settings, I did it by removing Logging section, e.g.:
builder.ConfigureAppConfiguration(
(_, configBuilder) =>
{
var config = configBuilder.Build();
var settings = config.AsEnumerable()
.Where(x => !x.Key.StartsWith("Logging:"))
.ToDictionary(k => k.Key, v => v.Value);
foreach (var source in configBuilder.Sources.ToArray())
{
configBuilder.Sources.Remove(source);
}
configBuilder.AddInMemoryCollection(settings);
}
);
Any suggestion on how we spin up 1 instance of our API in memory and still use ITestOutputHelper?
the problem with this approach is we need to run our API project N times which is not great.
in other words looks like it is not possible to do this when we're using class fixtures in xUnit 😞
Did you ever work out a solution @alirezant?
@@adamdiament3214 not yet, didn't have time to take a look but soon I need to find a solution for this problem.
I think we might be able to the similar thing using the IMessageSink interface provided by xunit.
I'll try to let you know if I found any better way
I wrote a buffer logger provider that works quite well:
Basically, register a singleton that implements ILoggerProvider that in the log method that adds messages to a concurrent queue of string _logMessages
```
public void Log(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func formatter)
{
var message = formatter(state, exception);
_logMessages.Enqueue(message);
}
```
Then in the initialize async of the WebApplicationFactory, get it and clear it, so you don't get all the startup logs
var bufferLogger = Services.GetRequiredService() as BufferLoggerProvider;
bufferLogger?.Clear();
In the constructor of your test
this.bufferLoggerProvider = factory.Services.GetRequiredService() as BufferLoggerProvider;
Then in the dispose method of your test, write your logs out to the standard ITestOutputHelper and clear them
// At the end of each test, flush the logs to ITestOutputHelper
foreach (var log in bufferLoggerProvider.GetLogs())
{
output.WriteLine(log);
}
// Clear the logs for the next test
bufferLoggerProvider.Clear();
How is Nick still using the old UI in Rider?
Very interesting and useful video. I had the same problem and will check this package, but it would be good to have also logging the actual HTTP request which is sent to the controller, like in HTTP logging middleware.
Serilog.Sinks.Xunit does the same thing?
what about authorizaton and authentication?
Very nice. But, how would you accomplish the same thing if your WebApplicationFactory is used in an ICollectionFixture? I tried adding ITestOutputHelper as a constructor parameter and it yelled at me.
I believe fixtures are shared by tests where as ITestOutputHelper is per-test, so that probably won’t work
Thank you for an amazing content! However, in the video, isn't it a system test? I think integration test is when a repository is tested with a fake database running in Docker or in memory, but in this video, the whole application was tested. Correct me, if I'm wrong
Can‘t find channel of this guy, which Nick mentioned. And there is nothing in description, as i can see. Is anyone knows?
I added the missing link
how do you make test where your application needs a service to do anything, like lets say you access data based with a lot of enumerations that are from a service so to test operations on the data you need to get enums from service first, but enum service is quite dynamic, you can't just mock it, you would have to mock too much data and also maintain the mocks when user changes enums in the enum service and you are trying to test the operations on the data not the actual enum service
It's a bit tricky and depends if that other service is part of your company or not.
If it is, the way to do it is to generate code from those enums on that other service, then use that code as an input to your service.
It's called contract testing, and yes, it is hard to setup.
Or you can just mock and maintain them.
@@RaMz00z it is part of company, so you would create kind of mock generator in enum service and then use that generated mock in tests?
I've used console logger for integration tests.
I'm not sure how this differs from using serilog with a console sink in your tests, can someone explain?
Tests don’t use the console to output so you wouldn’t actually see anything in the test output
@@nickchapsas Ah ok! Nice to know thanks I'll be trying that in my project!
Literally i was searching today on this logging issue, and wasted 2 hrs before finding the solution on some blog.
I’m confused. Why would you be doing “integration” tests on a CI pipeline? Why would a build server have access to resource the full system needs?
An integration test should test the real thing if possible. It's unlikely an integration test would call a real external APIbut a database can easily be spun up.
"Hello everybody i'm naked"
I was thinking how useful would be a logger in the tests just last night
I am more interested in the integration testing itself, but not for entity framework. Where can I learn more about integration testing without EF?
Everyone always shows EF integration testing, which I understand is super easy with in-memory databases, but in the real life EF sucks hard (especially further down the line), and should be avoided like fire.
In my case all of the interactions with DB happen with Dapper and stored procedures. And I haven't seen much on proper integration testing such code. Anybody can help?
In my course on Dometrain I am showing integration testing without EF but rather dapper
@@nickchapsasPerfect! I'll check it out
Noice 👌👌