FREE TDD TUTORIAL: Learn what it takes to get started and learn the skills of TDD, with a hands-on demonstration by Dave Farley ➡ courses.cd.training/courses/tdd-tutorial
totally unrelated but... as soon as I started this video my wife sighed and said "I want him to read bedtime stories". So if you are looking for a new project channel....
I wish this video was made before I started working as a backend engineer. After a few months, I realized I was spending way more time on fixing those very tightly coupled pytest mocking parts rather than the actual code. The overall design have been terrible without keeping any SOLID principles, and the test code has been mocking third libraries and asserting useless features. I thought unit testing and mocking technique is rubbish, but as this video tells me that my code and the way I have been writing the test code are rubbish. I agree 1000%. We should focus on public interfaces rather than how a method of SQLAlchemy library is called with some useless arguments.
Finally an episode with real code examples, and they where also in my favorite language Python! Its the one thing I'm missing most of the time. Concrete examples which give context to the currently discussed topic. I really love you Dave but maybe more practical examples and maybe more code could help you reach a bigger audience. I'm a developer now for over 10 year and without examples i find it often hard to follow your thoughts or at least if i can follow them, i lack the understanding at which level they apply. Keep up the good work!
Apparently, I was in a different mindset when I read the video title. I thought this was going to be about making fun of other people's code (mocking). ... Still, I would be interested to hear your ideas on the pros and cons of making fun of other's code.
should help people learn how to write better code instead of making fun of their effort...especially if they are members of your team that you will be working with for a while.
"I generally prefer to have a layer of my own code between almost any 3rd party code, and the rest of my system". This is the heart of something pretty deep and difficult, I think. I think it'd make a great initial assertion for a whole episode.
In my own project we've stopped the use of mocking libraries for a while. We prefer to write tests that focus on the what not the how. If we change the how, without breaking the behaviour, we want the test to still pass. We still use fakes and stubs for say the DB. I tend to take the view the unit is behaviour and start from outside but then don't really work in much. I will just make the test pass and layers only emerge through Refactoring. I was surprised you had found value in mocking as we've really found the opposite, that mockless tests have increased our confidence in our tests, eliminated manual regression testing etc.
I agree about putting a layer of abstraction around the edges, but for a database specifically I do find it useful to write small tests for those layers, but tests that run against a real database. For example, test the data access layer performs the query you expect it to, and then feel safe in mocking that layer for the rest of the code. You do also test these from the higher level integration/acceptance tests, but the feedback is slower and it's harder to to TDD this way. If you write minimal tests for this layer but use a real database e.g. running in a throw away local container, you can get instant feedback as you type your SQL (especially if you use a testing tool that runs as you type). You do have to be careful to keep tests independent of each other if you use a real database, using random IDs etc. I have always found mocking connection/command/query objects to be very painful and provide no value, they just mirror the code. When I first started unit testing, this kind of thing made me think unit tests were pointless, when they were actually just awful tests. Always test for desired behaviour.
The tests running against a real database you describe are already (small) integration tests and I agree with you that writing these tests is good practice.
I like to do my DB tests _in_ the database, using stored procedures. Whenever team dynamics allow me, I will make sure that my DB layer is fully self-sufficient, without requiring external dependencies for things like, for example, invoicing of subscriptions.
I generally agree with everything said in the episode, but it still begs the question: even with every abstraction layer in place, there is still a piece of code that will interact with the 3rd party code, we're just making it as small as possible. What is your advice to test this code? Do you recommend leaving this to a higher layer on the test pyramid, maybe integration or even e2e?
Yes. 3rd party adapters are good candidates for integration testing. If you update your vendor component, they serve as acceptance tests whether the new delivery still fits your expectations.
Terrific advice. Mocking is certainly a powerful tool, but I had to discover the pitfalls you point out for myself. I once started to mock out the entire Jenkins Python library. Like touching a hot stove, it was a painful lesson in the correct scope for mocking. Keep putting out these great videos!
Your last point about having a rapper or abstraction around a third party really on point. I recently tried to figure out how to mock a third-party thing because I was trying to not have to spend up the database connections it was doing and it was a nightmare to try and test sad when I abstracted that I couldn't test what I actually wanted to test and not worried about what the third-party code was doing underneath
I agree with everything you say in this video Dave, apart from one thing (I don't think you are disagreeing with me here - just wanted to add for clarity). As an unfamiliar engineer picking up this feature, the structure of the codebase and established patterns might not be clear to me - I find that often the Best place to start is with an acceptance test. This way you can add the easiest test to write straight away, and then as the possible implementation options become clearer, you can use TDD until you pass the acceptance test. (my god, that might have sounded a bit like XP - still, what you're saying makes total sense - it just depends at what level are changes need to be made)
I agree. I start with Acceptance Tests too, it us just that this video isn't on that topic, and I don't use tools like Mocking libraries in Acceptance Tests. My Acc Tests run the code as though it is in production, so I use custom stubs, under test control, to fake interaction with external systems, everything else is real.
Very useful info. I normally create an abstraction layer. I will never directly reference 3rd party code without an abstraction. TDD always drives me in this direction. The benefit is my code relies on the abstraction so I can change what the abstraction uses without breaking all my code. This is very useful if you want to use a swap out one 3rd party library for another
I see you are on the right path to understand unit tests. One of two main goals of unit tests is to create documentation in code. Good documentation is easy to understand. Mocks are making unit tests hard to understand on top of coupling with implementation code and giving us false negatives when we keep the functionality and change the internal implementation. Usually mocking makes sense only when calling outside the process like calls to database, filesystem or remote resources but never use mocking libraries to write those mocks. The other way to test those calls is by using integration tests. Unit in "classical" style is unit of functionality. Unit can be small (single method call) or large (many objects collaborating to provide functionality) but it is always ONE functionality. If you want to test more than one functionality in one test then create an integration test.
Yeah, writing an adapter for the external code is really just a mock, I agree 😅 And yes, those weird Python runtime hacks are usually not needed at all. Just create an interface of the adapter and inject your test behavior. This is quite useful to simulate all kinds of external failures and ensure the calling scope can handle them gracefully 😄 Especially when talking about Python, there are very elegant solutions with typing.Protocol to make use of the flexible type system through duck typing 😂
Basically abstract away your third parties to bring back what you need, then you can just mock that abstraction instead of trying to bend the third party code to your will
This 60-something programmer has had no formal training on testing. I've learned a few things over the years, etc. I'm really liking that web page re: "Mocks, Fakes, Stubs and Dummies". Excellent to see some of the variations.
Summing up several years of painful experiences in a concise way, nice job. The only thing that is missing is how to tackle error testing. Especially when it comes to databases, it is very difficult to provoke different error states on an acceptance test level, if you're using a real database. Therefore, I keep mocking the database libraries for these kinds of tests (of course, I don't actually test, if the database method was called with specific attributes, as that is not the goal of the test)
It's funny because the carpenter at 3:42 reminded me of years and years ago when my woodwork teacher at school passed me back my little model boat and said "Do you want to be a butcher when you grow up, son?" But seriously, it's maddening the long struggle to get our industry to use TDD... Just imagine a mechanical engineering saying "Micrometers? Dial Indicators? Rulers? I can't be bothered with any of that... it's a waste of time". You've got me thinking now, maybe what the BDD people should have done is instead of all this talk of behaviour and specs is to rename tests as "measurements" (or something cleverer than that but with the same meaning)?
I'm a huge fan of TDD. I've used Mocks so, so much, and it's always a huge pain in the ass, as well as being very brittle. Will study your "put in a facade" technique, everyone wins!
I think I get the part where you create a facade as a thin layer at the boundary of your code. But I hope you mean we have to mock the facade instead of the 3rd pty lib. If we don't mock anything, then I/O will be done and tests will be slow.
Nice video, thanks! I also tend to create code at the boundaries of my systems to interact with external services, and there's a design pattern for that: Humble Object. It's very helpful for that sort of situation.
Totally unrelated to the actual content of the video, but I really appreciated that you put the traditional woodworking face marks into your animation. Nice touch!
I'm not quite sure I understand. In the second example there is no longer any dB code so yes the test is simpler but is that not because the implementation has also been simplified? Is the idea to put 3rd party code behind abstraction layers and to then inject simpler implementations at runtime?
I agree with this observation. Very confusing and poorly explained. So the "good" example was using a in memory Store object for the book. How is this related to the db? Seems like it was testing the mock using Store.
The genetic pattern for database interactions used is the repository pattern. It’s quite common to have a piece of code that takes your semantic inputs (like add book) and then goes to just do the SQL part. There must not be any business logic in that object (like validating the ISBN). Those repository implementations do not get covered by unit tests, but instead are covered with some form of integration tests. Checking that you can actually store in the database happens only after you know that the core logic of your system works.
This simplifies the test and allows for a more direct testing of the add_book functionality. So yes, the idea is to put 3rd party code behind abstraction layers, as you mentioned, and in this case, to use a simpler implementation (in the form of a Mock object) during testing to avoid having to rely on external dependencies.
@@dinov5347 The "good" example was indeed using an in-memory Store object for storing book data, but the point of the example was to demonstrate how TDD can simplify unit testing by decoupling the database implementation from the test logic, in other words it helps you identify and avoid issues with mocking 3rd party code. The idea is to put 3rd party code behind abstraction layers and inject simpler implementations at runtime. In this case, the Store object serves as a simpler implementation of the database connection, which allows the developer to focus on testing the logic of adding a book without having to worry about the complexities of SQL. The Store object is not related to the database in any way; it's simply a data structure that the developer uses to test the add_book() method. The Mock library in Python allows the developer to avoid unnecessary implementation details of things like SQL, and the Store class can handle the data within it.
@@antdok9573 I have never coded in Python, but in Java you should not use an in-memory database, but instead simply mock the database response using Mockito. Using in-memory databases in JUnit tests causes unpredictable results in case running complex SQL queries (e.g. dependent on DB2 SQL). It probably works locally but only works intermittently during Jenkins builds. It could break on Jenkins after two months or so due to memory exhaustion or deadlock of some sorts. Seems like he is using a facade, which is only mocking the DB-response. As you wrote he is only interested in the interactions. He wants to separate essential complexities/business logic from the accidental complexities/database interactions (which he does not care about). Thereby making his code more robust and easier to change.
I mostly work on legacy PHP projects and I'm mocking extensively. Unfortunately, I don't have much choice, if I don't introduce unit testing, it's extremely hard to realize where I broke something, even harder to realize I removed a feature that is paramount. I really don't see a better approach, everyone else talks of ditching the ball of mud, but that ball of bud is generating the dough that back my payroll 🙃
I find trying to understand the problem your code is supposed to solve is easier than getting requirements. Methods like Domain Driven Design are for that. I can write the specifications myself in BDD format and get feedback from a running system from end users myself 😉
@@skyhappy My problem is with the strange edge cases that they think of after you have what should be final code and then claim that it is important and cannot ship without it. It is a case where I don’t know xyz property creates an invalid combination with abc, just because of the manual process that was done before, it is something small to them but has a cascading effect on code downstream. These things are often overlooked because they don’t do that process very often and they forget until someone else reminds them during end-to-end testing.
Excellent video points. I prefer to only test dependencies behavior in isolation. And only against each version. Behavior across internal code: wrapping or instrument injection is important, but also important is finding ways to test behavior without altering the systems configuration primitives from production. And in very short or small connections in the systems.
Isn't the improved example still mocking third-party code, since it mocks the add method of the list object? This would break if the implementation changed to use the extend method instead, for example, so why not just do an assertEqual on the contents of the list?
What about the sql query? Usually theres a lot of business logic hiding inside. Sometimes some of this logic is refactored or moved from the query into the code and vice versa for optimisation purposes. Therefore i prefer to write tests that use a real db to tests code+query logic
And again, so true... Thanks again for a very helpful video. Will show that to my Frontend colleagues who mock the axios- httpClient in all their methods...
I have a FREE TDD TUTORIAL with a hands-on demonstration where you can learn what it takes to get started, and how to learn TDD skills and use TDD to improve the design of your code ➡ courses.cd.training/courses/tdd-tutorial
You implement a version of your abstraction for storing and retrieving things that talks to a DB. You test this as an integration test, but not as a unit test. You end up with testable code, and a better design this way.
What are your thoughts on mocking certain operating system constructs, like file io? I often use in in-memory file system fake to avoid needing to spill unit testing into the host operating system. Would that qualify as a “3rd party api”?
My approach is not to mock the 3rd party API at all, but instead to write my own (always simpler) context-specific abstraction and mock that instead. It is usually much easier, and my tests are much less coupled to implementation detail. So instead of mocking things like file.open file.write file.close I'd create some code that did... storage.save and mock that instead. It's not my job to test that the operating system knows how to open files, I am only interested in my use of those features, so I test to an abstracted outcome in the scope of TDD, and I will have a simple smoke-test that verifies that my storage is configured correctly and so works in a general sense.
If you want to fix @3:00. Fix typo "develepor" -> "developer" The only person who doesn't make mistakes is the one who doesn't try anything! Love the channel, great job.
Unfortunately once the video is up, we can't change it without loosing the views and interactions or confusing people with "why are there 2 versions of this video". I wish UA-cam supported this kind of edit, but it doesn't I am afraid - Sorry for the typo!
I really don't get this video. So don't mock HTTP responses? If so, do you just not test the small unit of code that recieves the call, and you test everything else?
No, I don't think it is. I think it is a bit too complex and too intrusive, needs back-doors into the system being tested. My preference is to isolate test from one another using what I call "functional isolation" - use the natural boundaries of the system to isolate state from other tests. e.g. Each test creates a new account and a new product that only it can uses.
I have a 3rd party service that I need to mock, so I watch this video to refresh my knowledge .. the thing is I don't feel he answered me! like he talked about many things in a very general way is it only me? Should I watch it again with my cup of coffee?!
Different problem, same solution. Use "Ports & Adaptors" to fake the external API for unit testing and mock it. Use test controlled external stubs in acceptance tests, talk to your code through the prod version of its interfaces - it sends real messages, packets, api calls. Then use contract testing to check that you can talk to a real version of the 3rd party API.
Does it makes sense that I follow this channel? I am not even a Junior. I just had 3 months long internship as a Java Developer. I mean, I don't understand a lot of things discussed in these videos, but I still find it interesting.
Defenitely the time to learn this. Cause a lot of senior engineers are being too 'smarty' and dont like to do TDD(or train even if they know about it), and the problem is owned by the folks that has to support the legacy code afterward. Which is not a pleasant thing and challenging to fix. So learning it sooner is defenitely better
It totally does make sense. I think it can offer amazing and confusing perspectives when you've only just scratched the surface and are starting to take baby steps into the professional world. I assume you most likely will see the ugliest side of code possible as an intern. I speak for myself as a relatively new guy too.
Wait I don't think I get your comment. The analogy is about not testing versus testing. Could you explain how you saw it as 1 type of testing vs another?
@@WouterSimonsPlus Good luck with avoiding "business logic" (also known as "what the program does") when designing your relational database and the queries for it.
Your code is in too small a typeface. I cannot read it on a mobile device. Hint: Use full screen and large font for code examples. There is a reason why every UA-cam coding video does that.
I can literally see and hear you in front of your screen while your were coming up with the bad code "why exactly did I think making this video is a good idea" :D
That "bad code" example is much better than what he prescribed. It's also too simplistic. Real programs have logging, tracing, metrics, database transaction handling, error handling, and more of the so-called (by the snake salesmen) "cross-cutting concerns" - also known as just code. If you apply his technique, most of the real complexity is swept under the rug, while your "unit tests" would test a few mocked function calls and call it a day. The example test is worthless, and the whole program should be tested by integration tests. The complex logic, if any, can be extracted into IO-free procedures, testable with unit tests. But his example does not have any.
@youtubeenjoyer1743 I think we have a very different definition of what each type of test is supposed to do. Also, which type of applications we build, how to build them and how to evolve them over time.
Code that I wrote to have layers of abstraction and proper testing (much like the good example) was refactored to reduce the layers to make it easier and people say it was small serverless functions anyways that did not need all the complexity I was putting in. Who knows, maybe times are changing…
It's clear from the title of the video that we should refrain from mocking third-party code. This means that during unit testing, we should only focus on the output of functions like print() in Python, rather than testing their implementation. The input() function provides a good example of why mocking is important. Without mocking, every test would have to wait for a human user to input text and press enter, which would significantly slow down the testing process. So, just test the code you actually wrote.
16:34 Dave that's because you have a lot of experience, not everyone is a Dave Farley, not everyone had the opportunity or time to practice TDD at work. So someone not like yourself can totally write rubbish code after writing the tests, that's because they have written rubbish tests, due to the lack of experience.
Agree design is extremely difficult and not everybody is a Dave Farley. But using TDD will help you to think about your design before you write the production code. It will also help you to avoid writing silly bugs in your code, as you will follow a certain flow when coding in small steps.
Mocking makes sense for parts of code that handle the interaction with other parts of the system. You want to sometimes be able to test that those interactions happen as you expect without actually having the other parts executing.
@@WouterSimonsPlus And this, right here, is where the vast majority of the bugs is found. The interactions between subsystems. Tests that mock subsystem calls are worthless at best, and harmful at worst.
when i saw the first example i thought "this can only be tested manually" :) but this particular example isn't bad code imo. it's so trivial that i can understand it at a glance. it does not need to be refactored.
Sorry guy based on the title I can't engage in this. I use facades to mock implementation so juniors can get faster feedback on changes and the details abstracted away aren't their concern.
I strongly disagree with the statement that "TDD is NOT Difficult" at 9:10, TDD requires that the developer knows how to write a good test, how, when and why to use Test Doubles and many other things. To say that it is easy for a developer to know well everything from both xUnit Test Patterns and TDD By Example is nonsense! It is hard and it is a skill that differs great developers from bad ones, to say that it is easy it means that Testing is easy, which I think is an extremely naive point of view on the subject.
I don't agree, certainly TDD is a skill that you need to learn, but it is not a difficult skill, part of the mistake, that I think makes it seem more difficult is to think of it in terms of testing rather than in terms of design. I always start any test with "What do I want my Code to do now". If I can't answer that question, I can not only not write a test, but I also can't write the code. TDD is about design much more than about testing. Sure, but in TDD there are 3 types of test, and you need to learn the basics of your xUnit framework, most people can pick that up in a couple of hours. The difficult part of TDD is that most people are VERY poor at design and TDD Surfaces you poor designs more quickly than anything else. That is why I value it so highly, and I think that is why people find it difficult, but it is not the TDD that is the problem.
The code I'm writing right now would easily take 5 times longer to write if I did it in a TDD way, and the end result in terms of code would be roughly the same, namely adhering to customer requirements. The end result on the customer side would not be the same though. Not only would they probably fire me for taking 5 times longer to write code, the amount of meetings I would need to have with the customer to get the necessary requirements upfront to even begin coding would frustrate the customer to the point where they wouldn't want anything to do with me. So yeah, I'm staying away from TDD.
Instead of waiting til having all requirements defined up front maybe try working in a more agile way, iterating over solutions and showing to your customer as often as you add something to get feedback. Embrace the change. TDD is great tool for this. Following your same argument, I wonder if you ever write tests as it seems you would do it only when you have frozen never-changing requirements. Your customer will change their mind and thus the requirements, this is a constant in software development. What do you do in that situation?
No one says TDD is the right fit in _any_ situation. I like to do it to explore different solutions or uncharted territory (e.g. business logic i haven't already understand 100%). So, it's okay to stay away from TDD, but in my view this would also set a bad example if you were a senior engineer and others would like to learn from you. With your mindset, they would probably not be comfortable and not happy with what you teach them. TDD is a practice that you need to practice, so if you never tried it you actually miss the point of it. Do you actually test your code? When do you write tests? How do you approach testing in general?
@@chiaradiamarcelo In my experience (18 years of IT consultancy, countless number of projects and bits of software created and modified), customers rarely have the time, nor the ability, nor the fortitude, nor the willingness to iterate often and give feedback often. You're lucky if you can iterate three times (proof of concept -> mostly working -> finishing touches), but usually it's only two iterations (not done -> done). Most of the time, the customer can't even express domain rules in a coherent and correct manner without first seeing the PoC. It's not because the customers are bad or poor at their jobs, it's because their domain is almost always already heavily intertwined with several existing integrated 3rd party systems that behave in a certain way, which in turn has formed the domain so that it cannot be treated independently from the systems that are already in place. I mean if your company uses MS Dynamics, your domain is gonna look like Microsofts interpretation of the world (we all know what a mess that can be). And as such, the customer will usually need to see a PoC or "some parts are working" version of what you're working on before they can express their domain concerns in a coherent and correct manner. Meaning, you can't do TDD upfront unless your customer is willing to pay you to study the breadth of their current systems environment. And most customers simply don't have the budget for you to do that, especially if you're a consultant, in which case the project budget more often than not has already been set. As a consequence, I only write tests on critical parts of the code that are non-obvious to junior devs, because I expect a senior dev to understand the code I'm writing even without any tests in place. Why do I expect that? Why, because I myself understand that when I look at other peoples code. I can see the potential null pointer reference bugs waiting to happen, I can see the magic hard coded values that'll need a recompile in the future, and so can you. I never do tests upfront like in TDD, because if the customer cannot express their domain and requirements before the PoC has been presented to them, any TDD you've done up to that point will probably have to be thrown away anyway. And, when the customer has seen the PoC, TDD is not even neccessary because you as a developer has enough information to go from PoC to mostly working within your alotted budget.
@@tobyzieglerrr If TDD is the gold standard, not working with TDD as a senior engineer is indeed setting a bad example. I can't argue with that. I can only argue with whether or not TDD is actually a gold standard. And in my 18 years as an IT consultant, I've only once seen one example where you could easily deploy TDD (radio station conglomerate scheduling software built from scratch in .NET, large budget). Ironically, that one example is also where they (people with twice the experience I had) actually opted to use TDD, and despite the very elaborate and detailed work they did upfront to understand the domain, the resulting system was an absolute mess. OO inheritance everywhere, inversion of control and independency injection everywhere, literally all properties were subject to null reference exceptions despite the very elaborate test suite, and an absolute nightmare to extend with new functionality. So if TDD is a gold standard, why couldn't this team develop this software properly? If TDD guarantees some sort of correctedness to a system, why did this system look like a mess? If TDD is so good, why did they give up on the constantly failing test suite and just disabled the test suite errors instead of maintaining it properly? You see, when you say "TDD", all I see is those kinds of challanges, and that kind of incurred costs, which I maintain is easily 5x the cost of the system itself. I test my code rigorously yes. I've always done that. When I was 25, before Google Analytics was a thing, I made an multivariate (A/B) testing suite from scratch for a travel agency. It was deeply integrated into their existing custom made .NET e-commerce website, randomly directing customers to one of several variants of their website so that they could measure effectiveness of different layouts and campaigns. $100 million revenue every year went through that website, so not a small-time travel agency. They use that tool for 10 years before they switched it out to GA. I repeat: 10 years. How long does your code last? As I said in my other message, I only write tests on critical parts of the code that are non-obvious to junior devs, because I expect a senior dev to understand the code I'm writing even without any tests in place.
I'm just going to make the observation now that chairs made by carpenters who eyeballed it decades or more years ago can still be sat on and work exactly as they should, but a chair you bought from IKEA made by engineers who measured everything is broken after one or two months of use.
FREE TDD TUTORIAL: Learn what it takes to get started and learn the skills of TDD, with a hands-on demonstration by Dave Farley ➡ courses.cd.training/courses/tdd-tutorial
totally unrelated but... as soon as I started this video my wife sighed and said "I want him to read bedtime stories". So if you are looking for a new project channel....
@Lazarlux haha. I’m sure that was the subtext.
@@Alan.livingston Pretty sure she wasn't looking at the screen then 🤣🤣
@@ContinuousDelivery make an audiobook!
I wish this video was made before I started working as a backend engineer. After a few months, I realized I was spending way more time on fixing those very tightly coupled pytest mocking parts rather than the actual code. The overall design have been terrible without keeping any SOLID principles, and the test code has been mocking third libraries and asserting useless features.
I thought unit testing and mocking technique is rubbish, but as this video tells me that my code and the way I have been writing the test code are rubbish. I agree 1000%. We should focus on public interfaces rather than how a method of SQLAlchemy library is called with some useless arguments.
Finally an episode with real code examples, and they where also in my favorite language Python! Its the one thing I'm missing most of the time. Concrete examples which give context to the currently discussed topic. I really love you Dave but maybe more practical examples and maybe more code could help you reach a bigger audience. I'm a developer now for over 10 year and without examples i find it often hard to follow your thoughts or at least if i can follow them, i lack the understanding at which level they apply. Keep up the good work!
Apparently, I was in a different mindset when I read the video title. I thought this was going to be about making fun of other people's code (mocking). ... Still, I would be interested to hear your ideas on the pros and cons of making fun of other's code.
Me too
should help people learn how to write better code instead of making fun of their effort...especially if they are members of your team that you will be working with for a while.
Well, it could motivate you to learn all the details of programming to be able to mock it if it's sub-optimal for its use case
xD
"I generally prefer to have a layer of my own code between almost any 3rd party code, and the rest of my system".
This is the heart of something pretty deep and difficult, I think. I think it'd make a great initial assertion for a whole episode.
In my own project we've stopped the use of mocking libraries for a while. We prefer to write tests that focus on the what not the how. If we change the how, without breaking the behaviour, we want the test to still pass.
We still use fakes and stubs for say the DB. I tend to take the view the unit is behaviour and start from outside but then don't really work in much. I will just make the test pass and layers only emerge through Refactoring. I was surprised you had found value in mocking as we've really found the opposite, that mockless tests have increased our confidence in our tests, eliminated manual regression testing etc.
I agree about putting a layer of abstraction around the edges, but for a database specifically I do find it useful to write small tests for those layers, but tests that run against a real database. For example, test the data access layer performs the query you expect it to, and then feel safe in mocking that layer for the rest of the code.
You do also test these from the higher level integration/acceptance tests, but the feedback is slower and it's harder to to TDD this way.
If you write minimal tests for this layer but use a real database e.g. running in a throw away local container, you can get instant feedback as you type your SQL (especially if you use a testing tool that runs as you type). You do have to be careful to keep tests independent of each other if you use a real database, using random IDs etc.
I have always found mocking connection/command/query objects to be very painful and provide no value, they just mirror the code. When I first started unit testing, this kind of thing made me think unit tests were pointless, when they were actually just awful tests. Always test for desired behaviour.
The tests running against a real database you describe are already (small) integration tests and I agree with you that writing these tests is good practice.
I use EF and just use the in memory dB for testing the DAL does what it says. Actual DB tests are put in a separate dB project with all its migrations
I agree. We don't need to mock the db connection when it comes to writing tests for repository layer, for example.
I like to do my DB tests _in_ the database, using stored procedures. Whenever team dynamics allow me, I will make sure that my DB layer is fully self-sufficient, without requiring external dependencies for things like, for example, invoicing of subscriptions.
I never ever mock the database
I generally agree with everything said in the episode, but it still begs the question: even with every abstraction layer in place, there is still a piece of code that will interact with the 3rd party code, we're just making it as small as possible. What is your advice to test this code? Do you recommend leaving this to a higher layer on the test pyramid, maybe integration or even e2e?
This! Without this question answered this video is useless.
Yes. 3rd party adapters are good candidates for integration testing. If you update your vendor component, they serve as acceptance tests whether the new delivery still fits your expectations.
Terrific advice. Mocking is certainly a powerful tool, but I had to discover the pitfalls you point out for myself. I once started to mock out the entire Jenkins Python library. Like touching a hot stove, it was a painful lesson in the correct scope for mocking. Keep putting out these great videos!
Rules of thumb for mocking: “Don’t mock what you don’t own” & “Mock roles not objects”
some code example of “Mock roles not objects”?
Your last point about having a rapper or abstraction around a third party really on point. I recently tried to figure out how to mock a third-party thing because I was trying to not have to spend up the database connections it was doing and it was a nightmare to try and test sad when I abstracted that I couldn't test what I actually wanted to test and not worried about what the third-party code was doing underneath
I agree with everything you say in this video Dave, apart from one thing (I don't think you are disagreeing with me here - just wanted to add for clarity). As an unfamiliar engineer picking up this feature, the structure of the codebase and established patterns might not be clear to me - I find that often the Best place to start is with an acceptance test. This way you can add the easiest test to write straight away, and then as the possible implementation options become clearer, you can use TDD until you pass the acceptance test. (my god, that might have sounded a bit like XP - still, what you're saying makes total sense - it just depends at what level are changes need to be made)
I agree. I start with Acceptance Tests too, it us just that this video isn't on that topic, and I don't use tools like Mocking libraries in Acceptance Tests. My Acc Tests run the code as though it is in production, so I use custom stubs, under test control, to fake interaction with external systems, everything else is real.
Very useful info. I normally create an abstraction layer. I will never directly reference 3rd party code without an abstraction. TDD always drives me in this direction. The benefit is my code relies on the abstraction so I can change what the abstraction uses without breaking all my code. This is very useful if you want to use a swap out one 3rd party library for another
You are uploading videos faster than I can watch them, my watch later list is full of your videos
speed up the video then :)
watch at 7.5x
I see you are on the right path to understand unit tests.
One of two main goals of unit tests is to create documentation in code. Good documentation is easy to understand. Mocks are making unit tests hard to understand on top of coupling with implementation code and giving us false negatives when we keep the functionality and change the internal implementation. Usually mocking makes sense only when calling outside the process like calls to database, filesystem or remote resources but never use mocking libraries to write those mocks. The other way to test those calls is by using integration tests.
Unit in "classical" style is unit of functionality. Unit can be small (single method call) or large (many objects collaborating to provide functionality) but it is always ONE functionality. If you want to test more than one functionality in one test then create an integration test.
Congratulations, you wrote a manual mock. I don't disagree with interfaces over mocking, but I want the actual implementation tested too!
Contract testing covers implementations!
Yeah, writing an adapter for the external code is really just a mock, I agree 😅
And yes, those weird Python runtime hacks are usually not needed at all. Just create an interface of the adapter and inject your test behavior. This is quite useful to simulate all kinds of external failures and ensure the calling scope can handle them gracefully 😄
Especially when talking about Python, there are very elegant solutions with typing.Protocol to make use of the flexible type system through duck typing 😂
Congratulations, you missed the point.
Basically abstract away your third parties to bring back what you need, then you can just mock that abstraction instead of trying to bend the third party code to your will
This 60-something programmer has had no formal training on testing. I've learned a few things over the years, etc. I'm really liking that web page re: "Mocks, Fakes, Stubs and Dummies". Excellent to see some of the variations.
Summing up several years of painful experiences in a concise way, nice job. The only thing that is missing is how to tackle error testing. Especially when it comes to databases, it is very difficult to provoke different error states on an acceptance test level, if you're using a real database. Therefore, I keep mocking the database libraries for these kinds of tests (of course, I don't actually test, if the database method was called with specific attributes, as that is not the goal of the test)
It's funny because the carpenter at 3:42 reminded me of years and years ago when my woodwork teacher at school passed me back my little model boat and said "Do you want to be a butcher when you grow up, son?"
But seriously, it's maddening the long struggle to get our industry to use TDD... Just imagine a mechanical engineering saying "Micrometers? Dial Indicators? Rulers? I can't be bothered with any of that... it's a waste of time".
You've got me thinking now, maybe what the BDD people should have done is instead of all this talk of behaviour and specs is to rename tests as "measurements" (or something cleverer than that but with the same meaning)?
Concrete code examples would be of great help for such discussions.
I'm a huge fan of TDD. I've used Mocks so, so much, and it's always a huge pain in the ass, as well as being very brittle. Will study your "put in a facade" technique, everyone wins!
I think I get the part where you create a facade as a thin layer at the boundary of your code.
But I hope you mean we have to mock the facade instead of the 3rd pty lib. If we don't mock anything, then I/O will be done and tests will be slow.
Yes, you mock the facade.
I mock 3rd-party code all the time: "Who wrote this crap!? monkeys!?" :)
I have resorted to mocking third party code in comments now: "yes, of course this API returns the NNN in reverse order on ."
Nice video, thanks! I also tend to create code at the boundaries of my systems to interact with external services, and there's a design pattern for that: Humble Object. It's very helpful for that sort of situation.
Totally unrelated to the actual content of the video, but I really appreciated that you put the traditional woodworking face marks into your animation. Nice touch!
Took me a while to do that animation, but I wanted to add it 😉😎
I'm not quite sure I understand. In the second example there is no longer any dB code so yes the test is simpler but is that not because the implementation has also been simplified? Is the idea to put 3rd party code behind abstraction layers and to then inject simpler implementations at runtime?
I agree with this observation. Very confusing and poorly explained. So the "good" example was using a in memory Store object for the book. How is this related to the db? Seems like it was testing the mock using Store.
The genetic pattern for database interactions used is the repository pattern.
It’s quite common to have a piece of code that takes your semantic inputs (like add book) and then goes to just do the SQL part. There must not be any business logic in that object (like validating the ISBN).
Those repository implementations do not get covered by unit tests, but instead are covered with some form of integration tests. Checking that you can actually store in the database happens only after you know that the core logic of your system works.
This simplifies the test and allows for a more direct testing of the add_book functionality. So yes, the idea is to put 3rd party code behind abstraction layers, as you mentioned, and in this case, to use a simpler implementation (in the form of a Mock object) during testing to avoid having to rely on external dependencies.
@@dinov5347 The "good" example was indeed using an in-memory Store object for storing book data, but the point of the example was to demonstrate how TDD can simplify unit testing by decoupling the database implementation from the test logic, in other words it helps you identify and avoid issues with mocking 3rd party code. The idea is to put 3rd party code behind abstraction layers and inject simpler implementations at runtime. In this case, the Store object serves as a simpler implementation of the database connection, which allows the developer to focus on testing the logic of adding a book without having to worry about the complexities of SQL.
The Store object is not related to the database in any way; it's simply a data structure that the developer uses to test the add_book() method. The Mock library in Python allows the developer to avoid unnecessary implementation details of things like SQL, and the Store class can handle the data within it.
@@antdok9573 I have never coded in Python, but in Java you should not use an in-memory database, but instead simply mock the database response using Mockito. Using in-memory databases in JUnit tests causes unpredictable results in case running complex SQL queries (e.g. dependent on DB2 SQL). It probably works locally but only works intermittently during Jenkins builds. It could break on Jenkins after two months or so due to memory exhaustion or deadlock of some sorts. Seems like he is using a facade, which is only mocking the DB-response. As you wrote he is only interested in the interactions. He wants to separate essential complexities/business logic from the accidental complexities/database interactions (which he does not care about). Thereby making his code more robust and easier to change.
I mostly work on legacy PHP projects and I'm mocking extensively. Unfortunately, I don't have much choice, if I don't introduce unit testing, it's extremely hard to realize where I broke something, even harder to realize I removed a feature that is paramount.
I really don't see a better approach, everyone else talks of ditching the ball of mud, but that ball of bud is generating the dough that back my payroll 🙃
I would love to use TTD, but sometimes to get specifications from stakeholders is the hardest part of the job.
Isn't that a matter of asking them a bunch of questions until they can say what they want
I find trying to understand the problem your code is supposed to solve is easier than getting requirements. Methods like Domain Driven Design are for that. I can write the specifications myself in BDD format and get feedback from a running system from end users myself 😉
@@skyhappy My problem is with the strange edge cases that they think of after you have what should be final code and then claim that it is important and cannot ship without it. It is a case where I don’t know xyz property creates an invalid combination with abc, just because of the manual process that was done before, it is something small to them but has a cascading effect on code downstream. These things are often overlooked because they don’t do that process very often and they forget until someone else reminds them during end-to-end testing.
Excellent video points. I prefer to only test dependencies behavior in isolation. And only against each version. Behavior across internal code: wrapping or instrument injection is important, but also important is finding ways to test behavior without altering the systems configuration primitives from production. And in very short or small connections in the systems.
I agree so much with those advices but struggle so much to have my fellow colleagues let me do my job this way ...
Isn't the improved example still mocking third-party code, since it mocks the add method of the list object? This would break if the implementation changed to use the extend method instead, for example, so why not just do an assertEqual on the contents of the list?
What about the sql query? Usually theres a lot of business logic hiding inside. Sometimes some of this logic is refactored or moved from the query into the code and vice versa for optimisation purposes. Therefore i prefer to write tests that use a real db to tests code+query logic
That’s called an integration test
And again, so true... Thanks again for a very helpful video. Will show that to my Frontend colleagues who mock the axios- httpClient in all their methods...
I have a FREE TDD TUTORIAL with a hands-on demonstration where you can learn what it takes to get started, and how to learn TDD skills and use TDD to improve the design of your code ➡ courses.cd.training/courses/tdd-tutorial
So, you created a store which gets book_list as a contractor argument. But where do you do the database persistence?
You implement a version of your abstraction for storing and retrieving things that talks to a DB. You test this as an integration test, but not as a unit test. You end up with testable code, and a better design this way.
I shared this video on my teams group. I'm not welcome anymore on certain channels 😂😂
This video reminds of my day to day work... Too many mocks and patchs for nothing.
Thin layers of abstraction simplify everything.
Yes, it does 😁😎
What are your thoughts on mocking certain operating system constructs, like file io? I often use in in-memory file system fake to avoid needing to spill unit testing into the host operating system. Would that qualify as a “3rd party api”?
My approach is not to mock the 3rd party API at all, but instead to write my own (always simpler) context-specific abstraction and mock that instead. It is usually much easier, and my tests are much less coupled to implementation detail.
So instead of mocking things like
file.open
file.write
file.close
I'd create some code that did...
storage.save
and mock that instead. It's not my job to test that the operating system knows how to open files, I am only interested in my use of those features, so I test to an abstracted outcome in the scope of TDD, and I will have a simple smoke-test that verifies that my storage is configured correctly and so works in a general sense.
If you want to fix @3:00. Fix typo "develepor" -> "developer"
The only person who doesn't make mistakes is the one who doesn't try anything!
Love the channel, great job.
No they are called develepors because they cost the company so much it becomes poor.
Unfortunately once the video is up, we can't change it without loosing the views and interactions or confusing people with "why are there 2 versions of this video". I wish UA-cam supported this kind of edit, but it doesn't I am afraid - Sorry for the typo!
I really don't get this video. So don't mock HTTP responses? If so, do you just not test the small unit of code that recieves the call, and you test everything else?
With databases you can use the rollback feature, so you can actually insert dummy records at the begining of the transaction. Is that good practice?
No, I don't think it is. I think it is a bit too complex and too intrusive, needs back-doors into the system being tested. My preference is to isolate test from one another using what I call "functional isolation" - use the natural boundaries of the system to isolate state from other tests. e.g. Each test creates a new account and a new product that only it can uses.
I have a 3rd party service that I need to mock, so I watch this video to refresh my knowledge .. the thing is I don't feel he answered me! like he talked about many things in a very general way
is it only me? Should I watch it again with my cup of coffee?!
He is trying to sell you his snake oil courses on questionable programming practices. My advice would be to avoid him and the others like him.
How to deal with external APIs without any sandbox, for instance?
Different problem, same solution. Use "Ports & Adaptors" to fake the external API for unit testing and mock it. Use test controlled external stubs in acceptance tests, talk to your code through the prod version of its interfaces - it sends real messages, packets, api calls. Then use contract testing to check that you can talk to a real version of the 3rd party API.
Personally i don't use mock in my tests, i want to know how my system behave in the most realistic way possible.
then its not unit tests, isn't? More like integration test
Does it makes sense that I follow this channel? I am not even a Junior. I just had 3 months long internship as a Java Developer. I mean, I don't understand a lot of things discussed in these videos, but I still find it interesting.
Thanks for following! I hope you stick with it
Defenitely the time to learn this. Cause a lot of senior engineers are being too 'smarty' and dont like to do TDD(or train even if they know about it), and the problem is owned by the folks that has to support the legacy code afterward. Which is not a pleasant thing and challenging to fix. So learning it sooner is defenitely better
It totally does make sense. I think it can offer amazing and confusing perspectives when you've only just scratched the surface and are starting to take baby steps into the professional world. I assume you most likely will see the ugliest side of code possible as an intern.
I speak for myself as a relatively new guy too.
I would love to see a video where the code was 1-to-1.
My hope is that now that you've mentioned mocks, you're getting closer to interviewing its co-inventor, Tim Mackinnon. 😂
so... wrap it up then you can mock it?
4:20 that's a pretty unfair metaphor for the London vs. Chicago school of TDD argument.
Wait I don't think I get your comment. The analogy is about not testing versus testing. Could you explain how you saw it as 1 type of testing vs another?
It is not a metaphor. He has a video where he goes over the difference between the two, so he is aware of the actual differences.
Is there a name for this style of design? Using an interface layer for libraries
Architecturally it is sometimes called "Ports & Adapters" and is also used in "Hexagonal Architectures".
Also known as "dead code" and "worthless tests".
Thats great insight! I must tell You that thanks to Your videos I was able to convice some people (including me) that TDD is not that bad after all :)
So do I understand correctly that you are not testing the abstraction layers themselves?
14:07
How do you write tests for the thin abstraction layer that wraps the 3rd party code?
Those are covered in the integration testing phase, that’s also why they should never contain any business logic (in this example validating the ISBN)
@@WouterSimonsPlus Good luck with avoiding "business logic" (also known as "what the program does") when designing your relational database and the queries for it.
@@youtubeenjoyer1743 thanks
Came for the coding lesson. Stayed for the t-shirt.
I mean yeah, how would you feel if others were to mock your code? Be nice!
There some unoffical , you dont mock what you dont own. I totally agree.
In other words, Ports & Adapters
Your code is in too small a typeface. I cannot read it on a mobile device. Hint: Use full screen and large font for code examples. There is a reason why every UA-cam coding video does that.
In full screen UA-cam mode you can zoom in with two fingers
At the very start mocking usually works pretty well.
I can literally see and hear you in front of your screen while your were coming up with the bad code "why exactly did I think making this video is a good idea" :D
That "bad code" example is much better than what he prescribed. It's also too simplistic. Real programs have logging, tracing, metrics, database transaction handling, error handling, and more of the so-called (by the snake salesmen) "cross-cutting concerns" - also known as just code. If you apply his technique, most of the real complexity is swept under the rug, while your "unit tests" would test a few mocked function calls and call it a day. The example test is worthless, and the whole program should be tested by integration tests. The complex logic, if any, can be extracted into IO-free procedures, testable with unit tests. But his example does not have any.
@youtubeenjoyer1743 I think we have a very different definition of what each type of test is supposed to do. Also, which type of applications we build, how to build them and how to evolve them over time.
Your “bad” design looks exactly like a production api I have to work on…
Code that I wrote to have layers of abstraction and proper testing (much like the good example) was refactored to reduce the layers to make it easier and people say it was small serverless functions anyways that did not need all the complexity I was putting in. Who knows, maybe times are changing…
It's clear from the title of the video that we should refrain from mocking third-party code. This means that during unit testing, we should only focus on the output of functions like print() in Python, rather than testing their implementation. The input() function provides a good example of why mocking is important. Without mocking, every test would have to wait for a human user to input text and press enter, which would significantly slow down the testing process.
So, just test the code you actually wrote.
16:34 Dave that's because you have a lot of experience, not everyone is a Dave Farley, not everyone had the opportunity or time to practice TDD at work. So someone not like yourself can totally write rubbish code after writing the tests, that's because they have written rubbish tests, due to the lack of experience.
Agree design is extremely difficult and not everybody is a Dave Farley. But using TDD will help you to think about your design before you write the production code. It will also help you to avoid writing silly bugs in your code, as you will follow a certain flow when coding in small steps.
My idea: don't mock at all if you can do that. If you can't, mock it until you have refactored to code that allows you to remove the mock.
Mocking makes sense for parts of code that handle the interaction with other parts of the system. You want to sometimes be able to test that those interactions happen as you expect without actually having the other parts executing.
@@WouterSimonsPlus And this, right here, is where the vast majority of the bugs is found. The interactions between subsystems. Tests that mock subsystem calls are worthless at best, and harmful at worst.
Overmocking is what I see too often.
when i saw the first example i thought "this can only be tested manually" :)
but this particular example isn't bad code imo. it's so trivial that i can understand it at a glance. it does not need to be refactored.
Sorry guy based on the title I can't engage in this. I use facades to mock implementation so juniors can get faster feedback on changes and the details abstracted away aren't their concern.
❤ the pinky & the brain shirt & video
I know right??? i need!
Ha yes maybe my problem is not testing is inherently difficult, by my code is not as testable as possible.
I strongly disagree with the statement that "TDD is NOT Difficult" at 9:10, TDD requires that the developer knows how to write a good test, how, when and why to use Test Doubles and many other things. To say that it is easy for a developer to know well everything from both xUnit Test Patterns and TDD By Example is nonsense! It is hard and it is a skill that differs great developers from bad ones, to say that it is easy it means that Testing is easy, which I think is an extremely naive point of view on the subject.
I don't agree, certainly TDD is a skill that you need to learn, but it is not a difficult skill, part of the mistake, that I think makes it seem more difficult is to think of it in terms of testing rather than in terms of design. I always start any test with "What do I want my Code to do now". If I can't answer that question, I can not only not write a test, but I also can't write the code. TDD is about design much more than about testing. Sure, but in TDD there are 3 types of test, and you need to learn the basics of your xUnit framework, most people can pick that up in a couple of hours. The difficult part of TDD is that most people are VERY poor at design and TDD Surfaces you poor designs more quickly than anything else. That is why I value it so highly, and I think that is why people find it difficult, but it is not the TDD that is the problem.
Don't mock 3rd party code, but mock 3td party services
imo you should not use mocks unless you have to. if you are testing against mocks, you make your tests less valuable. because they test less.
The code I'm writing right now would easily take 5 times longer to write if I did it in a TDD way, and the end result in terms of code would be roughly the same, namely adhering to customer requirements. The end result on the customer side would not be the same though. Not only would they probably fire me for taking 5 times longer to write code, the amount of meetings I would need to have with the customer to get the necessary requirements upfront to even begin coding would frustrate the customer to the point where they wouldn't want anything to do with me. So yeah, I'm staying away from TDD.
Instead of waiting til having all requirements defined up front maybe try working in a more agile way, iterating over solutions and showing to your customer as often as you add something to get feedback. Embrace the change. TDD is great tool for this.
Following your same argument, I wonder if you ever write tests as it seems you would do it only when you have frozen never-changing requirements. Your customer will change their mind and thus the requirements, this is a constant in software development. What do you do in that situation?
If the problem is that you're slow at TDD, I suggest practicing TDD to get better and faster at it.
No one says TDD is the right fit in _any_ situation. I like to do it to explore different solutions or uncharted territory (e.g. business logic i haven't already understand 100%). So, it's okay to stay away from TDD, but in my view this would also set a bad example if you were a senior engineer and others would like to learn from you. With your mindset, they would probably not be comfortable and not happy with what you teach them. TDD is a practice that you need to practice, so if you never tried it you actually miss the point of it. Do you actually test your code? When do you write tests? How do you approach testing in general?
@@chiaradiamarcelo In my experience (18 years of IT consultancy, countless number of projects and bits of software created and modified), customers rarely have the time, nor the ability, nor the fortitude, nor the willingness to iterate often and give feedback often. You're lucky if you can iterate three times (proof of concept -> mostly working -> finishing touches), but usually it's only two iterations (not done -> done). Most of the time, the customer can't even express domain rules in a coherent and correct manner without first seeing the PoC. It's not because the customers are bad or poor at their jobs, it's because their domain is almost always already heavily intertwined with several existing integrated 3rd party systems that behave in a certain way, which in turn has formed the domain so that it cannot be treated independently from the systems that are already in place. I mean if your company uses MS Dynamics, your domain is gonna look like Microsofts interpretation of the world (we all know what a mess that can be). And as such, the customer will usually need to see a PoC or "some parts are working" version of what you're working on before they can express their domain concerns in a coherent and correct manner. Meaning, you can't do TDD upfront unless your customer is willing to pay you to study the breadth of their current systems environment. And most customers simply don't have the budget for you to do that, especially if you're a consultant, in which case the project budget more often than not has already been set.
As a consequence, I only write tests on critical parts of the code that are non-obvious to junior devs, because I expect a senior dev to understand the code I'm writing even without any tests in place. Why do I expect that? Why, because I myself understand that when I look at other peoples code. I can see the potential null pointer reference bugs waiting to happen, I can see the magic hard coded values that'll need a recompile in the future, and so can you. I never do tests upfront like in TDD, because if the customer cannot express their domain and requirements before the PoC has been presented to them, any TDD you've done up to that point will probably have to be thrown away anyway. And, when the customer has seen the PoC, TDD is not even neccessary because you as a developer has enough information to go from PoC to mostly working within your alotted budget.
@@tobyzieglerrr If TDD is the gold standard, not working with TDD as a senior engineer is indeed setting a bad example. I can't argue with that. I can only argue with whether or not TDD is actually a gold standard. And in my 18 years as an IT consultant, I've only once seen one example where you could easily deploy TDD (radio station conglomerate scheduling software built from scratch in .NET, large budget). Ironically, that one example is also where they (people with twice the experience I had) actually opted to use TDD, and despite the very elaborate and detailed work they did upfront to understand the domain, the resulting system was an absolute mess. OO inheritance everywhere, inversion of control and independency injection everywhere, literally all properties were subject to null reference exceptions despite the very elaborate test suite, and an absolute nightmare to extend with new functionality. So if TDD is a gold standard, why couldn't this team develop this software properly? If TDD guarantees some sort of correctedness to a system, why did this system look like a mess? If TDD is so good, why did they give up on the constantly failing test suite and just disabled the test suite errors instead of maintaining it properly? You see, when you say "TDD", all I see is those kinds of challanges, and that kind of incurred costs, which I maintain is easily 5x the cost of the system itself.
I test my code rigorously yes. I've always done that. When I was 25, before Google Analytics was a thing, I made an multivariate (A/B) testing suite from scratch for a travel agency. It was deeply integrated into their existing custom made .NET e-commerce website, randomly directing customers to one of several variants of their website so that they could measure effectiveness of different layouts and campaigns. $100 million revenue every year went through that website, so not a small-time travel agency. They use that tool for 10 years before they switched it out to GA. I repeat: 10 years. How long does your code last? As I said in my other message, I only write tests on critical parts of the code that are non-obvious to junior devs, because I expect a senior dev to understand the code I'm writing even without any tests in place.
'Develepor'?
He didn't integrate spell checking into his video production pipeline.
@@Tymon0000 isn't it a GitHub action, by now? : )
I'm just going to make the observation now that chairs made by carpenters who eyeballed it decades or more years ago can still be sat on and work exactly as they should, but a chair you bought from IKEA made by engineers who measured everything is broken after one or two months of use.
Off-topic, but you can return IKEA furniture within the first 12 months. Provide feedback to IKEA by returning broken designs.
Can't mock it? Can we at least tease it? Make yo' mamma jokes?
I don't get it - did I just spend 20 minutes listening to a very convoluted full-of-water and unnecessary information explanation of encapsulation?