Mostly very good advise on how to write good tests and a great talk well worth watching. I disagree in one point though. Putting lot of logic into test cases is a bad idea. It makes them harder to understand and more likely to be buggy themselves. Instead, test code should be as trivial as possible (using tables for test cases is fine). This also means that it is totally fine to have redundant code in your test cases.
depends on type of test and number of test case. In case of an integration-test: if one is testing a scripting language with hundreds of test cases, then a table-driven test would be hard to maintain. In this case it is preferable to define a test case as the input file vs the expected output file. Now a table-driven test is ok for unit tests: a table-driven test is perfect for the binary search function example.
Russ is clearly way more experienced than I'll ever be, so I'm probably wrong. I don't like obscure logic in tests. As the logic of the test might as well just be wrong. I also don't like the idea of custom mini languages and tests that 'correct' tests. It just seems like there's so much room to introduce problems into your tests. I want my tests easy to understand at a glance. For me and for the other people on the team. If I have to figure out some obscure custom mini language to understand whether my code is bad or the test is bad, I'm not happy.
Very amusing watching a talk about testing while battling the clickers' functionality :D
where can i find the "uncover" program
10:54 - Tip #4 Write exhaustive tests
Mostly very good advise on how to write good tests and a great talk well worth watching.
I disagree in one point though. Putting lot of logic into test cases is a bad idea. It makes them harder to understand and more likely to be buggy themselves. Instead, test code should be as trivial as possible (using tables for test cases is fine).
This also means that it is totally fine to have redundant code in your test cases.
depends on type of test and number of test case.
In case of an integration-test: if one is testing a scripting language with hundreds of test cases, then a table-driven test would be hard to maintain. In this case it is preferable to define a test case as the input file vs the expected output file.
Now a table-driven test is ok for unit tests: a table-driven test is perfect for the binary search function example.
I would like to uncover uncover
Russ is clearly way more experienced than I'll ever be, so I'm probably wrong. I don't like obscure logic in tests. As the logic of the test might as well just be wrong.
I also don't like the idea of custom mini languages and tests that 'correct' tests. It just seems like there's so much room to introduce problems into your tests.
I want my tests easy to understand at a glance. For me and for the other people on the team. If I have to figure out some obscure custom mini language to understand whether my code is bad or the test is bad, I'm not happy.
Just write tests for your tests, EZ