Unless you think of code in terms of structs and memory with strong static guarantees, this has limited usefulness to you. You'll think Casey is arguing for hacking in code by breaking the type schema because it's faster or something, which isn't even possible.
The type schema is like religion and it is slowing things down, a lot, from developer speed to execution speed. I still do type erasure (in the good old casting to void*).
@@nexovec Because the "cleaner" your code the sooner you have to invent other intermediate types to make all the types fit together to do the tasks. This involves more abstraction layers, function calls, memory allocations, L1 and L2 cache misses, branch prediction misses, And for the programmer productivity. Abstraction is one of the worst now to understand what it actually does. Debugging is a hell when you can't see what is called, even you little artifical restricted unit test runs fine. We have the problem of too much abstraction for too little things. I recommend John Ousterhouse talk here .
Unless you think of code in terms of structs and memory with strong static guarantees, this has limited usefulness to you. You'll think Casey is arguing for hacking in code by breaking the type schema because it's faster or something, which isn't even possible.
0 information in your comment.
@@ViolentFury1 maybe you mapped it incorrectly? EDIT: cool nickname btw.
The type schema is like religion and it is slowing things down, a lot, from developer speed to execution speed.
I still do type erasure (in the good old casting to void*).
@@llothar68 You actually think type schema is slowing down execution speed... that's kind of morbid.
How would that even be possible?
@@nexovec Because the "cleaner" your code the sooner you have to invent other intermediate types to make all the types fit together to do the tasks. This involves more abstraction layers, function calls, memory allocations, L1 and L2 cache misses, branch prediction misses, And for the programmer productivity.
Abstraction is one of the worst now to understand what it actually does. Debugging is a hell when you can't see what is called, even you little artifical restricted unit test runs fine. We have the problem of too much abstraction for too little things. I recommend John Ousterhouse talk here .