You know, I think I'm my own worst enemy.
I'm obsessed with quality. With exactness. With preciseness.
I'm the kind of guy who sees things as black and white. Either it is true, or it is not true.
So, naturally, when I design computer systems, I 'fail safe', 'fail fast', 'authenticate', 'authorize', 'verify', 'validate', 'rollback', 'serialize', etc.
All of those things are *good*. All of those things lead to failures. That is, I try to make my own system fail.
We know, that if the data is not valid, then it's not valid and that's that. Well, I do, because that's the kind of guy I am. If you aren't like me, well, I've seen your systems buddy! they SUCK!
Still, its a shame to watch a serialized transaction spit 1,000 records into the DB and then fail on the last one because of a bogus field. On the other hand, if the system that I was replacing was robust to begin with, then I wouldn't need to do all this data cleansing in the first place..
I do think it's funny though, how I'm always writing code to make sure I fail.
On the bright side, I'm still convinced that failure is the winning strategy!