Once it was part of my job to work alongside experts in Quality Assurance. Actually, many of my colleagues weren't experts, just go-getters willing to give QA their best shot. The experts weren't always the best at this work, either, because being really good at QA is partly a matter of temperament. Watching the QA team operate, I came to think that when it comes to checking a piece of work, there are two kinds of people: those who are trying to make sure the work is correct, and those who are trying to prove that it isn't. You want the second type of person on your QA team. A checker should revel in finding errors, not aim to show that there aren't any.
If there are two mentalities, one corresponding to the jaded TSA employee and the other corresponding to the lovingly patient KGB interrogator, then I'm interrogatory by nature. But I do catch myself thinking the wrong way sometimes, such as when I'm finished putting away the pieces of a board game. 'Wouldn't it be nice, now (says my brain) to put on the box top and take the game back to the shelf!' Yes, it would be nice, but there is still another piece on the floor. Trust me, there is. One of my personal folk theorems is, 'There's always one more.' Need a paper clip or a rubber band? You're in luck, because there's one more in that drawer. Look long enough and you will find it. Nobody in history, to my knowledge, has ever run out of paper clips.
Another doctrine of mine is The Fundamental Theorem of Travel Delays, which I deduced circa 2010. This theorem says that The number of travel delays is not equal to 1. Corollary: If Delta Airlines, MTA, or Amtrak announces a delay, then start researching other plans, because they are going to announce another delay. Any other outcome would violate the theorem. Here is another example of the theorem in action: if you are entering a New York subway station, and if the person ahead of you swipes their MetroCard and gets an error message with a beep, then for the love of God, get yourself out from behind them. Where there is one beep, there will be another.
Even though I have the temperament for it, QA probably wouldn't be a good profession for me. I would spend too much time checking a piece of work when there were other pieces of work to be checked. There is such a thing as being overly perfectionist. Ignatius Reilly, the main character in the novel A Confederacy of Dunces, had a job pasting due-date slips into library books. "On some days," he said to his mother, "I could only paste in three or four slips and at the same time feel satisfied with the quality of my work."
A paradox that made a strong impression on me as a child was the paradox of Caesar's dying breath. Every breath you take, so the saying goes, probably includes at least one air molecule from Caesar's dying breath. Amazing, isn't it? Although the probability is minuscule that any randomly selected air molecule boasts such a pedigree, nevertheless, a breath of air contains so many molecules that the chances of entirely avoiding the imperial ones are low. It's like playing Russian roulette with a gun that is nearly empty but pulling the trigger eighty sextillion times. There's no future in it. Similarly, in the design of population studies, sometimes you don't need your sample to be a large percentage of the population if the sample is large in absolute terms. Intuitions like these inform my neurotic approach to copy editing: while any given word is nearly certain to be correct, in a long enough run of words there must be errors.
Addiction therapists describe gamblers who think that if a game offers a player a one-in-four chance of winning, then the player is certain to win by playing the game four times. I created a much harder puzzle once to test the solver's sense of such things: if the probability of winning a game is the same as the probability of losing the game a million times in a row, then is the probability of winning the game less than, equal to, or greater than one-in-a-million? I find this puzzle challenging! But the belief about the game of one-in-four is so wrong, I cannot believe that anybody believes it. Somehow those therapists are tricking people into giving the wrong answer. That said, if you can easily be tricked into giving an answer that on second thought you realize is wrong, then the real problem is not your intuitions about probability—it's your neglect of the habit of giving second thoughts to things. To build this habit, it is necessary to err frequently.
Socrates was unsure of everything save his own power to sniff out error. To detect error, it helps to believe in it. 'Out of the crooked timber of humanity no straight thing was ever made.' Certainty is a state of mind normally denied us, but if there is one thing we can be sure of it's mistakes. Between us, I despair of proofreading this page.