
“The only downside is that you break the link between the number you are using to keep track of your counting with the number of things you are counting.”
[Humble PI: A Comedy of Maths Errors, Matt Parker]
Since Adam was a boy, counting has long been recognized as the most important skill for humans to survive in the world. So, I believe that you (even if you are a toddler) can count numbers well. Let’s check it out. how many numbers you can count on your fingers? The answer is eleven (not ten). This is because you can also count zero with all folded fingers (the more correct answer is 1024, please search for “finger binary” on google). Next, how many natural numbers from 10 to 99? The answer is 90 (not 89). Hooray, you got the right answers, I count on you!
We are more getting into trouble when counting large numbers in efficient ways. For example, when we count the total number of events for calculating probability, we use some math skills such as permutation and combination. However, these are too tricky to use simply. So, when you make a decision based on probability (e.g. Bayesian approach), miscounting the number of events results in a totally different probability, leading to a wrong decision. Please don’t count on yourself when you count numbers (specifically, counting sheep to sleep or counting cards to win the blackjack).










