Debugging - the subtle lie

"Debugging is like being the detective in a crime movie where you're also the murderer"  - Filipe Fortes

From the earliest times in programming, people have used the term debugging. At one point there was a moth in a relay, but programmers had to invent a term for what they spend most of their time doing, so it was inevitable that a word would be invented to describe the process of fixing your own mistakes. It is estimated that 85% of the spent programming is consumed fixing the mistakes put into the code by the author. I know of no other job where 85% of the time is spent fixing mistakes of your own making. Programmers imagine they spend their time typing in programs, and so people often quest after briefer, more compact notations. The language called APL was the all-time champion of brevity, but its cryptic notation ensured that programs were almost write-only; only the author typically could understand them. The reality is that the computer requires such inhuman precision that it is a struggle for us to program, because the tiny mistakes that are made, often just a few words, cause the program to act in baffling ways.

Any programming language that purports to be a significant improvement over current technologies must make error correction a much easier, smoother process. In fact brevity may work against reducing error. Simplicity, regularity, and the ability to understand other people's code is paramount in the nex† generation language race.