Null pointers look simple on the surface, and that’s why they’re so dangerous. As compiler optimizations, intuitive but incorrect simplifications, and platform-specific quirks have piled on, the odds of making a wrong assumption have increased, leading to the proliferation of bugs and vulnerabilities.
This article explores common misconceptions about null pointers held by many programmers, starting with simple fallacies and working our way up to the weirdest cases. Some of them will be news only to beginners, while others may lead experts down the path of meticulous fact-checking. Without further ado, let’s dive in.
“signal handling is zero-cost until the signal is generated”
True, but higher level languages impose restrictions about in lining and other optimizations for code blocks that can generate/catch an exception. So an explicit null check is often superior.