October 2nd, 2024

Null Is Not the Billion Dollar Mistake. A Counter-Rant (2015)

The article defends the NULL concept in programming, highlighting its practical applications and the distinction between unknown and absent values, while acknowledging the challenges it presents in real-world scenarios.

Read original articleLink Icon
Null Is Not the Billion Dollar Mistake. A Counter-Rant (2015)

the traditional NULL concept in programming. The article argues against the common perception that NULL is a "billion-dollar mistake," suggesting instead that NULL serves a practical purpose in SQL and programming languages. It highlights the distinction between NULL as an "unknown" value and the concept of an absent value, emphasizing that while NULL can introduce complexity, it is also a necessary tool for modeling certain data scenarios. The author critiques purist views that dismiss NULL without considering its utility in real-world applications. The discussion also touches on alternatives like the Optional type in functional programming, which aims to handle the concept of absence more explicitly. Ultimately, the piece advocates for a balanced perspective on NULL, recognizing its role in programming while acknowledging the challenges it presents.

- NULL is often criticized as a problematic concept in programming, but it has practical applications.

- The distinction between NULL as an "unknown" value and an absent value is crucial for understanding its utility.

- Alternatives like the Optional type provide a different approach to handling absence but do not eliminate the need for NULL.

- The debate around NULL reflects broader tensions between purity and practicality in programming languages.

- Acknowledging the complexities of NULL can lead to more effective programming practices.

Link Icon 6 comments
By @moi2388 - 7 months
Well, this is a load of hogwash.

Null is definitely the billion dollar mistake. It’s not the “null” value however, since this could indeed be seen as a badly implemented option monad.

It’s the implicit nullability of all reference types which is the mistake, since it leads to undefined behaviour at runtime.

Also; in regards to the linked style discussion in the post; you’re wrong about all of it :p

By @Sakos - 7 months
The biggest problem is always just going to be that it breaks any understanding or concept of a type system. When object.doThing().doOtherThing() can fail because the result of doThing() can be null, you suddenly have to fill your entire code base with unreadable cruft to avoid that happening. Because it's not just that one method where this is possible.

It especially doesn't help in languages like Java, where nulls completely subvert static types and static type checking and it turns a whole class of issues into runtime errors.

Allowing null means you're implicitly codifying Type class | null for every single reference type everywhere in your code. By doing it implicitly, you're not providing any semantics in the programming language or in the runtime to be able to deal with it at the same abstraction level as the code around any particular point where something can be null. This is incredibly error prone and makes for ugly, brittle code.

Yeah, it's a billion dollar mistake.

By @karmakaze - 7 months
> JavaScript itself is a beacon of usefulness that is inversely proportional to its purity or beauty, so long story short: ...

How can I take this seriously. The language's shortcomings is from being written practically overnight, not from inverse proportional value elsewhere. This post might be a $100 mistake.

By @orionblastar - 7 months
Where UserName Is Not Null

Filters out the records where UserName is not Null.

Where UserName Is Null

Filters out the records where UserName is null and possibly didn't finish the registration process.

By @Jerrrrrrry - 7 months
Coincidentally surely, and not Dunning-Kreugur, that this exact thing proved to be another literal 2 comma line item just last week.
By @octav123 - 7 months
Stopped reading at "Functional programming languages like to make use of the Optional “monad” ... but that’s just another way of modelling NULL." It is clear he doesn't understand simple things like Option and the purpose of that and monads.