Letās jump right in at the only place we can: the very begining. The programs we write every day are based on orchestrated algorithms and data structures that all have their roots in a single thing: LOGIC. Letās quickly explore some basic logical rules as weāre going to build on them later on.
The first, most obvious is question is how are we defining the term ālogicā. There are a few different definitions so letās start with the first, offered by Aristotle. In trying to come up with a framework for thinking, Aristotle described what are known today as The Laws of Thought. Letās take a look at each one using JavaScript!
The first is The Law of Identity, which simply states that a bit of logic is whole unto itself - itās either true, or false, and it will always be equal to itself.
Here weāre describing this as x === x
, which⦠yes⦠I know seems perfectly obvious but stay with me.
The next law is called The Law of Contradiction, which states that a logical statement cannot be both true and false at the same time. Put another way - a true statement is never false, and a false statement is never true.
Again - this is perfectly reasonable and seems obvious. Letās keep going with the third law: Excluded Middle.
This oneās a bit more fun as it states that something can either be true or false - there is no in-between. Using JavaScript we can demonstrate this by setting x
and y
to true and false and playing around with different operations - the only thing that is returned, based on those operations, is true
or false
- thatās Excluded Middle in action.
And right about now some of you might be bristling at this.
As Iāve been writing the statements above weāve been seeing JavaScript evaluate the result of each, which has been undefined
. The idea here is that thereās a third state, thatās neither true nor false - which is undefined. You can also think of this as null
for now even though, yes, null and undefined are two different things. Weāll lump them together for the sake of defining whatās known as āternary logicā - or āthree state logicā which kicks Excluded Middle right in the teeth.
Aristotle had a problem with his logical system - it only deals with things that are known. The only things we can know for sure are things that have happened already and that we have witnessed somehow⦠even then thereās a question of whether we truly know them. Letās sidestep that rabbit hole.
When asked to apply his Laws of Thought to future events - such as āwill Greece be invaded this yearā Aristotle replied that logic cannot apply to future events as they are unknowable. An easy out, and also a lovely transition to Ternary Logic.
Letās bring this back to programming. You and I can muse all day about whether Aristotleās brand of logic - which we can call ābinaryā for now - is more applicable or whether the world can be better understood with the more flexible ternary logic. But I donāt want to do that because Iām here to talk about computer programming and for that thereās only one system that we can think about - a deterministic system.
If you read the first Imposterās Handbook youāll likely remember the chapter on determinism (and non-determinism). If not - a simple explanation is that a deterministic system means that every cause has an effect and there is no unknown.
Programs are deterministic because computers are deterministic. Every instruction that a computer is given is in the form of groups of 1s and 0s⦠there are no undefined middle points. This is important to understand as we move forward - the math that weāre about to get into and the very advent of computer science is predicated on these ideas. I know what youāre thinking thoughā¦
Programming languages define much more than simply true or false - they also include the ability to have something be neither in the forms of null, nil, none or undefined. So let me ask you a question⦠is that logical?
Letās take a look.
By default, JavaScript (and many other languages) will default a variable to an unknown value such as null or, as you see here, undefined. If I ask JavaScript if something undefined is equal to itself, the answer is true. If itās not equal to itself the answer is false - so good so far.
What about being equal to not-not itself? Well thatās false as well which makes a bit of distorted sense because !y
is false so !!y
returns true⦠I guess. But if something is !undefined
⦠what is it? To JavaScript⦠itās simply true
.
We can short-circuit our brains thinking about this but letās notā¦
The creator of ALGOL, Tony Hoare, is credited with creating the concept of null
in a programming language:
I call it my billion-dollar mistakeā¦At that time, I was designing the first comprehensive type system for references in an object-oriented language. My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldnāt resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.
Have you ever battled null problems in your code or tried to coerce an undefined value into some kind of truthy statement? We all do that every day.
Computers arenāt capable of understanding this. Programming languages are, apparently and at some point the two need to reconcile what null
and undefined
means. What makes this worse is that different languages behave differently.
Ruby defines null
as nil
and has a formalized class, called NilClass
for dealing with this unknowable value. If you try to involve nil
in a logical statement, such as greater or less than 10, Ruby will throw an exception. This makes a kind of sense, I suppose, as comparing something unknown can be ⦠anything really.
But what about coercion? As you can see here, nil
will be evaluated to false and asking nil if itās indeed nil will return true. But you can also convert nil to an array or an integer⦠which seems weird⦠and if you inspect nil you get a string back that says ānilā. Weāll just leave off there.
JavaScript is kind of a mess when it comes to handling null
operations as it will try to do itās best to give you some kind of answer without throwing an exception. 10 _ null
is 0, for instance⦠I dunnoā¦
Itās the last two statements that will bend your brain, however, because 10 <= null
is somehow false⦠but 10 >= 0
is true. I know JavaScript fans out there will readily have an answer⦠good for them Iām sure thereās a way to explain this but honestly itās not sensical to begin with because, as Iāve mentioned, null
and undefined
are abstractions on top of purely logical systems. Each language gets to invent itās own rules.
If you ask JavaScript what type null
is youāll get object
back - which isnāt true, as MDN states:
In the first implementation of JavaScript, JavaScript values were represented as a type tag and a value. The type tag for objects was 0. null was represented as the NULL pointer (0x00 in most platforms). Consequently, null had 0 as type tag, hence the bogus typeof return value.
Letās take a look at a more āstructuredā language - C#. You would think that a language like this would be a bit more strict about what you can do with Null⦠but itās not! OK it DOES throw when you try to compare null to !!null - thatās a good thing, but when you try to do numeric comparisons⦠hmmm
And null + 10 is null? I dunno about that.
So, whatās my point with this small dive into the world of logic and null? It is simply that null is an abstraction defined by programming languages. It (as well as undefined) has no place in the theory weāre about to dive into. Weāre about to jump into the land of pure logic and mathematics, electronic switches that become digital⦠encoding⦠encryption and a bunch more - none of which have the idea of null or undefined.
Itās exciting stuff - letās go!