Introduction to Logic, Binary and Cryptography
All of the code we've ever written is turned into binary. Those binary values are orchestrated in a wonderful way, allowing the CPU of our computers to execute fantastic feats of calculation. As programmers, we should know more about this stuff!
Lesson 1: Welcome!
I'm a self-taught coder and I've enjoyed learning things over my career... but I avoided binary and crypto like the plague! I just didn't need to understand that stuff to get my work done... at least I didn't think so. Let's just say I was wrong about that - so 3 years ago I dove in... and here we go!
Lesson 2: The NULL Disaster
Logically-speaking, there is no such thing as Null, yet we’ve decided to represent it in our programs. Null is a crutch. It’s a placeholder for "I don’t know and didn’t want to think about it further" in our code and this is evidenced by it popping up at runtime, shouting at us exceptionally saying “ARE YOU THINKING ABOUT IT NOW?”. it's been described as a "billion dollar mistake"... let's see why.
Premium Boolean Algebra
You're George Boole, a self-taught mathematician and somewhat of a genius. You want to know what God's thinking so you decide to take Aristotle's ideas of logic and go 'above and beyond' to include mathematical proofs.
We've covered how to add binary numbers together, but how do you subtract them? For that, you need a system for recognizing a number as negative and a few extra rules. Those rules are one's and two's complement.
Now that we know how to use binary to create switches and digitally represent information we need to ask the obvious question: 'is this worthwhile'? Are we improving things and if so, how much?
Lesson 2: Binary Encoding Basics
So far we've learned that we can use binary to represent information using just 1s and 0s, but it's useless unless we can share it with someone. In fact - the very definition of information required a sender and a receiver. But how do we do this with binary values? Let's learn the basics.
Premium The Huffman Encoding Algorithm
Encoding an alphabet for transmission can be ad-hoc, or we can spend a little time optimizing that encoding so our transmission can be sent more efficiently. This works well when there's no "noise" on the line - aka "chance for errors". One such algorithm is Huffman - which we'll learn now.
Premium Simple Error Correction
Optimizing our encoding scheme for a noiseless channel (one without transmission errors) is wonderful, but in the real world that just doesn't happen - anywhere. So we need to figure out a way to correct for errors and the good news is: this is math! There's always way.
We know that we can orchestrate bits in such a way as to 1) know when an error occurred and 2) locate the error if we have enough bits. But that's always the problem! We need more bits as our messages grow - but how do we orchestrate this? Once again, Richard Hamming comes to the rescue with his Hamming Code.
Lesson 1: Understanding Ciphers
Ciphers are simply algorithms designed to conceal information using a key of some kind. For decades it was thought that the stronger the cipher, the more secure your message. Over the centuries we've come to understand that the cipher doesn't matter at all - it's the key. Always the key.
Lesson 2: One-way Functions and Cryptography
The core of asymmetric encryption (something we'll get to shortly) is the idea of the one-way function. The definition is straightforward: it's nearly impossible to guess the input based on the output.
Premium The One-time Pad
If you read up on cryptography you'll quickly come to understand that "there is no unbreakable cipher" - which is true for the most part. A better way to think of it is "there is no unbreakable cipher... except for the one-time pad". Another invention of Claude Shannon - let's write our own!
Premium The Diffie-Hellman Key Exchange
It's a hard-learned lesson throughout history: ciphers don't matter! You have to keep the key safe! This was true until the early 1970s, when Witfield Diffie had an amazing idea: let's use computers, one-way functions and some tricky math to solve this problem with public and private keys. One of the top scientific breakthroughs in history.
Ron Rivest, Adi Shamir and Len Adelman leveraged the idea of a public and private key and came up with the RSA encryption algorithm - the most downloaded software in existence. You're using this algorithm to watch this video over HTTPS. You use it when you SSH into a private server as well. Brilliant, elegant and brutally simple - we're going to write this algorithm ourselves to understand how it works.
Premium The Basics of Hashes
Sometimes you don't want to decrypt something you've encrypted but, instead, have a unique "signature" of its binary value. That's what a hash code is - and they're everywhere. Let's see how they're made.
Premium Breaking a Hash
The only way to know if a hashing algorithm is good is to see ways to break it. You can't decrypt a hash, but you can do things messed up things like pre-imaging every value you can think of to see if you get a match. Let's take a look at how hackers can go nuts with your secret data.
This is a video that I made for my YouTube channel but I think it fits perfectly here too! If you’ve had to store sensitive user information in a database, you’ve probably heeded the advice to “just use bcrypt”. But do you know why? What other choices are there? In this video we take a deep look at bcrypt, pbkdf2, scrypt and argon2!
Premium Understanding Hash Collisions
Hashing algorithms aren't just for passwords - they are mostly used to uniquely identify things like files, string values and Git commits. This only works if a hashing algorithm produces a unique value *always*. If it doesn't, there's a collision.
Premium Blockchain Basics
It's not a bad idea - it's just that bad things have been done with it! No matter your opinion on Blockchain stuff - it's a good idea to know what you love/hate and how it works. Blockchains are basically Git repositories that store transactions. It's obviously more complicated - so let's build our own, shall we?