Just released - The Imposter's Roadmap! Go Get It

Bombs Away!

Buy or Subscribe

You can access this course in just a minute, and support my efforts to rid the world of crappy online courses!

Buy Standalone  Subscribe

You know enough to be dangerous, so let's blow some stuff up shall we? Here's the deal: I need to show the CEO the real power of Elixir. This is a HUGE roll of the dice on my part, and we could both be fired... but it would be really fun so who cares?

I want to overwhelm our database and bring it down. Basically a ddos attack on PostgreSQL using 12 or so lines of code. Elegant and powerful.

Think about how you would do that with Ruby, JavaScript, .NET. You would have to write something concurrent and orchestrate it in such a way as to take up all of PostgreSQL's connections at once, which is a little harder than it sounds...

Thinking About Asynchronicity

Developers often confuse asynchronous execution with concurrent or parallel execution. I do this often! It's something I learn and unlearn, all the time. Let's take a small tangent and discuss the difference.

If I was to kick up a Node application and fire off 1000 queries at once, what do you think would happen? It would indeed involve asynchronous execution, but would it be concurrent? We need things to be concurrent if we ever have a hope of bringing PostgreSQL down.

PostgreSQL has a default connection pool size of 50; this means that 50 different clients can connect and execute queries at the same time. You can up this, if you want, but they recommend setting it to 100, max.

This might not seem like much, but consider that PostgreSQL (if properly tuned) will execute most queries in a few milliseconds, at most. Even if you were to try and run 1000 queries with only 50 clients available, you would have to make sure they were all demanded within microseconds of each other. This is a little difficult.

I've tried it. With Node as a matter of fact. Let's see what happened...