You've probably realized it takes you more than 5 times as long to do a 500 piece puzzle than a 100 piece puzzle, but how much longer?

Good news! We can say in general about how much harder a puzzle is depending on the number of pieces in it.

In fact, there is a whole field of study for characterizing how hard problems are, called algorithm complexity*,* and engineers at computer companies like Google and Facebook spend a lot of time analyzing questions like these to make the internet run efficiently.

For example, say you wanted to say N random numbers out-loud. It would take you an amount of time proportional to N, that is, reciting 20 random numbers would take you twice as long as reciting 10 random numbers.

Formally one would say that, "saying random numbers outloud" gets harder *linearly in N*.

Similarly, adding up 20 numbers takes twice as long as adding up 10 numbers, so addition is also linear in how many numbers N you are adding.

But some problems are harder than linear, like putting together a puzzle with N pieces! It definitely takes more than twice as long to do a puzzle with 200 pieces than one with 100 pieces.

In fact, the problem goes as NxN, so doing a 200 piece puzzle is 4 times as hard as doing a 100 piece puzzle, and doing a 500 piece puzzle is 25 times as hard as doing a 100 piece puzzle:

The reason is that you have to compare each puzzle piece to all N other puzzle pieces. That is, each of the N pieces has to be compared to N-1 other pieces, so that's N x (N-1) comparisons to do in the worst case.

Another way to think of it is, if a puzzle piece is added to a puzzle, it's not just that you have to sort through the other N-1 pieces to find one the new piece connects to, on top of that, for each of those N-1 pieces you have to ask if they connect to the new piece.

Admittedly, we've oversimplified the problem in a few ways.

First, the more areas of distinct color, the easier the puzzle is. For example, say you have a 500 piece puzzle but it's really easy to separate the pieces by color into 5 piles, say, green field, blue sky, and red, yellow, purple stripes on a hot air balloon. Then it's really more like five 100-piece puzzles. So then it's really only 5x as hard as a 100 piece puzzle.

So hopefully that gives you some insight into why a puzzle with a lot of distinct color areas (like Joe Vaux Garden of Earthly Delights) is so much easier than one like Seurat's Grande Jatte, where you can't just start by sorting pieces by color.

Our 2nd over-simplification is the edge. If you have a straight edge, then you can split those pieces out at the start (and there will be roughly 4 x sqrt(N) edge pieces). So there's really a smaller number of non-edge pieces, call it

Z = N - 4 sqrt(N)

and then there's E = 4sqrt(N) edge pieces. So, just like with distinct colors, if you can separate a puzzle into edge and non-edge pieces, then it's not as hard, only:

O(Z^2) + O(E^2) <= O(N^2) hard.

So that's why puzzles with irregular edges can feel a lot harder than puzzles with straight edges.

If you happen to be talking to a computer scientist, the formal lingo for all this is called "big-O" notation (seriously). If the time it takes to solve the problem scales linearly in N (like counting N random numbers), one says the problem has O(N) complexity (pronounced O-of-N), but a problem like solving a jigsaw puzzle scales quadratically in N and thus has O(N^2) complexity (pronounced O-of-N-squared).

How hard are the the problems that most computer scientists solve? Well, a lot of what happens on the Internet has to quickly solve problems where N is around a billion, for example, finding the best out of N = 1 billion products to recommend to you, or finding the movie you're looking for out of N = 1 billion videos. Because the N in these real-world problems is so big and growing all the time, computer scientists work hard to find ways to teach computers to solve problems that are faster than O(N^2).

A classic important real-world problem is "How long does it take to sort N random numbers?"

5 2 8 1 -> SORT -> 1 2 5 8

A naive strategy to sorting is to just keep comparing each pair of neighboring numbers and swapping them if they aren't sorted. If you did this for 5 2 8 1, left-to-right, you'd get:

5 2 8 1

2 5 8 1

2 5 1 8

2 1 5 8

1 2 5 8 - done!

That strategy is officially called "bubble sort", and it's slow - like solving a jigsaw puzzle, the time it takes to sort N numbers grows as NxN, that is, bubble sort has complexity O(N^2).

A much slower (and stupider!) strategy is to randomly re-order the N numbers and then check if they are now sorted. That strategy is called "Bogo sort", and there's a good chance you'll die before getting N numbers sorted!

Luckily for the Internet, there are a number of more efficient sorting strategies that only take O(N log N) time. In fact the Internet is run using mostly O(N) and O(N log N) programs.

**Learn more:**

Karl Beecher's Brown Dogs and Barbers (a friendly intro to computer science)

Justin Abrahm's Explanation of Big-O

Rob Bell's Beginners Guide to Big-O

I am neither a mathematician nor a computer scientist so a bunch of this just blew by me; but, I once wondered if a person could write a compatibility algorithm based on Artifact Puzzle selections, you know, just for a hoot. Well, now that I’m so many puzzles in, I figure I must be puzzle pals with everyone who orders here;)