Skip to main content

Cryptography Corollary: Password Probabilities

It seems like not a month goes by without reports of data breaches involving passwords. It seems like there’s no hope to be had in constructing strong passwords. Over the past few years, you may have noticed this xkcd comic floating around the Internet. It attempts to explain why choosing a strong password is so difficult, and it offers a now (in)famous example of how to construct strong passwords. What’s at work here? How do we know the password (or passphrase) “correcthorsebatterystaple” is stronger than “Tr0ub4dor&3”? It all has to do with something known as password entropy.

Password Entropy

Entropy, as it turns out, isn’t exactly a simple concept, even for experts in information theory. We can try to approximate a simple definition, however. We can relate entropy to probability by stating that entropy measures how difficult it is for someone to guess a password—that is, the probability of guessing the password. In general, the higher the entropy (and the lower the probability), the harder it is to guess a password. Also, because we’re using computers here, we measure entropy using bits. For example, a password could have an entropy of 28 bits, as given in the xkcd comic. But, while 28 bits of entropy corresponds to roughly 268 million possibilities, this is pretty low entropy for a password.[1] We’ll come back to that later.

But what makes “Tr0ub4dor&3” a bad password? Well, on the one hand it has low entropy, and on the other hand, it’s difficult to remember. That latter point seems straightforward. We want to make our passwords strong, so we try substituting symbols and using uncommon words, but this is where things break down. Generally, humans are bad at remembering things, especially if those things are long sequences of seemingly arbitrary symbols. This leads to frustration (e.g., multiple errors on password entry) and bad habits (e.g., writing down passwords). Supposing we had perfect retention, however, the more pernicious problem of entropy would remain.

Warning: The next section uses quite a bit of math. See the footnote below for some helpful information.[2] If you want to skip all the math to calculating entropy, you can skip the next section. We’ll next be determining why 28 bits of entropy isn’t so great after all.

Calculating Entropy

Tr0ub4dor&3

Let’s look at how exactly the entropy of “Tr0ub4dor&3” is determined. On first glance, we may notice that the password is 11 characters long and is based on the uncommon word “troubadour” (omitting the second “u”) with the appendage “&3” tacked on at the end. This information isn’t given to us, but presumably the word “troubadour” was pulled from a special dictionary of 65,536 uncommon English words. We can calculate the entropy as log2(65536) = 16 bits of entropy (that’s the 16 squares in the comic). Notice that the first letter, “T”, is capitalized. It’s a common practice to capitalize different letters in passwords, especially at the beginning. For the sake of example, we assume only the first letter can be capitalized. In that case, the first letter, “T”, can be capitalized or not—that’s two options. This gives us an entropy of log2(2) = 1 bit for that one letter. Notice also that the vowels “o” and “a” have been substituted with “0” and “4”, respectively. It’s common for people to substitute the vowels “a”, “e”, “i”, and “o” with other symbols (less so the vowel “u”). There are four vowels there, but we must include their substitutions, too—that brings us to eight possibilities, or log2(8) = 3 bits. Altogether, for the root word “Tr0ub4dor”, we have 16 + 1 + 3 = 20 bits of entropy.

Tr0ub4dor&3

We also must factor in the appendage “&3”. Since those two symbols could be in either order—“&3” or “3&”—we have an entropy of log2(2) = 1 bit. Also, the “&” symbol could be any special symbol. If we consider the 16 most common special symbols, we have log2(16) = 4 bits of entropy for that symbol. Finally, there are 10 possible digits, so for “3”, we have log2(10) ≈ 3 bits of entropy. For the appendage “&3”, then, we have a total entropy of 1 + 4 + 3 = 8 bits of entropy.

Taken altogether, “Tr0ub4dor&3” has an entropy of approximately 28 bits. But how does that translate to password strength? Remember that entropy in this context measures how difficult it is for someone to guess the password. So, in a sense, we can gauge the strength of “Tr0ub4dor&3” by calculating how long it will take someone to guess it. Let’s expand on the calculation given in the xkcd comic.

Password Cracking

Suppose an attacker Eve has the ability to guess a modest 1,000 passwords per second.[3] Now, we know there are 60 seconds per minute, 60 minutes per hour, and 24 hours per day. So, we can say there are 60 * 60 * 24 = 86400 seconds in a day in which Eve can guess passwords (assuming she guesses nonstop). This means she can guess 1000 * 86400 = 86.4 million passwords per day. Going back to our “Tr0ub4dor&3” password, we know it has an approximate entropy of 228, or 28 bits.[4] If we divide 228 by 86.4 million, we see it would take Eve roughly 3 days to guess our password. Frankly, those aren’t great odds.

At the risk of beating a dead horse, it should now be obvious that “Tr0ub4dor&3” isn’t a great password. It has low entropy, which makes it easy to guess in a relatively short amount of time. Moreover, it’s a non-standard spelling littered with substituted symbols, making it difficult for humans to remember. So, what makes “correcthorsebatterystaple” a better password (or passphrase)? Again, the answer is the entropy.

You may have noticed that in the comic, each word in “correcthorsebatterystaple” is assigned an entropy of 11 bits, which combine to give 44 bits of entropy for the entire password.[5] Where did those 11 bits come from? Because each word has an entropy of 11 bits, we can deduce that the words came from a dictionary of 211 = 2048 words.[6] Admittedly, this is somewhat arbitrary (more on this in a moment), but it demonstrates nicely why 44 bits of entropy is better than 28 bits of entropy. Recall that it would take roughly 3 days for Eve to guess “Tr0ub4dor&3”. If we apply the same logic as above, we see 244 divided by 86.4 million is equal to roughly 203,613 days—that’s over 557 years of guessing passwords! Thus, we see that “correcthorsebatterystaple” is both easy to remember and difficult to guess in a short amount of time.

We observed that the dictionary of 2,048 words used in the xkcd comic is mostly arbitrary, but it’s not far off the mark. Most native English speakers can get through their days by relying on approximately 3,000 words. Assuming we used a dictionary of 3,000 words and a four-part password (e.g., “correcthorsebatterystaple”), we could have entropy on the order of 46.4 bits. If Eve were guessing nonstop, it would take her almost three thousand years to guess our password. By adding fewer than a thousand words, we quintupled the amount of time it would take for Eve to guess our password.

But suppose we take it up a couple notches. Assume we use a native English speaker’s larger 35,000-word vocabulary. How many bits of entropy would one word have in that vocabulary? We see each word would have log2(35000) ≈ 15.1 bits of entropy. For a four-part passphrase, we would have 15.1 * 4 = 60.4 bits of entropy. How long would it take Eve to guess that password? Approximately 48 million years![7]

Please Be Advised

Have we solved the password dilemma? Not exactly. There are two problems with the preceding discussion. First, in all our fancy calculations, we assumed that humans were selecting things perfectly at random. Unfortunately, humans aren’t random at all. Second, in our estimates of strength (i.e., time to guess), we assumed Eve could only guess 1,000 passwords per second. More importantly, we assumed Eve was guessing passwords one at a time. It turns out both those assumptions aren’t well founded.[8]

The major drawback here is that we assume, all things being equal, the components of our password (e.g., “correct”, “horse”, “battery”, and “staple”) are being chosen at random. Humans are notoriously predictable. We like things to be easy to remember, so we tend to choose common things upon which to base our passwords—names, birthdays, and so on. In this way, we skew the probabilities by making it easier for attackers, like Eve, to guess our passwords. Moreover, because we’re bad at remembering things, we tend to follow patterns. For example, the password “Tr0ub4dor&3” follows the common root-appendage pattern: “Tr0ub4dor” is the root word and “&3” is the appendage. As we saw earlier, symbol substitutions are also frequent, making them easily predicted.

Supposing we somehow perfectly randomly selected a password, we still would have to worry about someone guessing it. In our discussion, Eve was guessing passwords at a modest rate of 1,000 passwords per second. That is an unrealistically slow rate given the advancements in computing power and efficiency. Furthermore, there was an implicit assumption that Eve was guessing passwords sequentially, one at a time. In reality, attackers can guess hundreds or thousands of passwords concurrently. With the right hardware and software, an attacker could guess tens of thousands of passwords in mere hours. That’s partly due to the sophistication of modern computers, but it’s also partly our fault due to human imperfections. That powerful software exploits the common flaws we subconsciously inject into our passwords.

There are some ways to stay ahead of attackers, of course. We offered one method in a previous blog post. We suggested constructing a sentence (e.g., “I really want my passwords to be highly protected”) and selecting certain letters and symbols from that sentence to create a secure password. We gave the example “Irwmp2b^p!#1”, which is 12 characters long and has 78 unique symbols to choose from.[9] Generally, that means there are 7812 possible passwords we could construct that are 12 characters in length. As for entropy, our example password has something on the order of log2(7812) ≈ 75.4 bits of entropy. If Eve were trying to guess that password at an impressive 1,000,000 passwords per second, it would still take her 1.58 billion years to guess that password![10]

The point here is that making secure passwords isn’t that hard. There’s no excuse, really, for anyone to successfully guess your password, if you take the proper precautions.  While in this blog post we saw how mathematically easy it can be to guess bad passwords, we didn’t see how this can be done computationally. Next time, we’ll explore one of the most common methods used by attackers to crack passwords: Rainbow tables.

Want more great advice?  Learn how to stay off an attacker’s radar with this checklist.


[1] Check out this explanation for a bit more detail on this specific comic.

[2] As we now know, entropy is measured in bits. We also know that bits are expressed in powers of two since bits can represent either the value zero or one. For example, 23 means 3 bits are being represented—for our purposes, we say there are 3 bits of entropy. Sometimes we need to go the other direction. That is, sometimes we need to calculate the bits of entropy by using the binary logarithm. For example, say we have 10 possible digits (0-9) to choose from, but we want to know the entropy. We simply calculate the binary logarithm: log2(10) ≈ 3. In other words, we have approximately 23 or 3 bits of entropy.

[3] That’s actually a really low number, but it’s a nice round number that makes the example easier to follow.

[4] That means there are 268,435,456 (i.e., 228) possibilities.

[5] There are four words in the password “correcthorsebatterystaple”. If each has an entropy of 11 bits, then we can add to get 11 + 11 + 11 + 11 = 44. Or, more succinctly, 11 * 4 = 44 bits of entropy.

[6] Here’s a mathematical generalization of the formula used in the xkcd comic. Assume we need to use a dictionary or vocabulary of size 2a. Also assume we want our password to consist of b words randomly selected from that list. Finally, assume we want to achieve at least x bits of entropy for our password. With these assumptions, we can construct the formula (2a)b ≥ 2x, or more simply 2ab ≥ 2x. If we take the easier case, we see that 2ab = 2x, which means ab = x, or a = x / b. So, to achieve x bits of entropy with b words, we need a dictionary of at least size 2x/b words. For example, if we want at least 44 bits of entropy using a four-part password, we would have x = 44 and b = 4. We can solve: a = x / b = 44 / 4 = 11. So, we need a dictionary of at least size 211 = 2048 words. Neat!

[7] Suppose we use the entire contents of the Oxford English Dictionary. There are roughly 171,476 words in current use. That means each word would have log2(171476) ≈ 17.4 bits of entropy. If we keep our four-part passphrase convention, that means we would have 17.4 * 4 = 69.6 bits of entropy. It would take over 28 billion years to guess!

[8] In the interest of pedantry, it’s worth noting here that Eve most likely would be performing her attack offline—that is, guessing passwords against saved password hashes. Most well-constructed password systems have built-in safeguards to stymie attackers. That’s why we still have four-digit ATM PINs and short passcodes on our smartphones, for example. Those protections generally don’t apply if an attacker has stolen a database of password hashes, however.

[9] That’s 52 alphabetic characters (uppercase and lowercase), 10 digits, and 16 special symbols.

[10] The observant reader will note that we still rely on the implicit assumption that Eve is guessing passwords sequentially. It’s hard to estimate how much she’d improve if she were using special hardware and software, but it’s reasonable to assume her odds would improve quite a bit.