In information theory, entropy is a measure of the uncertainty associated with a random variable.[1] In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message[2], usually in units such as bits.[3] Instead of the number of guesses needed to find the password with certainty, the base-2 logarithm of that number is given, which is the number of "entropy bits" in a password.
Like, when I first read the username jmitchell42, I assumed this person had surname Mitchell with initial J, and was pleased to find out that, in fact, he was named James Mitchell the 42nd, and that it wasn't some nasty hoax.
J2 must of gone somewhere cuz new kid never gets this much computer time when he's around. Do you delete your browsing history so he doesn't know you were on?