Astronomy and cosmology involve some big numbers: a hundred million miles to the sun; six trillion miles in a light-year; two million light-years — ten billion billion miles, a one with 19 zeros after it — to the nearest galaxy. These are huge numbers, and it’s hard to get your head around them directly. But with a little bit of work, it’s not too bad. For example, a million isn’t actually that big of a number: get a cube of something small (marbles? BBs?) with a hundred objects on each side, and there are a million of those objects in that cube. Get a thousand of those cubes — a bigger cube, with ten of the smaller cubes on each side — and you’ve got a billion. A million seconds is only 11 and a half days; a billion seconds is 31 and a half years.
Another common way of making these numbers understandable is by making an analogy to something on a smaller scale: if the distance from the Earth to the Sun is one foot, then the distance from the Sun to Neptune is 30 feet, and the distance to the nearest star is 50 miles. Carl Sagan used the same trick with time; his cosmic calendar shrinks the lifetime of the universe down to one year — the Big Bang is on January 1st, the solar system forms sometime in early September, dinosaurs show up around Christmas and go extinct a few days later, and all of recorded history happens in the last ten seconds before midnight.
These kinds of analogies are great for understanding big numbers. But there are other numbers, really big numbers, that these analogies are useless for. They come up all the time in statistical mechanics, a branch of physics that uses probability to study systems made of vast numbers of smaller things.1 Here’s an example:
Say that you have a small room, about 8×8 feet, with a ceiling that’s seven feet high. Paint a narrow line down the middle, and then take everything out of the room — not just the furniture, but the air too, so you have a room with literally nothing in it. Now take a single molecule of oxygen, put it in that room, and wait a while. Let it bounce around in there. When you come back, what are the odds that you’ll find the molecule on the left-hand side of the line? Well, either it’s on the left-hand side or the right-hand side, and it’s just as likely to be on one side as the other, so the odds of it being on the left-hand side must be 1 in 2. It’s like flipping a coin — there are two possible outcomes, and they’re equally likely.
Let’s make things interesting, and put a second oxygen molecule in there, and wait a while again. What are the odds that both molecules will be on the left-hand side of the room when you look? Now it’s like flipping two coins, and we want to know how likely it is that both of them came up tails.2 And again, the answer’s not hard — there are four possible outcomes: both tails, both heads, tails-heads, and heads-tails, so the odds must be 1 in 4. What if we’ve got three molecules in there? Flip 3 coins: the odds are 1 in 8 that they’ll all come up tails. The number of outcomes is always , with the same number of twos as we have molecules (or coins), so the odds are always 1 in , where is the number of molecules.
The punchline: if we let all of the air back into the room, and then wait a while, what are the odds that we’ll find all of the molecules over on the left-hand side of the room?3 Well, still 1 in — but what’s N now? How many molecules are there in the air in that room? For a room of that size, the number of molecules in the air is pretty huge — about , or a 3 with 26 zeros after it.4 So the odds of finding all of the air in the room on the left-hand side are one in , or about . This is a very big number — it’s a one with zeros after it. If I were to use the molecules in the air in the room to write out the zeros, I’d need about one third of all of them to be able to write this number.5 But that’s not a wonderful way to express how truly vast this number is — at best, that just gives a very vague idea of how hard it would be to write out the number. Can we do better?
Not really. As I mentioned before, we can’t come up with scale analogies for it at all. Why not? Scale analogies are based on division. To get Sagan’s cosmic calendar, we divide time by a big number to shrink everything down — 14 billion years becomes one year. To understand the distance to the sun, we divide it by an everyday speed — 60 miles per hour — to find that it would take two hundred years to get to the sun at that speed. But a really big number like is so big that making it millions or billions of times smaller doesn’t bring it down to anything we can remotely understand. If we divide by a billion, we get ; that’s a one with zeros after it, which is still just about the same amount as we had before.
How can we understand the size of this number? We can try to find new ways to express just how difficult it would be to write it out. For example, if we used pennies as our zeros for this number, we’d end up with a pile of pennies that weighs nearly half as much as Mars. But that’s not terribly satisfying either — knowing how hard it is to write out a number isn’t the same as understanding just how phenomenally huge it is. The only other approach I can think of is to try to cook up another situation that involves a similarly big number, and compare that to our original case. Go back to that pile of pennies, and make it three times bigger (one-and-a-quarter times the mass of Mars, or one-eighth the mass of Earth). Now you’ve got as many pennies as there were molecules in that room. Flip each of them once. That’s already hard — if you flipped them superhumanly fast, a thousand pennies a second, it’d still take you about ten billion billion years to flip all of them — but say you can do it. The odds of all of them coming up tails are about one in .
Does that help? It doesn’t particularly help me. To me, the number is literally mind-numbing; I can’t feel its size very well.6 (If you can think of a better way to get a handle on this kind of number, please post it in the comments!) But those really long odds, 1 in , are the main reason that all the air in this room doesn’t all end up on one side. So it’s theoretically possible that you’d see it happen — but I wouldn’t hold my breath.
- If that sounds vague, that’s because it is — statistical mechanics is incredibly general, and it governs everything from puddles of water (made of molecules) to galaxies (made of stars and gas and dark matter) to DNA (made of nucleotides, sugars, &c.). [↩]
- We’re making a couple of assumptions here — that the molecules don’t attract or repel each other, that they’re negligibly small. These turn out to be remarkably good assumptions for most gases at room temperature, and the jargon here is straight out of high school chemistry: we’re assuming that this is an ideal gas. [↩]
- This isn’t impossible — after all, if we wanted to, we could put a big piston into the room and force all the air over on one side, so it’s physically possible for the air to be there. And air does just whiz about in a room, generally, so it’s possible that it could all happen to be on one side. Hopefully it’s on the same side that you’re in (though as we’re about to see, you don’t need to worry about that). [↩]
- About 500 moles, for those of you keeping score at home. [↩]
- Fun fact: this ratio of one-third holds true no matter how big the room is, as long as it’s somewhere around normal room temperature. [↩]
- There are, of course, even bigger numbers, though they don’t come up much in the physical sciences; Graham’s number puts this one to shame. EDIT: My friend Ben has also pointed out this excellent essay by Scott Aaronson on even bigger numbers. [↩]
I went to this art exhibit long ago that sort of reminds me of this. They used one grain of rice to represent one person for all sorts of interesting statistics.
This shows some good clear pictures: http://chapter5section2.blogspot.com/2008/09/of-all-people-in-world.html
You can also easily google “Of All the People in the world” to see more.
Aaronson’s essay covers it better than I could express, but I’m fascinated by the conceptual trap of big numbers. Our brains can’t place such huge numbers exactly, so they revert to spatial approximations only to be stonewalled again because they have no grasp of spatial ideas on that scale. So, uncomfortably, we return to expressing them with verbal language where even though we understand the characters involved in the expression (Not 10, nor 26, nor any number of superscripts is fundamentally alien to me), the overall idea the characters are expressing is utterly lost.
Similarly, “Anikwasadra” could be a word, or a place, or the name of your dog that I’ve met, but even though I know all the letters, sounds, and rules for expressing the characters in “Anikwasadra” I can’t firmly link it to an idea, and thus, it’s gibberish to me.
I’ve been struggling for a way around this problem ever since I first ran into it in calculus. Unsurprisingly, calculus up to three dimensions for me was a relatively simple affair. I understood it, could it apply it, and understood it’s immediate impact to my life. Then as soon as we started on N-dimensional calculus, I lost my footing and I was furious! What happened? The rules remained the same, simply appending “w” where once there was only “x” “y” and “z”, so on so forth, and yet without fail, my math was always much shakier once we pushed calculus into the 4th, 5th, and nth dimension. I’d run into the same problem you have with large numbers: It was simply mind numbing. Once I lost my spatial approximation as a point of reference, expressions became nebulous abstract ideas, even though those abstracts were governed by the same rules of calculus I’d already become familiar with.
Aaronson touches on the value of finding better paradigms to express and understand increasingly large numbers, as a means of expressing abstract ideas, but I think you’ve touched on the layman’s best hope at comfortably comprehending such large numbers:
We need to anchor it to more solid, spatially imaginable ideas. While converting zeroes into pennies and then imagining the number of pennies you have as a spatial concept you can grasp might not be terribly satisfying (unsurprising, given it’s one-off nature), it is at least, the beginnings of touching the shores of the unknowable. And without at least that, you’re doomed to flounder helplessly against these abstract ideas. That, and get an embarrassing C- on an exam in a class you were previously getting an A in.
I can only think of two things that MIGHT (probably won’t) help.
Option 1 – Break em Into Pieces and Cram the Suckers Down
Designate a number that can be managed relatively easily. Then, combine that number of familiar objects with progressively larger objects and imagine yourself flying upwards away from it.
You can fit N mites on an eyelash
You can fit N eyelashes in a swimming pool
You can fit N swimming pools on Earth
You can fit N Earths in the sun
You can fit N suns in the solar system… etc
I imagine you could get pretty big that way, and with each “layer” being ontologically different from the rest, perhaps you could wrap your head around them individually and end up with a better understanding of the holistic model as you slowly work your way up…. flying…. from the mites or whatever..
Option 2 – Different Strokes
Could you use different senses to try communicating the idea? Since a large part of understanding spacial dimensions is based on vision, perhaps trying to understand the numbers through touch or hearing might work. Again, find the largest manageable number available, and start small. Perhaps it could work like this:
Manageable number (N) = a summer breeze
N to the nth power = a pin prick
N to the nth to the nth power is a punch from Mike Tyson
N to the nth-nth-nth is an atom bomb… etc
Or what about combining senses? Could you use a 10″ speaker system with 1000w amplifier and describe how many you would have to stack on top of each other?
I haven’t worked with these kinds of numbers, so I’m not sure if any of this is at all possible.
Ryan — I really like your ideas, especially the idea about combining senses. That really might work, especially for sound, because of the strange way that we perceive the relative volume of sounds. (We perceive sound on a logarithmic scale, not a linear one.) Unfortunately, your other idea about breaking things down and scaling up won’t work — it’s multiplication all over again, and we need something more powerful than multiplication to get our heads around these numbers, as Aaronson’s essay points out very nicely. But I’m definitely going to think about how to use sound here…I’ll let you know if I come up with anything.