Flopping Fast

intel_unwraps_core_i5_cpu_plus_core_i7s_and_xeons_1

A gigaflop is a billion — one thousand millions — flops.  A flop is a “floating point operation” — the “fl” from “floating point” and the “op” from “operation” — which is, basically, a bit of arithmetic involving a variable with some number of digits before the decimal place and some after. (Take 340.202 and multiply it by -8.943 for example — calculating that would be one “flop.”) We use gigaflops per second as a way to benchmark how fast a computer’s processor is. Many current laptop computers are capable of roughly 50 gigaflops per second, and in the twenty or so seconds it took you to read those last sentences, the computer is capable of doing a trillion flops. (It’s not actually performing all those calculations right now, but it could.) That’s a lot of math, done incredibly fast.

But how fast? Here’s another way of looking at it — one which shows just how incredible our computers are.

Let’s say you’re on your laptop taking a math test — something from the fifth grade. You don’t get to use the computer’s calculator or any other automated way of doing the arithmetic, but there’s a pad and some pencils next to you so that you can calculate the answers by hand. The computer shows you 100 questions, each one of which is like the 340.202 * -8.943 example above. All the questions are the same type — lots of decimals — but not all are multiplication questions. There’s some division, some addition, and some subtraction. The test would take you, hopefully, about an hour. (Okay, maybe two.)

You’re sitting about two feet away from the screen — for ease of math, let’s say 23.6 inches, or about 0.4 inches short of that two foot mark. It feels like the words are hitting your eyes instantaneously, but they’re not. The light has to travel from your computer screen to your eyes, and while that happens quickly, it does take some measurable amount of time — about two nanoseconds. (Yes, light travels about 11.8 inches per nanosecond.) A nanosecond is a billionth of a second, and typically, we discount those incredibly tiny amounts of time.

But in this case, that super-tiny amount of time is important — because it shows just how fast the computer can complete that same math test. Again, each question on your test is one flop, and a gigaflop is a billion flops. At 50 gigaflops per second, the computer can, therefore, answer 50 such questions in a nanosecond — or complete all 100 in two nanoseconds. In other words, in the amount of time it takes the questions from your computer screen to hit your eyes, the computer is done with the test.

Bonus Fact: In the first sentence above, “billion” is defined as “one thousand millions,” which if you’re in the United States, probably seems unnecessary to you. It’s not. In the UK, a billion used to be one million millions, or what we in the U.S. would typically call a trillion. The definition changed in around 1975.

From the ArchivesThe Buzzing Supercomputer: The computer is faster than you. So are the bees.

Related: Looking for some random numbers? Here’s a book of a million of them.

Thank you to @davewiner@billyjoelismint, and @drmagoo on Twitter for helping explain what a floating point operation is.