Whether it’s the latest reveals about the powerful Microsoft Scorpio project for Xbox or the newest release on AMD GPUs, you may see the term “TFLOP” being tossed around. If you don’t really know that this means or why you should care about it, then you’ve come to the right place! Let’s talk through the term and why it can be important when talking about processors.

### Okay, what is a TFLOP?

TFLOP is a bit of shorthand for “teraflop,” which is a way of measuring the power of a computer based more on mathematical capability than GHz. A teraflop refers to the capability of a processor to calculate one trillion floating-point operations per second. Saying something has “6 TFLOPS,” for example, means that its processor setup is capable of handling 6 trillion floating-point calculations every second, on average.

### Well, I’m already lost. What are floating-point calculations?

Floating-point calculations are a common way of gauging the computational power of computers. In fact, once we started using FLOPs, it quickly became a common international standard for talking about computer prowess.

As you may (or may not) remember from math classes, floating-point, or “real” numbers, are a set of all numbers including integers, numbers with decimal points, irrational numbers like pi, and so on. From a computational standpoint, a floating-point calculation is any *finite* calculation that uses floating-point numbers, particularly decimals. This is far more useful than looking at fixed-point calculations (which use only whole integers), because the work that computers do frequently involves finite floating-point numbers and all their real world complications.

So, FLOPS measure how many equations involving floating-point numbers that a processor can solve in one second. There is a lot of variance in the FLOPS that various devices need. A traditional calculator, for example, may need only around 10 FLOPS for all its operations. So when we start talking about megaflops (a million floating-point calculations), gigaflops (a billion) and teraflops (a trillion), you can start to see what sort of power we’re talking about.

Manufacturers frequently include FLOPS as a specification on computers so they can talk about how fast they are in a universal way. However, if you have a custom-built machine and really want to brag about its teraflops, too, then there’s a pretty simple equation that you can use to figure it out:

### Wait, I’ve seen teraflop data on GPUs, does that mean TFLOPs create graphics, too?

It’s all basically the same thing. What we see as computer graphics are a massive number of polygons being rendered and moved into different positions on the screen. But the processor doesn’t draw little pictures, it uses mathematics to describe the shape, characteristics and placement of these polygons. You can think of each angle and position as its own little floating-point calculation—because that’s exactly how the processor sees it.

### So, more TFLOPs means faster devices and better graphics?

Often—but not always. In fact, we have seen some GPUs with more teraflops that perform *worse* than those with fewer TFLOPS. For a general analogy, consider wattage. A floodlight and a spotlight may use same wattage, but they behave in very different ways and have different levels of brightness. Likewise, real-world performance is dependent on things like the structure of the processor, frame buffers, core speed, and other important specifications.

But yes—as a guideline, more TFLOPS should mean faster devices and better graphics. And that’s actually an impressive sign of growth. It was only several years ago that consumer devices couldn’t even approach the TFLOP level, and now we’re talking casually about devices having 6 to 11 TFLOPs without thinking twice. In the world of supercomputers, it’s even more impressive. Researchers who compare specs are now discussing supercomputers that have more than 100 petaflops, and a petaflop is a *thousand* teraflops.

### Bottom line: What should I think when I see the TFLOP specs?

Think of this as the equivalent of a dude talking about how much he can lift in the gym. If a ripped dude says he can bench 300 pounds, then it’s a good bet he can bench a lot, even if the actual figure has been fudged a bit for self-serving purposes.

But a weight-lifting championship isn’t going to take contestants at their word, and you should practice the same skepticism when looking at devices and quoted TFLOPs. Ultimately, it’s real-world performance that matters most.

419