At EasyTechJunkie, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What are Gigaflops?

Gigaflops measure a computer's performance, specifically its ability to calculate billions of floating-point operations per second. This metric is crucial in fields demanding high-speed data processing, like scientific simulations or complex 3D rendering. Understanding gigaflops helps us appreciate the power behind modern computing. How does this translate to advancements in technology and everyday applications? Let's examine the impact together.
Rachel Burkot
Rachel Burkot

Gigaflops are measures of computer speed. A gigaflop is a billion floating point operations per second (FLOPS). FLOPS, which is technically either a singular or plural term, is used especially in fields of scientific floating point calculations. Floating point is a computer term that refers to a system for numerical representation where a string of digits stands for a rational number. It "floats" in the sense that it can be placed anywhere relative to the significant digits in the number.

Floating points are used in codes to handle long numbers easily. A floating point number is expressed as a basic number, also called a mantissa, an exponent and a number base, or radix. The base is usually either ten or two. Floating point operations are measured by a computer’s floating point registers.

Man holding computer
Man holding computer

A simple calculator uses only about ten FLOPS, so gigaflops are used to measure the speed of high-power computer systems. The fastest computer processor, the Cray XT Jaguar, which was expanded in November 2008, operates at 1.64 petaflops, or one quadrillion FLOPS. Computer operations are usually measured in megaflops, which are one million FLOPS. As computer systems expand, however, technicians are using such terms as gigaflops; teraflops, which are one trillion FLOPS; and even petaflops.

Gigaflops are good indicators of a computer’s raw performance, but they should not be the only factor used to measure computer performance, as they cannot measure integer calculations. Using FLOPS as a benchmark of computer speed is also not recommended, as it only provides theoretical, single-precision floating point performance. A computer code that uses double-precision floating point performance would not be an accurate benchmark. Only in the most specialized applications are FLOPS so numerous that gigaflops must be used.

Modern processors typically include a floating point unit (FPU), which is the part of the microprocessor responsible for the FLOPS. The FLOPS measurement is the speed of the FPU. Additional elements that a FLOPS measurement fails to take into account include whether the microprocessor is running under a heavy or light load and the specific operations that are included in floating point operations.

The Standard Performance Evaluation Corporation (SPEC) is a nonprofit corporation founded by technicians interested in creating a standard of benchmark tests for measuring FLOPS. The tests are intensive measures of such factors as integer performance and floating point performance.

You might also Like

Discussion Comments


How do you describe in Flops, a computing system that operates at 3000 Ghertz/second.


Unfortunately this article is the very first that completely lost me. I think I needed some examples as it felt as if you were using the same words to define the same words.


I can count to twenty rather well with my shoes off. Hard to think of such large numbers.


Post your comments
Forgot password?
    • Man holding computer
      Man holding computer