How do we measure speed in computation?
- Floating-Point Operation per Second (FLOPS)
- Count number of floating-point calculations (arithmetic operations) per second.
- Not MIPS (millions of instructions per second) as MIPS also count non-arithmetic operations such as data movement or condition.
$FLOPS = sockets \times \frac{cores}{socket} \times \frac{cycles}{second} \times \frac{FLOPS}{cycle}$
- MFLOPS (megaFLOPS) = 1,000,000 FLOPS
- GFLOPS (gigaFLOPS) = 1,000,000,000 FLOPS