With graphics cards / GPUs, or processor CPUs one hears the term gigaflops, or the abbreviation gflops!
Gigaflops is a unit of measurement
to measure the performance of a floating point unit of a computer, commonly referred to as an FPU. One gigaflop corresponds to one billion / 1,000,000,000 FLOPS, or in clear, the number of floating point operations per second.
The masters of the gigaflops
are the GPU, the already integrated GPUs that are part of the CPU, such as the intel HD 4000 reaches over 300 gigaflops while the CPU only reaches 10-30 gigaflops depending on the CPU model, i.e. more than 10 times the CPU .
The term gigaflops
is very popular, Giga stands for a billion and FLOPS is an acronym "floating point operations per second". Because gigaflops measure how many billions of floating-point calculations a processor can perform per second, it serves as a good indicator of the pure computing power of a processor. However, since it does not measure integer calculations, gigaflops cannot be used as a comprehensive means of measuring the overall performance of a processor.