What is FLOPS in field of deep learning?

I not sure my answer is 100% correct. but this is what i understand.

  • FLOPS = Floating point operations per second

  • FLOPs = Floating point operations

FLOPS is a unit of speed. FLOPs is a unit of amount.


Confusingly both FLOPs, floating point operations, and FLOPS, floating point operations per second, are used in reference to machine learning. FLOPs are often used to describe how many operations are required to run a single instance of a given model, like VGG19. This is the usage of FLOPs in both of the links you posted, though unfortunately the opengenus link incorrectly mistakenly uses 'Floating point operations per second' to refer to FLOPs.

You will see FLOPS used to describe the computing power of given hardware like GPUs which is useful when thinking about how powerful a given piece of hardware is, or conversely, how long it may take to train a model on that hardware.

Sometimes people write FLOPS when they mean FLOPs. It is usually clear from the context which one they mean.