what is 'Gbytes seconds'?

I couldn't find better documentation than the man page where that description can be found. I think 1 Gbyte second is 1 Gbyte of memory used for one second. So if your code uses 1 GB for 1 minute then 2 GB for two minutes, the accumulated memory usage is 1*60 + 2*120 = 300 GByte seconds.


The Gigabyte-second unit specifies the amount of memory allocated to a task per second that that task runs.

Example:

  • Task 1: 1 GB * 10 seconds = 10 gb-sec

  • Task 2: .128 MB * 10 seconds ~ 0.128 GB * 10 Seconds = 1.28 gb-sec

  • Task 3: 8 GB * 10 seconds = 80 gb-sec

In practice it's often employed as a devops metric or as a way of billing out services designed to run for short time periods and then terminate such as serverless function services on PaaS clouds.