A top-like utility for monitoring CUDA activity on a GPU

To get real-time insight on used resources, do:

nvidia-smi -l 1

This will loop and call the view at every second.

If you do not want to keep past traces of the looped call in the console history, you can also do:

watch -n0.1 nvidia-smi

Where 0.1 is the time interval, in seconds.

enter image description here


I find gpustat very useful. In can be installed with pip install gpustat, and prints breakdown of usage by processes or users.

enter image description here


I'm not aware of anything that combines this information, but you can use the nvidia-smi tool to get the raw data, like so (thanks to @jmsu for the tip on -l):

$ nvidia-smi -q -g 0 -d UTILIZATION -l

==============NVSMI LOG==============

Timestamp                       : Tue Nov 22 11:50:05 2011

Driver Version                  : 275.19

Attached GPUs                   : 2

GPU 0:1:0
    Utilization
        Gpu                     : 0 %
        Memory                  : 0 %

Just use watch nvidia-smi, it will output the message by 2s interval in default.

For example, as the below image:

enter image description here

You can also use watch -n 5 nvidia-smi (-n 5 by 5s interval).