How to check gpu usages on aws ec2 gpu instance?
Use:
nvidia-smi -h
to see the options.Display info arguments:
Display only selected information: MEMORY,
UTILIZATION, ECC, TEMPERATURE, POWER, CLOCK,
COMPUTE, PIDS, PERFORMANCE, SUPPORTED_CLOCKS,
PAGE_RETIREMENT, ACCOUNTING, ENCODER STATS
Example:
nvidia-smi -i 0 -l -q -d UTILIZATION
The output is something like:
==============NVSMI LOG==============
Timestamp : Thu Apr 11 03:48:37 2019
Driver Version : 384.183
CUDA Version : 9.0
Attached GPUs : 1
GPU 00000000:00:1E.0
Utilization
**Gpu : 9 %**
Memory : 11 %
Encoder : 0 %
Decoder : 0 %
GPU Utilization Samples
Duration : 18446744073709.22 sec
Number of Samples : 99
Max : 10 %
Min : 0 %
Avg : 0 %
Memory Utilization Samples
Duration : 18446744073709.22 sec
Number of Samples : 99
Max : 14 %
Min : 0 %
Avg : 0 %
ENC Utilization Samples
Duration : 18446744073709.22 sec
Number of Samples : 99
Max : 0 %
Min : 0 %
Avg : 0 %
DEC Utilization Samples
Duration : 18446744073709.22 sec
Number of Samples : 99
Max : 0 %
Min : 0 %
Avg : 0 %
Is this NVIDIA gear? If so, try nvidia-smi -i 3 -l -q -d
to see GPU and memory utilization statistics (among other info).
Notice that this only works under 1) old nvidia drivers (18X.XX), or 2) NVIDIA Tesla GPUs.