Prevent /usr/lib/xorg/Xorg from using GPU Memory in Ubuntu 20.04 Server
On an fresh Ubuntu 20.04 Server machine with 2 Nvidia GPU cards and i7-5930K, running nvidia-smi
shows that 170 MB of GPU memory is being used by /usr/lib/xorg/Xorg
.
Since this system is being used for deep learning, we will like to free up as much GPU memory as possible.
Question: How can we prevent gnome and Xorg from taking up 179 MB of GPU memory?
Output of nvidia-smi
Sat Oct 3 20:27:19 2020
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 450.66 Driver Version: 450.66 CUDA Version: 11.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 GeForce GTX 1080 Off | 00000000:02:00.0 Off | N/A |
| 0% 54C P8 11W / 210W | 7MiB / 8119MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 1 GeForce GTX 1080 Off | 00000000:03:00.0 Off | N/A |
| 0% 50C P8 10W / 210W | 179MiB / 8116MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 1109 G /usr/lib/xorg/Xorg 4MiB |
| 1 N/A N/A 1109 G /usr/lib/xorg/Xorg 166MiB |
| 1 N/A N/A 1189 G /usr/bin/gnome-shell 9MiB |
+-----------------------------------------------------------------------------+
I had the same issue. Many of the suggestions were saying to edit /etc/X11/xorg.conf
which doesn't exist for me on Ubuntu 20.04. I needed to keep X11 because I occasionally use X2go or X11 over SSH. I did manage to find a file /usr/share/X11/xorg.conf.d/10-nvidia.conf
and then commented out all the lines. I was then able to restart X11 with:
sudo systemctl restart display-manager
Voilà, no more GNOME/X11 on GPU.
$ nvidia-smi
Wed Feb 3 19:44:02 2021
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 450.102.04 Driver Version: 450.102.04 CUDA Version: 11.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 GeForce RTX 208... Off | 00000000:01:00.0 Off | N/A |
| 27% 33C P8 1W / 250W | 882MiB / 11019MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 1 GeForce GTX 1070 Off | 00000000:4D:00.0 Off | N/A |
| 0% 41C P8 10W / 151W | 2MiB / 8119MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 1075324 C python 879MiB |
+-----------------------------------------------------------------------------+
I have a setup with Nvidia 2080 GPU. I have observed that remoting into the machine via xrdp leads to a GPU use of only 14 Mb. If a physical screen is connected, it consumes the memory as per the screenshot posted above.
Edit: Even chrome remote desktop leads to xorg consuming just 14 Mb of GPU RAM