Force integrated graphics

As of now, applications which use graphics tend to use NVIDIA.

$ nvidia-smi
Mon Jun 12 14:26:03 2023       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.41.03              Driver Version: 530.41.03    CUDA Version: 12.1     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                  Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf            Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce RTX 3060         Off| 00000000:01:00.0  On |                  N/A |
|  0%   40C    P8               12W / 170W|   1284MiB / 12288MiB |     11%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
                                                                                         
+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|    0   N/A  N/A      1299      G   /usr/lib/Xorg                               523MiB |
|    0   N/A  N/A      1391      G   /usr/bin/gnome-shell                        108MiB |
|    0   N/A  N/A      1592      G   /usr/bin/gnome-software                      20MiB |
|    0   N/A  N/A      1817      G   /usr/lib/xdg-desktop-portal-gnome            69MiB |
|    0   N/A  N/A      2794      G   /usr/bin/kitty                                3MiB |
|    0   N/A  N/A      8396    C+G   ...4175876,13528862819532032333,262144      552MiB |
+---------------------------------------------------------------------------------------+`

I was thinking of a use-case to always allocate the GPU VRAM to Machine Learning models. This should be possible, but weird. Given recent uptick in deep-learning queries on the internet, someone or the other should have run into the same problem.

On looking for an adaptation of an ubuntu script for the newly installed ArchLinux, I received the following pointers from the matrix channel:

  1. hyprland-nvidia
  2. r/linux_gaming/…/possible_to_use_intel_igpu_on_wayland_but_nvidia
  3. gh/ewagner12/all-ways-egpu

I haven’t yet felt the need to do walk this path for now. This post will be updated with the details if I get to execution. For now, this will remain a link stash.

Workaround

Coincidental, but relative of mine needed more VRAM space for ML experiments (they’re all the rage it seems, I figure). Between our discussions, the following workaround is nice if you have a second machine and can use the GPU machine as a headless one without display.

It’s possible to reconnect to the motherboard display-port/HDMI to use the integrated graphics, and disable display. This allows NVIDIA driver to work, while display rendering does not use the GPU and the VRAM remains available.

systemctl disable gdm --now 

To revert, it’s always possible to use:

systemctl enable gdm --now 

Since I still have my ThinkPad X1C, I think I should be able to use the method if it comes to that. The above assumes GNOME is the default. It is the case for me and the person I’m corresponding with.

(Comments disabled. Email me instead.)