Nvidia configuration: Difference between revisions

From wikinotes
 
 
(6 intermediate revisions by the same user not shown)
Line 39: Line 39:
= Linux =
= Linux =
<blockquote>
<blockquote>
== HDMI underscan ==
<blockquote>
{{ NOTE |
if tv/monitor has a setting to disable overscan, use that instead }}
Interactively adjust
<source lang="yaml">
- nvidia-settings:
  - Xserver display configuration:
    - underscan: 40  # adjust, click apply to preview
</source>
It is persisted in your <code>/etc/X11/xorg.conf</code>
<source lang="dosini">
Section "Screen"
    Identifier    "Screen0"
    Device        "Device0"
    Monitor        "Monitor0"
    Option        "metamodes" "1920x1080_60 +0+0 {viewportout=1840x1035+40+22}"  # <-- viewportout
    # ...
EndSection
</source>
</blockquote><!-- hdmi underscan -->
== High Dynamic Range ==
== High Dynamic Range ==
<blockquote>
<blockquote>
Line 54: Line 78:
</blockquote><!-- High Dynamic Range -->
</blockquote><!-- High Dynamic Range -->


== TwinView ==
== Multi Display ==
<blockquote>
=== TwinView ===
<blockquote>
<blockquote>
TwinView is the linux version of surround (1x large screen, spanning multiple monitors).<br>
TwinView is the linux version of surround (1x large screen, spanning multiple monitors).<br>
Line 89: Line 115:
</blockquote><!-- TwinView -->
</blockquote><!-- TwinView -->


== Xinerama ==
=== Xinerama ===
<blockquote>
<blockquote>
Xinerama is an older technology that presents multiple screens as one display.<br>
Xinerama is an older technology that presents multiple screens as one display.<br>
It is slower, and it is not compatible alongside TwinView.
It is slower, and it is not compatible alongside TwinView.
</blockquote><!-- xinerama -->
</blockquote><!-- xinerama -->
</blockquote><!-- Multi Display -->


== Screen Tearing ==
== Screen Tearing ==
Line 143: Line 170:
</blockquote><!-- Permanent Fix -->
</blockquote><!-- Permanent Fix -->
</blockquote><!-- Screen Tearing -->
</blockquote><!-- Screen Tearing -->
== vconsole/framebuffer tweaks ==
<blockquote>
If you have no framebuffer/vconsole/tty when booting,<br>
you may need to disable some features.
<syntaxhighlight lang="ini">
# /boot/loader/entries/archlinux.conf
# ...
options ... rdblacklist=nouveau nomodeset nofb  # maybe get rid of nofb?
</syntaxhighlight>
</blockquote><!-- vconsole/framebuffer tweaks -->
== UCSI CCG Suspend/Resume ==
<blockquote>
If you're getting errors about UCSI_CCG failures on boot,<br>
you can blacklist the kernel module from loading on boot.
I'm not certain this is worth it..
<syntaxhighlight lang="bash">
# /etc/modprobe.d/nvidia.conf
blacklist ucsi_ccg
</syntaxhighlight>
</blockquote><!-- Suspend/Resume -->
== Misc Optimization ==
<blockquote>
=== Powermizer ===
<blockquote>
I haven't noticed a significant difference, but some have.<br>
also configurable from nvidia-settings UI.
<syntaxhighlight lang="bash">
# enable powermizer
nvidia-settings -a [gpu:0]/GPUPowerMizerMode=1 > /dev/null
# disable powermizer
nvidia-settings -a [gpu:0]/GPUPowerMizerMode=0 > /dev/null
</syntaxhighlight>
</blockquote><!-- Powermizer -->
</blockquote><!-- Misc Optimization -->
</blockquote><!-- Linux -->
</blockquote><!-- Linux -->

Latest revision as of 00:16, 28 March 2022

Tutorials

arch wiki https://wiki.archlinux.org/title/NVIDIA
ubuntu wiki: Xinerama how to https://help.ubuntu.com/community/XineramaHowTo
reddit triple monitor gaming on linux https://www.reddit.com/r/linux_gaming/comments/6n648l/triple_monitor_linux_gaming_gtx_1060_6gb/
nvidia forums: triple monitor https://forums.developer.nvidia.com/t/can-we-use-xrandr-to-configure-two-monitors-as-one-big-display-using-nvidia-drivers/35389

Windows

High Dynamic Range

Increases the rendered value range (darker low-value, brighter high-value)

Nvidia Control Panel:
  Display:
    Change Resolution:
      (dropdown) Output dynamic range: Full

Surround

Surround letsss you have 1x large screen spanning multiple monitors.

HeliosDisplayManagement can keep track of desktop/nvidia profiles and easily toggle them. https://github.com/falahati/HeliosDisplayManagement

Linux

HDMI underscan

NOTE:

if tv/monitor has a setting to disable overscan, use that instead

Interactively adjust

- nvidia-settings:
  - Xserver display configuration:
    - underscan: 40  # adjust, click apply to preview

It is persisted in your /etc/X11/xorg.conf

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    Option         "metamodes" "1920x1080_60 +0+0 {viewportout=1840x1035+40+22}"  # <-- viewportout
    # ...
EndSection

High Dynamic Range

Increases the rendered value range (darker low-value, brighter high-value).
This is very important for ENB mods.

nvidia-settings:
  GPU - (GeForce …):
    DFP-1 - (Samsung …):
      Contols Tab (top):
        Color Controls:
          Color Range: Full

Multi Display

TwinView

TwinView is the linux version of surround (1x large screen, spanning multiple monitors).
Old documentation says max 2 monitors, but more are possible.

Twinview is usually enabled by default. You can test it.

nvidia-settings --query TwinView
# This may not be necessary
Section "Screen"
  Option "TwinView" "1"
  # ...
EndSection

You can tell nvidia to hide xinerama info which will treat your Xorg setup as having one large triple-wide monitor.
using nvidiaXineramaInfo.
All games I have tried have required this, but apparently some do not.

Section "Screen"
  Option "nvidiaXineramaInfo" "Off"
  # ...
EndSection

You may consider having a second xorg.conf. After making the change, you can restart X with:

systemctl restart lxdm.service  # restart your DisplayManager of choice

Xinerama

Xinerama is an older technology that presents multiple screens as one display.
It is slower, and it is not compatible alongside TwinView.

Screen Tearing

You can prevent screen tearing by enabling ForceFullCompositionPipeline on each monitor.
(or alternatively, only the monitor you want to prevent screen tearing on).

Runtime Fix

Temporary, Runtime Fix

nvidia-settings --assign CurrentMetaMode="$(nvidia-settings --query CurrentMetaMode | sed 's/}/, ForceFullCompositionPipeline=On}/g' | awk -F'::' '{ print $2 }')"

Or in long-form

# single monitor
nvidia-settings --assign \
    CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"

# for multi monitor, query CurrentMetaMode,
# then add {ForceCompositionPipeline=On} to each monitor you want to reduce screen tearing on
nvidia-settings --query CurrentMetaMode

nvidia-settings --assign \
    CurrentMetaMode="(wintermute:0.0): id=50, switchable=yes, source=xconfig :: \
                     DPY-3: nvidia-auto-select @1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceFullCompositionPipeline=On},\ 
                     DPY-5: nvidia-auto-select @2560x1440 +1920+0 {ViewPortIn=2560x1440, ViewPortOut=2560x1440+0+0, ForceFullCompositionPipeline=On},\
                     DPY-6: nvidia-auto-select @1920x1080 +4480+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0, ForceFullCompositionPipeline=On}"

Permanent Fix

Permanent, Configured Fix

# /etc/X11/Xorg.conf  (alt: /etc/X11/xorg.conf.d/*.conf)

Section "Screen"
    # ...

    # enable triple buffering, disable IndirectGlxProtocol, and add your {ForceFullCompositionPipeline=On} to the MetaModes options map.
    Option         "MetaModes" "nvidia-auto-select +0+0 {ForceFullCompositionPipeline=On}" 
    Option         "AllowIndirectGLXProtocol" "off"
    Option         "TripleBuffer" "on"
EndSection

vconsole/framebuffer tweaks

If you have no framebuffer/vconsole/tty when booting,
you may need to disable some features.

# /boot/loader/entries/archlinux.conf

# ...
options ... rdblacklist=nouveau nomodeset nofb  # maybe get rid of nofb?

UCSI CCG Suspend/Resume

If you're getting errors about UCSI_CCG failures on boot,
you can blacklist the kernel module from loading on boot.

I'm not certain this is worth it..

# /etc/modprobe.d/nvidia.conf

blacklist ucsi_ccg

Misc Optimization

Powermizer

I haven't noticed a significant difference, but some have.
also configurable from nvidia-settings UI.

# enable powermizer
nvidia-settings -a [gpu:0]/GPUPowerMizerMode=1 > /dev/null

# disable powermizer
nvidia-settings -a [gpu:0]/GPUPowerMizerMode=0 > /dev/null