xrandr
which the X.org server hands off to now for dynamic use of monitors. man xrandr
even reports that it is trying to keepaconstant DPI. Not sure just why it is doing it, but found a good way to get it done.I just got a new monitor to be able to use as an external monitor for my laptop. While I was setting it up I noticed that the monitors display size wasn’t correctly detected. The Xorg server does a good job auto-configuring however this caught my eye:
xdpyinfo | grep -B2 resolution dimensions: 1920x1080 pixels (508x286 millimeters) resolution: 96x96 dots per inch
The monitor I got is a 21.5″ monitor so I figured the DPI was off. I decided to calculate it myself (this is a square pixel monitor):
res_horz=1920 res_vert=1080 res_diag=$(echo "scale=5;sqrt($res_horz^2+$res_vert^2)" | bc) siz_diag=21.5 siz_horz=$(echo "scale=5;($siz_diag/$res_diag)*$res_horz*25.4" | bc) siz_vert=$(echo "scale=5;($siz_diag/$res_diag)*$res_vert*25.4" | bc) echo "$siz_horz"x"$siz_vert" 475.48800x267.46200
Also there are online DPI Calculators conferred by doubt (1, 2,) and xrandr:
em_ds_h=$(xrandr | grep VGA-0 | rev | cut -d " " -f 3 | rev | sed 's/mm//') em_ds_v=$(xrandr | grep VGA-0 | rev | cut -d " " -f 1 | rev | sed 's/mm//') em_ds="$em_ds_h"x"$em_ds_v" echo $em_ds 477x268
My discovered value and theirs are a couple millimeters off overall so I just used theirs. I created a configuration to define the display size to the the Xorg server. A basic configuration to define display size can be done like this:
cat /usr/share/X11/xorg.conf.d/90-monitor-disp-size.conf Section "Monitor" Identifier "<default monitor>" DisplaySize 477 268 EndSection
Arch Linux and most distros use /etc/X11/xorg.conf.d/
. However this won’t work on the external monitor. So I expanded on it (more than it probably needed to be) by defining both monitors and related sections:
Section "Monitor" Identifier "Internal - Pavilion Laptop" DisplaySize 304.5 228.6 EndSection Section "Monitor" Identifier "External - Samsung Syncmaster SA350" VendorName "Samsung" ModelName "SA300/SA350" DisplaySize 476 267.7 EndSection Section "Device" Identifier "ATI Radeon Mobility IGP 330M" Option "Monitor-VGA-0" "External - Samsung Syncmaster SA350" Option "Monitor-LVDS" "Internal - Pavilion Laptop" EndSection Section "Screen" Identifier "Default Screen" Monitor "Internal - Pavilion Laptop" EndSection Section "ServerLayout" Identifier "Default Layout" Screen "Default Screen" EndSection
I added VendorName
and ModelName
but I’m not sure they uniquely define the monitor so that the Xorg server acknowledges them. The VendorName
I believe is just for reference, ModelName
can usually be discovered by doing:
grep "Monitor name" /var/log/Xorg.0.log
Monitor-VGA-0 and Monitor-LVDS define the ports and hence by reference should uniquely define the monitor (xrandr -q
shows them and both are found in the Xorg log).
After a bit of research I discovered that there is a good amount of history concerning the Xorg server having a bit of trouble in not being able to correctly discover the display size. I believe this may be related to some drivers. I’ve been told the open-source ATI driver have had problems and read in some other places of other people who have had similar issues. Defining the display size in the configuration and telling the Xorg server not to use the auto-detected value can be done by adding this to the Devices section (for Nvidia drivers use: Option "UseEDID" "FALSE"
):
Option "NoDDC"
Unfortunately, this didn’t work either and left me completely at a loss. Unsure how to go further to define display size in the the Xorg server configuration I decided to define it through xrandr.
xrandr has an option to define the display size with the --fbmm
option:
xrandr --output VGA-0 --auto -fbmm 476x267.7
--auto
uses the default/preferred mode of the monitor.
I wonder why in 2012 We Linux user still have to use alien language to set DPI on our HD 2012 monitor. Great tutorial anyway
res_diag=$(echo “scale=2;sqrt($res_horz^2+$res_vert)” | bc) wants to be
res_diag=$(echo “scale=2;sqrt($res_horz^2+$res_vert^2)” | bc) i think
And that would explain why my value is off. Updated post, good catch PaX.
about nvidia, this one saved my life: http://analogbit.com/fix_nvidia_edid
So.how do you run both.monitors.at.the same.time.with diff dpi?
xrandr takes an –output option. That is how you set different DPI.