A bit late, but still this thread has been slightly getting on my
On Thu, 6 Oct 2011 16:44:40 +0100
Matthew Garrett wrote:
> On Thu, Oct 06, 2011 at 11:35:08AM -0400, Simo Sorce wrote:
> > I am sure display manager can easily grow a button to say something
> > along the lines of: change font resolution to better fit multiple
> > monitors. so that when someone that has widely varying DPIs between
> > monitors plugs a second monitor in they can press that button and
> > get whatever default you like best for that use case.
> We could do that, but you'd still need toolkit support for triggering
> a re-render of everything.
Try changing font size in xfce (or gnome 2, not sure about gnome 3 or
kde). Every GTK app re-renders. The same happens when I manually
change DPI. So where's the missing support?
Now to the actual problem:
a) Xorg+randr does its best in reporting the correct DPI per
display and probably should (or does) fall back to some default value
when the screen dimensions are obviously incorrect. You could also add
some white-list/black-list for devices that have known/unknown data
despite reporting incorrect ones. I have no reason not to believe ajax
that X is already doing its best in this area. It apparently also
allows for overriding the data in case user knows the reported data are
wrong. BTW. for my display it's reported correctly ;-)
b) DPI is physical resolution of the pixels. In theory we always either
know it or user can count it (you can measure the size of your monitor
or tv, you can measure the size of the area data-projector shines
on,...). However it's pointless for making use of it on TVs and
data-projectors -- you watch them from distance => you don't expect to
read 12pt big font from several meters.
c) DEs, Toolkits and Apps. They should IMHO do it's best to make use of
the data X provide, not try to override it. Allow the user to enter the
correct DPI (you can let him to check physical resolution of the
device, or use a physical ruler to match with virtual one or something
like that). When adding another screen, DPI of the first one should be
used as current toolkits don't support rendering say half of window
with one DPI and another one with different one (for me it looks
incredibly hard to make work actually). At the same time user should be
asked whether he wants to use some kind of "magnifier" on the new
screen. This setting should be remembered so next time user plugs in
the same device same settings would be applied.
d) Points are real unit of measure -- 1/72 inch. So it does not makes
sense if they are differently big on different screens unless
explicitly zoomed in/out. If you want to set wrong DPI, don't use
points for UI font sizes, it simply does not make sense. Unfortunately
small, big, huge is also pointless - with high DPI even huge can be
small and otherwise (or when watching from small or great distance).