Discussion:
Pixels Per Inch needs to be standardized 🔍
Alberto Salvia Novella
2016-05-04 15:44:51 UTC
Permalink
As a result of the conversation had at
(https://developer.blender.org/T48292)

I would like to propose having a standard way of advertising Pixels Per
Inch, so applications can know its value independently of the desktop
environment in use.

Where shall I post this suggestion?

Thanks in advance.
Mattias Andrée
2016-05-04 15:49:18 UTC
Permalink
What's wrong with dots per inch? And if you want to
make a new standard, don't base it on inches, base it
on centimetres.

On Wed, 4 May 2016 17:44:51 +0200
Post by Alberto Salvia Novella
As a result of the conversation had at
(https://developer.blender.org/T48292)
I would like to propose having a standard way of
advertising Pixels Per Inch, so applications can know its
value independently of the desktop environment in use.
Where shall I post this suggestion?
Thanks in advance.
Alberto Salvia Novella
2016-05-04 17:01:09 UTC
Permalink
Post by Mattias Andrée
What's wrong with dots per inch?
How can an application reliably know which is the current pixel density
of the desktop?
Mattias Andrée
2016-05-04 17:45:30 UTC
Permalink
On Wed, 4 May 2016 19:01:09 +0200
Post by Alberto Salvia Novella
Post by Mattias Andrée
What's wrong with dots per inch?
How can an application reliably know which is the current
pixel density of the desktop?
Well, you cannot know anything reliably. The EDID
does contain all information you need for DPI, however
with limited precision. X.org reports a bogus DPI. But
if pretend that all monitors' dimensions are in whole
centimetres, than the number of pixels per centimetre
can be calculated

ppc_x = output_width_px(monitor) / output_width_cm(monitor);
ppc_y = output_height_px(monitor) / output_height_cm(monitor);

Notice that this is easier to calculate than the pixels per inch.

ppi_x = output_width_px(monitor) / output_width_cm(monitor) * 2.540;
ppi_y = output_height_px(monitor) / output_height_cm(monitor) * 2.540;

But why is pixels preferred over dots?
Jasper St. Pierre
2016-05-04 18:12:38 UTC
Permalink
What are the dimensions of a projector, whose pixels-per-inch or
dots-per-inch value is a distance of how far away the projector is for
the wall, or, in a keystoned case, isn't even constant across the
display?

For limited scenarios, you can make it work (with caution, see [0]).
But we cannot calculate a sensible DPI value in the general case.

[0] https://lists.fedoraproject.org/pipermail/devel/2011-October/157671.html
Post by Mattias Andrée
On Wed, 4 May 2016 19:01:09 +0200
Post by Alberto Salvia Novella
Post by Mattias Andrée
What's wrong with dots per inch?
How can an application reliably know which is the current
pixel density of the desktop?
Well, you cannot know anything reliably. The EDID
does contain all information you need for DPI, however
with limited precision. X.org reports a bogus DPI. But
if pretend that all monitors' dimensions are in whole
centimetres, than the number of pixels per centimetre
can be calculated
ppc_x = output_width_px(monitor) / output_width_cm(monitor);
ppc_y = output_height_px(monitor) / output_height_cm(monitor);
Notice that this is easier to calculate than the pixels per inch.
ppi_x = output_width_px(monitor) / output_width_cm(monitor) * 2.540;
ppi_y = output_height_px(monitor) / output_height_cm(monitor) * 2.540;
But why is pixels preferred over dots?
_______________________________________________
xdg mailing list
https://lists.freedesktop.org/mailman/listinfo/xdg
--
Jasper
Mattias Andrée
2016-05-04 18:17:46 UTC
Permalink
For projectors, I think it would be best if the dimensions
could be configured.

On Wed, 4 May 2016 11:12:38 -0700
Post by Jasper St. Pierre
What are the dimensions of a projector, whose
pixels-per-inch or dots-per-inch value is a distance of
how far away the projector is for the wall, or, in a
keystoned case, isn't even constant across the display?
For limited scenarios, you can make it work (with
caution, see [0]). But we cannot calculate a sensible DPI
value in the general case.
[0]
https://lists.fedoraproject.org/pipermail/devel/2011-October/157671.html
On Wed, May 4, 2016 at 10:45 AM, Mattias Andrée
Post by Mattias Andrée
On Wed, 4 May 2016 19:01:09 +0200
Post by Alberto Salvia Novella
Post by Mattias Andrée
What's wrong with dots per inch?
How can an application reliably know which is the
current pixel density of the desktop?
Well, you cannot know anything reliably. The EDID
does contain all information you need for DPI, however
with limited precision. X.org reports a bogus DPI. But
if pretend that all monitors' dimensions are in whole
centimetres, than the number of pixels per centimetre
can be calculated
ppc_x = output_width_px(monitor) /
output_width_cm(monitor); ppc_y =
output_height_px(monitor) / output_height_cm(monitor);
Notice that this is easier to calculate than the pixels
per inch.
ppi_x = output_width_px(monitor) /
output_width_cm(monitor) * 2.540; ppi_y =
output_height_px(monitor) / output_height_cm(monitor) *
2.540;
But why is pixels preferred over dots?
_______________________________________________
xdg mailing list
https://lists.freedesktop.org/mailman/listinfo/xdg
n***@laposte.net
2016-05-04 20:04:53 UTC
Permalink
----- Mail original -----
De: "Jasper St. Pierre"
Post by Jasper St. Pierre
What are the dimensions of a projector, whose pixels-per-inch or
dots-per-inch value is a distance of how far away the projector is for
the wall,
Please, not the projector strawman again.

Any perpendicular screen surface is trivial to handle, once you add the viewing distance to the mix. And those are 95% of the cases at least, and the only ones needing calibration in the first place.

Thus, for each screen:

1. detect kind of screen (projector, tv, set-top monitor, laptop screen, smartphone screen) → udev job via edid and other hardware hints

2. auto-set typical viewing distance (use THX or SMTPE recommandations for projectors, they're about the same anyway. It will just works for users that followed the recommandations, those with non-standard setup can do 3)

3. let the user and the app environment provide a more accurate value if they want (via sensors, user input, user lies, whatever). One distance value! One field or slider! No harder than the zoom slider DEs expose now, and a lot more universal (so user prefs need not be hardware-specific)

4. expose the resulted computed horizontal and vertical pixel angle for an eye at viewing distance of the center of the screen

5. let apps do smart things with the result

And you're done. Handles 100% of the sane cases of known screens, lets app people invent smart autozoom strategies, loads simpler than the colour calibration, that people actually do despite the hardships, more accurate than hardcoding magic values with no relationship with actual hardware.

Hell, even manufacturers of dumb analog amps have known for a long time it is not too hard to ask users for speaker distances, because consummers would rather input distance once, than have unbalanced sound (because speed-of-sound << speed-of-light, even small differences matter).

Why are we still arguing that computer UI is harder to do than analog amp UI ?

Regards,
--
Nicolas Mailhot
Pekka Paalanen
2016-05-04 18:56:46 UTC
Permalink
On Wed, 4 May 2016 17:44:51 +0200
Post by Alberto Salvia Novella
As a result of the conversation had at
(https://developer.blender.org/T48292)
Hi,

too bad the discussion does not explain why you need the ppi, or
what it would be used for. There are many uses for ppi that
people think are proper, but fall apart very quickly in practice.
Post by Alberto Salvia Novella
I would like to propose having a standard way of advertising Pixels Per
Inch, so applications can know its value independently of the desktop
environment in use.
Seems like this would call for lots of standard things:
- creating a representation of an output
- signalling which output a window is showing on, and updating it
dynamically as the window moves and outputs are plugged in and out
- delivering output ppi every time it changes permanently (video
mode change)

Assuming you had all that, what will you then do, when your window
is shown on more than one output with differing ppis?

All this assuming the issues Jasper mentioned were already solved.

Or, how would a DE know which ppi it needs to advertise at a time?
Should there only one ppi per window at a time?


Thanks,
pq
Alberto Salvia Novella
2016-05-05 01:24:29 UTC
Permalink
Post by Pekka Paalanen
too bad the discussion does not explain why you need the ppi, or
what it would be used for.
What I am asking for is a standard way to advertise the desktop scale
factor. And it does not necessarily need to be about pixel density, it
just could be a multiplier like x1.5 or x2.
Post by Pekka Paalanen
Or, how would a DE know which ppi it needs to advertise at a time?
One scale factor per screen.
Pekka Paalanen
2016-05-05 10:28:32 UTC
Permalink
On Thu, 5 May 2016 03:24:29 +0200
Post by Alberto Salvia Novella
Post by Pekka Paalanen
too bad the discussion does not explain why you need the ppi, or
what it would be used for.
What I am asking for is a standard way to advertise the desktop scale
factor. And it does not necessarily need to be about pixel density, it
just could be a multiplier like x1.5 or x2.
Post by Pekka Paalanen
Or, how would a DE know which ppi it needs to advertise at a time?
One scale factor per screen.
Oh scale factor! Yes, that is a completely different thing. Please
do talk about a scale factor instead of dpi or ppi. People will
respond much better, while dpi tends to raise hard prejudice (with
me too) due to its history of abuse and misconceptions.

So this is about HiDPI? That would be a good term to use too, as
HiDPI support also uses a scale factor, not dpi.

FWIW, Wayland offers it like this: each wl_output (usually
represents a single monitor) has an integer scale factor
associated. The client/app/toolkit gets told on which outputs a
window is shown on, and then the app can choose what size and
factor to draw in. The compositor automatically accounts for the
mismatch between the draw factor and output factor. (wl_outputs
also advertise resolution and physical size, if applicable, so it
is also possible to compute ppi.) This is built in the core of the
Wayland display protocol.


Thanks,
pq
Alberto Salvia Novella
2016-05-05 13:33:48 UTC
Permalink
each wl_output (usually represents a single monitor) has an integer
scale factor associated.
Thank you, this seems what I was talking about.

You know if is there another way of figuring out the scale factor in the
xserver and mir display managers?
Pekka Paalanen
2016-05-06 08:16:57 UTC
Permalink
On Thu, 5 May 2016 15:33:48 +0200
Post by Alberto Salvia Novella
each wl_output (usually represents a single monitor) has an integer
scale factor associated.
Thank you, this seems what I was talking about.
You know if is there another way of figuring out the scale factor in the
xserver and mir display managers?
I do not, unfortunately.


Thanks,
pq
Alberto Salvia Novella
2016-05-06 14:44:05 UTC
Permalink
Post by Pekka Paalanen
I do not, unfortunately.
Then I will ask in the xorg mailing list. Thank you.
Thomas U. Grüttmüller
2017-03-26 00:41:06 UTC
Permalink
Post by Alberto Salvia Novella
I would like to propose having a standard way of advertising Pixels Per
Inch, so applications can know its value independently of the desktop
environment in use.
The X server already advertises the DPI of the monitor.
I found this in /var/log/Xorg.0.log:

[ 6882.546] (==) intel(0): DPI set to (96, 96)

Here, the resolution is set to 96 DPI although in reality, my screen has
120 DPI. And you know what: I want it to stay this way. Please don’t set
it to the true value. Or at least provide some means to change it back
to 96 DPI manually. The point in having a higher definition screen is to
fit a lot of stuff on it.

Thank you.
Thomas
Kai Uwe Broulik
2017-03-26 07:00:46 UTC
Permalink
The point in having a higher definition screen is to ‎fit a lot of stuff on it.
The point in having a higher definition screen is to have crisper fonts and graphics.

See how opinions differ? X.org lying to us by forcing 96 dpi is a terrible thing and one of the major complaints we in Plasma get (e.g. Login screen unreadably small)

Cheers,
Kai Uwe 
Thomas U. Grüttmüller
2017-03-26 12:55:07 UTC
Permalink
Post by Kai Uwe Broulik
The point in having a higher definition screen is to ‎fit a lot of stuff on it.
The point in having a higher definition screen is to have crisper fonts and graphics.
If you scale everything by 120/96, raster graphics will look ugly, and
raster fonts, too.

Greetings,
Thomas
n***@laposte.net
2017-03-27 07:39:52 UTC
Permalink
Post by Thomas U. Grüttmüller
If you scale everything by 120/96, raster graphics will look ugly, and
raster fonts, too.
There is no room for raster fonts in an HiDPI world. The tech has moved. Live with it or keep up buying 96dpi hardware while it's still available, but don't break everyone's hardware support to simulate your preferences.

Sincerely,
--
Nicolas Mailhot
Continue reading on narkive:
Loading...