Quantcast
Channel: User Bass - Super User
Viewing all articles
Browse latest Browse all 35

X Logical Font Description and HiDPI

$
0
0

The problem: X-server serves fonts at a fixed resolution of 100dpi, rather than the current window system resolution (xdpyinfo | grep -F resolution).

A bit of theory. There are legacy server-side fonts which are sent to X clients over the network (via TCP or UNIX socket) either by the X server itself, or by a separate X Font Server (single or multiple). Unlike the usual client-side fonts (Xft, GTK 2+, Qt 2+), the "server" backend (also called the core X font backend) does not support anti-aliasing, but supports network transparency (that is, bitmaps, without any alpha channel, are sent over the network). At the application level, server-side fonts are specified not as an XftFontStruct (which most often translates into the familiar DejaVu Sans Mono:size=12:antialias=true), but as an XLFD. If we are talking about a local machine, then the same font file can be registered in both font backends at once and be available to both modern GTK and Qt-based applications, and legacy ones (Xt, Athena, Motif, GTK 1.2, Qt 1.x).

Historically, there were raster server-side fonts (*.pcf), and a raster has a resolution of its own (not necessarily the same as the window system resolution). Therefore, XLFD has fields such as RESOLUTION_X and RESOLUTION_Y. For a raster font not to look ugly when rendered onto the screen and still have the requested rasterized glyph size (PIXEL_SIZE), the raster resolution must be close to the screen resolution, therefore raster fonts were usually shipped with native resolutions of 75dpi and 100dpi (that's why we still have directories such as /usr/share/fonts/X11/75dpi and /usr/share/fonts/X11/100dpi). So, the below lines represent the same 12 pt font

-bitstream-charter-bold-r-normal--12-120-75-75-p-75-iso8859-1-bitstream-charter-bold-r-normal--17-120-100-100-p-107-iso8859-1

with a rasterized glyph size of

  • 12px at 75dpi, and
  • 17px at 100dpi, respectively.

But, in addition to raster fonts, there are vector, or outline fonts (TrueType, OpenType, Adobe Type 1), which can be scaled by any factor and still look good when rendered onto the screen. Some X-server implementations (notably, XSun) also supported the Adobe Type 3 format, where glyphs were described using the Turing-complete PostScript language.

Of course, the concept of raster resolution does not apply to vector fonts, so I can request zeroes (0) or even asterisks (*) in the RESOLUTION_X and RESOLUTION_Y fields, and, in theory, my X server should give me exactly the font requested. This is directly stated in the Arch Linux Wiki article at the link above:

Scalable fonts were designed to be resized. A scalable font name, as shown in the example below, has zeroes in the pixel and point size fields, the two resolution fields, and the average width field.

...

To specify a scalable font at a particular size you only need to provide a value for the POINT_SIZE field, the other size related values ​​can remain at zero. The POINT_SIZE value is in tenths of a point, so the entered value must be the desired point size multiplied by ten.

So, either of the following two queries should return a 12ptCourier New font at the window system resolution:

-monotype-courier new-medium-r-normal--*-120-*-*-m-*-iso10646-1-monotype-courier new-medium-r-normal--0-120-0-0-m-0-iso10646-1

Or so I thought. The thing is, after migration from 96...115dpi monitors to a 162dpi 4k monitor, I noticed that my carefully selected vector fonts suddenly became too small.

And it turned out that unless you explicitly set RESOLUTION_X and RESOLUTION_Y fields to 162 (and no one in his right mind would do so -- it would require rewriting dozens of Xresources lines every time one changes his monitor), then X server defaults to rendering the font at 100dpi instead of 162. The difference between 17 and 27 pixels (the factor of 1.62 = 162 / 100) is quite noticeable. Here's an example for a modern Debian 10 box:

Debian 10, Courier New 12pt at 162dpi

I thought this regression was a consequence of people gradually cutting out obsolete subsystems from X11, but in Debian Woody, released in 2002 and having a 2.2 kernel, I saw exactly the same thing:

Debian 3, Courier New 12pt at 162dpi

The only difference is that Debian Woody renders fonts in a "cleaner" manner, apparently, applying hinting on the server side, before sending bitmaps over the network.

So this is not a regression. The problem has always been there and equally affects all vector font types (TrueType, OpenType, Type 1).

Now, the question. Is there a way, without hard-coding window system resolution into user settings for each individual resource, to get by with less pain than recommended by the author of the Sharing Xresources between systems article?

Is it possible to solve the problem by changing the global configuration of the X server itself or the libraries it relies on (libfreetype, libxfont)?


Viewing all articles
Browse latest Browse all 35

Trending Articles