Fwd: Details on using xcb_send_request()?

2021-02-04 Thread Junk Mail
While this question talks abnout XCB, I suppose it is more related to the protocol rather than usage of the XCB library, so I am forwarding this e-mail here.  Beginning of forwarded message 03.02.2021, 23:47, "Junk Mail" : Of course I could simply read what is written here https://gitlab.freedesktop.org/xorg/lib/libxcb/-/blob/master/src/xcbext.h#L62 but still, I felt the need for examples showing its usage and I found this:https://github.com/StarchLinux/libxcb/blob/master/xproto.c#L1808xcb_parts[2].iov_base = (char *) &xcb_out;xcb_parts[2].iov_len = sizeof(xcb_out);xcb_parts[3].iov_base = 0;xcb_parts[3].iov_len = -xcb_parts[2].iov_len & 3; // why?xcb_parts[4].iov_base = (char *) value_list;xcb_parts[4].iov_len = xcb_popcount(value_mask) * sizeof(uint32_t);xcb_parts[5].iov_base = 0;xcb_parts[5].iov_len = -xcb_parts[4].iov_len & 3; // why? It seems like there are "delimiters" (index 3 and 5 of xcb_parts, which is of type struct iovec), and their lengths are always the negative value of the length of the previous index AND'd with 3, I have found nothing obvious that could shed light here: https://gitlab.freedesktop.org/xorg/lib/libxcb/-/blob/master/src/xcb_out.c#L221So if I read this code correctly, indexes 3 and 5 are sent directly to the X server, but I failed to find if this is something required by the X Window System protocol spec.Could anyone shed light on this?  End of forwarded message ___
xorg@lists.x.org: X.Org support
Archives: http://lists.freedesktop.org/archives/xorg
Info: https://lists.x.org/mailman/listinfo/xorg
Your subscription address: %(user_address)s


AW: X Logical Font Description and HiDPI

2021-02-04 Thread Walter Harms
Hi Andrey,
not being an expert on X11 fonts i would like to condense
your question a bit;

when you use a scaleable font:
> -monotype-courier new-medium-r-normal--*-120-*-*-m-*-iso10646-1
> -monotype-courier new-medium-r-normal--0-120-0-0-m-0-iso10646-1

you expect a font depending on you DPI setting ?

Die you ever change your DPI setting ? 
(https://linuxreviews.org/HOWTO_set_DPI_in_Xorg)
you may try to set
Xft.dpi: 162 in you .Xdefaults and see what is happening (i know older 
programms work fine (most times))

re,
 wh

Von: xorg  im Auftrag von Andrey ``Bass'' Shcheglov 

Gesendet: Mittwoch, 3. Februar 2021 12:07:27
An: xorg@lists.x.org
Betreff: X Logical Font Description and HiDPI

Hello,

*The problem*: X-server serves fonts at a fixed resolution of 100dpi, rather 
than the current window system resolution (`xdpyinfo | grep -F resolution`).

*A bit of theory*. There are legacy server-side fonts which are sent to X 
clients over the network (via TCP or UNIX socket) either by the X server 
itself, or by a separate X Font Server (single or multiple). Unlike the usual 
client-side fonts (Xft, GTK 2+, Qt 2+), the "server" backend (also called the 
core X font backend) does not support anti-aliasing, but supports network 
transparency (that is, bitmaps, without any alpha channel, are sent over the 
network). At the application level, server-side fonts are specified not as an 
`XftFontStruct` (which most often translates into the familiar "DejaVu Sans 
Mono:size=12:antialias=true", see 
), but as an XLFD 
. If we are 
talking about a local machine, then the same font file can be registered in 
both subsystems at once and be available to both modern GTK and Qt-based 
applications, and legacy ones (Xt, Athena, Motif, GTK 1.2, Qt 1.x).

Historically, server fonts were bitmap, or raster (*.pcf), and a raster has a 
resolution of its own (not necessarily the same as the window system 
resolution). Therefore, XLFD has fields such as RESOLUTION_X and RESOLUTION_Y. 
For a raster font not to look ugly when rendered onto the screen and still have 
the requested rasterized glyph size, the raster resolution must be close to the 
screen resolution, therefore raster fonts were usually shipped with native 
resolutions of 75 dpi and 100 dpi (that's why we have directories such as 
/usr/share/fonts/X11/75dpi and /usr/share/fonts/X11/100dpi). So, the below 
lines represent the same 12 pt font

> -bitstream-charter-bold-r-normal--12-120-75-75-p-75-iso8859-1
> -bitstream-charter-bold-r-normal--17-120-100-100-p-107-iso8859-1

with a rasterized glyph size of

 * 12 px at 75 dpi, and
 * 17 px at 100 dpi, respectively.

But, in addition to bitmap fonts, there are vector, or outline fonts (TrueType, 
OpenType, Adobe Type 1), which can be scaled by any factor and still look good 
when rendered onto the screen. Some X-server implementations (notably, XSun) 
also supported the Adobe Type 3 format, where glyphs were described using the 
Turing-complete PostScript language.

Of course, the concept of raster resolution does not apply to vector fonts, so 
I can request zeroes (`0`) or even asterisks (`*`) in the RESOLUTION_X and 
RESOLUTION_Y fields, and, in theory, my X server should give me exactly the 
font requested. This is directly stated in the _Arch Wiki_ article at the link 
above:

> Scalable fonts were designed to be resized. A scalable font name, as shown in 
> the example below, has zeroes in the pixel and point size fields, the two 
> resolution fields, and the average width field.
>
> ...
>
> To specify a scalable font at a particular size you only need to provide a 
> value for the POINT_SIZE field, the other size related values ​​can remain at 
> zero. The POINT_SIZE value is in tenths of a point, so the entered value must 
> be the desired point size multiplied by ten.

So, either of the following two queries should return a 12 pt `Courier New` 
font at the window system resolution:

> -monotype-courier new-medium-r-normal--*-120-*-*-m-*-iso10646-1
> -monotype-courier new-medium-r-normal--0-120-0-0-m-0-iso10646-1

*Or so I thought*. The thing is, having migrated from 96... 15 dpi monitors to 
a 162 dpi 4k monitor, I noticed that my carefully chosen vector fonts suddenly 
became too small.

And it turned out that unless you explicitly set RESOLUTION_X and RESOLUTION_Y 
fields to 162 (and no one in his right mind will do so -- it would require 
rewriting dozens of Xresources lines every time one changes his monitor), then 
X server defaults to rendering the font at 100 dpi instead of 162. The 
difference between 17 and 27 pixels (the factor of 1.62 = 162 / 100) is quite 
noticeable. Here's an example for a modern Debian 10 box: 
.

I thought this regression was a consequence of people gradually