On 3/11/18 9:49 PM, Simon Ser wrote:
+ <interface name="zxdg_toplevel_decoration_manager_v1" version="1">
+ <description summary="window decoration manager">
+ This interface permits choosing between client-side and server-side
+ window decorations for a toplevel surface.
+ A window decoration is a user interface component used to move, resize
+ change a window's state. It can be managed either by the client (part of
+ the surface) or by the server.
You mention decorations only, but this (or some other text below maybe)
should mention shadows as well. You may consider them as part of the
decoration, but xdg_surface.set_window_geometry() is designed to include
decoration and exclude shadows. Yet, e.g. GTK+ allows to resize using
Nit: I think GTK+ allows to move from any dead zone in the surface too. :-)
xdg-shell's wording includes drop-shadows as part of decorations:
Client-side decorations often have invisible portions like drop-shadows which
should be ignored for the purposes of aligning, placing and constraining
So "decorations" here stand for both visible parts such as the titlebar and
invisible parts such as drop-shadows. Do you think the protocol needs to
disambiguate these concepts?
You’re right, though I would rather be too precise (while not
exhaustive) that not enough, just in case. But others may have a
different opinion so let’s wait for more reviews here.
+ <request name="set_mode">
+ <description summary="set the decoration mode">
+ Set the toplevel surface decoration mode.
+ After requesting a decoration mode, the compositor will respond by
+ emitting a xdg_surface.configure event. The client should then update
+ its content, drawing it with or without decorations depending on the
+ received mode. The client must also acknowledge the configure when
+ committing the new content (see xdg_surface.ack_configure).
+ The compositor can ignore this request.
+ <arg name="mode" type="uint" enum="mode" summary="the decoration mode"/>
This request I’m still not sure about it.
Why would an SSD-capable client would want to switch from CSD
back-and-forth? What is the use case here? (Preferably a user use case.)
One reason is consistency with xdg-shell.
Not all states can be client-initiated though.
But there are also real use-cases:
- A compositor might prefer SSDs or CSDs depending of the window container. For
instance, a tiled compositor might prefer SSDs for tiled windows and CSDs for
floating windows. Windows can be moved between a tiling and a floating
container at run-time.
That I’m 100% ok with, your point is clear and makes a lot of sense. I
would even object if someone wanted to remove that. :-)
- Clients might expose a user setting that allows toggling SSDs. For instance,
Chrome/Chromium already has such a feature. Requiring the user to restart the
app or to quickly close and reopen the main window offers poor UX.
I’m not convinced a client should have such a setting. For consistency
and UX, I would rather have a central point of setting (the compositor).
If we make it harder (though possible) to switch, then we can hopefully
limit the number of clients wanting to support such a setting.
Since we must have the destructor support anyway, on both sides, I think
it is enough for this feature.
Last but not least: it should be much much clearer that the compositor
is in charge here. This is not about magic SSD, clients must support CSD
in all cases and should not error out if this global is not here. And
even then, it may want CSD in some cases.
I've added this in the protocol description:
Note that even if the server supports server-side window decorations, clients
must still support client-side decorations.
I think people already got that part, though it does no harm to say it
again, but I was speaking about the configure event. It may happen *at
any time* and the client must be prepared (e.g for the cases you mentioned).
Quentin “Sardem FF7” Glidic
wayland-devel mailing list