Re: minimized and stick windows

2013-05-16 Thread Pekka Paalanen
On Wed, 15 May 2013 15:26:02 +0200
Alexander Preisinger alexander.preisin...@gmail.com wrote:

 2013/5/15 Pekka Paalanen ppaala...@gmail.com
 
  On Wed, 15 May 2013 14:20:21 +0200
  Alexander Preisinger alexander.preisin...@gmail.com wrote:
 
   Hello,
  
   I thought a bit about it and like to present my ideas.
   I mainly thought about it from the shell/compositor site when I like to
   minimize, maximize surfaces from keybindings, like in some window
  managers.
  
   For example the client can still request minimize, maximize, fullsrceen
  and
   toplevel actions, but now the compositor responds with an state_update
   event.
   The compositor can also send this state_update when the compositor likes
   change the window on it's own (like some task bar or compositor key
   bindings).
   The client can then save the state and act accordingly (like hiding same
   menus if maximized or fullscreen).
  
   diff --git a/protocol/wayland.xml b/protocol/wayland.xml
   index 3bce022..e0f2c4a 100644
   --- a/protocol/wayland.xml
   +++ b/protocol/wayland.xml
   @@ -811,6 +811,14 @@
  arg name=output type=object interface=wl_output
   allow-null=true/
/request
  
   +request name=set_minimized
   +description summary=minimize the surface
   +Minimize the surface.
   +
   +The compositor responds with state_update event.
   +/description
   +/request
   +
request name=set_title
  description summary=set surface title
   Set a short title for the surface.
   @@ -867,6 +875,30 @@
  arg name=height type=int/
/event
  
   +enum name=state
   +  description summary=different states for a surfaces
   +  /description
   +  entry name=toplevel value=1 summary=surface is neither
   maximized, minizized or fullscreen/
   +  entry name=maximized value=2 summary=surface is maximized/
   +  entry name=minimized value=3 summary=surface is minizimed/
   +  entry name=fullscreen value=4 summary=surface is
  fullscreen/
   +/enum
   +
   +event name=state_update
   +description summary=update surface state
   +Tells the surface which state is has on the output.
   +
   +This event is sent in respons to set_maximized, set_minimized or
   +set_fullscreen request to acknowledge the request. The client can
   update it
   +own state if it wants to keep track of it.
   +
   +The also compositor sends this event if itt wants the surface
   minimized or
   +maximized. For example by clicking on a task list item or compositor
   key
   +bindings for fullscreen.
   +/description
   +arg name=state type=uint summary=new surface state/
   +/event
   +
event name=popup_done
  description summary=popup interaction is done
   The popup_done event is sent out when a popup grab is broken,
  
  
   I don't know about multiple window applications and maybe missed some
  other
   use cases, but I hope this isn't too wrong of an idea. At least this
  should
   hopefully not break the protocol too much.
 
  If I understood right, here you have the client asking the compositor
  for permission, and then the compositor orders the client to be in a
  certain state and will compose it as such, regardless of what the client
  actually draws.
 
  This won't work, fixing the races it causes will complicate the
  protocol and cause roundtrips.
 
  The client draws its window, hence the client is in charge of how it
  looks, and the compositor cannot force that.
 
 
 Hence, it must be compositor proposing to the client that it should
  e.g. maximize. It the client does that at some point, perhaps first
  sending a few new frames since it was animating, the client will tell
  the compositor it will now go maximized, and then the very next frame
  it draws will be maximized. This avoids flicker.
 
 
 Yes that seems logical. So the update_state should then be a
 request/suggest_state event?

Yup, something like that.

 It seems I am tainted by using tiling window managers, where they sometimes
 forces the size.

You can never really force a size.

You can tell a client with wl_shell_surface.geometry event, that this
window should be made at most this size. The client can choose a
smaller size, but should choose the largest size possible fitting into
the suggested size. Well-behaving clients will do just that.

If a client is not well-behaving, it will look like crap, and there's
nothing to fix that. But note, that well-behaving does allow a smaller
window than suggested, so you need to prepare for that in a tiling WM.

The never expose global coordinates property does allow a tiling WM to
deal gracefully with misbehaving clients, though. You can simply scale
the surface down to the size you really wanted, and the client will
continue working as if nothing strange happened. Only that one client
will look bad, but it still completely shows, and will not obscure
other 

Re: minimized and stick windows

2013-05-16 Thread Pekka Paalanen
On Wed, 15 May 2013 12:27:17 -0700
Bill Spitzak spit...@gmail.com wrote:

 Pekka Paalanen wrote:
 
  Minimize is a little special, since the client does not need to react
  specially for it to look right.
 
 The client does have to react if there is a floating panel that also has 
 to disappear.
 
 For example the floating shared toolbox with 2 main windows. It should 
 only disappear when *both* main windows are minimized.

You very conventiently removed my next sentence, where I already took
this into account.
- pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [RFC] libinputmapper: Input device configuration for graphic-servers

2013-05-16 Thread Todd Showalter
On Thu, May 16, 2013 at 1:37 AM, Peter Hutterer
peter.hutte...@who-t.net wrote:

 why are gamepads and joysticks different? buttons, a few axes that may or
 may not map to x/y and the rest is device-specific.
 this may be in the thread, but I still haven't gone through all msgs here.

Joysticks are designed for a different purpose (flight sims), and
so have a different set of controls.  For example, on a lot of
joysticks there is a throttle, which is a constrained axis you can
set to any position and it will stay there until you move it again.
Button placement on joysticks tends to be more arbitrary as well.

In terms of raw functionality they're similar, but the differences
are large enough (especially in the way they're used) that they are
better treated separately.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [RFC] libinputmapper: Input device configuration for graphic-servers

2013-05-16 Thread David Herrmann
Hi Peter

On Thu, May 16, 2013 at 7:37 AM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 On Sun, May 12, 2013 at 04:20:59PM +0200, David Herrmann wrote:
[..]
 So what is the proposed solution?
 My recommendation is, that compositors still search for devices via
 udev and use device drivers like libxkbcommon. So linux evdev handling
 is still controlled by the compositor. However, I'd like to see
 something like my libinputmapper proposal being used for device
 detection and classification.

 libinputmapper provides an inmap_evdev object which reads device
 information from an evdev-fd or sysfs /sys/class/input/inputnum
 path, performs some heuristics to classify it and searches it's global
 database for known fixups for broken devices.
 It then provides capabilities to the caller, which allow them to see
 what drivers to load on the device. And it provides a very simple
 mapping table that allows to apply fixup mappings for broken devices.
 These mappings are simple 1-to-1 mappings that are supposed to be
 applied before drivers handle the input. This is to avoid
 device-specific fixup in the drivers and move all this to the
 inputmapper. An example would be a remapping for gamepads that report
 BTN_A instead of BTN_NORTH, but we cannot fix them in the kernel for
 backwards-compatibility reasons. The gamepad-driver can then assume
 that if it receives BTN_NORTH, it is guaranteed to be BTN_NORTH and
 doesn't need to special case xbox360/etc. controllers, because they're
 broken.

 I think evdev is exactly that interface and apparently it doesn't work.

 if you want a mapping table, you need a per-client table because sooner or
 later you have a client that needs BTN_FOO when the kernel gives you BTN_BAR
 and you can't change the client to fix it.

 i.e. the same issue evdev has now, having a global remapping table just
 moves the problem down by 2 years.

 a mapping table is good, but you probably want two stages of mapping: one
 that's used in the compositor for truly broken devices that for some reason
 can't be fixed in the kernel, and one that's used on a per-client basis. and
 you'll likely want to be able to overide the client-specific from outside
 the client too.

IMHO, the problem with evdev is, that it doesn't provide device
classes. The only class we have is this is an input device. All
other event-flags can be combined in whatever way we want.

So like 10 years ago when the first gamepad driver was introduced, we
added some mapping that was unique to this device (the device was
probably unique, too). Some time later, we added some other
gamepad-like driver with a different mapping (as it was probably a
very different device-type, back then, and we didn't see it coming
that this will become a wide-spread device-type).
However, today we notice that a GamePad is an established type of
device (like a touchpad), but we have tons of different mappings in
the kernel for backwards-compatibility reasons. I can see that this
kind of development can happen again (and very likely it _will_ happen
again) and it will happen for all kinds of devices.

But that's why I designed the proposal from a compositor's view
instead of from a kernel's view.

A touchpad driver of the compositor needs to know exactly what kind of
events it gets from the kernel. If it gets wrong events, it will
misbehave. As we cannot guarantee that all kernel drivers behave the
same way, the compositor's touchpad driver needs to work around all
these little details on a per-device basis.
To avoid this, I tried to abstract the touchpad-protocol and moved
per-device handling into a separate library. It detects all devices
that can serve as a touchpad and fixes trivial (1-to-1 mapping)
incompatibilities. This removes all per-device handling from the
touchpad driver and it can expect all input it gets to be conform with
a touchpad protocol.
And in fact, it removes this from all the compositor's input drivers.
So I think of it more like a lib-detect-and-make-compat.

All devices that do not fall into one of the categories (I called it
capability), will be handled as custom devices. So if we want an input
driver for a new fancy device, then we need a custom driver, anyway
(or adjust a generic driver to handle both). If at some point it turns
out, that this kind of device becomes more established, we can add a
new capability for it. Or we try extending an existing capability in a
backwards-compatible way. We can then remove the custom-device
handling from the input-driver and instead extend/write a generic
driver for the new capability.


So I cannot follow how you think this will have the same problems as
evdev? Or, let's ask the inverse question: How does this differ from
the X11 model where we move the custom device handling into the
drivers?

 libinputmapper would use some static heuristics for all this, but
 additionally parse user-configuration. A configuration file contains
 [match] entries, which specify device-configurations to load 

[PATCH 0/2] Updated scaling patch

2013-05-16 Thread alexl
From: Alexander Larsson al...@redhat.com

Here is a new version of the scaling work, based on the feedback
from Pekka. Changes in this version are:

* Better documentation in general, and about coordinate spaces
  in particular.
* Scaling is an integer
* Updated wl_compository version as needed
* Added a flag to wl_output.mode to mark resolutions that
  are scaled
* Support grouping of property changes to wl_output using a
  wl_output.done event

Alexander Larsson (2):
  protocol: Allow output changes to be treated atomically
  protocol: Support scaled outputs and surfaces

 protocol/wayland.xml | 119 ---
 1 file changed, 104 insertions(+), 15 deletions(-)

-- 
1.8.1.4

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


[PATCH 2/2] protocol: Support scaled outputs and surfaces

2013-05-16 Thread alexl
From: Alexander Larsson al...@redhat.com

This adds the wl_surface.set_buffer_scale request, and a wl_output.scale
event. These together lets us support automatic upscaling of old
clients on very high resolution monitors, while allowing new clients
to take advantage of this to render at the higher resolution when the
surface is displayed on the scaled output.

It is similar to set_buffer_transform in that the buffer is stored in
a transformed pixels (in this case scaled). This means that if an output
is scaled we can directly use the pre-scaled buffer with additional data,
rather than having to scale it.

Additionally this adds a scaled flag to the wl_output.mode flags
so that clients know which resolutions are native and which are scaled.

Also, in places where the documentation was previously not clear as to
what coordinate system was used this was fleshed out.

It also adds a scaling_factor event to wl_output that specifies the
scaling of an output.

This is meant to be used for outputs with a very high DPI to tell the
client that this particular output has subpixel precision. Coordinates
in other parts of the protocol, like input events, relative window
positioning and output positioning are still in the compositor space
rather than the scaled space. However, input has subpixel precision
so you can still get input at full resolution.

This setup means global properties like mouse acceleration/speed,
pointer size, monitor geometry, etc can be specified in a mostly
similar resolution even on a multimonitor setup where some monitors
are low dpi and some are e.g. retina-class outputs.
---
 protocol/wayland.xml | 107 ---
 1 file changed, 93 insertions(+), 14 deletions(-)

diff --git a/protocol/wayland.xml b/protocol/wayland.xml
index d3ae149..acfb140 100644
--- a/protocol/wayland.xml
+++ b/protocol/wayland.xml
@@ -173,7 +173,7 @@
 /event
   /interface
 
-  interface name=wl_compositor version=2
+  interface name=wl_compositor version=3
 description summary=the compositor singleton
   A compositor.  This object is a singleton global.  The
   compositor is in charge of combining the contents of multiple
@@ -709,7 +709,7 @@
 
The x and y arguments specify the locations of the upper left
corner of the surface relative to the upper left corner of the
-   parent surface.
+   parent surface, in surface local coordinates.
 
The flags argument controls details of the transient behaviour.
   /description
@@ -777,6 +777,10 @@
in any of the clients surfaces is reported as normal, however,
clicks in other clients surfaces will be discarded and trigger
the callback.
+
+   The x and y arguments specify the locations of the upper left
+   corner of the surface relative to the upper left corner of the
+   parent surface, in surface local coordinates.
   /description
 
   arg name=seat type=object interface=wl_seat summary=the wl_seat 
whose pointer is used/
@@ -860,6 +864,9 @@
 
The client is free to dismiss all but the last configure
event it received.
+
+   The width and height arguments specify the size of the window
+   in surface local coordinates.
   /description
 
   arg name=edges type=uint/
@@ -876,11 +883,16 @@
 /event
   /interface
 
-  interface name=wl_surface version=2
+  interface name=wl_surface version=3
 description summary=an onscreen surface
   A surface is a rectangular area that is displayed on the screen.
   It has a location, size and pixel contents.
 
+  The size of a surface (and relative positions on it) is described
+  in surface local coordinates, which may differ from the buffer
+  local coordinates of the pixel content, in case a buffer_transform
+  or a buffer_scale is used.
+
   Surfaces are also used for some special purposes, e.g. as
   cursor images for pointers, drag icons, etc.
 /description
@@ -895,20 +907,25 @@
   description summary=set the surface contents
Set a buffer as the content of this surface.
 
+   The new size of the surface is calculated based on the buffer
+   size transformed by the inverse buffer_transform and the
+   inverse buffer_scale. This means that the supplied buffer
+   must be an integer multiple of the buffer_scale.
+
The x and y arguments specify the location of the new pending
-   buffer's upper left corner, relative to the current buffer's
-   upper left corner. In other words, the x and y, and the width
-   and height of the wl_buffer together define in which directions
-   the surface's size changes.
+   buffer's upper left corner, relative to the current buffer's upper
+   left corner, in surface local coordinates. In other words, the
+   x and y, combined with the new surface size define in which
+   directions the surface's size changes.
 
Surface contents are 

Re: [PATCH 2/2] protocol: Support scaled outputs and surfaces

2013-05-16 Thread Jason Ekstrand
On May 16, 2013 8:44 AM, al...@redhat.com wrote:

 From: Alexander Larsson al...@redhat.com

 This adds the wl_surface.set_buffer_scale request, and a wl_output.scale
 event. These together lets us support automatic upscaling of old
 clients on very high resolution monitors, while allowing new clients
 to take advantage of this to render at the higher resolution when the
 surface is displayed on the scaled output.

 It is similar to set_buffer_transform in that the buffer is stored in
 a transformed pixels (in this case scaled). This means that if an output
 is scaled we can directly use the pre-scaled buffer with additional data,
 rather than having to scale it.

 Additionally this adds a scaled flag to the wl_output.mode flags
 so that clients know which resolutions are native and which are scaled.

 Also, in places where the documentation was previously not clear as to
 what coordinate system was used this was fleshed out.

 It also adds a scaling_factor event to wl_output that specifies the
 scaling of an output.

 This is meant to be used for outputs with a very high DPI to tell the
 client that this particular output has subpixel precision. Coordinates
 in other parts of the protocol, like input events, relative window
 positioning and output positioning are still in the compositor space
 rather than the scaled space. However, input has subpixel precision
 so you can still get input at full resolution.

 This setup means global properties like mouse acceleration/speed,
 pointer size, monitor geometry, etc can be specified in a mostly
 similar resolution even on a multimonitor setup where some monitors
 are low dpi and some are e.g. retina-class outputs.

This looks better.

I still think we can solve this problem better if the clients, instead of
providing some sort of pre-scaled buffer that matches the output's
arbitrary scale factor, simply told the compositor which output they
rendered for. Then everything will be in that outputs coordinates.  If the
surface ever lands on a different output, the compositor can scale
everything relative to the selected output.  Surfaces which do not specify
an output would just get scaled by the factor.  This has three advantages.

1. Everything is still pixel-perfect and there are no input rounding errors.

2. There is no confusion about things like subsurface positioning and
clients can place subsurfaces at any pixel, not just multiples of the scale
factor. (Sub surfaces would have to inherent their drawing on X monitor
setting from the parent to keep everything sane.)

3. Since surfaces are scaled relative to their preferred output, the user
can specify arbitrary scaling factors for each output and are not
restricted to integers.

I proposed this in more detail in a previous email but no one bothered to
respond to it.

Thanks,
--Jason Ekstrand

 ---
  protocol/wayland.xml | 107
---
  1 file changed, 93 insertions(+), 14 deletions(-)

 diff --git a/protocol/wayland.xml b/protocol/wayland.xml
 index d3ae149..acfb140 100644
 --- a/protocol/wayland.xml
 +++ b/protocol/wayland.xml
 @@ -173,7 +173,7 @@
  /event
/interface

 -  interface name=wl_compositor version=2
 +  interface name=wl_compositor version=3
  description summary=the compositor singleton
A compositor.  This object is a singleton global.  The
compositor is in charge of combining the contents of multiple
 @@ -709,7 +709,7 @@

 The x and y arguments specify the locations of the upper left
 corner of the surface relative to the upper left corner of the
 -   parent surface.
 +   parent surface, in surface local coordinates.

 The flags argument controls details of the transient behaviour.
/description
 @@ -777,6 +777,10 @@
 in any of the clients surfaces is reported as normal, however,
 clicks in other clients surfaces will be discarded and trigger
 the callback.
 +
 +   The x and y arguments specify the locations of the upper left
 +   corner of the surface relative to the upper left corner of the
 +   parent surface, in surface local coordinates.
/description

arg name=seat type=object interface=wl_seat summary=the
wl_seat whose pointer is used/
 @@ -860,6 +864,9 @@

 The client is free to dismiss all but the last configure
 event it received.
 +
 +   The width and height arguments specify the size of the window
 +   in surface local coordinates.
/description

arg name=edges type=uint/
 @@ -876,11 +883,16 @@
  /event
/interface

 -  interface name=wl_surface version=2
 +  interface name=wl_surface version=3
  description summary=an onscreen surface
A surface is a rectangular area that is displayed on the screen.
It has a location, size and pixel contents.

 +  The size of a surface (and relative positions on it) is described
 +  in surface local coordinates, which 

Re: minimized and stick windows

2013-05-16 Thread Bill Spitzak



Pekka Paalanen wrote:

For example the floating shared toolbox with 2 main windows. It should 
only disappear when *both* main windows are minimized.


You very conventiently removed my next sentence, where I already took
this into account.
- pq


Sorry, obviously I did not read very carefully:

 Actually, if you think about a multi-window application, minimize 
needs to work the same way, so that application can hide all relevant 
windows (but maybe not *all* windows).


I think also it is important to note that the compositor cannot even 
hide the window the minimize is for. This is because that hide should be 
in sync with the hiding of other windows, so the client should do all of 
them.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [RFC] libinputmapper: Input device configuration for graphic-servers

2013-05-16 Thread Bill Spitzak

David Herrmann wrote:


/**
* @INMAP_CAP_ACCELEROMETER
* Accelerometer interface
*
* Accelerometer devices report linear acceleration data as ABS_X/Y/Z
* and rotational acceleration as ABS_RX/Y/Z.
*
* @TODO this collides with ABS_X/Y of absolute pointing devices
*   introduce ABS_ACCELX/Y/Z
*/


If actual position is called ABS_X and the first derivative is called 
REL_X, then it does not make sense for the second derivative to be 
called ABS again. The events should be called ACCEL_X or REL_REL_X or 
REL2_X.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 2/2] protocol: Support scaled outputs and surfaces

2013-05-16 Thread Bill Spitzak

al...@redhat.com wrote:


Coordinates
in other parts of the protocol, like input events, relative window
positioning and output positioning are still in the compositor space
rather than the scaled space. However, input has subpixel precision
so you can still get input at full resolution.


If I understand this correctly, this means that a client that is aware 
of the high-dpi is still unable to make a surface with a size that is 
not a multiple of the scale, or to move the x/y by an amount that is not 
a multiple of the scale, or position subsurfaces at this level of accuracy.


The only way I can see to make it work is that all protocol must be in 
buffer space (or perhaps in buffer space after the rotation/reflection 
defined by buffer_transform). This also has the advantage (imho) of 
getting rid of one of the coordinate spaces a client has to think about.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 2/2] protocol: Support scaled outputs and surfaces

2013-05-16 Thread Bill Spitzak

Jason Ekstrand wrote:

I still think we can solve this problem better if the clients, instead 
of providing some sort of pre-scaled buffer that matches the output's 
arbitrary scale factor, simply told the compositor which output they 
rendered for.


That is equivalent to providing a scale factor, except that the scale 
factor has to match one of the outputs.


A client will not be able to make a low-dpi surface if there are only 
high-dpi outputs, which seems pretty limiting.


You could say that the scaler api would be used in that case, but this 
brings up the big question of why this api and the scaler are different 
when they serve the same purpose?

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 2/2] protocol: Support scaled outputs and surfaces

2013-05-16 Thread Jason Ekstrand
On May 16, 2013 1:11 PM, Bill Spitzak spit...@gmail.com wrote:

 Jason Ekstrand wrote:

 I still think we can solve this problem better if the clients, instead
of providing some sort of pre-scaled buffer that matches the output's
arbitrary scale factor, simply told the compositor which output they
rendered for.


 That is equivalent to providing a scale factor, except that the scale
factor has to match one of the outputs.

What I didn't mention here but did before is that this could be combined
with an integer scale factor in case you want to render at a multiple.  If
you throw that in, I think it covers all of the interesting cases.


 A client will not be able to make a low-dpi surface if there are only
high-dpi outputs, which seems pretty limiting.

If you want a low DPI surface you can just not specify the scale/output at
all. Then it will just assume something like 100dpi and scale.


 You could say that the scaler api would be used in that case, but this
brings up the big question of why this api and the scaler are different
when they serve the same purpose?

The point of this soi is to allow surfaces to render the same size on
different density outputs. The point of the scaler api is to allow a
surface to render at a different resolution than its specified size. The
two are orthogonal.

--Jason Ekstrand
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 2/2] Move the EDID parsing to its own file

2013-05-16 Thread Graeme Gill
Bill Spitzak wrote:
 The Y of the primaries can be used as alternative method of specifying the 
 whitepoint. (convert the
 3 Yxy colors to XYZ, add them, then convert back to Yxy and the xy is the 
 whitepoint, I think).

Correct, but this assumes the display is perfectly additive. Real world ones 
mightn't be, so
Yxy or XYZ x RGBW does provide extra information.

 I think the reason rgb sets are specfied as 4 pairs of xy (the three 
 primaries and the whitepoint)
 instead of 3 triples is to remove the arbitrary multiplier that makes there 
 be 9 numbers instead of 8.

A lot of display folk are very chromaticity diagram oriented - they are used to 
just
specifying xy, and it's nice to have the white point be explicit so you can 
more easily
check what the white point color temperature is.

Graeme Gill.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Is light-weight window manager possible with Wayland?

2013-05-16 Thread Michael Pozhidaev
Hello!

Maybe the things I am asking now are well-known but although I have tried to
read various materials about Wayland and Weston some details remain
unclear for me. I would like to clarify them and would be very grateful
for any help!

Is creation of light-weight window manager possible for Wayland? I mean
window manager that does windows size and position manipulation in the
same manner as for example DWM does.

I think it is very convenient when it is possible to write your own WM
with less than 2000 lines of code and implement any custom behaviour you
want.I am not interesting in any visual effects and any other features
usually discussed as Wayland or Weston advantages. 

Is shell term used in Weston is a thing I need? Can I implement tiling
WM behaviour with it? I have tried a search for tiling window manager for 
Wayland
and found ADWC:

http://www.phoronix.com/scan.php?page=news_itempx=MTA5MTA

This page says that ADWC supports multiple monitors and is an experimental 
fork of the Weston reference compositor for
Wayland. I think any clone of Weston itself cannot be a real
light-weight window manager. 

Am I wrong in my conclusions with shell for Weston? Thank you for any help! 
:))

-- 
Michael Pozhidaev. Tomsk, Russia.
Russian info page: http://www.marigostra.ru/
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel