Re: [ANNOUNCE] libevdev - a library to wrap the evdev kernel interface

2013-06-27 Thread Todd Showalter
On Thu, Jun 27, 2013 at 12:52 AM, Peter Hutterer
peter.hutte...@who-t.net wrote:

 For the last month or so I've been spending some time on a helper library
 for evdev devices.

Looks nice!

I assume this doesn't abstract away the need to be root to access events.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [RFC] libinputmapper: Input device configuration for graphic-servers

2013-05-16 Thread Todd Showalter
On Thu, May 16, 2013 at 1:37 AM, Peter Hutterer
peter.hutte...@who-t.net wrote:

 why are gamepads and joysticks different? buttons, a few axes that may or
 may not map to x/y and the rest is device-specific.
 this may be in the thread, but I still haven't gone through all msgs here.

Joysticks are designed for a different purpose (flight sims), and
so have a different set of controls.  For example, on a lot of
joysticks there is a throttle, which is a constrained axis you can
set to any position and it will stay there until you move it again.
Button placement on joysticks tends to be more arbitrary as well.

In terms of raw functionality they're similar, but the differences
are large enough (especially in the way they're used) that they are
better treated separately.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-13 Thread Todd Showalter
On Mon, May 13, 2013 at 2:33 AM, David Herrmann dh.herrm...@gmail.com wrote:

 That is why the kernel provides PHYS and UNIQ fields for every
 input device (they might be empty if not implemented, but at least
 they're supposed to be there..). PHYS provides the physical location
 for the device. UNIQ provides a unique identification.

PHYS is potentially useful, but not reliable; there are often
several USB ports near each other, and the player may not re-plug the
device in the same one, especially if they are on the back of the
machine.  With wireless setups we can't be sure that PHYS will be
identical between connections.  Still, it's useful as a hint.

UNIQ has been an empty string on every USB input device I've
encountered.  I need to get bluetooth set up (my PC has no bluetooth
hardware, I need to buy a card or something), so i can't say whether
wireless devices provide useful UNIQ values.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [RFC] libinputmapper: Input device configuration for graphic-servers

2013-05-12 Thread Todd Showalter
On Sun, May 12, 2013 at 10:20 AM, David Herrmann dh.herrm...@gmail.com wrote:

 So what is the proposed solution?
 My recommendation is, that compositors still search for devices via
 udev and use device drivers like libxkbcommon. So linux evdev handling
 is still controlled by the compositor. However, I'd like to see
 something like my libinputmapper proposal being used for device
 detection and classification.

[8]

I could work with this.  Right now I'm somewhat tackling the whole
problem head-on; I've got a server with an inotify watch on /dev/input
that binds a udp port.  Connect to the udp port, get a stream of
events for all devices the server cares about (which in my case is
gamepads), with suitable remapping.

The idea isn't necessarily to do things precisely that way, it's
more a proof of concept tool; it's easy enough to replace the udp
socket with a fifo or just jack it in to something as a back-end.
Likewise, it could just be processing data streams rather than dealing
directly with /dev/input.  The remapping could also be sourced out to
your library.

I haven't had as much time to devote to it as I'd like, but I'm
hoping I can get the code up soon.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 0/2] Support for high DPI outputs via scaling

2013-05-08 Thread Todd Showalter
On Wed, May 8, 2013 at 6:51 AM,  al...@redhat.com wrote:

 I'm working on trying to make high DPI (i.e. retina-class) outputs
 work well on Linux. I've written a proposal here:

 https://docs.google.com/document/d/1rvtiZb_Sm9C9718IoYQgnpzkirdl-wJZBBu_qLgaYyY/edit?usp=sharing

I'm dubious about handling things this way.  This is what gets
done in iOS and OSX, and it's what Microsoft tried decades ago; at one
point in the win16/win32 era you were supposed to do everything in
twips and mickeys (a twip was a device-independent
usually-about-a-pixel measurement, a mickey was a unit of mouse
movement...).

There are problems with this.

We're moving to a HiDPI world.  With 3DTV failing in the market,
it's looking like the TV manufacturers are going to push hard on 4K,
and once 4K panels are widely available they'll work their way down to
PCs fairly quickly.  By the time Wayland is moving into mainstream
adoption, if 4K isn't the standard yet for 20 - 24 monitors it will
be imminent.

We're at that point in iOS now; with the exception of legacy
hardware, the iPad Mini and the (presumably soon to be discontinued)
iPad 2, everything is retina and has a 2.0 scale factor.  We're
probably less than a year away from the end of Apple selling any iOS
devices with a 1.0 scale factor, and two to three years at most from
the point where support for 1.0 scale devices is a consideration for
developers.  Apple needed something to bridge the gap between the
devices, but that gap is just about behind us now, and the scale
factor is going to remain as a minor programming wart for a long time
to come; something that trips up new developers.

Windows blew it with their DPI adjustments; in my experience
changing the DPI significantly on Windows (up to and including win7)
breaks things.  Even OS tools render wrong, and many times I've run
into games that don't handle mouse positions properly when the DPI is
changed.  If you run Stardock's Fallen Enchantress at double DPI, for
example, the mouse pointer can move across the entire screen, but as
far as the game is concerned the actual location of the pointer is
scaled by 0.5, constrained to the upper left quarter of the screen.
This is not an uncommon problem.

More fundamentally, HiDPI doesn't really capture the nature of
the problem.  As an example, I have a living room PC I built mostly
for gaming.  I've got a 46 HDMI TV it's hooked up to, and according
to my measuring tape I'm sitting about 10' from it (sorry for the
furlong/firkin/fortnight measurement system, but it seems like metric
hasn't made it to monitor discussions yet).

The rough standard for monitors these days is a 22 monitor viewed
from an 18 distance.  Ish.  YMMV.  But it seems like that's about
what most people seem to do.  Most monitors these days are 1920x1080,
since that's what TV panels are, and my tv is no different.  So,
similar triangles: a 22 monitor at 18 would be the equivalent of a
146 monitor at 120.  The practical effect of this is that normal
text is nearly impossible to read.

The problem is entirely about the angle subtended by a pixel from
the position of your eye. If you aren't thinking about it in those
terms, you're misunderstanding the problem.  Oddly, Apple got it right
in their *marketing* for the retina iPhone, but they blew it in
implementation; if I plug a mac mini into my TV, it still assumes it's
a 22 monitor I'm sitting 18 away from, and there's no way I know of
to convince it otherwise.  It's all but unusable as a living room PC
unless you have a crazy-big TV, are sitting way to close, or are part
falcon.

The angle subtended by a pixel to the viewer is the problem.

The reason why this matters is that you can't know what that angle
*is*, outside of specialized environments.  My 46 TV is 48dpi, and
that could presumably be determined by EDID (assuming Sony didn't
screw the table up), but there's no way for the computer or tv to know
how far away from it I'm sitting, and that's a critical variable; at
the distance I sit, a pixel subtends only 7.5% of the angle it would
if I was sitting at 18.

This is one of the major flaws with the OSX/iOS approach; they
went for convenience (integer scale based somewhat on DPI and somewhat
on screen size) rather than solving the real problem.  So, we've got
the iPad Mini displaying 160dpi using the same gui as the iPad 2
displaying 130dpi, and we've got TVs displaying pixel-for-pixel the
same gui as desktop displays.

Ultimately, the answer may just be to do what you're planning and
make sure that there's some sort of simple tool to let the user set
their view distance.  Whatever we do, though, it needs to deal with
this problem.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 0/2] Support for high DPI outputs via scaling

2013-05-08 Thread Todd Showalter
, or games trying to
decide how large they should render their GUI text.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [PATCH 0/2] Support for high DPI outputs via scaling

2013-05-08 Thread Todd Showalter
On Wed, May 8, 2013 at 3:51 PM, Jason Ekstrand ja...@jlekstrand.net wrote:

 Also, I agree that you're going to have mixed setups.  Like you said, people
 replace their laptops fairly frequently.  However, I have a monitor sitting
 on my desk that I bought in 2004 and it's still working perfectly with no
 dead pixels.  Even if I go out next year and buy an super hi-res laptop,
 I'll still plug it into that monitor as an external.  I think we can expect
 mixed setups for quite some time.

True; I guess my point there is that I'm betting that in the next
year or so the CCFL on your LCD monitor will go, and you're going to
be junking it.  It seems like the backlight on CCFL LCD monitors has a
MTTF of around a decade, and aftermarket replacement of CCFLs is
neither easy nor financially sensible from what I've seen.

 Also, you have to remember what drives buying monitor A over monitor B.  A
 lot of the reason to bump from 1280x1024 to 1600x1200 or 1920x1080 is screen
 space.  You can put a lot more windows (and therefore get more work done) on
 a 1920x1080 screen than on the old 1280x1024 screen.  With the bump to 4k,
 you don't get a bump in space, just resolution.  Therefore, I don't know how
 jumpy people are going to be to replace the 1920x1080 screen with one that's
 more expensive but doesn't grant them extra room to work.  I don't think the
 switch to 4k will be as rapid as you are suggesting.

What I think is going to happen is the TV makers will be pushing
4K for the home theatre market, and will hope to convince people
with TVs to upgrade to 4K.  There's been a lot of noise over 4K this
year, mostly because 3DTV has been dead in the water, TVs are becoming
super cheap commodity items (the price of a big TV has come down by a
literal order of magnitude in the past decade; I remember seeing 50
plasma TVs for $20K CDN, now I can have one for less than $2K CDN, and
that doesn't even take a decade of inflation into account), and the
manufacturers are desperate to find a way to convince people that
their current TV is horribly obsolete and needs replacing.  It sounds
like the push is going to be for 4K TVs paired with 4K-capable bluray.

Once they shift manufacturing over to 4K panels, they're going to
want to spin down the 1080p panel lines, and when that happens the
1080p monitors start to move to a less desirable part of the
supply/demand curve.

That's why we're at 1080p now, when five years ago it was fairly
easy to find 1600x1200 and 1920x1200 panels.  The TV market brings
economy of scale to a specific panel size, and whatever they settle on
is what's going to be cheap.  Computers users aren't driving the
market.

The recent upswing in the availability of LED backlights may
change things; I'm not sure.  In theory, LED backlights ought to have
a far longer MTTF; at that point I'm guessing that it's going to be
the longevity of the power supply that dominates the service life of
monitors.  Maybe that means hardware will stick around longer.

The point, though, is that while we can argue timeframe, we're
clearly rapidly approaching peak 1080p, and the future isn't lower
def.

 Is there a way to bypass the scaling?  When I'm using the mouse in
 a game, I want to know the pixel position of the pointer, with no
 scaling applied.  I'm going to be drawing the gui at native res, and I
 want the mouse to interact with it at native res.

 I think that's supposed to be solved by wl_pointer giving sub-pixel
 accuracy.

I suppose as long as I know the scale factor, I can reverse the scaling.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-07 Thread Todd Showalter
On Tue, May 7, 2013 at 3:23 AM, Pekka Paalanen ppaala...@gmail.com wrote:

 Yeah, like Daniel said, there is no concept of a return value.

 When a client creates a new object, the server can only either agree,
 or disconnect the client with a protocol error. Any other behaviour
 requires specialized handling, and causes a roundtrip, where the client
 must either wait for a reply before continuing, or risk having further
 requests ignored without any obvious way to know what got ignored in
 the end. Both cases are unacceptable.

Ok.  I was assuming that cases where you had fundamental
capability change in the server (ie: input devices appearing or
disappearing) were rare and special enough to warrant a round trip.

 When a client sends a request, that creates a new protocol object, then
 from the client's point of view, the object is created on that instant,
 before the request has even been submitted to the wire. This allows the
 client to immediately send more requests on that new object, without
 waiting for a roundtrip in between. The same works also in the reverse
 direction, when the server creates protocol objects by sending events.

 A major design principle in Wayland is to minimize roundtrips, as it
 leads to better performance and lower overhead.

Fair enough.  We're talking about rare events here, so I wouldn't
have called it essential, but if that's an organizing principle of the
project then so be it.

 It's not about the gamepad capabilities at all. It's just an
 assignment, configured in the server: this input device belongs to
 player N.

The place where that becomes a problem is with controller
batteries.  As an example, I've got a PS3, and my wife uses it to
watch netflix (it's a streaming tv/movie service, for those who
haven't heard of it).  It uses the PS3 controller as a remote, to do
things like play/pause.

It's not uncommon for the battery in the controller to run flat
while she's watching.  I've got a second controller, and we typically
charge one while the other is in use, but fairly often the controller
she's using runs flat.  When that happens, we have a second charged
controller, but to use it we have to reboot the PS3, because without
rebooting it connects as Player 2, and netflix only listens to Player
1.  As far as I know there's no simple way to tell the gamepad to
reconnect as Player 1, short of rebooting the machine and rerunning
all the controller handshaking.

When a gamepad goes away and then it reappears or another appears,
it's *probably* the same player.  So what I'm thinking is that it
makes more sense to have the wl_gamepad go into a disconnected
state, and then reactivate when the next gamepad appears, rather than
creating a new wl_gamepad.

 If the gamepad later comes back online, it is like it was hotplugged
 again: a new wl_gamepad object is sent, with the same player id as
 before.

This would work too.  The main thing is dealing well with the
single player case where the player is replacing a gamepad.  This
could be because:

- they wandered out of RF range when they were getting a drink
- they want to play the game with a different gamepad
- the gamepad they were using ran out of power and is now plugged in via usb
- the gamepad they were using ran out of power and is being replaced
with a charged gamepad
- someone tripped over the usb cord and yanked it out and then plugged
it back in

 Yeah, the main point of the leave event is to say you don't get any
 more input events from this device, until it comes back, and it also
 implies that the client should forget all temporary state of the
 gamepad, like which buttons were down.

Yes.

 Immediately following an enter event, or in the enter event, a new set
 of current state is sent. Notice, that this should not be done by
 sending e.g. fake button-down events. We have a protocol design policy,
 that input events from user actions are never manufactured.

My temptation would actually be to say that when focus goes to a
new application, we treat buttons that are down as if they were up;
don't send a release when they are lifted.  So, if I'm holding down
SELECT when focus enters the client window and then release it, press
it and release it, the client sees the press and the second release,
but not the initial release.

That doesn't work with axis values, but if the client cares about
deltas it's going to have to clear them on focus change anyways, since
it has already been said that the protocol will not be sending deltas.
 If we were sending deltas we could make things a little cleaner in
some ways, but it does expand the protocol and I'm not sure it does so
usefully.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-07 Thread Todd Showalter
On Tue, May 7, 2013 at 1:02 PM, Pekka Paalanen ppaala...@gmail.com wrote:

 This would work too.  The main thing is dealing well with the
 single player case where the player is replacing a gamepad.  This
 could be because:

 - they wandered out of RF range when they were getting a drink
 - they want to play the game with a different gamepad
 - the gamepad they were using ran out of power and is now plugged in via usb
 - the gamepad they were using ran out of power and is being replaced
 with a charged gamepad
 - someone tripped over the usb cord and yanked it out and then plugged
 it back in

 Yeah, sure, and that's all just heuristics inside the server. The
 server needs to make sure the player id becomes what the user
 wants, even if one wl_gamepad object is deleted and another created.

The client needs to look at a new wl_gamepad when it shows up and
decide whether it's a new player or an existing player who is
reconnecting,  As long as it's easy for the client to do that, I think
we're good.

 The problem you described with PS3 should be solvable with the
 mysterious gamepad configuration GUI I talked about before, somehow.

Partly, though I think the default case should be that if a
controller disappears and another (or the same one) appears, the
assumption is it's the player that just left coming back.  The number
of times that isn't true isn't likely to be statistically significant.

 My temptation would actually be to say that when focus goes to a
 new application, we treat buttons that are down as if they were up;
 don't send a release when they are lifted.  So, if I'm holding down
 SELECT when focus enters the client window and then release it, press
 it and release it, the client sees the press and the second release,
 but not the initial release.

 It depends. If a gamepad enters with button A down, and then the
 user presses button B down, is the application supposed to respond
 to B or A+B?

In my experience games that use gamepads don't usually use the
gamepad buttons as modifiers; it can happen, but it's awkward to
explain to the player and often awkward to actually perform with the
hands.  What you get more often is some sort of lockon, where holding
a button down makes player motion relative to a target (so you can
circle-strafe around an opponent, for example).  In cases like this
the focus switch is likely to have broken the player's context
anyways.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-06 Thread Todd Showalter
On 2013-05-06, at 2:54 AM, Pekka Paalanen ppaala...@gmail.com wrote:

I don't think there's any problem in principle with the gamepad
 events being delivered to the same client that has keyboard focus.
 The only annoying thing is if (in a multiplayer game) someone can
 screw you by sending you a well-timed IM that pops up a window and
 steals focus, but honestly I think that's more an argument against
 focus stealing than it is for not attaching gamepad focus to keyboard
 focus.
 
 Focus stealing indeed, there has been some discussion about that.
 
 The problem is, that a wl_seat may not have a keyboard, hence it does
 not have a keyboard focus. And if there are multiple wl_seats, one for
 each player, as a user I don't want to individually assign each player's
 focus to the game.

That seems like an argument for ganging gamepads into a single seat, 
preferably one with a keyboard. I presume we want the normal case to be the 
easy case, and I think the normal case is one game running that has the focus 
of the keyboard, mouse and gamepads.

It's important to support other scenarios, but I think that is the one that 
has to Just Work with as little user effort as possible.

 I could imagine the Wii pointer exposed as a wl_pointer with the
 gamepad... hrm, that's another curious input device that does not fit
 well in our categories: it needs a cursor image, but provides absolute
 positions unlike a mouse, right?

Sort of. It's actually more complex in some ways, because the position 
actually comes from a camera in the end of the wiimote looking at a couple of 
infra red LEDs.  It also has accelerometers and hot-docking peripherals, and a 
built-in speaker.

It can lose sight of the screen, at which point the pointer is in an 
undefined location. The accelerometers mean the pointer can have an 
orientation; you can in principle rotate the pointer.  It can change capability 
somewhat drastically depending on what is jacked in.

It's kind of neat, but if you're writing something that uses it, you're 
going to be writing a lot of device-specific code. I'm not sure how much of a 
useful abstraction can be built around it.

The other thing is, the pointer is driven by a camera looking at LEDs, but 
IIRC decoding that happens on the host machine; it just gets a stream of 
intensity pixmaps from the device and uses that to calculate position. Which 
means there are potentially a lot of interesting things you could do with it if 
you know a little signal processing and how to wire up LEDs.

Todd. 

--
  Todd Showalter, President
  Electron Jump Games, Inc.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Todd Showalter
On Mon, May 6, 2013 at 8:36 AM, Pekka Paalanen ppaala...@gmail.com wrote:

 Into wl_seat, we should add a capability bit for gamepad. When the bit
 is set, a client can send wl_seat::get_gamepad_manager request, which
 creates a new wl_gamepad_manager object. (Do we actually need a
 capability bit?)

There are options here:

- have the capability bit, if the bit is set the client can request a
manager -- has to deal with the case where the client sent the request
but the caps bit wasn't set, presumably by returning NULL or -1 the
protocol equivalent

- leave out the caps bit, client requests the manager if they want it,
they get NULL equivalent if there are no gamepads

- leave out the caps bit, gampad manager is always there, but can be
expected to return 0 if asked to enumerate gamepads when none are
connected

 A wl_gamepad_manager will send an event for each physical gamepad (as
 it dynamically appears, if hotplugged later) associated with this
 particular wl_seat, creating a wl_gamepad object for each.

 A wl_gamepad object will send an event about the player id as the first
 thing, and also if it later changes.

Some gamepads don't have player id controls, so we can't rely on
them, but supporting them where we can is useful.  I think it's best
viewed as a really forceful hint as to the player's ID, where
otherwise we're stuck doing heuristics with plugging.

 If a gamepad is hot-unplugged, a wl_gamepad event will notify about
 that, and the wl_gamepad object becomes inert (does not send any
 events, ignores all but the destroy request).

Dealing gracefully with things like wireless gamepads running
their batteries flat or moving out of radio range is important, which
is what I assume this is to deal with.  I presume the idea here is
that if the player moves back into range or replaces the batteries,
the wl_gamepad object revives?

 Gamepad input events are delivered according to the keyboard focus of
 the related wl_seat. If there is no keyboard to focus, then use the
 pointer focus, or something. It doesn't really affect the protocol
 design how the focus is assigned. However, would we need a
 wl_gamepad::enter,leave events? Probably, along with events for initial
 state. Or maybe enter/leave should be wl_gamepad_manager events?

I think we need enter/leave events.  The client can be responsible
for cleaning up its own state, though if an initial state is sent on
focus gain that makes things much easier.

I don't see anything here that raises any flags for me; at least
at first reading it seems quite usable.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Axis events to keyboard focus (Re: Input and games.)

2013-05-06 Thread Todd Showalter
On Mon, May 6, 2013 at 9:06 PM, Vincent Povirk madewokh...@gmail.com wrote:

 A compositor could just drop events that aren't over a focused window,
 and it would solve Todd's problem, unless he's also expecting to be
 able to scroll things without hovering over them.

My problem is that I expect one of two cases:

1) all focus follows the pointer (this is what I prefer)

2) click to focus

I can live with either, though I vastly prefer
focus-follows-pointer.  The problem in OSX is that it's this broken
mushing of the two systems together; scroll wheel focus follows the
pointer, keyboard is click-to-focus.

Because I interact with document browsers (ie: web browsers, pdf
readers) mostly via the scroll wheel, I often get into a case where I
launch a document browser from somewhere else (IRC, a terminal,
emacs...) and the launching window retains keyboard focus, but I've
got scrollwheel focus on the document browser.  When I hit the
close-tab or close-window hotkey, it gets routed to the *launching*
window (IRC or whatever) because *keyboard* focus never left that
window.  So the wrong window closes, and I'm annoyed at my computer
for doing something stupid.

As the user, my subconscious expectation is that the hotkey will
be sent to the window with which I've been interacting, which is
precisely what doesn't happen.  Instead, it may well go to a window I
haven't interacted with for several minutes, or possibly even longer.

This gets particularly nasty if you walk away from the computer
for a while and then come back.  You wind up having to stop and stare
at the window decorations and the system menu to determine which
window has real focus.  Or you forget, and close the wrong window.
I've thrown my hands up and set absolutely everything in OSX that I
can to confirm-on-close, because otherwise I get burned too often.

I'm dubious about any focus model that requires me to remember
that window A has one kind of focus, and window B has another
simultaneously.  It inevitably leads to pilot error.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-05 Thread Todd Showalter
On Sun, May 5, 2013 at 12:55 PM, Pekka Paalanen ppaala...@gmail.com wrote:

 In a wl_seat, we have one kbd focus, and one pointer focus. These
 two are unrelated, except sometimes some pointer action may change
 the kbd focus. Most of the time, they have no relation.

As a total aside, OSX has this and it drives me nuts.  Scrollwheel
focus follows the pointer, keyboard focus doesn't.  In practise what
that means is that whenever I'm on OSX I wind up closing the wrong
thing.  Example:

- running an irc client and firefox
- colleague sends an url, I click on it
- firefox brings up the url, I mouse over to it and scroll through
with the scroll wheel
- I'm done with the link, clover-w to close the tab, and it closes my
IRC session instead, because keyboard focus never left the irc window

I've had to use OSX for a couple of years now because of some iOS
projects we've been working on, and this still bites me at least once
a day.  It's *completely* counterintuitive GUI behaviour.

 I was thinking of adding a third one: the gamepad focus. It could
 be independent from kbd and pointer foci, or maybe it is assigned
 with the kbd focus. Or maybe the gamepad focus is assigned to the
 surface having any wl_seat's the kbd focus, whose client has bound
 to the gamepad.

 In any case, we have the fundamental problem: which client gets the
 gamepad events at a point in time?

 There can be several clients bound to any gamepad, and the target
 (focus) must be switchable intuitively.

 Is it wrong to think a wl_seat as a user--a player, that may have a
 gamepad?

 It's just too tempting for me to think that each player corresponds
 to a particular wl_seat.

I don't think there's any problem in principle with the gamepad
events being delivered to the same client that has keyboard focus.
The only annoying thing is if (in a multiplayer game) someone can
screw you by sending you a well-timed IM that pops up a window and
steals focus, but honestly I think that's more an argument against
focus stealing than it is for not attaching gamepad focus to keyboard
focus.

I don't see any reason why you couldn't have two (or N, for some
reasonable N) games running at the same time, using the same gamepad,
and only the program with focus sees gamepad events.  There are some
tricky cases, if the game wants to have multiple windows with no
containing root window, for example, but maybe that's one of those
well, don't do that, then cases.

Having given it some thought, I'd be inclined to be cautious about
how much you consider the gamepad-with-builtin-keyboard case.  They
really made those things to make MMOs viable on game consoles.  As far
as I know, not a lot of people have them, and the main argument for
them is on consoles which don't have native keyboards.  On a PC, the
kinds of games that need keyboards are the kinds of games you tend to
want access to the mouse.  That's not to say that nobody will ever use
a gamepad keyboard in a game on Linux, but I'd argue it's on thin
enough ground that I wouldn't let it drive the design considerations.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 3:34 AM, Pekka Paalanen ppaala...@gmail.com wrote:

 Yup. Whatever we do, we get it wrong for someone, so there needs to be
 a GUI to fix it. But should that GUI be all games' burden, or servers'
 burden...

 Along with the GUI is the burden of implementing the default
 heuristics, which may require platform specific information.

I don't know that you need a GUI to fix it as long as you're
willing to lay down some policy.  We could go with basic heuristics:

- if a gamepad unplugs from a specific usb port and some other gamepad
re-plugs in the same port before any other gamepads appear, it's the
same player

- if a gamepad unplugs from a specific usb port and then appears in
another before any other gamepads appear, it's the same player

- otherwise, you get whatever mad order falls out of the code

I think that covers the common case; if people start swapping
multiple controllers around between ports, they might have to re-jack
things to get the gamepad-player mapping they like, but that's going
to be rare.

 I can summarize my question to this:

 Which one is better for the end user: have device assingment to seats
 heuristics and GUI in the server, and seats to players mapping GUI
 in every game; or have it all in every game?

Heuristics mean less work for the player and behaviour the player
can learn to anticipate.  I say go with that.  I think the moment you
present people with a gui plugboard and ask them to patch-cable
controllers to player IDs, you're in a bad place.

I could see it being an advanced option that a savvy player could
bring up to fix things without rejacking the hardware, but the less
technically savvy are going to have a far easier time just physically
unplugging and replugging gamepads than they are figuring out a GUI
they've never (or rarely) seen before.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 6:42 AM, Pekka Paalanen ppaala...@gmail.com wrote:

 Sure, the heuristics can cover a lot, but there is still the mad case,
 and also the initial setup (system started with 3 new gamepads hooked
 up), where one may want to configure manually. The GUI is just my
 reminder, that sometimes it is necessary to configure manually, and
 there must be some way to do it when wanted.

 Even if it's just press the home button in one gamepad at a time, to
 assing players 1 to N.

If there's going to be a gamepad setup gui, my preference would be
for it to be a system thing rather than a game thing.  Partly because
I'm lazy/cheap and don't want to have to do themed versions of it for
every game I do, but also partly because otherwise it's something else
that someone can half-ass or get wrong.

 Well, yes. But the question was not whether we should have heuristics
 or a GUI. The question is, do we want the heuristics *and* the GUI in
 the server or the games? The GUI is a fallback, indeed, for those who
 want it, and so is also the wl_seat-player mapping setup in a game.

 If we do the heuristics in the server, there is very little we have to
 do in the protocol for it. Maybe just allow to have human-readable
 names for wl_seats. The press home button to assign players would be
 easy to implement. The drawback is that the server's player 1 might not
 be the game's player 1, so we need some thought to make them match.

 If we do the heuristics in the games, we have to think about what
 meta data of the gamepads we need to transmit. You said something about
 a hash of some things before. If we have just a single hash, we cannot
 implement the heuristics you described above, so it will need some
 thought. Also, if we want to drive things like player id lights in
 gamepads, that needs to be considered in the protocol.

 Maybe there could be some scheme, where we would not need to have the
 wl_seat-player mapping configurable in games after all, if one goes
 with server side heuristics. There are also the things Daniel wrote
 about, which link directly to what we can do.

I vote do it on the server, however it winds up being done.  It
means the client is isolated from a whole bunch of things it would
otherwise need to explicitly support, and it means that things happen
consistently between games.  It also means that any bugs in the
process will be addressable without shipping a new build of the game.

 Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 12:12 PM, Daniel Stone dan...@fooishbar.org wrote:

 I think edge resistance/edge snapping really wants pointer warping as 
 well.

 It's really difficult to achieve a nicely responsive and fluid UI
 (i.e. doing this without jumps) when you're just warping the pointer.
 To be honest, I'd prefer to see an interface where, upon a click, you
 could set an acceleration (deceleration) factor which was valid for
 the duration of that click/drag only.  We already have drag  drop
 working kind of like this, so it's totally possible to do for relative
 (i.e. wl_pointer) devices.  The only two usecases I've seen come up
 for pointer warping are this and pointer confinement, which I'd rather
 do specifically than through warping - which is a massive minefield I
 really, really want to avoid.

Decelerate/accelerate would cover all the cases I can think of.

 But it's also a totally orthogonal discussion. :)

True enough.  :)

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-02 Thread Todd Showalter
On Thu, May 2, 2013 at 5:44 AM, Pekka Paalanen ppaala...@gmail.com wrote:
 On Tue, 30 Apr 2013 09:14:48 -0400
 Todd Showalter t...@electronjump.com wrote:

 I'm getting set up to write code.  Someone kindly gave me a bash
 script to pull down all the components, so once I get things set up
 properly I'll see if I can get a patch together.

 Excellent!

The day job is interfering a bit, but I'm hoping to be able to
start working on this shortly.

 The question is, is a gamepad an object, or is a *set* of gamepads
 an object?

 Both, just like a wl_pointer can be one or more physical mice. Whether a
 wl_pointer is backed by several mice, the clients have no way to know,
 or separate events by the physical device.

 The interfaces are abstract in that sense.

Right.  From a game point of view, we don't want to do the
conflated-device thing; it makes some sense to have two mice
controlling a single pointer on a single device (the thinkpad nub
mouse + usb mouse case), but it never makes sense to have multiple
gamepads generating events for a single virtual gamepad.  The game
needs to be able to tell them apart.

 I'd rather the display server sorted it out, honestly, I just
 wasn't sure how much policy people were comfortable with pushing into
 the display server.

 I think we can put lots of policy in the server. A Wayland server is not
 just a generic display server like X, but is actually tied to the GUI
 paradigms, shell, and the desktop environment. In principle, every DE
 will have its own server, and code re-use is punted as an
 implementation detail. We prefer to communicate intent (set_fullscreen)
 rather than primitive actions (set window size  position it to 0,0 
 raise).

Ok, good.

 For example, a window manager with all its policies is just a component
 inside a Wayland server. It's also intended to be user configurable,
 like a modern DE.

Fair enough.  So, I'll need to fork Weston if I want to build my
fever dream combo of Quicksilver and Sawfish, then.  :)

 Ok, that makes sense.  So, from the game point of view, if each
 gamepad lives in its own wl_seat, how does the game detect that new
 gamepads have arrived or gone away?  I assume there are wl_seat
 create/destroy events?

 wl_seats are global objects in the protocol, and yes, we have events for
 globals to come and go dynamically. The events are in the wl_registry
 interface.

Ok, so in principle the game just watches for wl_seats appearing
and disappearing, and checks to see if they have gamepads attached.

 If just a gamepad goes away and later comes back, the wl_seat could
 even stay around in between. There can also be seats without a gamepad,
 so it is still the game's responsibility to decide which wl_seats it
 takes as players.

This is the icky problem for whoever handles it.  If a gamepad
disappears and then appears again attached to a different usb port, or
if a gamepad disappears and a different pad appears at the port where
the old one was, is it the same wl_seat?

 Which reminds me: maybe we should add a name string event to wl_seat
 interface? This way a game, if need be, can list the seats by name
 given by the user, and the user can then pick which ones are actual
 players. (It is a standard procedure to send initial state of an object
 right after binding/creating it.) I imagine it might be useful for other
 apps, too.

 Unless it's enough to just pick the wl_seats that have a gamepad?

 Hmm, is this actually any better than just handing all gamepads
 individually without any wl_seats, and let the game sort sort them out?
 How far can we assume that a wl_seat == a player, for *every*
 existing wl_seat? And which player is which wl_seat?

That's why I was assuming originally that gamepads would all be
attached to a single wl_seat and come in with pad_index values.
However it winds up getting wrapped in protocol, what the game is
interested in (if it cares about more than one gamepad, which it may
not) is figuring out when those gamepads appear and disappear, how
they map to players, and what input each player is generating.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-26 Thread Todd Showalter
 stick click
7 - BUTTON_RIGHT_STICK -- right stick click
8 - BUTTON_START -- start button

Controllers may have other buttons, and if so they must map to
index values higher than those of the standard buttons.  Nonstandard
buttons can only be understood in the context of the information
delivered via the wl_gamepad::connect event.

There is perhaps a question here about whether to deal with things
like controller keyboards; some game controllers have keyboards that
connect to them for things like in-game messaging.  Arguably they
belong within the gamepad protocol if they're going to be handled,
since they're per-player keyboards.  That said, they're also uncommon.
 If they are going to be handled, it also makes sense to ask whether
this is actually something where we should be bringing in the wl_seat
abstraction, but that might be abusing wl_seat.  The alternative would
be to do something like use keyboard keysyms, but set the high bit.
Regardless, I'm not sold on including them in the protocol at all.
Call it an open question.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is
button_index -- uint -- index of the button that changed state
state -- uint -- 0 for released, 1 for pressed


wl_gamepad::accelerometer
An optional part of the protocol; an orientation event indicates a
change in accelerometer data.  Accelerometer data is assumed to be
generated as a three axis vector; some hardware apparently produces
quaternions, which is interesting, but quaternions (at least,
normalized quaternions) don't give you velocity, just orientation.
Besides, without float values in the protocol the data the quaternion
encoding gets icky fast; quaternions are normalized 4d vectors, which
means they need a lot of precision below the decimal.

This is a part of the protocol that is being included for
discussion; I'm not sold on it.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is
accel_index -- uint -- index of the accelerometer that changed state
x -- uint -- accelerometer x axis, mapped such that 1.0f == 2^15 - 1
y -- uint -- accelerometer y axis, mapped such that 1.0f == 2^15 - 1
z -- uint -- accelerometer z axis, mapped such that 1.0f == 2^15 - 1


wl_gamepad::sysbutton -- gamepad system button event
A sysbutton event occurs when the system button (the ps button on
a ps3 controller, the glowy x button on the xbox 360 controller, the
home button on the wii controller) is pressed.  While this information
might be passed on to the application, it is somewhat expected that
his event will be trapped and acted upon by the window manager.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is


wl_gamepad::extended -- gamepad hardware-specific extended event
This is an optional extension to the protocol; a method of
handling extra data created by gamepads beyond the standard protocol.
Most extended information would pass through more standard messages;
extra buttons, sticks or trigger values should use those messages with
higher index values.  This message is for handling anything that
doesn't fit the standard model.  I'm not sold on this event either; it
might just be a bad idea.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is
subtype -- uint -- ordinal identifying the event subtype
a -- uint -- first parameter
b -- uint -- second parameter
c -- uint -- third parameter
d -- uint -- fourth parameter



 Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-26 Thread Todd Showalter
On Fri, Apr 26, 2013 at 8:40 PM, Jason Ekstrand ja...@jlekstrand.net wrote:

 I think you forgot reply-all.  I add wayland-devel again.

Blast.  Sorry about that.  Thanks!

 There is, actually:

 expanded = (base  7) | (base  1);

 ie: repeat the bit pattern down into the lower bits.  Examples:

  - (000) | (111) - 111
 000 - () | (000) - 000
 100 - (1000) | (10) - 1000100
 1011001 - (1011001000) | (101100) - 1011001101100

 And so forth.  It's the same scheme you use when doing color
 channel expansion.  I haven't seen a rigorous mathematical proof that
 it's correct, but I'd be surprised if someone more so inclined than I
 hasn't come up with one.

 Wow, I've never seen that one before.  And yes, it is provably exactly
 correct (up to a little integer round-off because of the implicit right
 shift by 1).  I guess I learned a new trick today; that's really cool!

AFAIK folks in graphics hardware have been using that trick at the
hardware level to do color channel expansion (ie: turning RGB565 into
RGB888 or the like) since at least the 90s, but like a lot of the more
clever bit manipulation tricks it's not that widely disseminated.  I
actually came up with it independently back in the 90s and was pretty
proud of myself before a co-worker I was explaining it shot me down
with oh, *that*, yeah, that's what my raytracer does.. :)

I meant to mention in my original reply that although most
physical hardware (especially historical hardware) is linearly mapped
signed byte or unsigned byte axis values, I think a protocol that's
going to be relatively future proof needs to handle higher precision
and convert well to float.  Most games are going to want the sticks to
map cleanly to either digital up/down/left/right buttons or to [-1.0
.. 1.0] ranges, and both of those are easy translations from the [-32k
.. 32k] range.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Fwd: Input and games.

2013-04-26 Thread Todd Showalter
I failed to reply-all before, so I'm forwarding this back to the list.

On Fri, Apr 26, 2013 at 5:46 PM, Jason Ekstrand ja...@jlekstrand.net wrote:

 My first general comment is about floating point.  I'm not 100% sure what
 all went into the design decision to make wl_fixed have 8 bits of fractional
 precision vs. 12 or 16.  I'm guessing that they wanted the increased integer
 capability, but you'd have to ask Kristian about that.  My understanding is
 that most game controllers work with ranges of [0,1] or [-1,1] which would
 be wasteful to put into wl_fixed.  Looking below, it seems as if you're
 fairly consistently picking a 16 bit fractional part.  That breaks out of
 the norm of the wire format a bit, but I think it's justified in this case.
 The big thing is to be consistent which it looks like you're doing anyway.

In my experience, most game controllers actually return byte
values which you wind up interpreting either as signed or unsigned
depending on what makes sense.  Certainly that's been the case
historically.  In games we typically do something like:

stick.x = ((float)raw_x) / (raw_x = 0) ? 127.0f : 128.0f;
stick.y = ((float)raw_y) / (raw_y = 0) ? 127.0f : 128.0f;

 Another concern is how to map [0, 255] onto [0, 2^15 - 1] cleanly.
 Unfortunately, there is no good way to do this so that 0 - 0 and 255 -
 2^15 - 1.  Perhaps that doesn't matter much for games since you're sensing
 human movements which will be slightly different for each controller anyway.

There is, actually:

expanded = (base  7) | (base  1);

ie: repeat the bit pattern down into the lower bits.  Examples:

 - (000) | (111) - 111
000 - () | (000) - 000
100 - (1000) | (10) - 1000100
1011001 - (1011001000) | (101100) - 1011001101100

And so forth.  It's the same scheme you use when doing color
channel expansion.  I haven't seen a rigorous mathematical proof that
it's correct, but I'd be surprised if someone more so inclined than I
hasn't come up with one.

[wl_gamepad::connect and disconnect]

 Do we really need connect and disconnect timestampped?  Are those timestamps
 going to be reliable/useful?  When you plug in a device, it takes a second
 or two just to detect and show up in /dev.  On that time scale, when did I
 see the event? is just as accurate as any timestamp.

It seemed like all events had timestamps as part of the protocol,
so I didn't know how fundamental that was to the underlying system.
The only reason connect and disconnect might need to be timestamped is
if events are going to be batched up, you might possibly have ordering
issues with delivery.  If that's not a problem and the underlying
system doesn't require timestamps, they can go.

[wl_gamepad::button]

 The trigger_index is 0 for left stick values and 1 for right stick
 values.  Hardware with more triggers can potentially supply higher
 values; the pressure-sensitive buttons on the ps3 controller would go
 here, for instance.

 Could you be more clear about what other pressure-sensitive buttons on the
 PS3 controller you're referring to here?  I know they went a bit overboard
 on pressure sensitivity in the PS3 controller and seem to recall that even
 buttons like triangle etc. were pressure-sensitive.  That said, those
 buttons should map as buttons not triggers so that they can be picked up in
 a canonical way.  Are you simply planning to double-report events there?

I included this as a this data could work without breaking the
protocol, but it's not essential.

In the particular case of the ps3 (and all of the dual shock
controllers, IIRC), all of the buttons are pressure sensitive with a
[0..255] range except start, select, home (on pads that have it)
and the stick clicks.  The face buttons, the dpad and all four
shoulder buttons are pressure sensitive.  Whether it's worth exporting
that is another question entirely; I've heard rumour that the ps4
controller removes pressure sensing from a lot of the buttons.

[wl_gamepad::extended]

 My feeling on this would be to wait until we have a use-case for it.  We can
 always bump the version and add an event if it comes up.  I think that's
 better than just assuming we can do something sensible with four generic
 parameters.

This is partly in response to things like the razer Wiimote-like
contraption that apparently spits out piles of quaternions, and also
things like hardcore flightsticks that have things like fixed-range
throttles.  I'm not convinced it's needed either, but I figured if I
was making a proposed protocol it was worth throwing it in for the
sake of discussion.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-24 Thread Todd Showalter
On Wed, Apr 24, 2013 at 11:03 AM, Jason Ekstrand ja...@jlekstrand.net wrote:

 I realize that my little Android project shouldn't be the sole driver of
 protocol decisions, but I don't think that is the only case where game
 controller events would come from something that's not evdev.  As another
 example, people have talked about Wayland on FreeBSD; how does FreeBSD
 handle game controllers?  Can we assume that some sort of evdev fd passing
 will work there and on Linux in any sort of reasonable way?

I haven't run FreeBSD for a while, but my memory of how game
controllers were handled is that it was not far removed from just
throwing the USB packets at the client and letting the client figure
it out.  Hopefully it has improved.

The core of my argument here is that there should be a standard
gamepad coming through the event system, much like the standard mouse
does.  The standard gamepad would be:

- left analog stick
- right analog stick
- left analog trigger
- right analog trigger
- dpad
- home button (ps3 ps, xbox glowy x, wii home)
- start button
- left shoulder button
- right shoulder button
- face top button (ps3 triangle, xbox Y)
- face left button (ps3 square, xbox X)
- face right button (ps3 circle, xbox B)
- face bottom button (ps3 x, xbox A)

An actual gamepad could generate more events than this (xbox has a
back button, ps3 has a select button, ps3 also has accelerometers...),
and some stripped down gamepads might not be able to produce all
events (no analog triggers, perhaps, or no home button), but what the
gamepad has that matches the spec should produce the standard events.

As a game I want to be able to say: oh, a gamepad, ABS_X and
ABS_Y are the stick, BTN_FACE_SOUTH is jump, BTN_FACE_EAST is shoot,
and BTN_START brings up the pause menu.  I don't want to have to
dlopen(libGameTrans.so, RTLD_NOW), then call a function to march the
event list looking for supported gamepads, then call functions to hook
up translation layers and so forth.  We don't make clients do this for
mice, they shouldn't have to for gamepads.

We're into the weeds on evdev a bit because that's what happens to
be producing these events on Linux, but my ultimate concern is what a
client has to do in order to use a gamepad in the Wayland world.  I
would like that process to be as sane and trouble-free as possible,
regardless of what Wayland is sitting on top of.  So, I would prefer
things to Just Work on your project as well.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-24 Thread Todd Showalter
On Wed, Apr 24, 2013 at 5:03 PM, Rick Yorgason r...@firefang.com wrote:

 The core of my argument here is that there should be a standard
 gamepad coming through the event system, much like the standard mouse
 does.  The standard gamepad would be: snip

 For reference, in the Windows XP days joystick input was done with
 DirectInput, which was designed to be as flexible as possible, work with any
 input device, and even went as far as to let you query human-readable names
 for devices and buttons.

 Now they've deprecated DirectInput for the much simpler XInput, which lays
 out the controller like so:

 http://msdn.microsoft.com/en-ca/library/windows/desktop/microsoft.directx_sdk.reference.xinput_gamepad%28v=vs.85%29.aspx

They can get away with that these days because things are so much
more homogeneous these days; when DirectInput was new, PC game
controllers were all over the place in terms of functionality;
gamepads tended to be modelled on the SNES controller (with no analog
controls) and joysticks were becoming grotesquely baroque collections
of axis values, force feedback systems and buttons as the flight
simulator market gradually specialized itself to death.

These days everyone has pretty much settled down to the same
formula, because it works reasonably and is generally familiar.  There
are things the basic gamepad doesn't cover (racing wheels, proper
flight sim controls, motion devices...), and maybe some day it would
be nice to have standards for those too, but I think right now the
time is more than ripe for gamepads.

 That's a nice set of buttons/axes to use as a standard abstraction, although
 it would be nice if they had built that on top of DirectInput's flexibility.
 Having a sane default configuration is great, but in XInput it comes at the
 cost of not allowing players to customize their controls to support more
 exotic hardware. It would be amazing if Wayland/evdev was designed around
 this middle-ground.

In its current state with evdev, it appears what we get is a
stream of axis and button events, but the index of those events is
arbitrary (ie: the ps3 right stick doesn't use the same axes as the
xbox 360 controller, and the buttons have no overlap at all), and
there's no way to query what the actual axis or button values are
without a priori knowledge.  You need to know that if the device
string is Microsoft X-Box 360 pad you need to map the incoming
events according to the template for that device.

I don't mind that for the functionality beyond the basic gamepad
abstraction, but without the basic gamepad abstraction there it makes
something that should be simple a hassle both for the developer and
the end user.  If we can get the core gamepad into evdev and pass the
nonstandard events through at higher index values, I think we get
everything we want out of it, at least on Linux.

 One thing I would expect a joystick abstraction to do that I don't expect a
 mouse abstraction to do is, if I plug two mice into a system I expect them
 both to control the same cursor, but with joysticks I always want to know
 which joystick is sending each message.

Yes, definitely.  Which also leads to the whole device naming
question; ie: if someone unplugs a controller and plugs it back in,
how do you make sure player 2 stays player 2?  But I think as long as
the rules are simple (if one controller disappears and reappears,
assume it's the same player, if multiple controllers disappear and
reappear at the same time, well, pilot error, user gets whatever they
get), it's largely a problem that can be papered over.

 (By the way, I like Todd's north/east/south/west abstraction for the face
 buttons. It's probably also safe to abstract start/back into
 startnext/backselect. XInput notably does not allow access to the home
 button, and even on Linux it would probably be bad form for games to use the
 home button, but a low-level protocol would need to define it so it could be
 used for things like Steam's Big Picture or media centre compositors.)

Our game engine has been running on all sorts of stuff over the
years, so we've seen controls called all sorts of things.  The best
part is the XBox controllers vs. the Nintendo controllers; Nintendo is
(clockwise from the top) xaby, while XBox is ybax.  So, the x and y
buttons are in the swapped and the a and b buttons are swapped.  As a
result, I tend to prefer something more abstract.

Actually, IIRC for a while in the original PlayStation libraries
you had the option of referring to the face buttons as if they were a
second dpad; there was a parallel set of #defines for the button masks
that were RPAD_UP, RPAD_RIGHT and so forth.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Todd Showalter
On Sat, Apr 20, 2013 at 12:20 PM, Daniel danl...@terra.es wrote:

 This is useful for desktop software too. I'm thinking of Stellarium or
 Google Earth, where moving the mouse is expected to move the
 environment, not the pointer itself.

Games is really perhaps shorthand here; there are a lot of tools
and so forth that have similar behavior and operating requirements to
games, but aren't strictly games per se.  If you have an architectural
walkthrough program that lets you navigate a building and make
alterations, that's not really something you'd call a game, but it is
operating under many of the same constraints.  It's more obvious in
things using 3D, but even the 2D side can use it in places.

I could easily see (for example) wanting to be able to do drag 
drop within a window on a canvas larger than the window can display;
say it's something like dia or visio or the like.  I drag an icon from
the sidebar into the canvas, and if it gets to the edge of the canvas
window the canvas scrolls and the dragged object (and the pointer)
parks at the window edge.

It's useful behavior.  I can definitely see why adding it to the
protocol makes things more annoying, but I've a strong suspicion it's
one of those things that if you leave it out you'll find that down the
road there's a lot of pressure to find a way to hack it in.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-19 Thread Todd Showalter
: the timestamp of the first
event we got), that's all we really need; if we need to relate it to
the wall clock, we can call gettimeofday() and compare.  If the time
units aren't fixed (ie: if they're just monotonically increasing IDs
that don't actually encode time values and are only useful for
establishing order), the results for games will be unfortunate.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Input and games.

2013-04-18 Thread Todd Showalter
I'm a game developer, and we're hoping to have our games working
properly with Wayland.  Input is a particular point of interest for
me. The traditional desktop input model is what tends to drive input
interfaces, but games have somewhat unique requirements that at times
mesh badly with the standard desktop model.

Is there a roadmap for input support I can look over?  Is there
anything I can do to help make Wayland game-friendly?

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel