Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-18 Thread Michał Sawicz
W dniu 17.03.2013 09:05, Robert Bruce Park pisze:
> On 13-03-16 01:46 AM, Michał Sawicz wrote:
>> Case in point - notifications - default behaviour for the desktop
>> is to have them non-interactive, click-through. That won't be good
>> enough for a phone, where that bubble could take a significant
>> portion of the screen, and there's no way to interact with what's
>> behind them. It should be possible to dismiss them. On a tablet,
>> however, this might be dependant on whether there's a pointer
>> device.
> 
> I think this is actually a perfect case *in favor* of my argument,
> though. Consider this:
> 
> The notifications as they currently exist on the desktop become
> transparent when moused over, and pass clicks through them. You can
> take that exact same widget, add a little bit of code that says
> "dismiss myself when a finger swipes over me in a certain direction"
> 
> If you run that widget on a normal phone, the user doesn't have a
> mouse, so they never see the click-through effect because they are
> physically incapable of performing a mouse click -- even though the
> code to pass mouse clicks through is still present. Phone users then
> see the notifications and can dismiss them with a finger swipe.
> 
> On a tablet, you get the same thing -- the notification can be swiped
> away with a finger, or clicked through with a mouse. If the tablet
> user happens to have a bluetooth mouse, they can mouseover the
> notifications to dim them, click through them, or they can put their
> finger on the touchscreen to swipe it away.

Yeah, if we can pull it off like that (and UX design agrees), I'm all for.

> You are probably right that there are going to be corner-cases, but
> the larger point I'm trying to make is that there are probably many
> fewer corner cases than you actually expect. Just write the widgets to
> respond to different input devices in the correct way, and everything
> else will fall into place.

That very well may be - I do hope that there will only be a minimal set
of places where we do differentiate, I just wanted those few not to rely
solely on the abstract / arbitrary notion of a form factor, when in fact
the cause is a lot simpler.

> I'll give an example of something that failed to do this currently --
> if we forget about the Touch image for a second, and remember back to
> when we were running the desktop images on the Nexus tablets, there
> was a hilariously terrible situation in which if you tried to scroll a
> window with touch gestures, what actually ended up happening was that
> Ubuntu would interpret this as a mouse click+drag, and perform a text
> selection rather than a scroll action. This is a case of those widgets
> not being written to understand the difference between what a
> touchscreen is, and what a mouse is, and was trying to map touchscreen
> input onto a set of assumptions about how a mouse is supposed to be
> used. Don't do that. All of the widgets in the SDK should fully
> understand what a touchscreen is, and what a mouse is, and should be
> able to Do What I Mean regardless of what input device I'm actually
> using. That means it knows to select text on a click+drag, and it
> knows to scroll on a touch-swipe. The same widget can accept both of
> those inputs, and then you don't need to write any special-casing code
> to figure out whether you're on a phone or a TV or anything, you
> simply do the right thing in all cases.

+1! You did bring up an important thing here - touch != pointer.

>>> Again, don't do things "only when there's a means". Provide all
>>> input options simultaneously and then the user will simply choose
>>> whichever one is the easiest to use given the input devices that
>>> they have access to. I think this can be done in a very seamless
>>> and transparent way -- eg, a button will look identical whether
>>> you are expecting it to be clicked on, touch-tapped on,
>>> keyboard-navigated to, or TV remote selected... and regardless of
>>> which input device is used to activate the button, the button
>>> activation will be identical anyway.
> 
>> Sure, that probably is one example where you can have that
>> implemented and it simply won't be used. But the notion of focus is
>> limited to text entry fields.
> 
> That's a shame, because there was a day when that wasn't true. Made it
> pretty easy for keyboard-only use of a desktop, as you could simply
> tab around between different widgets and activate different buttons
> and things without having to know all kinds of cryptic shortcuts.

Wait, I never said we won't do keyboard navigation :D. This is still a
must, for accessibility purposes if not anything else.

> Thanks for reading my rant, apologies for the length ;-)

Don't :), that was kind of the point of this thread :)

Thanks,
-- 
Michał Sawicz 
Canonical Services Ltd.



signature.asc
Description: OpenPGP digital signature
-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscri

Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-17 Thread Robert Bruce Park
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 13-03-16 01:46 AM, Michał Sawicz wrote:
> Case in point - notifications - default behaviour for the desktop
> is to have them non-interactive, click-through. That won't be good
> enough for a phone, where that bubble could take a significant
> portion of the screen, and there's no way to interact with what's
> behind them. It should be possible to dismiss them. On a tablet,
> however, this might be dependant on whether there's a pointer
> device.

I think this is actually a perfect case *in favor* of my argument,
though. Consider this:

The notifications as they currently exist on the desktop become
transparent when moused over, and pass clicks through them. You can
take that exact same widget, add a little bit of code that says
"dismiss myself when a finger swipes over me in a certain direction"

If you run that widget on a normal phone, the user doesn't have a
mouse, so they never see the click-through effect because they are
physically incapable of performing a mouse click -- even though the
code to pass mouse clicks through is still present. Phone users then
see the notifications and can dismiss them with a finger swipe.

On a tablet, you get the same thing -- the notification can be swiped
away with a finger, or clicked through with a mouse. If the tablet
user happens to have a bluetooth mouse, they can mouseover the
notifications to dim them, click through them, or they can put their
finger on the touchscreen to swipe it away.

You are probably right that there are going to be corner-cases, but
the larger point I'm trying to make is that there are probably many
fewer corner cases than you actually expect. Just write the widgets to
respond to different input devices in the correct way, and everything
else will fall into place.

I'll give an example of something that failed to do this currently --
if we forget about the Touch image for a second, and remember back to
when we were running the desktop images on the Nexus tablets, there
was a hilariously terrible situation in which if you tried to scroll a
window with touch gestures, what actually ended up happening was that
Ubuntu would interpret this as a mouse click+drag, and perform a text
selection rather than a scroll action. This is a case of those widgets
not being written to understand the difference between what a
touchscreen is, and what a mouse is, and was trying to map touchscreen
input onto a set of assumptions about how a mouse is supposed to be
used. Don't do that. All of the widgets in the SDK should fully
understand what a touchscreen is, and what a mouse is, and should be
able to Do What I Mean regardless of what input device I'm actually
using. That means it knows to select text on a click+drag, and it
knows to scroll on a touch-swipe. The same widget can accept both of
those inputs, and then you don't need to write any special-casing code
to figure out whether you're on a phone or a TV or anything, you
simply do the right thing in all cases.

>> Again, don't do things "only when there's a means". Provide all
>> input options simultaneously and then the user will simply choose
>> whichever one is the easiest to use given the input devices that
>> they have access to. I think this can be done in a very seamless
>> and transparent way -- eg, a button will look identical whether
>> you are expecting it to be clicked on, touch-tapped on,
>> keyboard-navigated to, or TV remote selected... and regardless of
>> which input device is used to activate the button, the button
>> activation will be identical anyway.
> 
> Sure, that probably is one example where you can have that
> implemented and it simply won't be used. But the notion of focus is
> limited to text entry fields.

That's a shame, because there was a day when that wasn't true. Made it
pretty easy for keyboard-only use of a desktop, as you could simply
tab around between different widgets and activate different buttons
and things without having to know all kinds of cryptic shortcuts.

Thanks for reading my rant, apologies for the length ;-)
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.12 (GNU/Linux)
Comment: Using GnuPG with undefined - http://www.enigmail.net/

iQIcBAEBAgAGBQJRRXksAAoJEMnVpoCaWglEVHYP/RcITdR8CWxWW6HjSKVs+IvI
NVMfIDIf7mhXMBO7R8RxGTMcR+RXemI5ND5let4GKfDjg8a+jw8UaR4wnz7HnFEt
auRzYmKwfrGWQ46Vk9ygSsvhr/sb1Z+GxYNsB7dpj6fm+c2AG+TAPokQVnQNe0PM
NaWtGfEpUr+vM6E3fryJ/wThYtASJmZSHAKj/7VCdShbhOdOi5mq71rj8UTJRt7b
n46sdIRpBA1WmQdpIqtwoh7qV8BKFj7Qj8VSn+RoKNnF9jX2kw8Me0kLLEwS5LaL
V1GYI6sGRtUm51LIQErQzEOugGSWz7WCkPk15YrRpNLLHLuIDpDnDu2BBWVqWOG5
3wk5426rCzMDnql6frjnPc2nYrc/qexQpUbGuRlUPc0yfiHibqQpyYQXeu+0kcVL
p95qMo6grNSQ5SL5xReD7xJYffolXtrEWxwMkWHk0PDw+T4HXguZ8aQsqZ7o3dWW
B6BzUKnXnKeF2bH69EDym9VqFZosq0qtxDUNFzFom8Gjq9d38KmeijMPgG4C8wvM
CG0cxkczZ0ohMUVeX3QKx5QQSL5YO0uUYgUtzP7f6xTc4Vm7ZeIy6RBF7ZHuiXiC
h+5kX58i2FZjNzRhcTFhtq9hC6MM8JvASoYRWm2Ru/tdbeHNBzqRSVZh1m+tzW0s
KFNMrbbA6S4iIZFHrpyr
=Ienv
-END PGP SIGNATURE-

-- 
Mailing 

Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-16 Thread Gilbert Röhrbein

On 15.03.2013 19:57, Michał Sawicz wrote:

We need to come up with a list of factors based on which we decide which
layout, behaviour, user experience is applied.


Can we make a brainstorm for this on something else than a mailing list? 
A wiki or etherpad?



* what mode was requested (touch, PC, 10', ?)
* what input methods are available (pointer, touch, key, camera?)
* component dimensions
* which stage is the app running in
* device form factor

> ...

We need many more use cases to verify whether that approach would work,
so bring them on!

Maybe there are other factors to take into account? Or maybe you have
ideas about a completely different approach to this?
...
Comments, rants, all feedback welcome :)


I think of guidelines to enable app designers to be aware of important 
factors in the lifetime of an app, so that they are concerned of them. 
And also I think of a system to enable developers to respond to notable 
changes of the factors they are concerned of.


This would be helpful for the fine grained app specific responsiveness. 
For example changing layout of some buttons depending if the device is 
held and used by the left or right hand. (Very fancy I think :) )
The guidelines would make you aware of this problem and some common 
solutions. The system analyses touch events and "device phonyness" and 
provides a Qt signal and a getter for the very entry in the guidelines.
But also for more conventional use cases like device form factor, it 
would be nice, I think :)


More coarse grained, OS enforced responsiveness is, I believe, much more 
involved. It needs decisions on what the OS is doing, running in default 
settings. There are some threads and posts relating to "make Ubuntu more 
tablet friendly" and such. Examples would be changing window decoration, 
like size of window close buttons, or scrolling direction.


As I said that I strongly propose that, you should distinguish between 
this app specific responsiveness and OS enforced responsiveness. I also 
want to propose to make guidelines for the topic of responsiveness and 
to expose this guidelines to developers in form of signals and getters :)


What do you think of these three points?

Gilbert




--
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-16 Thread Michał Sawicz
W dniu 15.03.2013 23:31, Robert Bruce Park pisze:
> On 13-03-15 11:57 AM, Michał Sawicz wrote:
--8<--
>> * click-through notifications - only when there’s a pointer device
>> - *not* whether it’s a desktop or a phone
> 
> Ok, but I think it's a bad idea to start writing code that looks like
> "if (mouse) allow_click; elif (touch) allow_swipe"
> 
> I think ideally you should just create the widget such that it can
> reasonably sanely interact with both a mouse and a touchscreen at the
> same time (ie, it knows what to do when it gets clicked on, AND it
> knows what to do when a finger swipes over it on the touchscreen). In
> this way, you can simply render the identical widget on all possible
> form factors (convergence!) and then regardless of what input devices
> the user happens to actually have, we end up behaving The Right Way.

Of course - that's the ideal case, and we should strive to achieve that,
but there will be instances where we need to tweak the behaviours,
'cause the interaction is, after all, very much different.

Case in point - notifications - default behaviour for the desktop is to
have them non-interactive, click-through. That won't be good enough for
a phone, where that bubble could take a significant portion of the
screen, and there's no way to interact with what's behind them. It
should be possible to dismiss them. On a tablet, however, this might be
dependant on whether there's a pointer device.

Now we come onto the "why can't I dismiss notifications on my desktop?",
or worse "why can't I dismiss notifications on my tablet after I've
docked it?". I agree those are very difficult questions to answer - and
we should avoid them at all cost - and that's on user experience design
level.

>> * directional navigation - only when there’s means of that
>> navigation - a keyboard or a remote - *not* whether it’s a phone or
>> a tv
> 
> Again, don't do things "only when there's a means". Provide all input
> options simultaneously and then the user will simply choose whichever
> one is the easiest to use given the input devices that they have
> access to. I think this can be done in a very seamless and transparent
> way -- eg, a button will look identical whether you are expecting it
> to be clicked on, touch-tapped on, keyboard-navigated to, or TV remote
> selected... and regardless of which input device is used to activate
> the button, the button activation will be identical anyway.

Sure, that probably is one example where you can have that implemented
and it simply won't be used. But the notion of focus is limited to text
entry fields.

> Another example, any kind of list that you might want to scroll
> through, should at all times accept arrow-key navigation, and also
> touch swipe dragging, and also mouse-wheel scrolling, etc. Whichever
> one the user happens to use will work quite naturally, and the other
> options that the user might not have access too are harmlessly ignored.

Sure, agreed.

> One thing you really have to consider is that there's going to be a
> million different unpredictable combinations here -- maybe somebody
> has a desktop PC, but it's hooked up to a data projector and they have
> an IR receiver so they can use a TV remote with their PC. So maybe
> this user has a keyboard, and a mouse, and a TV remote, but no
> touchscreen. Or maybe the user has a touchscreen laptop, but they also
> have a mouse plugged in. Or maybe they have a tablet with a bluetooth
> keyboard ;-)
> 
> So what I'm trying to say is, you won't possibly be able to predict
> and special-case every combination of input devices. Best to just
> allow all input devices to work at all times, and then the user can
> just use whatever comes naturally to them.

Yes, that is best and should be that whenever possible.

Cheers,
-- 
Michał Sawicz 
Canonical Services Ltd.



signature.asc
Description: OpenPGP digital signature
-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread Zisu Andrei
>
> Again, don't do things "only when there's a means". Provide all input
> options simultaneously and then the user will simply choose whichever
> one is the easiest to use given the input devices that they have
> access to. I think this can be done in a very seamless and transparent
> way -- eg, a button will look identical whether you are expecting it
> to be clicked on, touch-tapped on, keyboard-navigated to, or TV remote
> selected... and regardless of which input device is used to activate
> the button, the button activation will be identical anyway.
>
> Another example, any kind of list that you might want to scroll
> through, should at all times accept arrow-key navigation, and also
> touch swipe dragging, and also mouse-wheel scrolling, etc. Whichever
> one the user happens to use will work quite naturally, and the other
> options that the user might not have access too are harmlessly ignored.


I am not entirely sure what you're saying has anything to do with this
discussion.

Michal Sawicz:

>  * which stage is the app running in


How would this be a behaviour factor?

Also, I wrote an
articlea
while ago, but I'm not sure how to submit it for the Unity team to
review
it (I emailed Jono Bacon about it, didn't answer yet, maybe anyone could
help me?). I am talking very much about no-touch but with keyboard
interaction improvements.

What you are proposing is very welcome, as it is becoming really difficult
to use Unity with a mouse as the buttons are getting bigger and bigger,,
and I am not sure why that is, because there is no touch enabled on my
laptop, and it's giving me a hard time moving with the trackpad around.


Zisu Andrei


On 15 March 2013 22:31, Robert Bruce Park  wrote:

> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> On 13-03-15 11:57 AM, Michał Sawicz wrote:
> > In the shell we managed to stay away from "if (tablet) this; elif
> > (phone) that" in favour of differentiating based on available
> > space.
>
> Excellent.
>
> > * click-through notifications - only when there’s a pointer device
> > - *not* whether it’s a desktop or a phone
>
> Ok, but I think it's a bad idea to start writing code that looks like
> "if (mouse) allow_click; elif (touch) allow_swipe"
>
> I think ideally you should just create the widget such that it can
> reasonably sanely interact with both a mouse and a touchscreen at the
> same time (ie, it knows what to do when it gets clicked on, AND it
> knows what to do when a finger swipes over it on the touchscreen). In
> this way, you can simply render the identical widget on all possible
> form factors (convergence!) and then regardless of what input devices
> the user happens to actually have, we end up behaving The Right Way.
>
> > * directional navigation - only when there’s means of that
> > navigation - a keyboard or a remote - *not* whether it’s a phone or
> > a tv
>
> Again, don't do things "only when there's a means". Provide all input
> options simultaneously and then the user will simply choose whichever
> one is the easiest to use given the input devices that they have
> access to. I think this can be done in a very seamless and transparent
> way -- eg, a button will look identical whether you are expecting it
> to be clicked on, touch-tapped on, keyboard-navigated to, or TV remote
> selected... and regardless of which input device is used to activate
> the button, the button activation will be identical anyway.
>
> Another example, any kind of list that you might want to scroll
> through, should at all times accept arrow-key navigation, and also
> touch swipe dragging, and also mouse-wheel scrolling, etc. Whichever
> one the user happens to use will work quite naturally, and the other
> options that the user might not have access too are harmlessly ignored.
>
> One thing you really have to consider is that there's going to be a
> million different unpredictable combinations here -- maybe somebody
> has a desktop PC, but it's hooked up to a data projector and they have
> an IR receiver so they can use a TV remote with their PC. So maybe
> this user has a keyboard, and a mouse, and a TV remote, but no
> touchscreen. Or maybe the user has a touchscreen laptop, but they also
> have a mouse plugged in. Or maybe they have a tablet with a bluetooth
> keyboard ;-)
>
> So what I'm trying to say is, you won't possibly be able to predict
> and special-case every combination of input devices. Best to just
> allow all input devices to work at all times, and then the user can
> just use whatever comes naturally to them.
>
> > I hate to see "tablet, phone, desktop, TV" differentiation when
> > we're trying to have a converged platform. We need, all of us, to
> > work with a more holistic approach.
>
> I agree ;-)
> -BEGIN PGP SIGNATURE-
> Version: GnuPG v1.4.12 (GNU/Linux)
> Comment: Using GnuPG with undefined - http://www.enigmail.net/
>
> iQIcBAEBAgAGBQJRQ6FNAAoJEMnVpoCaWglE9BUP/

Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread Robert Bruce Park
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 13-03-15 11:57 AM, Michał Sawicz wrote:
> In the shell we managed to stay away from "if (tablet) this; elif 
> (phone) that" in favour of differentiating based on available
> space.

Excellent.

> * click-through notifications - only when there’s a pointer device
> - *not* whether it’s a desktop or a phone

Ok, but I think it's a bad idea to start writing code that looks like
"if (mouse) allow_click; elif (touch) allow_swipe"

I think ideally you should just create the widget such that it can
reasonably sanely interact with both a mouse and a touchscreen at the
same time (ie, it knows what to do when it gets clicked on, AND it
knows what to do when a finger swipes over it on the touchscreen). In
this way, you can simply render the identical widget on all possible
form factors (convergence!) and then regardless of what input devices
the user happens to actually have, we end up behaving The Right Way.

> * directional navigation - only when there’s means of that
> navigation - a keyboard or a remote - *not* whether it’s a phone or
> a tv

Again, don't do things "only when there's a means". Provide all input
options simultaneously and then the user will simply choose whichever
one is the easiest to use given the input devices that they have
access to. I think this can be done in a very seamless and transparent
way -- eg, a button will look identical whether you are expecting it
to be clicked on, touch-tapped on, keyboard-navigated to, or TV remote
selected... and regardless of which input device is used to activate
the button, the button activation will be identical anyway.

Another example, any kind of list that you might want to scroll
through, should at all times accept arrow-key navigation, and also
touch swipe dragging, and also mouse-wheel scrolling, etc. Whichever
one the user happens to use will work quite naturally, and the other
options that the user might not have access too are harmlessly ignored.

One thing you really have to consider is that there's going to be a
million different unpredictable combinations here -- maybe somebody
has a desktop PC, but it's hooked up to a data projector and they have
an IR receiver so they can use a TV remote with their PC. So maybe
this user has a keyboard, and a mouse, and a TV remote, but no
touchscreen. Or maybe the user has a touchscreen laptop, but they also
have a mouse plugged in. Or maybe they have a tablet with a bluetooth
keyboard ;-)

So what I'm trying to say is, you won't possibly be able to predict
and special-case every combination of input devices. Best to just
allow all input devices to work at all times, and then the user can
just use whatever comes naturally to them.

> I hate to see "tablet, phone, desktop, TV" differentiation when
> we're trying to have a converged platform. We need, all of us, to
> work with a more holistic approach.

I agree ;-)
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.12 (GNU/Linux)
Comment: Using GnuPG with undefined - http://www.enigmail.net/

iQIcBAEBAgAGBQJRQ6FNAAoJEMnVpoCaWglE9BUP/2v67IFJaoEEBy3ug2EbU9qo
m8adS3dxkiHMKIDa/2/qoQ9gye+pX+2goY5vO7vJ9wdI0vSJ1hS9q6W6L6Y62+2R
NEnb+lIxc8DKKaeYPctGi+NjMOKOtgTxDMWGowX0MVzYFxQrNv48QI3+AbTenAeN
z9mNbDHWlmPcpUotFcCqJlc5Hxk2+c5dsEIG7p7ajcfh4iOxi70fxhZJirQoc0xN
ifyMYuTkgDwEybM3pqQspPkr4xK28tQcDQ/cnrnNa/pOfSh1Bn7GIpsAtunG/ZwX
YHzGbW10DGkJwUIY07nqKmQpr2C5SErFPp0KU9eb5TPTQLnj2U7gbRmafKfRLG77
EFnV1dbbfFQ6cbvrPN/Ai7DlHm3MWV0OrXmKGrFmyIqc/gT0PsigwMUDWswvbRu8
PrJRCxUGZFDeEN1MvXKGRjmVcibmeHIN/tK0qWPH5qsX5oSrksicSiJSf7BzUdKC
Hj3KI7zT7ON0Uearnsr9JHBkrkAm8SrqkT68ZIDbMdRx1x0Z3TSSz87Wqs3Oicqp
VcwahFEF/On/76tXK80Fjg94qr44RwB7VnNzjdblrzuFcDtYN9yWgUeq0beU761W
u/rt17agc1KSn00M+jOe3LVNM2yZftNSj7oRhXj3tUTNAQ7FgkYmuuYMQMxroZW9
rMQdD74Fn6NC8E5n8bSr
=1Swp
-END PGP SIGNATURE-

-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread shaneguignard
Yes, you answered all my questions. 
and I agree. 

Sent on the TELUS Mobility network with BlackBerry

-Original Message-
From: Michał Sawicz 
Date: Fri, 15 Mar 2013 22:28:16 
To: Shane Guignard
Cc: rdvlaunch...@gmail.com; 

Subject: Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

W dniu 15.03.2013 22:02, Shane Guignard pisze:
> Monitor HDMI input? or user Selectable upon hook up. GoTo PC mode or
> remain in Handset mode.
> If on tablet run tablet with touch.
> 
> How is it that we plan on communicating the to a PC.? HDMI with
> Bluetooth peripherals or USB dock?

Not sure I got all of your questions right, but here goes:

Obviously monitoring events for video /outputs/ will be one part of the
story. Whether we'll display a dialog to choose the mode - we'll have to
see, but preferably we'd just go to the one making more sense, or the
one previously selected, if any.

If you're asking about connecting peripherals to the device - it will
depend on the device itself, but all modes should be supported.

Cheers,
-- 
Michał Sawicz 
Canonical Services Ltd.


-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread Michał Sawicz
W dniu 15.03.2013 22:02, Shane Guignard pisze:
> Monitor HDMI input? or user Selectable upon hook up. GoTo PC mode or
> remain in Handset mode.
> If on tablet run tablet with touch.
> 
> How is it that we plan on communicating the to a PC.? HDMI with
> Bluetooth peripherals or USB dock?

Not sure I got all of your questions right, but here goes:

Obviously monitoring events for video /outputs/ will be one part of the
story. Whether we'll display a dialog to choose the mode - we'll have to
see, but preferably we'd just go to the one making more sense, or the
one previously selected, if any.

If you're asking about connecting peripherals to the device - it will
depend on the device itself, but all modes should be supported.

Cheers,
-- 
Michał Sawicz 
Canonical Services Ltd.



signature.asc
Description: OpenPGP digital signature
-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread Shane Guignard
Monitor HDMI input? or user Selectable upon hook up. GoTo PC mode or remain
in Handset mode.
If on tablet run tablet with touch.

How is it that we plan on communicating the to a PC.? HDMI with Bluetooth
peripherals or USB dock?


On Fri, Mar 15, 2013 at 3:18 PM, Michał Sawicz
wrote:

> W dniu 15.03.2013 20:15, rdvlaunch...@gmail.com pisze:
> > Saviq, tvoss & Kaleo,
> >
> > Do not forget that many phones and tablets can output using HDMI to
> > HDTVs. Right now Android chops off the top and bottom portions of their
> > 16:10 aspect ratio screens to accommodate an HDTV 16:9 aspect ration. I
> > suspect that attaching a phone or tablet through HDMI to a HDTV will be
> > supported as it is popular with gamers, presentations and when these
> > devices are used as convenient media players.
>
> I would think
>
> On 13-03-15 02:57 PM, Michał Sawicz wrote:
> --8<--
> > What's more, each display needs to be configured separately (i.e. phone
> > remains in touch mode when connected to TV with 10' interface). Each
> > device will have a default mode associated, and external displays we'll
> > try and associate a default mode
> -->8--
>
> answers that question?
> --
> Michał Sawicz 
> Canonical Services Ltd.
>
>
> --
> Mailing list: https://launchpad.net/~ubuntu-phone
> Post to : ubuntu-phone@lists.launchpad.net
> Unsubscribe : https://launchpad.net/~ubuntu-phone
> More help   : https://help.launchpad.net/ListHelp
>
>


-- 
Shane Guignard
-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread Michał Sawicz
W dniu 15.03.2013 20:15, rdvlaunch...@gmail.com pisze:
> Saviq, tvoss & Kaleo,
> 
> Do not forget that many phones and tablets can output using HDMI to
> HDTVs. Right now Android chops off the top and bottom portions of their
> 16:10 aspect ratio screens to accommodate an HDTV 16:9 aspect ration. I
> suspect that attaching a phone or tablet through HDMI to a HDTV will be
> supported as it is popular with gamers, presentations and when these
> devices are used as convenient media players.

I would think

On 13-03-15 02:57 PM, Michał Sawicz wrote:
--8<--
> What's more, each display needs to be configured separately (i.e. phone
> remains in touch mode when connected to TV with 10' interface). Each
> device will have a default mode associated, and external displays we'll
> try and associate a default mode
-->8--

answers that question?
-- 
Michał Sawicz 
Canonical Services Ltd.



signature.asc
Description: OpenPGP digital signature
-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


Re: [Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread rdvlaunch...@gmail.com

On 13-03-15 02:57 PM, Michał Sawicz wrote:

Hey all,

During the phone and tablet development we have several times had to
tackle the "it's supposed to look/behave/be like that on the phone, and
like this on the tablet" problem.

In the shell we managed to stay away from "if (tablet) this; elif
(phone) that" in favour of differentiating based on available space.

Since then an argument was made that that's not good enough - we need to
be able to have the 10' interface on a tablet, or the desktop interface
on a TV - regardless of screen size.

But whether the UI looks/behaves in a certain way can’t only be
determined by the single factor of what interaction mode was selected.
What’s more, each display needs to be configured separately (i.e. phone
remains in touch mode when connected to TV with 10’ interface). Each
device will have a default mode associated, and external displays we’ll
try and associate a default mode based on some properties of the display
itself, and then potentially and remember the setup for the future.

We need to come up with a list of factors based on which we decide which
layout, behaviour, user experience is applied.

The list of questions that come to mind (in no particular order):

* what mode was requested (touch, PC, 10', ?)
* what input methods are available (pointer, touch, key, camera?)
* component dimensions
* which stage is the app running in
* device form factor

That seems like a lot, I know, but most of the time you would only
decide based on a subset of those - it very much depends on the case at
hand.

Some examples:
* search entry in dash - it moves “up” over the header title on the
phone, because it wouldn’t fit otherwise - real-estate is enough to
decide on the behaviour here
* click-through notifications - only when there’s a pointer device -
*not* whether it’s a desktop or a phone
* directional navigation - only when there’s means of that navigation -
a keyboard or a remote - *not* whether it’s a phone or a tv

We need many more use cases to verify whether that approach would work,
so bring them on!

Maybe there are other factors to take into account? Or maybe you have
ideas about a completely different approach to this?

I hate to see "tablet, phone, desktop, TV" differentiation when we're
trying to have a converged platform. We need, all of us, to work with a
more holistic approach.

Comments, rants, all feedback welcome :)

P.S.
Responsive Web Design seems very much related
http://en.wikipedia.org/wiki/Responsive_web_design

Cheers,
--
Saviq, tvoss & Kaleo





Saviq, tvoss & Kaleo,

Do not forget that many phones and tablets can output using HDMI to HDTVs. 
Right now Android chops off the top and bottom portions of their 16:10 aspect 
ratio screens to accommodate an HDTV 16:9 aspect ration. I suspect that 
attaching a phone or tablet through HDMI to a HDTV will be supported as it is 
popular with gamers, presentations and when these devices are used as 
convenient media players.


--
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp


[Ubuntu-phone] [Design/UX/Apps] Ubuntu behaviour factors

2013-03-15 Thread Michał Sawicz
Hey all,

During the phone and tablet development we have several times had to
tackle the "it's supposed to look/behave/be like that on the phone, and
like this on the tablet" problem.

In the shell we managed to stay away from "if (tablet) this; elif
(phone) that" in favour of differentiating based on available space.

Since then an argument was made that that's not good enough - we need to
be able to have the 10' interface on a tablet, or the desktop interface
on a TV - regardless of screen size.

But whether the UI looks/behaves in a certain way can’t only be
determined by the single factor of what interaction mode was selected.
What’s more, each display needs to be configured separately (i.e. phone
remains in touch mode when connected to TV with 10’ interface). Each
device will have a default mode associated, and external displays we’ll
try and associate a default mode based on some properties of the display
itself, and then potentially and remember the setup for the future.

We need to come up with a list of factors based on which we decide which
layout, behaviour, user experience is applied.

The list of questions that come to mind (in no particular order):

* what mode was requested (touch, PC, 10', ?)
* what input methods are available (pointer, touch, key, camera?)
* component dimensions
* which stage is the app running in
* device form factor

That seems like a lot, I know, but most of the time you would only
decide based on a subset of those - it very much depends on the case at
hand.

Some examples:
* search entry in dash - it moves “up” over the header title on the
phone, because it wouldn’t fit otherwise - real-estate is enough to
decide on the behaviour here
* click-through notifications - only when there’s a pointer device -
*not* whether it’s a desktop or a phone
* directional navigation - only when there’s means of that navigation -
a keyboard or a remote - *not* whether it’s a phone or a tv

We need many more use cases to verify whether that approach would work,
so bring them on!

Maybe there are other factors to take into account? Or maybe you have
ideas about a completely different approach to this?

I hate to see "tablet, phone, desktop, TV" differentiation when we're
trying to have a converged platform. We need, all of us, to work with a
more holistic approach.

Comments, rants, all feedback welcome :)

P.S.
Responsive Web Design seems very much related
http://en.wikipedia.org/wiki/Responsive_web_design

Cheers,
--
Saviq, tvoss & Kaleo



signature.asc
Description: OpenPGP digital signature
-- 
Mailing list: https://launchpad.net/~ubuntu-phone
Post to : ubuntu-phone@lists.launchpad.net
Unsubscribe : https://launchpad.net/~ubuntu-phone
More help   : https://help.launchpad.net/ListHelp