On 21/09/13 11:05, Nick Sabalausky wrote:
So? Does everything have to be targeted at new/casual users? Can't
experienced users have stuff that's made for them? Who ever said
command lines are still intended for everybody? Keep in mind, a
programmer is NOT a casual or new user. But in any case, please don't
mistake "Windows vs Linux" as a "one size fits all" topic, because you
seem to be steering things that way.

There's a difference between difficulty that is inherent, versus difficulty that is unnecessary and arises out of a lack of concern for usability.

Or, in the case being discussed here, more likely it arises out of historical priorities that apply much less today. I would imagine that back in the early days of UNIX, processing key-presses was much more expensive than it is today, and there was thus a strong functional benefit in minimizing the amount of typing. (That still applies to an extent today if you're typing commands over a slow ssh connection, for example.)

If we were designing command-line scripting from scratch, today, we'd do something very different and it would definitely be much more user-friendly, and no one would lose from that -- both experts and novices would benefit.

Rant: Seems to be a big trend in computing these days. Everything is all
about catering to Average Joe Numbskull and Little Miss Facebook, and to
hell anyone who has more advanced experience and needs where "usable
by anyone's grandmother" is the least of their concerns.

Average Joes need their tools, sure, but so do the rest of us.

Speaking as a hopefully non-average Joe ... making things usable by anyone's grandmother doesn't necessarily have to come at the cost of making things less good for experts. Well-done usability design makes life easier for experts as well as for novices.

The problem is that because experts are as good as they are, they are much more capable of dealing with unnecessary complexity. And, having mastered unnecessary complexity, it's then that bit more difficult to invest the time to learn a new, simpler way of doing things, because it means investing time to re-learn how to do stuff _you can already do_, and that learning curve means you'll go through a period of being less capable (because you're learning) than you are with your existing toolkit. And then of course there's all the legacy stuff that does things using the old tools and which you know you'll have to keep using, so it's another reason to stick with what you know ... and thus lockin happens.

Case in point: C++ vs. D. Is anyone here going to claim that, _as a language_, D is not significantly more user-friendly than C++? Yet it's no less powerful -- in fact, the enhanced user-friendliness frees up experts to do more things better.

You do realize that in the time you've spent taking a friendly OS
discussion and single-handedly trying[1] to turn it into yet another
ill-informed OS flamewar (congratulations, btw) you could have already
learned quite a bit about using a unix command line?

[1] Don't deny it. Your intent to bait was obvious a few posts back, but
due to your good standing here I've been giving you a chance.

For what it's worth, I think you may have missed the humour of Manu's posts. Over-the-top criticism, in the right tone of voice, can be a pretty good way to get people who are used to a particular way of doing things to re-evaluate their assumptions. It's important to occasionally engage in mockery and caricature of the things that we value, because it helps to re-engage our critical faculties about what is good and bad.

Specifically in this case: the user-friendliness of GNU/Linux distros has come a _huge_ way in the last 10 years, but there's no reason why they shouldn't be every bit as surface-friendly (maybe even more so) than the popular commercial OS's while retaining all the power that experts need and want. It's a terrible shame that more attention is not given to this surface-friendliness, and it's striking how resistant many old-school free software people are to usability-oriented improvements _that don't necessarily constrain them_.

** Example 1 **
I was a longstanding KDE user until with the 12.04 release of Ubuntu, I switched over to using Unity. I found it much more usable and effective in all sorts of ways, but initially I was frustrated because there were superficially less config options available. It was striking how quickly I realized _I didn't miss them_ and that most of that configurability I'd had with KDE was a distraction rather than something that assisted me. As someone wrote round-about that time, there's a tendency for customisability to be an excuse for lack of design.

** Example 2 **
The first GNU/Linux distro I ever installed on my own machine was Ubuntu 5.10, which I decided I should try out after seeing video of Mark Shuttleworth's talk at DebConf 2005. Coming from Windows there was a fairly steep learning curve, but what buggered it for me was that my wireless card driver wasn't supported and getting it working involved a complicated procedure editing various config files to get the system to use a proprietary Windows driver. I just couldn't get it to work.

Then, I tried OpenSUSE 10.0, which had YaST -- a GUI config tool which could handle all the complicated under-the-bonnet stuff I needed, it just needed to be pointed to the proprietary driver and it would sort out the rest itself.

So, SUSE was what kept me using Linux, and after a period of finding all the ways to shoot myself in the foot (e.g. installing RPMs from 3rd-party repos and seeing how they'd break all my system dependencies...), and getting used to Linux-y ways of doing stuff rather than Windows-y, I got to the point where I switched back over to Ubuntu, was comfortable doing the command-line config for the wireless driver, and have never really looked back.

The point is, without a distro that catered to novices who had no clue how to use the command line, I'd have been sunk. And as it is, I've been able to get to the point of becoming much more capable, thanks to there being a tool available that let me initially bypass the depth of complexity of the system.

** Example 3 **
... when colleagues first tried to get me to install Linux on my system, way back in 2001. Much laughter in the office when I suggested that I thought Windows was a more effective, better-made OS. By coincidence, the same day, we had a visitor coming to give a presentation which she had on a 1.44 MB floppy disk (I know, I know, this sounds like an archaelogical dig...). Cue much amusement on my part as all the guys in the office tried to remember the command-line instructions to mount a floppy on Red Hat. Today, of course, it's a given that if you insert a disk, your Linux distro should auto-detect and work out how to mount it, but back then, this kind of usability issue wasn't really considered important -- even though auto-detection is just as beneficial to the expert as the novice.

I had a video card driver problem the other day. The bundled
auto-update app failed, and totally broke my computer.
I had to download kernel source, and run some scripts to compile
some
sort
of shim that made the video driver compatible with my kernel to
get it working again... absolutely astounding.

Uh... you do realize that this is because Linux actually *lets* you
fix things? If something like this happened on Windows, the only
real solution is to nuke the system from orbit and start from
ground zero again (i.e. reinstall). One can hardly expect that
repairing a broken car engine should require no thought.


Nothing like that has EVER happened to me in a few decades of windows.
In my experience asa linux user, these sort of problems are a daily
chore.

I've had stuff like that happen on Windows. Not on my own system within
the last few years, but over "a few decades"? Oh hell yea.

OTOH, I don't think I've had such trouble with Linux in at least as
long. I think 2002 was probably the last time.

It's worth remembering that the ability to go under the bonnet and fix things, while in principle it's available to everyone, is not an advantage that's perceptible to many users. To a great many users, _any_ breakage that can't be fixed through the regular OS GUI is a "take it to the experts or else just reinstall it" show-stopper.

So, if the _typical_ problems of your OS require under-the-bonnet maintenance, then this is a usability problem that it's worth trying to address.

Speaking of which, I managed to totally break my computer last night /
this morning too.


No shit. Should I be surprised? ;)

[...]

but the hardy little thing just kept going. It was
causing subtle breakages like my printer mysteriously failing to
work, and when I finally figured out the problem, I downloaded a
new kernel and recompiled it.

... speechless ;)

[...]

I rest my case.

Ok, now I know you're just trying to troll. But I've never seen you
troll before so you should know better.

He made it perfectly clear he had been messing around with his own
internals. *Plus* you know perfectly well messing around with Windows
internals can also lead to problems requiring expert-skill recovery
techniques, so really, you *know* that you know better, so cut the
shit.

Yes, Linux sucks. And guess what? So does Windows. I use both, by
choice. End of story.

I think it's generally true that (these days) most of the under-the-bonnet maintenance I have to do on GNU/Linux is down to the fact that I ask more of the system than I do of Windows, and I do more risky stuff that requires me to take on more responsibility. (In fact, I hardly use Windows at all these days, and I ask my Ubuntu setup to do all sorts of things I'd never have dreamed of doing back when Windows was my main OS.)

On the other hand, I think it's daft not to recognize the fact that Windows is in many ways better at helping the user avoid having to go beneath the surface to fix problems, and that surface-level friendliness makes a big difference in how easy it is to use in practice. It solves _more users' problems more of the time_.

This ought to be somewhere GNU/Linux can clean up, because it ought to be possible to have that surface friendliness while also being _easier_ to go under the hood. (Though as I discovered as a novice Linux user, that ease of going under the hood can be also a great way to screw things up for yourself. It's a bit like giving someone a handful of basic martial arts moves can be a great way to get them beaten up...)

I think the main difference is quality-assurance. Windows software is
more likely to be released only after it's reasonably proven that it
works.

Like Debian.

Debian's QA is different to that of Windows. Debian test to ensure that the software is bug-free -- they don't as a rule consider usability challenges to be bugs, and they will sometimes favour inferior technical solutions for non-technical reasons (e.g. driver licensing).

I actually think that they're right to have that strong free software focus, and that in the long term it also results in better software, but on a short-term basis it does result in more user-facing problems.

Whereas, whatever extensive criticisms one can make of Microsoft Windows, one thing that has to be acknowledged is that they have a very strong focus on minimizing user-facing problems or making it trivial to deal with them.

Bottom line: we all know that GNU/Linux is fundamentally a better OS than Windows. We all know that many of the claims about user-friendliness are FUD, and that many of the real problems arise out of Windows lock-in in the computing space (driver support being the most obvious). But there are certain usability issues that are particularly damaging for non-technical users, which do arise much more regularly on GNU/Linux than on Windows systems, and we shouldn't deny this or claim that it's tolerable. It's a fair criticism that there are not enough actors in the GNU/Linux world focusing on design for usability, and this ought to change.

Best wishes,

    -- Joe

Reply via email to