This was a fascinating discussion for me -- thank you -- I listened to
it twice. I regret however that no transcript is available. I know how
much work that is -- I once interviewed someone in French for over 30
minutes, then transcribed it, then translated it -- and group
interviews are even harder -- but there's no better way to keep such
an important discussion findable or memorable. As well, if I may say
so, the Ogg audio could benefit from metadata (cf. my previous post)
with the participant's names and the licence for example. Here's how I
did it last time with vorbis-tools v1.1.1:

./oggenc --downmix  -q 2 --title='Sean Daly Interviews Ashley
Highfield, BBC Director of Future Media and Technology, for Groklaw'
--artist='Ashley Highfield' --date='November 14, 2007'
--genre='Speech' --comment 'copyright=(c)2007 Pamela Jones.'
--comment 'location=telephone interview'   --comment
'organization=Groklaw (http://www.groklaw.net)' --comment
'license=Creative Commons Attribution-NonCommercial 2.0 (see
http://creativecommons.org/licenses/by-nc/2.0/)'
/Users/SD/Groklaw/AshleyHighfield/1002.AshleyHighfield_BBC.optimized.stereo.wav
     -o /Users/SD/Groklaw/AshleyHighfield/AshleyHighfield_BBC.vorbis.ogg

This metadata is visible in players or can be extracted/updated with
vorbiscomment which of course can be called in scripts to import
external text into the Ogg container. One could advance the argument
that this is useless, since neither Microsoft nor Google nor Apple
care about Ogg metadata. But I believe that when the container allows
metadata, it should certainly be used; the challenge in the years to
come with audiovisual material will be how to find files (and how to
seek passages inside a file).

My point of view concurs with the guests: accessibility is not
top-of-mind right now, with developers, with clients who certainly
have a central role to play in insisting upon accessibility. I agree
with the assessment that there has been regression; I think the two
major factors for this have been the dominance of MSIE which has
encouraged developer laziness in terms of standards implementation
(and perhaps vendor laziness for screen readers &c), and the rise of
Flash which has really been catastrophic for accessibility. Perhaps
Adobe has realized what a sorry state it's left the web in, they are
making helpful progress I think with the XML-based XMP initiative.

I think it would be helpful in this context to imagine that content
tagging, metadata, must be easily extractible; after all, we can't
know in advance how content will be repurposed. I populate Ogg
metadata not because today's search tools can't handle it, but because
I have faith tomorrow's will. Think about those zillions of U-Matic
and beta cassettes in the vaults: the cassettes have text labels with
metadata, and the video itself starts with a card full of metadata.
Using these these two metadata supports is simply common sense; the
label aids in finding a cassette without viewing every film, and the
embedded metadata aids in identifying the video when the original
support is unavailable.

The only way I think to succeed accessibility is to test, test, test.
An easy way for a non-disabled person to check accessibility of a
website is to use the links or lynx browsers. Years ago, tables
blocked these browsers and indeed were criticized for poor
accessibility. Today, Flash is the guilty party. Testing doesn't have
to be ad hoc focus groups (although such can be useful); there are
surely communities of disabled users who would be happy to contribute
feedback from early on in development (assuming of course modern
non-monolithic development methods). But anyone who has had a major
site in production knows that problems crop up all the time and
ongoing testing is the way to catch problems quickly. Many basic tests
can be automated.

In the USA, there is an incentive to provide accessibility, Section
508 ( http://www.section508.gov/ ) which mandates accessibility for
people with disabilities, although its effectiveness is questioned.
Shortly after the Hurricane Katrina disaster, the government
(mis)managed to put up an MSIE-only site for victim relief and there
were reported cases of victims who could not apply for emergency
benefits.

Some GNU/Linux users have used the analogy of the disabled when they
are shut out. This is not so far off the mark, because sites
inaccessible to the disabled, or to *nix users, or Mac users, or
cellphone or GPS gadget users, are the direct result of coding to a
target platform instead of coding to standards.

There is a movement to separate content from structure (CSS, content
management systems, ...) and that's without question the right
direction, but I take the view that the imperfection of today's
standards is not a major problem, as long as tools exist to get
metadata in and out or to associate it. I think the real enemy of
accessibility is monolithic development which shows its inflexibility
at the first turn. When data transformations are seen as a flow, with
chainlinks, with branches, it's easier to substitute links in the
chain for better ones or add more later as we go along, as new
information comes in. Again, my embedded Ogg metadata is of limited
use today, but the chainlink tools already exist to extract the
metadata and format it in XML or vice versa.

Sean
-
Sent via the backstage.bbc.co.uk discussion group.  To unsubscribe, please 
visit http://backstage.bbc.co.uk/archives/2005/01/mailing_list.html.  
Unofficial list archive: http://www.mail-archive.com/[email protected]/

Reply via email to