[MCN-L] IP SIG: Op-Ed on Israeli Fair Use Provision

1970-01-16 Thread Amalyah Keshet
MCN conference regulars will remember speaker Jonathan Band.  Below, a nice 
piece he wrote on Fair Use.

Amalyah Keshet
Chair, MCN IP SIG   www.mcn.edu
Blog  www.musematic.net 


-- 

Below is a link to an op-ed piece I wrote for the Jerusalem Post in support of 
the fair use provision in the new Israeli Copyright Law.



http://www.jpost.com/servlet/Satellite?cid=1206446110027pagename=JPost%2FJPArticle%2FShowFull




Jonathan Band PLLC
policybandwidth
21 Dupont Circle NW
8th Floor
Washington, D.C.  20036
voice: 202-296-5675
  fax: 202-872-0884
email: jband at policybandwidth.com
  web: www.policybandwidth.com




[MCN-L] Fw: Software for testing ISPs

1970-01-16 Thread Amalyah Keshet
From Fred von Lohmann at Electronic Freedom Foundation http://www.eff.org/

http://www.eff.org/deeplinks/2008/03/keeping-isps-honest

- Original Message - 

Software for Keeping ISPs Honest
Posted by Peter Eckersley

Yesterday's announcement of a d?tente between Comcast and BitTorrent
was great news. Unfortunately, the general problem of ISPs doing
strange things to Internet traffic without telling their customers is
likely to continue in the future. EFF and many other organizations are
working on software to test ISPs for unusual (mis)behavior. In this
detailed post, we have a round-up of the tools that are out there
right now, and others that are in development...

The Backstory

When you sign up for an Internet connection, you expect it to actually
be an Internet connection. You expect that you can run whatever
applications and protocols you choose over the link, or indeed that
you can write your own software and run that.

There is a disturbing trend, however, of ISPs stepping in to meddle
with your communications, deciding that some applications and
protocols are more suitable than others. Or deciding that they can
inject advertisements into your queries for domain names, or your
browser's exchangeswith web sites. Or deciding that encrypted traffic
should be throttled across the board.

Whatever you may think about the merits of these practices, we think
it's obvious that consumers have a right to know what they're paying
for. Only then can they exert pressure on an ISP tochange its ways, or
vote with their wallets and take their business elsewhere. As we
argued in a recent submission to the FCC, ISPs should (at a minimum)
disclose the nature of their network management practices.

But disclosure will never be enough. Internet users need to be able to
test networks themselves to make sure that packets and web pages
arrive as they were sent, to make sure that DNS queries are correctly
answered, and that ISPs comply with the Internet's standards. That
needs to happen around the whole planet.

There are lots of approaches to ISP testing

Before we start talking about all the tools that are popping up for
ISP testing, it's worth noting that there are a lot of different ways
to test a network, with many different pros and cons. For instance,
the software may:

? Actively send synthetic, pre-determined test traffic, or
passively observe the way the network treats natural traffic;
? If the traffic is synthetic, the testing software may try to cope
with the complex variation in operating systems and network
environments, or try to simplify things by creating or insisting on a
known test environment;
? Passive testing systems may focus on one or a small number of
protocols, or they may try to test for interference in any protocol
that is present;
? The software may (1) be unilateral, just trying to detect
interference or delay by examining what's happening on a single
computer's network connection, or (2) be multi-party, synchronizing
and comparing records from computers that are talking to each other,
or (3) be in between, only having authoritative records from one end
but possessing special knowledge about how the other end will behave;
? Non-unilateral testing systems may rely on a central server, or
they may just try to coordinate records in a peer-to-peer fashion;
? Software may operate at the packet level, measuring integrity,
latency and reliability on a per-packet basis, or it may operate at a
higher level, confirming (for example) that web pages arrive intact or
that a link is running at a certain speed, without worrying about any
of the individual packets.
It's a good idea for the Internet community to be pursuing most of
these different possibilities, because they're all useful in different
situations, and we don't yet know which techniques will prove to be
the most important.

Existing and soon-to-be-released tools and data

Last year, EFF released a simple utility called pcapdiff (many thanks
to the people who've sent us patches and bug reports; we'll be
releasing version 0.2 shortly). EFF is also working on a much more
elaborate tool for testing ISPs, which will be called Switzerland.
More on that below.

An Italian group has developed an ISO CD image that can be used to
test an Internet connection (the CD uses pcapdiff, too!). It's called
The Gemini Project. Deploying a whole temporary operating system on a
CD is a great example of the simplify the test environment approach
we described above.

Vuze, the company formed around the Azureus BitTorrent client, has
released a plugin that counts the number of RST packets sent to your
BT client. These statistics are interesting, but remember that there
are legitimate RST packets, and the presence of TCP RSTs isn't
evidence that they were spoofed by an intermediary.

The problem of ISPs making arrangements with advertisers to inject
extra ads and tracking mechanisms into web pages prompted researchers
at the University of Washington and 

[MCN-L] ISKO UK event Recording the Living World - London, 30 March 2010]

1970-01-16 Thread Aida Slavic
= Apologies for cross-posting =

You are cordially invited to the following ISKO UK afternoon event:

RECORDING THE LIVING WORLD

30 March 2010 - 15.30-19.00

VENUE: University College London, Roberts Building, Torrington Place, 
WC1E 7JE

FEE: ?5  (ISKO members and students FREE)

Creepy-crawlies, mammoths and lotus blossom, all have a place at the 
Natural History Museum. Today the data painstakingly collected over 
centuries by naturalists and other scientists are being liberated from 
their institutional archives, made available in rejuvenated catalogues 
and published on the Web.

Diane Tough will talk about recent developments in the methods of 
collection description at the Museum whose library is one of the 
foremost resources for researchers in molecular biology, biodiversity, 
systematics, taxonomy, and the history of science, and consists of over 
one million books and half a million artworks. 

Graham Higley will tell us about the Encyclopaedia of Life - an 
ambitiousproject that aims to 
build an online resource in which every species on earth will have its 
own web page. This international enterprise consists of five major 
projects: the Species Pages Group, the Biodiversity Informatics Group, 
the Scanning and Digitization Group, the Learning and Education Group, 
and the Biodiversity Synthesis Group. Together they are creating an 
unparalleled resource for the life sciences that covers every aspect of 
the study, research, recording and documentation of living creatures.

This ISKO UK Seminar is organized in cooperation with the UCL Department 
for Information Studies.

To read more and to book your place go to the event page 
http://www.iskouk.org/living_world_mar2010.htm

We look forward to seeing you on 30 March!





__ Information from ESET Smart Security, version of virus signature 
database 4923 (20100307) __

The message was checked by ESET Smart Security.

http://www.eset.com