Re: [Wikimedia-l] Wikimedia Travel Guide: Board statement

2012-09-07 Thread emijrp
2012/9/7 Thomas Dalton thomas.dal...@gmail.com

 On 7 September 2012 20:40, Daniel Zahn dz...@wikimedia.org wrote:
  On Thu, Sep 6, 2012 at 7:40 AM, Rodrigo Tetsuo Argenton
  rodrigo.argen...@gmail.com wrote:
 
  and I'm wondering how they will sue for importing content that is on
 free
  license...
 
  Well, it's Creative Commons Attribution-ShareAlike, so will have to
  attribute Wikitravel and the original author(s) on every single page.
 
  http://wikitravel.org/shared/Copyleft

 Why attribute Wikitravel? The license only requires the author to be
 attributed, not the first publisher. Ideally, content will be copied
 across with the page history intact, which will constitute the
 appropriate attribution (in the same way it does on Wikipedia). I
 haven't been following too closely - are there any dumps available? It
 seems unlikely it will be possible to get any now (I doubt IB can be
 compelled to provide one).


WikiTeam did some backups in 2011 but only the last version for every page,
because the server was weak and unable to provide the complete history
http://code.google.com/p/wikiteam/downloads/list?can=2q=wikitravel

Indeed, they have started to protect themselves against the content
export http://wikitravel.org/en/Special:Export


  ___
 Wikimedia-l mailing list
 Wikimedia-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Wikimedia Travel Guide: Board statement

2012-09-07 Thread emijrp
When we use 1911 Britannica texts, we only attribute to the encyclopedia,
not its authors, so we can put This text comes from Wikitravel.

Anyway, if we are going to use Wikitravel texts, writing a script to scrape
just the usernames from histories is trivial
http://wikitravel.org/wiki/en/index.php?title=Kaprunaction=history

2012/9/7 Daniel Zahn dz...@wikimedia.org

 On Fri, Sep 7, 2012 at 12:47 PM, Thomas Dalton thomas.dal...@gmail.com
 wrote:

  Why attribute Wikitravel? The license only requires the author to be
  attributed, not the first publisher. Ideally, content will be copied
  across with the page history intact

 Really? it says.. (2) credit the author, licensor and/or other
 parties (such as a wiki or journal) in the manner they specify; 

 ..and yeah, we don't appear to have the full history.

 --
 Daniel Zahn dz...@wikimedia.org
 Operations Engineer

 ___
 Wikimedia-l mailing list
 Wikimedia-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Wikimedia Travel Guide: Board statement

2012-09-07 Thread emijrp
How to re-use Wikitravel guides http://www.webcitation.org/6AVYKMbhE

Example of credits:
http://wikitravel.org/wiki/en/index.php?title=Singaporeaction=credits

2012/9/7 Nathan nawr...@gmail.com

 On Fri, Sep 7, 2012 at 4:11 PM, emijrp emi...@gmail.com wrote:
  When we use 1911 Britannica texts, we only attribute to the encyclopedia,
  not its authors, so we can put This text comes from Wikitravel.
 
  Anyway, if we are going to use Wikitravel texts, writing a script to
 scrape
  just the usernames from histories is trivial
  http://wikitravel.org/wiki/en/index.php?title=Kaprunaction=history
 

 That's not really accurate, and not how Wikimedia projects expect to
 be credited either. Wikipedia or Wikitravel are not the authors of
 any project content (other than, for Wikitravel, that written by
 employees in the course of their employment).

 ___
 Wikimedia-l mailing list
 Wikimedia-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


[Wikimedia-l] Endangered languages, a new project by Google

2012-06-21 Thread emijrp
A new project by Google to protect endangered languages around the world:
http://www.endangeredlanguages.com

English blog post:
http://googleblog.blogspot.com.es/2012/06/endangered-languages-project-supporting.html
Spanish translation:
http://googleespana.blogspot.com.es/2012/06/la-conservacion-de-las-lenguas-en.html

Remember, there is a deadline
https://en.wikipedia.org/wiki/Wikipedia:There_is_a_deadline

: )

-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] speedydeletion.wika.com lauched

2012-06-11 Thread emijrp
Mike is the terror of deletionists.

Good work.

2012/6/10 Mike Dupont jamesmikedup...@googlemail.com

 Hi,
 I have launched speedydeletion.wika.com , it is updated every 30 minutes
 with the proposed deletions and speedy deletion articles (not notable and
 hoaxes, not others).
 it is running on the en.wikipedia.org. the sources for the script are all
 on git hub and are a merger of pywikipediabot and the wikiteam codebases.
 hope you enjoy it,
 thanks,
 mike
 --
 James Michael DuPont
 Member of Free Libre Open Source Software Kosova http://flossk.org
 Contributor FOSM, the CC-BY-SA map of the world http://fosm.org
 Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3
 ___
 Wikimedia-l mailing list
 Wikimedia-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Study: Nobody cares about your copyright

2012-05-21 Thread emijrp
Lol, 14 years term. Good luck. That is a lost battle.

I think that the useful approach is to spread the word about free licenses,
that allow to use content NOW.

2012/5/21 Samuel Klein meta...@gmail.com

 14 years is a fine place to start.  Are there any existing campaigns
 pushing for it?  S.

 On Mon, May 21, 2012 at 2:22 PM, David Gerard dger...@gmail.com wrote:
  On 21 May 2012 18:59, Samuel Klein meta...@gmail.com wrote:
 
  I don't think the right term here is 0 years.  It is also not life
  + 70.  Perhaps 7 + 7.
 
 
  I suggested 14 as a likely figure because that figure is already in
  common currency - as it was the term in the UK (Statute of Anne) and
  in the US (Copyright Act of 1790).
 
  And then Sage Ross turned up the recent study suggesting a 15-year
  term would be the correct length to maximise artistic production
  (though I think the number is a bit conveniently close to 14 years and
  would like to see multiple competing studies that show their working):
 
  http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1436186
 
  The Economist also ran an editorial pushing 14 years:
 
  http://www.economist.com/node/1547223
 
  So, yeah, 14 year term is the meme.
 
 
  - d.
 
  ___
  Wikimedia-l mailing list
  Wikimedia-l@lists.wikimedia.org
  Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l



 --
 Samuel Klein  identi.ca:sj   w:user:sj  +1 617
 529 4266

 ___
 Wikimedia-l mailing list
 Wikimedia-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fire Drill Re: Wikimedia sites not easy to archive (Was Re: Knol is closing tomorrow )

2012-05-18 Thread emijrp
There is no such 10GB limit,
http://archive.org/details/ARCHIVETEAM-YV-6360017-6399947 (238 GB example)

ArchiveTeam/WikiTeam is uploading some dumps to Internet Archive, if you
want to join the effort use the mailing list
https://groups.google.com/group/wikiteam-discuss to avoid wasting resources.

2012/5/18 Mike Dupont jamesmikedup...@googlemail.com

 Hello People,
 I have completed my first set in uploading the osm/fosm dataset (350gb
 unpacked) to archive.org
 http://osmopenlayers.blogspot.de/2012/05/upload-finished.html

 We can do something similar with wikipedia, the bucket size of
 archive.org is 10gb, we need to split up the data in a way that it is
 useful. I have done this by putting each object on one line and each
 file contains the full data records and the parts that belong to the
 previous block and next block, so you are able to process the blocks
 almost stand alone.

 mike

 ___
 Wikimedia-l mailing list
 Wikimedia-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


[Wikimedia-l] Knol is closing tomorrow

2012-04-30 Thread emijrp
Dear all;

First it was Encarta, then printed Britannica. Tomorrow, Knol.[1][2] It is
not a good moment for Wikipedia rivals.

We at Archive Team are attempting to download all the 700,000 Knols.[3] For
the sake of history. Join us, #archiveteam EFNET.

Regards,
emijrp

[1] http://knol.google.com/k
[2] http://news.bbc.co.uk/2/hi/technology/7144970.stm
[3] http://db.tt/GNrEh61y

-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] [Wikitech-l] Release of educational videos under creative commons

2012-04-25 Thread emijrp
2012/4/24 Samuel Klein meta...@gmail.com

 Where's the latest thread on the Timed Media Handler progress?

 I am meeting with MIT Open CourseWare tomorrow - they want to expand
 the set of videos they released last year under CC-SA, starting with
 categories / vids that would be fill gaps on Wikipedia.  Any thoughts
 on how to make that collaboration more effective would be welcome.

 SJ


You can upload them to Internet Archive, if Wikipedia has temporal issues
with videos. When the problems are fixed, we can move them from Internet
Archive to Wikimedia Commons.

-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Harvard University: Drop paywalled journals, publish open access

2012-04-25 Thread emijrp
2012/4/24 David Gerard dger...@gmail.com

 Tangential, but highly relevant to the goal of free content:


 http://www.guardian.co.uk/science/2012/apr/24/harvard-university-journal-publishers-prices

 Exasperated by rising subscription costs charged by academic
 publishers, Harvard University has encouraged its faculty members to
 make their research freely available through open access journals and
 to resign from publications that keep articles behind paywalls.


 http://isites.harvard.edu/icb/icb.do?keyword=k77982tabgroupid=icb.tabgroup143448

 The Library has never received anything close to full reimbursement
 for these expenditures from overhead collected by the University on
 grant and research funds. The Faculty Advisory Council to the Library,
 representing university faculty in all schools and in consultation
 with the Harvard Library leadership,  reached this conclusion: major
 periodical subscriptions, especially to electronic journals published
 by historically key providers, cannot be sustained: continuing these
 subscriptions on their current footing is financially untenable. Doing
 so would seriously erode collection efforts in many other areas,
 already compromised.


Meanwhile in Wikipedia we accept these gifts[1] to put links to paywall
content.

[1] https://en.wikipedia.org/wiki/Wikipedia:HighBeam

-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] [Wikipedia-l] Fwd: Harvard Library releases 12M bibliographic records under CC0

2012-04-25 Thread emijrp
2012/4/25 Federico Leva (Nemo) nemow...@gmail.com

 Thanks for sharing, I had read about it on the NYT but nothing was said on
 license.
 So now the USA have more open bibliographic data than Germany/Europe? :)
 lobid.org is a very nice initiative, but other catalog systems have very
 complex interactions between hundreds or thousands of entities and it's
 very hard to change the licenses.
 The main problem is usually deduplication and quality of the records, any
 information on this for Harvard's data?

 Mateus Nobre, 25/04/2012 19:44:

 Add ALL at Wikisource!


 Wikisource? This is only metadata.


Perhaps it is OK for Wikidata.

-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] [Wikitech-l] Release of educational videos under creative commons

2012-04-24 Thread emijrp
2012/4/24 Andrew Cates andrew.ca...@soschildren.org

 Um. https://commons.wikimedia.org/wiki/Category:Video appear to all be
 images relevant to the topic of videos. Not themselves videos.


Sorry, a missing S, https://commons.wikimedia.org/wiki/Category:Videos


 Owner? Not personally but I am the CEO of the charity which owns them and
 I can release them.

 Is there any actual video anywhere on WP and in what format?


Yes https://en.wikipedia.org/wiki/Category:Articles_containing_video_clips

The prefered format is Theora
https://commons.wikimedia.org/wiki/Commons:Video


 Andrew
 ==


(Please, press Reply all to send your e-mails to the mailing list too,
and not just me)


 On Tue, Apr 24, 2012 at 1:26 PM, emijrp emi...@gmail.com wrote:

 2012/4/24 Andrew Cates andrew.ca...@soschildren.org

 I am wondering about releasing many hundreds of Africa educational videos
 under creative commons.

 They are the videos currently at www.our-africa.org which is a child
 generated reference site about Africa.

 There is a lot of material in the videos which could be edited and used
 to
 improve Wikipedia article (a solar kettle in operation, a maize plant
 grinding maize, a variety of musical instruments in use, different
 religious festivals, cocoa plantations etc etc)

 However at present Wikipedia does not seem to support or want video
 material.


 Wikipedia supports video with [[Media:]] tag. Also, Wikimedia Commons has
 a little collection of them
 https://commons.wikimedia.org/wiki/Category:Video



 Does anyone have a feel whether this is likely to change?


 http://www.videoonwikipedia.com

 Also, this message is more related to Wikimedia or Commons mailing lists
 (cc:). If you are the owner of those videos and you want to donate them,
 some people can help you in the process.


 Andrew
 ___
 Wikitech-l mailing list
 wikitec...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




 --
 Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
 Pre-doctoral student at the University of Cádiz (Spain)
 Projects: AVBOT http://code.google.com/p/avbot/ | 
 StatMediaWikihttp://statmediawiki.forja.rediris.es
 | WikiEvidens http://code.google.com/p/wikievidens/ | 
 WikiPapershttp://wikipapers.referata.com
 | WikiTeam http://code.google.com/p/wikiteam/
 Personal website: https://sites.google.com/site/emijrp/





-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] [Wikipedia-l] Fwd: Harvard Library releases 12M bibliographic records under CC0

2012-04-24 Thread emijrp
Very good news for Open Library, and for us too.

2012/4/24 Samuel Klein s...@wikimedia.org

 This is big news -- though still only part of Harvard's full
 collection of records.

 Following the British Library's release of 3M bib records under CC0 18
 months ago:
 http://isites.harvard.edu/icb/icb.do?keyword=k77982pageid=icb.page498373

 David Weinberger writes:

  This is the largest contribution of full bib records we know of.
 
  Stuart Shieber [of the Berkman Center] (and of the Office of Scholarly
  Communication) was the driving force behind this.
 
  Woohoo!
 
  David W.

 ___
 Wikipedia-l mailing list
 wikipedi...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikipedia-l




-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Wiki Travel Guide

2012-04-11 Thread emijrp
2012/4/11 John Vandenberg jay...@gmail.com

 I agree that travel content is within the scope.  In addition to the
 content itself, which helps other people, the process of writing and
 communicating travel information is educational for the writer.

 There was a session about WikiTravel on at RCC2011 Canberra, where the
 need to change host was discussed, and forking generally was
 discussed.


 https://en.wikiversity.org/wiki/University_of_Canberra/RCC2011/All_about_Wikitravel

 The wikiteam dumps of wikitravel are a bit old now; is someone working
 on making fresh dumps publicly available?
 https://code.google.com/p/wikiteam/downloads/list?can=2q=wikitravel
 https://groups.google.com/d/topic/wikiteam-discuss/0gSFlnxeKOo/discussion


Hi;

Furthermore, WikiTravel dumps generated by WikiTeam include only the last
revision for every article. WikiTravel server is a bit weak, and full
history exports fail.

We would like to download a full dump if available.

Regards,
emijrp

-- 
Emilio J. Rodríguez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of Cádiz (Spain)
Projects: AVBOT http://code.google.com/p/avbot/ |
StatMediaWikihttp://statmediawiki.forja.rediris.es
| WikiEvidens http://code.google.com/p/wikievidens/ |
WikiPapershttp://wikipapers.referata.com
| WikiTeam http://code.google.com/p/wikiteam/
Personal website: https://sites.google.com/site/emijrp/
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l