Re: [Wikitech-l] [Wikimedia-l] Wikidata now officially has more total edits than English language Wikipedia

2019-03-20 Thread Emilio J . Rodríguez-Posada
El mié., 20 mar. 2019 a las 7:48, Ariel Glenn WMF () escribió: > Only 45 minutes later, the gap is already over 2000 revsions: > > [ariel@bigtrouble wikidata-huge]$ python3 ./compare_sizes.py > Last enwiki revid is 888606979 and last wikidata revid is 888629401 > 2019-03-20 06:46:03: diff is

Re: [Wikitech-l] Download a wiki?

2018-05-20 Thread Emilio J . Rodríguez-Posada
WikiTeam bat signal. Dump delivered. 2018-05-19 6:52 GMT+02:00 Bart Humphries : > Great, thanks! > > I have a convention this weekend, so it'll probably be Monday > evening/Tuesday before I can really do anything else with that dump. > > On Fri, May 18, 2018, 3:32 PM

Re: [Wikitech-l] Global user pages deployed to all wikis

2015-02-22 Thread Emilio J . Rodríguez-Posada
2015-02-21 16:21 GMT+01:00 MZMcBride z...@mzmcbride.com: Emilio J. Rodríguez-Posada wrote: It seems so. In my case, I created years ago a lot of redirects to my English userpage from many Wikipedia languages, and now I have to request the deletion for all them. Not very useful. Not very

Re: [Wikitech-l] Global user pages deployed to all wikis

2015-02-21 Thread Emilio J . Rodríguez-Posada
2015-02-21 8:45 GMT+01:00 Pine W wiki.p...@gmail.com: Is it necessary to request deletion of a local user page in order to get the global page to be automatically transcluded? Pine It seems so. In my case, I created years ago a lot of redirects to my English userpage from many Wikipedia

Re: [Wikitech-l] Global user pages deployed to all wikis

2015-02-20 Thread Emilio J . Rodríguez-Posada
Hello, thanks for this, it is a good feature. Does it work for Wiktionary, Wikisource, etc? Does it show the same userpage in any sister project? In that case, a trick to show different content using a switch and the {{SITENAME}} magic word would work? 2015-02-19 2:06 GMT+01:00 Legoktm

Re: [Wikitech-l] Jake requests enabling access and edit access to Wikipedia via TOR

2014-01-13 Thread Emilio J . Rodríguez-Posada
Just create a page editable for everybody (as user talk pages are editable for blocked users): * [[Wikipedia:Edit suggestions by TOR users]] Redirect to it with a notice when a TOR node click on edit tabs. Later, any user can add the suggestions to the articles, if they are OK. Anyway, TOR

Re: [Wikitech-l] [wikiteam-discuss:699] Tarballs of all 2004-2012 Commons files now available at archive.org

2013-10-13 Thread Emilio J . Rodríguez-Posada
Nice work Nemo! 2013/10/13 Federico Leva (Nemo) nemow...@gmail.com WikiTeam has just finished archiving all Wikimedia Commons files up to 2012 (and some more) on the Internet Archive: https://archive.org/details/ **wikimediacommons https://archive.org/details/wikimediacommons So far it's

Re: [Wikitech-l] GMail sending lots of WIkimedia mail to spam again

2013-08-06 Thread Emilio J . Rodríguez-Posada
Meanwhile you can fix the filters and add the option do not send to spam folder. I did it for all my wiki filters... 2013/8/6 Bináris wikipo...@gmail.com 2013/8/5 Risker risker...@gmail.com Really? I've not once had that message. As best I can tell, it is affecting EVERY

Re: [Wikitech-l] GMail sending lots of WIkimedia mail to spam again

2013-08-05 Thread Emilio J . Rodríguez-Posada
What is the explanation for this? My spam folder is full of emails from wiki mailing lists too. Perhaps many users don't know how to unsubscribe and mark them as spam and Google filter has learn it? 2013/8/5 Mathieu Stumpf psychosl...@culture-libre.org Le lundi 05 août 2013 à 23:01 +0530,

Re: [Wikitech-l] How's the SSL thing going?

2013-07-31 Thread Emilio J . Rodríguez-Posada
It was so obvious that int. agencies were doing that. It was discussed in past threads in the mailing list too. Also, I have read that SSL is not secure neither. So, bleh... 2013/7/31 David Gerard dger...@gmail.com Jimmy just tweeted this:

Re: [Wikitech-l] Listing missing words of wiktionnaries

2013-07-23 Thread Emilio J . Rodríguez-Posada
http://storage.googleapis.com/books/ngrams/books/datasetsv2.html 2013/7/23 Bináris wikipo...@gmail.com Once you have a list of words which are used on the web (this must be got from an outer source, nothing to do it within Wiktionary), the easiest way is to run a bot, e.g. Pywikipedia.