Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Mike Dupont
Ok, I merged the code from wikteam and have a full history dump script
that uploads to archive.org,
next step is to fix the bucket metadata in the script
mike

On Tue, May 29, 2012 at 3:08 AM, Mike  Dupont
jamesmikedup...@googlemail.com wrote:
 Well, I have now updated the script to include  the xml dump in raw
 format. I will have to add more information the achive.org item, at
 least a basic readme.
 other thing is that the wikipybot does not support the full history it
 seems, so that I will have to move over to the wikiteam version and
 rework it,
 I just spent 2 hours on this so i am pretty happy for the first version.

 mike

 On Tue, May 29, 2012 at 1:52 AM, Hydriz Wikipedia ad...@alphacorp.tk wrote:
 This is quite nice, though the item's metadata is too little :)

 On Tue, May 29, 2012 at 3:40 AM, Mike Dupont jamesmikedup...@googlemail.com
 wrote:

 first version of the Script is ready , it gets the versions, puts them
 in a zip and puts that on archive.org
 https://github.com/h4ck3rm1k3/pywikipediabot/blob/master/export_deleted.py

 here is an example output :
 http://archive.org/details/wikipedia-delete-2012-05

 http://ia601203.us.archive.org/24/items/wikipedia-delete-2012-05/archive2012-05-28T21:34:02.302183.zip

 I will cron this, and it should give a start of saving deleted data.
 Articles will be exported once a day, even if they they were exported
 yesterday as long as they are in one of the categories.

 mike

 On Mon, May 21, 2012 at 7:21 PM, Mike  Dupont
 jamesmikedup...@googlemail.com wrote:
  Thanks! and run that 1 time per day, they dont get deleted that quickly.
  mike
 
  On Mon, May 21, 2012 at 9:11 PM, emijrp emi...@gmail.com wrote:
  Create a script that makes a request to Special:Export using this
 category
  as feed
  https://en.wikipedia.org/wiki/Category:Candidates_for_speedy_deletion
 
  More info
 https://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export
 
 
  2012/5/21 Mike Dupont jamesmikedup...@googlemail.com
 
  Well I whould be happy for items like this :
  http://en.wikipedia.org/wiki/Template:Db-a7
  would it be possible to extract them easily?
  mike
 
  On Thu, May 17, 2012 at 2:23 PM, Ariel T. Glenn ar...@wikimedia.org
  wrote:
   There's a few other reasons articles get deleted: copyright issues,
   personal identifying data, etc.  This makes maintaning the sort of
   mirror you propose problematic, although a similar mirror is here:
   http://deletionpedia.dbatley.com/w/index.php?title=Main_Page
  
   The dumps contain only data publically available at the time of the
 run,
   without deleted data.
  
   The articles aren't permanently deleted of course.  The revisions
 texts
   live on in the database, so a query on toolserver, for example,
 could be
   used to get at them, but that would need to be for research purposes.
  
   Ariel
  
   Στις 17-05-2012, ημέρα Πεμ, και ώρα 13:30 +0200, ο/η Mike Dupont
 έγραψε:
   Hi,
   I am thinking about how to collect articles deleted based on the
 not
   notable criteria,
   is there any way we can extract them from the mysql binlogs? how are
   these mirrors working? I would be interested in setting up a mirror
 of
   deleted data, at least that which is not spam/vandalism based on
 tags.
   mike
  
   On Thu, May 17, 2012 at 1:09 PM, Ariel T. Glenn 
 ar...@wikimedia.org
   wrote:
We now have three mirror sites, yay!  The full list is linked to
 from
http://dumps.wikimedia.org/ and is also available at
   
   
 http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors
   
Summarizing, we have:
   
C3L (Brazil) with the last 5 good known dumps,
Masaryk University (Czech Republic) with the last 5 known good
 dumps,
Your.org (USA) with the complete archive of dumps, and
   
for the latest version of uploaded media, Your.org with
http/ftp/rsync
access.
   
Thanks to Carlos, Kevin and Yenya respectively at the above sites
 for
volunteering space, time and effort to make this happen.
   
As people noticed earlier, a series of media tarballs per-project
(excluding commons) is being generated.  As soon as the first run
 of
these is complete we'll announce its location and start generating
them
on a semi-regular basis.
   
As we've been getting the bugs out of the mirroring setup, it is
getting
easier to add new locations.  Know anyone interested?  Please let
 us
know; we would love to have them.
   
Ariel
   
   
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  
  
  
  
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
  --
  James Michael DuPont
  Member of Free Libre Open Source Software Kosova http://flossk.org
  Contributor FOSM, the CC-BY-SA map of 

Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Mike Dupont
https://github.com/h4ck3rm1k3/wikiteam code here

On Wed, May 30, 2012 at 6:26 AM, Mike  Dupont
jamesmikedup...@googlemail.com wrote:
 Ok, I merged the code from wikteam and have a full history dump script
 that uploads to archive.org,
 next step is to fix the bucket metadata in the script
 mike

 On Tue, May 29, 2012 at 3:08 AM, Mike  Dupont
 jamesmikedup...@googlemail.com wrote:
 Well, I have now updated the script to include  the xml dump in raw
 format. I will have to add more information the achive.org item, at
 least a basic readme.
 other thing is that the wikipybot does not support the full history it
 seems, so that I will have to move over to the wikiteam version and
 rework it,
 I just spent 2 hours on this so i am pretty happy for the first version.

 mike

 On Tue, May 29, 2012 at 1:52 AM, Hydriz Wikipedia ad...@alphacorp.tk wrote:
 This is quite nice, though the item's metadata is too little :)

 On Tue, May 29, 2012 at 3:40 AM, Mike Dupont jamesmikedup...@googlemail.com
 wrote:

 first version of the Script is ready , it gets the versions, puts them
 in a zip and puts that on archive.org
 https://github.com/h4ck3rm1k3/pywikipediabot/blob/master/export_deleted.py

 here is an example output :
 http://archive.org/details/wikipedia-delete-2012-05

 http://ia601203.us.archive.org/24/items/wikipedia-delete-2012-05/archive2012-05-28T21:34:02.302183.zip

 I will cron this, and it should give a start of saving deleted data.
 Articles will be exported once a day, even if they they were exported
 yesterday as long as they are in one of the categories.

 mike

 On Mon, May 21, 2012 at 7:21 PM, Mike  Dupont
 jamesmikedup...@googlemail.com wrote:
  Thanks! and run that 1 time per day, they dont get deleted that quickly.
  mike
 
  On Mon, May 21, 2012 at 9:11 PM, emijrp emi...@gmail.com wrote:
  Create a script that makes a request to Special:Export using this
 category
  as feed
  https://en.wikipedia.org/wiki/Category:Candidates_for_speedy_deletion
 
  More info
 https://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export
 
 
  2012/5/21 Mike Dupont jamesmikedup...@googlemail.com
 
  Well I whould be happy for items like this :
  http://en.wikipedia.org/wiki/Template:Db-a7
  would it be possible to extract them easily?
  mike
 
  On Thu, May 17, 2012 at 2:23 PM, Ariel T. Glenn ar...@wikimedia.org
  wrote:
   There's a few other reasons articles get deleted: copyright issues,
   personal identifying data, etc.  This makes maintaning the sort of
   mirror you propose problematic, although a similar mirror is here:
   http://deletionpedia.dbatley.com/w/index.php?title=Main_Page
  
   The dumps contain only data publically available at the time of the
 run,
   without deleted data.
  
   The articles aren't permanently deleted of course.  The revisions
 texts
   live on in the database, so a query on toolserver, for example,
 could be
   used to get at them, but that would need to be for research purposes.
  
   Ariel
  
   Στις 17-05-2012, ημέρα Πεμ, και ώρα 13:30 +0200, ο/η Mike Dupont
 έγραψε:
   Hi,
   I am thinking about how to collect articles deleted based on the
 not
   notable criteria,
   is there any way we can extract them from the mysql binlogs? how are
   these mirrors working? I would be interested in setting up a mirror
 of
   deleted data, at least that which is not spam/vandalism based on
 tags.
   mike
  
   On Thu, May 17, 2012 at 1:09 PM, Ariel T. Glenn 
 ar...@wikimedia.org
   wrote:
We now have three mirror sites, yay!  The full list is linked to
 from
http://dumps.wikimedia.org/ and is also available at
   
   
 http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors
   
Summarizing, we have:
   
C3L (Brazil) with the last 5 good known dumps,
Masaryk University (Czech Republic) with the last 5 known good
 dumps,
Your.org (USA) with the complete archive of dumps, and
   
for the latest version of uploaded media, Your.org with
http/ftp/rsync
access.
   
Thanks to Carlos, Kevin and Yenya respectively at the above sites
 for
volunteering space, time and effort to make this happen.
   
As people noticed earlier, a series of media tarballs per-project
(excluding commons) is being generated.  As soon as the first run
 of
these is complete we'll announce its location and start generating
them
on a semi-regular basis.
   
As we've been getting the bugs out of the mirroring setup, it is
getting
easier to add new locations.  Know anyone interested?  Please let
 us
know; we would love to have them.
   
Ariel
   
   
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  
  
  
  
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 

Re: [Wikitech-l] git review version update, May 2012

2012-05-30 Thread Raimond Spekking
Am 26.05.2012 20:02, schrieb Amir E. Aharoni:
 `git review' says that a new version of git-review is availble on PyPI.
 
 The last update created some unwanted surprises, so I decided to avoid
 updating it for now. What do our Git experts suggest?
 
 Thank you,

Grmbl. I updated git-review on my Win7 system and now I am unable
sending to Gerrit:

Ray42@RAY-PC
/d/F_Programmierung/xampp/htdocs/core/extensions/EducationProgram
(i18nfixes)
$ git review
Traceback (most recent call last):
  File c:\Python27\Scripts\git-review, line 735, in module
main()
  File c:\Python27\Scripts\git-review, line 704, in main
if not set_hooks_commit_msg(remote, hook_file):
  File c:\Python27\Scripts\git-review, line 148, in set_hooks_commit_msg
os.chmod(target_file, os.path.stat.S_IREAD | os.path.stat.S_IEXEC)
WindowsError: [Error 2] Das System kann die angegebene Datei nicht
finden: '.git\\hooks\\commit-msg'


(the last sentence means: The system cannot find the file
'.git\\hooks\\commit-msg')

Any ideas how to fix this?

Raimond.



signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Huib Laurens
I'm still intressted in running a mirror also, like noted on Meta and send
out earlier per mail also.

I'm just wondering, why is there no rsync possibility from the main server?
Its strange when we need to rsync from a mirror.

-- 
*Kind regards,

Huib Laurens**

Certified cPanel Specialist
Certified Kaspersky Specialist
**
WickedWay Webhosting, webhosting the wicked way!

www.wickedway.nl - www.wickedway.be* .
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Hydriz Wikipedia
Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync
dataset1001.wikimedia.org::

However, the system limits the rsyncers to only mirrors, to prevent others
from rsyncing directly from Wikimedia.

On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com wrote:

 I'm still intressted in running a mirror also, like noted on Meta and send
 out earlier per mail also.

 I'm just wondering, why is there no rsync possibility from the main server?
 Its strange when we need to rsync from a mirror.

 --
 *Kind regards,

 Huib Laurens**

 Certified cPanel Specialist
 Certified Kaspersky Specialist
 **
 WickedWay Webhosting, webhosting the wicked way!

 www.wickedway.nl - www.wickedway.be* .
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Regards,
Hydriz

We've created the greatest collection of shared knowledge in history. Help
protect Wikipedia. Donate now: http://donate.wikimedia.org
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Huib Laurens
Ok, cool.

And how will I get wikimedia to allow our IP to rsync?

Best,

Huib

On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia ad...@alphacorp.tkwrote:

 Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync
 dataset1001.wikimedia.org::

 However, the system limits the rsyncers to only mirrors, to prevent others
 from rsyncing directly from Wikimedia.

 On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com wrote:

  I'm still intressted in running a mirror also, like noted on Meta and
 send
  out earlier per mail also.
 
  I'm just wondering, why is there no rsync possibility from the main
 server?
  Its strange when we need to rsync from a mirror.
 
  --
  *Kind regards,
 
  Huib Laurens**
 
  Certified cPanel Specialist
  Certified Kaspersky Specialist
  **
  WickedWay Webhosting, webhosting the wicked way!
 
  www.wickedway.nl - www.wickedway.be* .
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Regards,
 Hydriz

 We've created the greatest collection of shared knowledge in history. Help
 protect Wikipedia. Donate now: http://donate.wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Kind regards,

Huib Laurens
WickedWay.nl

Webhosting the wicked way.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Hydriz Wikipedia
Ariel will do that :)

BTW just dig around inside their puppet configuration repository on Gerrit
and you can know more :)

On Wed, May 30, 2012 at 4:58 PM, Huib Laurens sterke...@gmail.com wrote:

 Ok, cool.

 And how will I get wikimedia to allow our IP to rsync?

 Best,

 Huib

 On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia ad...@alphacorp.tk
 wrote:

  Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync
  dataset1001.wikimedia.org::
 
  However, the system limits the rsyncers to only mirrors, to prevent
 others
  from rsyncing directly from Wikimedia.
 
  On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com
 wrote:
 
   I'm still intressted in running a mirror also, like noted on Meta and
  send
   out earlier per mail also.
  
   I'm just wondering, why is there no rsync possibility from the main
  server?
   Its strange when we need to rsync from a mirror.
  
   --
   *Kind regards,
  
   Huib Laurens**
  
   Certified cPanel Specialist
   Certified Kaspersky Specialist
   **
   WickedWay Webhosting, webhosting the wicked way!
  
   www.wickedway.nl - www.wickedway.be* .
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
 
 
 
  --
  Regards,
  Hydriz
 
  We've created the greatest collection of shared knowledge in history.
 Help
  protect Wikipedia. Donate now: http://donate.wikimedia.org
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Kind regards,

 Huib Laurens
 WickedWay.nl

 Webhosting the wicked way.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Regards,
Hydriz

We've created the greatest collection of shared knowledge in history. Help
protect Wikipedia. Donate now: http://donate.wikimedia.org
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-30 Thread Antoine Musso
Le 30/05/12 08:55, Raimond Spekking a écrit :
 (the last sentence means: The system cannot find the file
 '.git\\hooks\\commit-msg')
 
 Any ideas how to fix this?

Set up your hook on that repository?

http://www.mediawiki.org/wiki/Git/Workflow has the answer:

-
First, you need to download a pre-commit hook script and place it in the
right directory in your cloned copy of the repository. The script is
available from https://gerrit.wikimedia.org/r/tools/hooks/commit-msg and
must be placed in the repository sub directory .git/hooks/

1) With a browser:
Download the script from the repo using Save As ... then browse to
wikimedia-git-repos/examples/.git/hooks/. Voilà!

2) With wget:
Change to the repository directory (for example, cd
wikimedia-git-repos/examples/
wget  -P .git/hooks https://gerrit.wikimedia.org/r/tools/hooks/commit-msg

3) With curl:
curl https://gerrit.wikimedia.org/r/tools/hooks/commit-msg 
.git/hooks/commit-msg

You also need to ensure the hook is executable. In Linux you do this with:
  chmod u+x .git/hooks/commit-msg

When ever you commit a change locally, the hook script will generate a
unique Change-Id for you.
-




-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Huib Laurens
Ok.

I mailed Ariel about this, if all goes will I can have the mirror up and
running by Friday.

Best,
Huib

On Wed, May 30, 2012 at 10:59 AM, Hydriz Wikipedia ad...@alphacorp.tkwrote:

 Ariel will do that :)

 BTW just dig around inside their puppet configuration repository on Gerrit
 and you can know more :)

 On Wed, May 30, 2012 at 4:58 PM, Huib Laurens sterke...@gmail.com wrote:

  Ok, cool.
 
  And how will I get wikimedia to allow our IP to rsync?
 
  Best,
 
  Huib
 
  On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia ad...@alphacorp.tk
  wrote:
 
   Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync
   dataset1001.wikimedia.org::
  
   However, the system limits the rsyncers to only mirrors, to prevent
  others
   from rsyncing directly from Wikimedia.
  
   On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com
  wrote:
  
I'm still intressted in running a mirror also, like noted on Meta and
   send
out earlier per mail also.
   
I'm just wondering, why is there no rsync possibility from the main
   server?
Its strange when we need to rsync from a mirror.
   
--
*Kind regards,
   
Huib Laurens**
   
Certified cPanel Specialist
Certified Kaspersky Specialist
**
WickedWay Webhosting, webhosting the wicked way!
   
www.wickedway.nl - www.wickedway.be* .
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   
  
  
  
   --
   Regards,
   Hydriz
  
   We've created the greatest collection of shared knowledge in history.
  Help
   protect Wikipedia. Donate now: http://donate.wikimedia.org
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
 
 
 
  --
  Kind regards,
 
  Huib Laurens
  WickedWay.nl
 
  Webhosting the wicked way.
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Regards,
 Hydriz

 We've created the greatest collection of shared knowledge in history. Help
 protect Wikipedia. Donate now: http://donate.wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Kind regards,

Huib Laurens
WickedWay.nl

Webhosting the wicked way.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Hydriz Wikipedia
Do you have a url that you can reveal so that some of us can have a sneak
peak? :P

On Wed, May 30, 2012 at 5:16 PM, Huib Laurens sterke...@gmail.com wrote:

 Ok.

 I mailed Ariel about this, if all goes will I can have the mirror up and
 running by Friday.

 Best,
 Huib

 On Wed, May 30, 2012 at 10:59 AM, Hydriz Wikipedia ad...@alphacorp.tk
 wrote:

  Ariel will do that :)
 
  BTW just dig around inside their puppet configuration repository on
 Gerrit
  and you can know more :)
 
  On Wed, May 30, 2012 at 4:58 PM, Huib Laurens sterke...@gmail.com
 wrote:
 
   Ok, cool.
  
   And how will I get wikimedia to allow our IP to rsync?
  
   Best,
  
   Huib
  
   On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia ad...@alphacorp.tk
   wrote:
  
Eh, mirrors rsync directly from dataset1001.wikimedia.org, see rsync
dataset1001.wikimedia.org::
   
However, the system limits the rsyncers to only mirrors, to prevent
   others
from rsyncing directly from Wikimedia.
   
On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com
   wrote:
   
 I'm still intressted in running a mirror also, like noted on Meta
 and
send
 out earlier per mail also.

 I'm just wondering, why is there no rsync possibility from the main
server?
 Its strange when we need to rsync from a mirror.

 --
 *Kind regards,

 Huib Laurens**

 Certified cPanel Specialist
 Certified Kaspersky Specialist
 **
 WickedWay Webhosting, webhosting the wicked way!

 www.wickedway.nl - www.wickedway.be* .
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

   
   
   
--
Regards,
Hydriz
   
We've created the greatest collection of shared knowledge in history.
   Help
protect Wikipedia. Donate now: http://donate.wikimedia.org
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   
  
  
  
   --
   Kind regards,
  
   Huib Laurens
   WickedWay.nl
  
   Webhosting the wicked way.
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
 
 
 
  --
  Regards,
  Hydriz
 
  We've created the greatest collection of shared knowledge in history.
 Help
  protect Wikipedia. Donate now: http://donate.wikimedia.org
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Kind regards,

 Huib Laurens
 WickedWay.nl

 Webhosting the wicked way.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Regards,
Hydriz

We've created the greatest collection of shared knowledge in history. Help
protect Wikipedia. Donate now: http://donate.wikimedia.org
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update

2012-05-30 Thread Huib Laurens
Sure :)

http://mirror.fr.wickedway.nl

later on we will duplicate this mirror to a Dutch mirror also :)

Best,
Huib

On Wed, May 30, 2012 at 11:18 AM, Hydriz Wikipedia ad...@alphacorp.tkwrote:

 Do you have a url that you can reveal so that some of us can have a sneak
 peak? :P

 On Wed, May 30, 2012 at 5:16 PM, Huib Laurens sterke...@gmail.com wrote:

  Ok.
 
  I mailed Ariel about this, if all goes will I can have the mirror up and
  running by Friday.
 
  Best,
  Huib
 
  On Wed, May 30, 2012 at 10:59 AM, Hydriz Wikipedia ad...@alphacorp.tk
  wrote:
 
   Ariel will do that :)
  
   BTW just dig around inside their puppet configuration repository on
  Gerrit
   and you can know more :)
  
   On Wed, May 30, 2012 at 4:58 PM, Huib Laurens sterke...@gmail.com
  wrote:
  
Ok, cool.
   
And how will I get wikimedia to allow our IP to rsync?
   
Best,
   
Huib
   
On Wed, May 30, 2012 at 10:54 AM, Hydriz Wikipedia 
 ad...@alphacorp.tk
wrote:
   
 Eh, mirrors rsync directly from dataset1001.wikimedia.org, see
 rsync
 dataset1001.wikimedia.org::

 However, the system limits the rsyncers to only mirrors, to prevent
others
 from rsyncing directly from Wikimedia.

 On Wed, May 30, 2012 at 4:52 PM, Huib Laurens sterke...@gmail.com
 
wrote:

  I'm still intressted in running a mirror also, like noted on Meta
  and
 send
  out earlier per mail also.
 
  I'm just wondering, why is there no rsync possibility from the
 main
 server?
  Its strange when we need to rsync from a mirror.
 
  --
  *Kind regards,
 
  Huib Laurens**
 
  Certified cPanel Specialist
  Certified Kaspersky Specialist
  **
  WickedWay Webhosting, webhosting the wicked way!
 
  www.wickedway.nl - www.wickedway.be* .
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Regards,
 Hydriz

 We've created the greatest collection of shared knowledge in
 history.
Help
 protect Wikipedia. Donate now: http://donate.wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

   
   
   
--
Kind regards,
   
Huib Laurens
WickedWay.nl
   
Webhosting the wicked way.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   
  
  
  
   --
   Regards,
   Hydriz
  
   We've created the greatest collection of shared knowledge in history.
  Help
   protect Wikipedia. Donate now: http://donate.wikimedia.org
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
 
 
 
  --
  Kind regards,
 
  Huib Laurens
  WickedWay.nl
 
  Webhosting the wicked way.
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Regards,
 Hydriz

 We've created the greatest collection of shared knowledge in history. Help
 protect Wikipedia. Donate now: http://donate.wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Kind regards,

Huib Laurens
WickedWay.nl

Webhosting the wicked way.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-30 Thread Raimond Spekking
Am 30.05.2012 11:00, schrieb Antoine Musso:
 Le 30/05/12 08:55, Raimond Spekking a écrit :
 (the last sentence means: The system cannot find the file
 '.git\\hooks\\commit-msg')

 Any ideas how to fix this?
 
 Set up your hook on that repository?
 

Yeah I missed that. Thanks for help :-)

But to excuse my question: The former git-review has done this
automatically during the first commit of an extensions.

Raimond.



signature.asc
Description: OpenPGP digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit question: pushing to another branch

2012-05-30 Thread Ryan Kaldari

How do you create the new branch on gerrit?

Ryan Kaldari

On 5/29/12 6:22 AM, Jeroen De Dauw wrote:

Hey,

Thanks all for pointing to the already existing solution :)

One note I'd like to add for people that want to do this: you need to
create the branch on gerrit first, else git review will go mad at you.

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] git review version update, May 2012

2012-05-30 Thread Marcin Cieslak
 Raimond Spekking raimond.spekk...@gmail.com wrote:
 os.chmod(target_file, os.path.stat.S_IREAD | os.path.stat.S_IEXEC)
 WindowsError: [Error 2] Das System kann die angegebene Datei nicht
 finden: '.git\\hooks\\commit-msg'


 (the last sentence means: The system cannot find the file
 '.git\\hooks\\commit-msg')

 Any ideas how to fix this?

Maybe it is now related to the fact that if git submodules are used
and relatively new git is used and the moon is waning there might
no longer be .git subdirectory in the submodule (i.e. extension)
but there is only one two levels below. 

I think aaronsw fixed this upstream with this change:

https://review.openstack.org/#/c/7166/

(it got merged, and should be included in 1.17, the latest of git-review).

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit question: pushing to another branch

2012-05-30 Thread Marcin Cieslak
 Ryan Kaldari rkald...@wikimedia.org wrote:
 How do you create the new branch on gerrit?

In Gerrit Web UI:

Admin - Projects - (choose project) - Branches - Create branch

//Saper


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l