>>>>> On Fri, 6 Jul 2018, Kent Fredric wrote:

> On Thu, 5 Jul 2018 12:32:20 -0500
> William Hubbs <willi...@gentoo.org> wrote:

>> I looked at this first, and it is very hard on the server.
>> Every pull or clone you do to update things works like an initial
>> clone, so it takes pretty massive resources.

> Surely, then the recommended approach involves:

> 1. Selecting pages [1]
> 2. Limiting clone depth [2]

> Or at least, encouraging the use of by_rev [3]

That will change it from being completely unusable to barely usable.
Still it isn't something I would want to use on a daily basis.

I have tried it when helping with the conversion of the GLEPs.
For fetching only, so I don't know what would happen when trying to
push a page back to the wiki.

Ulrich

> 1: 
> https://github.com/Git-Mediawiki/Git-Mediawiki/blob/master/docs/User-manual.md#limit-the-pages-to-be-imported
> 2: 
> https://github.com/Git-Mediawiki/Git-Mediawiki/blob/master/docs/User-manual.md#shallow-imports
> 3: 
> https://github.com/Git-Mediawiki/Git-Mediawiki/blob/master/docs/User-manual.md#optimizing-git-fetch

Attachment: pgpgSbryKAZip.pgp
Description: PGP signature

Reply via email to