How far back do you need to go?
On Sun, Jan 22, 2023 at 10:25 AM Adhittya Ramadhan
wrote:
>
> Pada tanggal 17 Jan 2023 11:23, "Eric Andrew Lewis" <
> eric.andrew.le...@gmail.com> menulis:
>
>> Hi,
>>
>> I am interested in performing analysis on recently created pages on
>> English Wikipedia.
>>
That eswiki page is in namespace 'wgNamespaceNumber":104 the FR page is
"wgNamespaceNumber":116
On Sun, Feb 13, 2022 at 2:17 PM Erik del Toro wrote:
> Hello.
>
> I am doing some converts to aarddict https://aarddict.org/ offline
> wikipedia and wiktionary app. I use mw2slob and the N0 files
Are you limiting your count to namespace 0?
On Thu, Aug 20, 2020 at 10:45 AM Yuki Kumagai
wrote:
> Hiya
>
> I have a question about wikipedia xml database dump. Apologies if this
> wasn't an appropriate place for asking a question.
> On a wikipedia page, it's mentioned that the current number
Normal editing won’t cause issues. But a delete /move/restore history merge
can cause things to look out of order if you are using child/parent
On Fri, Jan 17, 2020 at 8:52 PM Christopher Wolfram
wrote:
> Thanks Ariel.
>
> So the revisions are in order of revision id which are assigned
>
the dumps do not contain any images, just the description text that goes
along with them. Platonides got you a raw copy of the actual files
On Fri, Mar 18, 2016 at 6:31 AM, D. Hansen
wrote:
> Hi Platonides
>
> First let me thank you very much!
>
> 99 GByte, how
Have you tried 7zip ?
On Fri, Jan 15, 2016 at 8:30 PM, Richard Farmbrough <
rich...@farmbrough.co.uk> wrote:
> I have problems bunzip2ing pages-articles files. WinRAR fails at 37G, and
> bunzip2 fails at some point >> 14g though it "helpfully" cleans up after
> itself.
>
> Bunzip2 v 1.0.6
>
>
OK, reports a few minutes old:
http://tools.wmflabs.org/betacommand-dev/reports/commonswiki_svg_list.txt.7z
On Sat, Jun 20, 2015 at 1:38 AM, Ariel T. Glenn agl...@wikimedia.org
wrote:
Στις 20-06-2015, ημέρα Σαβ, και ώρα 00:46 +0200, ο/η Federico Leva
(Nemo) έγραψε:
D. Hansen, 19/06/2015
I can run a database report Monday. But keep in mind that the wiki isn't
static and what you want changes on a very rapid rate
On Friday, June 19, 2015, D. Hansen sammelacco...@tageskurier.de wrote:
Hi!
I have tried to get a list of all .svg-files on commons.wikipedia.
Of course I could
You would probably need to pull from wikidata
On Fri, Oct 3, 2014 at 10:31 AM, Ditty Mathew ditty...@gmail.com wrote:
I am trying to get paired articles from Simple English Wikipedia and
English Wikipedia. For that I am looking for language links for Simple
English Wikipedia. Is it available?
http://en.wikipedia.org/w/api.php?action=querymeta=siteinfosiprop=namespaces|namespacealiasesformat=jsonfm
should be what you need.
On Tue, Jul 2, 2013 at 3:11 PM, Byrial Jensen byr...@vip.cybercity.dkwrote:
Hi, is there a file somewhere with a list of all namespace names and
numbers for all
I am looking to create a script for creating manual dumps for those
wikis that either dont or wont publish their own dumps and that I dont
have server access to. To that end I am writing a python dump creator,
however I would like to ensure that my format is the same as the
existing. I could
11 matches
Mail list logo