> I need the wikipedia dump of 2002 for my Machine Learning Thesis.
> Could you please provide me the data.
Is this what you are looking for?
https://dumps.wikimedia.org/archive/
Regards
Aron Bergman
___
Xmldatadumps-l mailing list
Hello,
I need the wikipedia dump of 2002 for my Machine Learning Thesis.
Could you please provide me the data.
Thank you,
___
Xmldatadumps-l mailing list
Xmldatadumps-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l
Hi,
I would be interested to know how many pages in
enwiki-latest-pages-articles.xml . My own count gives 19,4 Mio. pages.
Can this be, at least roughly, confirmed?
In the internet I just find these numbers:
5,861,178 - I guess this are all namespace 0 pages
47,826,337 - this are all pages
The number should be around 19414056, the same number of pages in the
stubs-articles file.
On Tue, May 28, 2019 at 8:35 AM Sigbert Klinke
wrote:
> Hi,
>
> I would be interested to know how many pages in
> enwiki-latest-pages-articles.xml . My own count gives 19,4 Mio. pages.
> Can this be, at