Hello there,

I'm a beginner programmer and am working on a project using Wikipedia views, 
trying to reproduce a paper. I was pulling json from http://stats.grok.se but 
the last month isn't updated.  I'm trying to get the last 30 days of wikipedia 
views for several pages, but the hourly files are 70mb compressed. Is it 
possible for me to query the specific data directly through some kind of 
Wikipedia database directly?


Otherwise in regards to the files (located at 
http://dumps.wikimedia.org/other/pagecounts-raw/2016/2016-01/), should they be  
opened/accessed in some way? I'm using a high level language so working with 
these things sounds like it will break my computer.


Like I said, I'm fairly new to this all so I apologize in advance if these 
questions seem silly.


Thank you for any help,

Dominic Della Sera

_______________________________________________
Analytics mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to