Hi, Nick,

I am not in analytics, but, assuming you mean the English language
Wikipedia, the all the internal links and the external links are available
for download in sql format at:

https://dumps.wikimedia.org/enwiki/20180220/ - (search "pagelinks" and
"externallinks"). The latest dump is runing now, it may be ready in a few
days, too. They are quite large because they include the text and other
properties, but with very little automation, it is easy to count the pairs
in the direction you want. You may need the "all-titles" or
"all-titles-on-ns0" files, too, to reference page ids.

Maybe someone else can offer an easier option?

On Fri, Mar 9, 2018 at 4:42 PM, Nick Bell <bhin...@gmail.com> wrote:

>  Dear Analytics Team,
>
> I’m doing a project on Wikipedia for my Maths degree, and I was hoping you
> could help me acquire some data about Wikipedia.
>
> I would like to get the number of incoming internal links and outgoing
> internal links for every page, if possible. I could limit this if needs be,
> as I am aware this totals around 11 million values.
>
> I have minimal programming experience, so if this is unreasonable or
> impossible please let me know. I very much appreciate your time considering
> my request.
>
>
>
> Many thanks,
>
>
> Nicholas Bell
>
> Mathematics Undergraduate
>
> University of Bristol
>
>
> _______________________________________________
> Analytics mailing list
> Analytics@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/analytics
>
>


-- 
Jaime Crespo
<http://wikimedia.org>
_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to