Thank you! I don't protect non-API wikis at all, they have a room in the
museum, I only want to be sure that we can do everything with API.

2013/4/22 Alex S.H. Lin <[email protected]>

> (sorry for poor english grammar )
>
>  I think you want to distunguish the link page is exist or not in
> wikitext.
> You don't need to analyze HTML tags now, just use
>
>    - Page().linkedPages() to get all links Page() object in the page you
>    want.
>    - getall(site, pagelist) to load all links detail.
>    - pagelist[x].get() to get wikitexts.
>
> if page is not exist, you will get the exception NoPage.
>
> There's use two API requests, and easy to solve it.
>
>
> 2013/4/22 Bináris <[email protected]>
>
>> Can we solve everything through API? I have a script that I wrote a few
>> years ago, and I had to analyze HTML code to distunguish "red" and "blue"
>> links on a page because there was no API function for that.
>>
>> --
>> Bináris
>> _______________________________________________
>> Pywikipedia-l mailing list
>> [email protected]
>> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>>
>>
>
>
> --
> I always keep my spirit minority.
>
> _______________________________________________
> Pywikipedia-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l
>
>


-- 
Bináris
_______________________________________________
Pywikipedia-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/pywikipedia-l

Reply via email to