On Sat, Mar 14, 2026 at 10:32 AM Daniel Kinzler via Wikitech-l
<[email protected]> wrote:
> Tools like this should continue to work fine if they authenticate when making 
> the API requests.  We don't want to break community tools, but we can't 
> distinguish them from commercial scrapers, which we want to rate limit. So 
> the way to fix the tools is to make the user log in, or to run the tools on 
> WMCS.
>
> The there are problems with making tools authenticate for making API calls, 
> please let us know.

The main issue with a tool like Depictor is that there are two kinds
of requests: authenticated requests for edits on Commons (which work
fine), but there are also unauthenticated read-only requests coming
from the frontend to get images, fetch metadata, etc. Those are far
more plentiful and might seem pretty random (a typical session on
Depictor could easily have 50 requests per minute). As far as i know
these requests can't be authenticated (giving a 'token' parameter
doesn't work). I did add the Api-User-Agent header, but i've seen the
dreaded HTTP 429 error also after adding those headers.

Maybe one solution would be to move unauthenticated requests to the
backend (the PHP API layer) as well because then it's clear that
they're coming from toolforge.org? But that would unfortunately
require a lot of refactoring of this app, and many older ones which i
doubt still have maintainers.


-- Hay
_______________________________________________
Wikitech-l mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Reply via email to