https://bugzilla.wikimedia.org/show_bug.cgi?id=30332

--- Comment #1 from Reedy <[email protected]> 2011-08-11 19:52:11 UTC ---
(In reply to comment #0)
> At the moment, it's very hard to code a bot that can filter out blacklisted
> URLs whilst leaving other URLs.
> 
> A simple thing to help with this would be to have the API pass back all
> problematic URLs on the page, thus reducing the number of attempts to two (try
> -> filter -> try again) rather than a loop of changing one domain every time,a
> s at present.
> 
> Thanks.

I don't actually think this is an api bug

            case EditPage::AS_SPAM_ERROR:
                $this->dieUsageMsg( array( 'spamdetected', $result['spam'] ) );

It's what gets returned back by the editpage...

Might be something looking to fix as part of bug 29246

-- 
Configure bugmail: https://bugzilla.wikimedia.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
You are on the CC list for the bug.

_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to