https://bugzilla.wikimedia.org/show_bug.cgi?id=63605
--- Comment #3 from JuneHyeon Bae <[email protected]> --- When I execute some scripts (ex: redirect.py broken), "pywikibot.exceptions.PageNotFound: Page http://uncyclopedia.kr/w/index.php?title=%ED%8A%B9%EC%88%98%EA%B8%B0%EB%8A%A5:BrokenRedirects&limit=500&useskin=monobook could not be retrieved. Check your family file." error appear in pywikibot-compat/pywikibot/comns/http.py. I traced error and the problem is "forceHTTPS=deleted" line in the pywikibot-compat/login-data/uncyclopedia-ko-DvtBot-login.data. When pywikibot request login, "Set-Cookie: forceHTTPS=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.uncyclopedia.kr; httponly" is included in the server response and saved to login.data file. In after, all requests are include "Cookie: forceHTTPS=deleted" value, server will response as 302 redirect "Location: http*s*://uncyclopedia.kr/w/index.php?title=%ED%8A%B9%EC%88%98%EA%B8%B0%EB%8A%A5:BrokenRedirects&limit=500&useskin=monobook" even SSL is disabled on server and client both. so HTTPError is appear and it processed as PageNotFound exeption wrongly. The problem is: When user login, and ssl is disabled, mediawiki set "forceHTTPS" cookie's value as "". And When we do setcookie("aaa", "");, PHP automatically set cookie "aaa" as expired, by setting value="deleted"; expire=date("Y-m-d H:i:s", 1);. Most browsers invalidate that cookie and works normally, but pywikibot-compat save that cookie without checking expire date. So that will be cause forceHTTPS=deleted cookie will included in all request. And mediawiki didn't check forceHTTPS cookie's value is "true", only check exists. So mediawiki thinks it should force https, and return 302 redirect to https page. This patch fix that problem by checking cookie's expire date. -- You are receiving this mail because: You are on the CC list for the bug. _______________________________________________ Wikibugs-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikibugs-l
