whym created this task.
whym added a project: Pywikibot-archivebot.py.
Restricted Application added subscribers: pywikibot-bugs-list, Aklapper.
Restricted Application added a project: Pywikibot.

TASK DESCRIPTION
  There is an upper limit to the size of a page. Archive pages can be too large 
for it. The value is available in the siteinfo API: 'maxarticlesize'.
  
  https://www.mediawiki.org/wiki/Manual:$wgMaxArticleSize
  
https://www.mediawiki.org/wiki/Special:ApiSandbox#action=query&format=json&meta=siteinfo&siprop=general%7Cnamespaces%7Cnamespacealiases%7Cstatistics
  
  When an archive page reaches maxarticlesize, the bot should stop trying to 
add content to it and find somewhere else to archive to.
  
  For size-based archiving, it will be straightforward. You can just cap the 
max size parameter with maxarticlesize.
  
  For time-based archiving, it does not seem so simple. One possibility might 
be to create a new archive page by adding a suffix (e.g. 2020 → 2020_(2)), 
which will make implementation a bit complicated, and the index page of 
archives will be messed up. There might be a better way to handle this.
  
  Any thoughts?
  
  Original discussion: 
https://commons.wikimedia.org/w/index.php?title=User_talk:ArchiverBot&oldid=436612913#Not_archiving_a_page

TASK DETAIL
  https://phabricator.wikimedia.org/T276937

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: whym
Cc: Aklapper, pywikibot-bugs-list, whym, Jyoo1011, JohnsonLee01, SHEKH, 
Dijkstra, Khutuck, Zkhalido, MJL, Viztor, Wenyi, Tbscho, MayS, Mdupont, JJMC89, 
Dvorapa, Altostratus, Avicennasis, mys_721tx, jayvdb, Masti, Alchimista
_______________________________________________
pywikibot-bugs mailing list
pywikibot-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/pywikibot-bugs

Reply via email to