Daniel, I don't think that urllib.urlretrieve is fit for the job, since
some parts of the wiki pages (ToC & go-back links) are removed before
writing them to disk. If there's some other reason to use
urllib.urlretrieve, please let me know... :)
I'm going to add the script in bughelper.main right now, and added these
comments to debian/rules, just in case:
--- debian/rules 2007-02-08 08:18:05 +0000
+++ debian/rules 2007-02-15 14:16:51 +0000
@@ -10,6 +10,10 @@
build/bughelper::
docbook2x-man debian/bughelper.1.docbook
docbook2x-man debian/bugxml.1.docbook
+ #
+ # uncomment to get a fresh copy of doc pages before release
+ #
+ #python debian/doc.py
clean::
rm -f bughelper.1 bugxml.1
BTW, I had patched a few Python scripts before, but this was the first
one I wrote from scratch, so it took me a while... :/
** Changed in: bughelper (upstream)
Status: Needs Info => Fix Committed
--
RFE: add documentation to bughelper
https://launchpad.net/bugs/79222
--
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs