On 3/4/2010 03:58, Clayton wrote:
On 03/04/2010 09:41 AM, Nino Novak wrote:
There is no "easy" one-click way to download any multi-page part of
any website or Wiki - at least not from within the website/Wiki
without some preparation.

The OooWiki uses Books for this, but someone has to create the Book -
they cannot (at this point) be auto-generated.

but it's definitely a nice-to-have feature for the documentation wiki to
have a mechanism which - say - automatically updates existing
"official" wikibooks each time one of the underlying documents is
changed, e.g. on a daily basis. Maybe it's doable with a
script/bot/cronjob already?

How would this work?

I've pondered this one quite a lot, wishing it could be totally
automated, and I haven't been able to come up with anything reliable or
practical that could manage the whole process.

The closest I've come to an idea for automating that would have a chance
to work is to parse the maintained TOC file, and convert it into a Book
file.

The TOC file is a known and maintained part of any Wiki Book.  The
syntax of both the TOC file and the Book file are known... and in theory
the TOC could be converted into a book via an external transform.

Simple Pseudocode:
   - Use the MW API to extract the XML for a given TOC
   - Parse the TOC XML and convert/transform it to Book XML
   - Use API to upload the Book XML

The transform is the challenge... it can be done using... Ant/XSLT/Saxon
for example.

C.
Something like that should work for creating books automagically, but it may need something more to keep them up-to-date. Would some process need to check the revision date on each page?

--
/tj/


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@documentation.openoffice.org
For additional commands, e-mail: dev-h...@documentation.openoffice.org

Reply via email to