For #1) Wouldn't we have to keep doing that whenever a change is made to the website as a result? I'm concerned with it falling out of sync with the production website.
True, and this is a significant concern. Whomever updated the web site would have to also upload said changes to build/ which can/will become problematic.
Also, the output on minotaur should be viewable via ViewCVS anyway, correct? (We were doing manual updates to the website to fix the breadcrumb issue--we went through ViewCVS to find the correct pages to update IIRC.)
Sort of. minotaur is a different location, and I don't even know if it has CVS (I just upload the darn site using FTP--actually *cough* Dreamweaver does it for me). Once a forrestbot is set up, it can be done via a web-based interface, although that may take a while. I believe they're waiting for some new servers or something, and their hair started going gray some time ago... Hopefully, when the forrestbot is built, they'll let infrastructure@ know, and fop-dev can take the steps to create a forrestbot to handle xml-fop.
FWIW, as far as I understand, forrestbot is not much more than: - run the /forrest/ command - if successful, cp -R build/ /www/xml.apache.org/fop - let fop-dev@ know whether or not it was successful when it's done
For #2) Oh ye of little faith! ;) Aren't we eventually going to have a full-site PDF anyway? Then non-internet installations can just save a copy of the PDF locally after they download the software. (I guess they have to be connected at some time to the 'Net though...) Perhaps saving the full site PDF once we get it may be more efficient.
D'oh! I agree. having a wholesite.pdf file is a *much* better solution. As Emily Litella would say, "Never mind..."
Web Maestro Clay -- <[EMAIL PROTECTED]> - <http://homepage.mac.com/webmaestro/> My religion is simple. My religion is kindness. - HH The 14th Dalai Lama of Tibet