Bob and I have been working on the wiki breakdown. Here's what's done: - the ikiwiki package is installed and basically configured on frontend.
- it is connected to the subversion source repository in the administration project, https://savannah.gnu.org/svn/?group=administration. (More details another time.) - the html for our pages (there are about 150, most short) are saved in that svn repo; they aren't visible on our web site yet, though. I retrieved them by hand from the Wayback Machine, starting from http://web.archive.org/web/20130818055223/http://savannah.gnu.org/maintenance/FrontPage It had all the actual content that I remember ever being there. Here are my next steps: - write something to convert that dumped html to markdown; then at least we'll have the documentation online again. I hope to get the basic task done in the next day or two. - set up a post-svn-commit hook so that commits automatically cause the wiki to be updated. Also set up email on commit. - enable editing through the web. Obviously much more to say/document eventually, but that's the current state. The wiki content itself is also badly, sadly, madly stale. By the way, neither Bob nor I could find a way to extract the actual text from Zope's Data.fs file. The provided fsdump tool gives a text transcription of it: env PYTHONPATH=/usr/lib/zope2.10/lib/python \ /usr/lib/zope2.10/bin/fsdump.py \ /var/lib/zope2.10-20120719/instance/default/var/Data.fs | head But this seemed to just report transactions, referring to opaque data pointers. No actual content. We found lots of people asking questions about it on the web, but no usable answers. If Zope was actually running, it would be possible to export the "objects" (pages) in various ways, but of course, we can't get Zope to start. Hence the brute force save from wayback. Yay for wayback. They did better backups than anything we had ... karl