I am sorry that this is a bit off the usual topic of FriCAS as such... On 23 March 2013 14:01, Waldek Hebisch <[email protected]> wrote: > > Yes. You can see error message. The line: > > AttributeError: isBibliographyExportable > > seem to be key. Also, I saw some messages about problems > loading CMFBibliographyAT. >
My first suspicion is that this might be a Python version related issue. Some parts of Zope and ZWiki use legacy features of Python that are discarded in new versions of Python. >> Having this old VM accessible only behind an Apache proxy >> running on an up to date current version of Linux seemed like more >> than adequate protection from the hostile Internet to me. > > Both Ubuntu and wiki software. Where you successful at upgrading both of these? I have to admit that I am interested in using newer versions but it just never seemed like a priority to me. > Running wiki in VM gives reasonable > protection for host so I decided to run it. However thinking > that this is "only a VM" is wrong IMHO: > Using VirtualBox (and most other VM host packages) makes it relatively easy and cheap to keep regular snapshots. These days I usually use XEN on SuSE Linux with BTRFS for VM storage. BTRFS has some nice snapshot features. > ... > - It is good to have several lines of defence. If attacker breaks > one the other may limit damage and allow faster detection and > recovery from breakin. > I think that usually one should first seriously evaluate the level of risk both in terms of magnitude of consequences and the probability of a successful attack compared to the level of effort required to prevent it. But you might have other motivations for wanting to do this, e.g. educational. > Beside security obsolete system may cause other problems: > ... Yes. > > I wrote about backup above but let me explain it in more detail. > AFAICS the actual wiki content (that is pages written by users, > including revision) has volume below 100 MB. Then there are > files obtained from outside of similar volume, but by their > nature static. Some parts of wiki software contain unique > customization, but that is of limited volume (latexwiki + > mathaction subdirectory are below 1.7MB). So ideally we > should be able to dump few hundred megabytes of data from > the old system, install a new one, put data back and have > wiki running on the new system. Making regular backups > of order of few megabytes is not a problem. Actually, > for incremental backups volume would be much lower. > > OTOH backing up virtual disk may take 10 GB which is problematic. > You are right. But it seems to me that the best "backup" for this sort of application is in fact replication. If several co-operating instances of the site exist on the Internet than all that is necessary to backup the site is to periodically synchronize page content. There is a simple and effective Zope tool for doing this called Zsyncer which works any Zope-based application. I used it between the University of Washington site and my local version. And Ralf and I have used it to synchronize his copy of the wiki with my local version. https://pypi.python.org/pypi/Products.ZSyncer Zsyncer only transfers actual page content. The page is re-rendered on the new site when the page is next accessed. Of course it would also be very useful if the source code for ZWiki and the mathaction plugin was under version control on a site like github. One thing that might make sense and reduce the size of the wiki VM itself would be to off-load the handling of the LaTeX to image conversion to a separate VM host using a system such as http://www.forkosh.com/mimetex.html Regards, Bill Page. -- You received this message because you are subscribed to the Google Groups "FriCAS - computer algebra system" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/fricas-devel?hl=en. For more options, visit https://groups.google.com/groups/opt_out.
