** Changed in: zorba
Assignee: (unassigned) => Matthias Brantner (matthias-brantner)
You received this bug notification because you are a member of Zorba
Coders, which is the registrant for Zorba.
Memory overload when streaming large file
Status in Zorba - The XQuery Processor:
We tried to put every 1000th wikipedia article into a Zorba
collection. Unfortunately, the computer's main memory quickly fills up
and swapping starts at ~93% memory consumption for the zorba process.
Code is attached. The wiki.xml file (36GB) is a recent Wikipedia Dump
and can be obtained at http://download.wikimedia.org/enwiki/latest
/enwiki-latest-pages-articles.xml.bz2 (~7.8GB when compressed). There
are currently around 4,000,000 articles in the English Wikipedia.
To manage notifications about this bug go to:
Mailing list: https://launchpad.net/~zorba-coders
Post to : email@example.com
Unsubscribe : https://launchpad.net/~zorba-coders
More help : https://help.launchpad.net/ListHelp