Public bug reported:

We tried to put every 1000th wikipedia article into a Zorba collection.
Unfortunately, the computer's main memory quickly fills up and swapping
starts at ~93% memory consumption for the zorba process.

Code is attached. The wiki.xml file (36GB) is a recent Wikipedia Dump
and can be obtained at http://download.wikimedia.org/enwiki/latest
/enwiki-latest-pages-articles.xml.bz2 (~7.8GB when compressed). There
are currently around 4,000,000 articles in the English Wikipedia.

** Affects: zorba
     Importance: Undecided
     Assignee: Matthias Brantner (matthias-brantner)
         Status: New

-- 
You received this bug notification because you are a member of Zorba
Coders, which is the registrant for Zorba.
https://bugs.launchpad.net/bugs/1016053

Title:
  Memory overload when streaming large file

Status in Zorba - The XQuery Processor:
  New

Bug description:
  We tried to put every 1000th wikipedia article into a Zorba
  collection. Unfortunately, the computer's main memory quickly fills up
  and swapping starts at ~93% memory consumption for the zorba process.

  Code is attached. The wiki.xml file (36GB) is a recent Wikipedia Dump
  and can be obtained at http://download.wikimedia.org/enwiki/latest
  /enwiki-latest-pages-articles.xml.bz2 (~7.8GB when compressed). There
  are currently around 4,000,000 articles in the English Wikipedia.

To manage notifications about this bug go to:
https://bugs.launchpad.net/zorba/+bug/1016053/+subscriptions

-- 
Mailing list: https://launchpad.net/~zorba-coders
Post to     : zorba-coders@lists.launchpad.net
Unsubscribe : https://launchpad.net/~zorba-coders
More help   : https://help.launchpad.net/ListHelp

Reply via email to