[basex-talk] Creating more than a million databases per session: Out Of Memory

2016-10-15 Thread Bram Vanroy | KU Leuven
Hi all I've talked before on how we restructured our data to drastically improve search times on a 500 million token corpus. [1] Now, after some minor improvements, I am trying to import the generated XML files into BaseX. The result would be 100,00s to millions of BaseX databases - as we expec

Re: [basex-talk] Creating more than a million databases per session: Out Of Memory

2016-10-15 Thread Marco Lettere
Hi Bram, not being much into the issue of creating databases at this scale I'm not sure whether the OOM problems you are facing are related to Basex of JVM actually. Anyway something rather simple you could try is to behave "in between". Instead of opening a single session for the create statem

[basex-talk] Making connections to a database

2016-10-15 Thread Thomas Daly
Hello, I'm developing an application server with node.js. When the server starts, it creates a single connection to a BaseX database, to allow subsequent user queries to the database. I notice that when I connect separately to the BaseX database with the BaseXClient command line tool, the app

[basex-talk] (no subject)

2016-10-15 Thread Shaun Flynn
Hello there, I am trying to parse a large CSV file into XML by the following xquery: let $file := fetch:text("D:\BPLAN\tlk.txt") let $convert := csv:parse($file, map { 'header' : true(), 'separator' : 'tab'}) return fn:put( {for $row in $convert/csv/record return {$row/*} }, "D:\BPLAN\