[basex-talk] Creating more than a million databases per session: Out Of Memory

2016-10-15 Thread Bram Vanroy | KU Leuven
Hi all I've talked before on how we restructured our data to drastically improve search times on a 500 million token corpus. [1] Now, after some minor improvements, I am trying to import the generated XML files into BaseX. The result would be 100,00s to millions of BaseX databases - as we

[basex-talk] (no subject)

2016-10-15 Thread Shaun Flynn
Hello there, I am trying to parse a large CSV file into XML by the following xquery: let $file := fetch:text("D:\BPLAN\tlk.txt") let $convert := csv:parse($file, map { 'header' : true(), 'separator' : 'tab'}) return fn:put( {for $row in $convert/csv/record return {$row/*} },

[basex-talk] Making connections to a database

2016-10-15 Thread Thomas Daly
Hello, I'm developing an application server with node.js. When the server starts, it creates a single connection to a BaseX database, to allow subsequent user queries to the database. I notice that when I connect separately to the BaseX database with the BaseXClient command line tool, the