Hello there,

I am trying to parse a large CSV file into XML by the following xquery:

let $file := fetch:text("D:\BPLAN\tlk.txt")
let $convert := csv:parse($file, map { 'header' : true(), 'separator' :
'tab'})
return fn:put(
  <tlks>
  {for $row in $convert/csv/record
  return <tlk>{$row/*}</tlk>
  }</tlks>,
  "D:\BPLAN\tlk.xml"
)

Using the GUI, it runs out of memory -- when I click on the bottom right
hand corner (where the memory usage is shown), it says to increase memory,
restart using -Xmx <size>.

I do this through the MS DOS prompt, but -Xmx does not appear to be a
parameter any more,

Is there a better method for parsing large CSV files? I then want to add
the resulting file tlk.xml to a new database.

Kindest Regards
Shaun Connelly-Flynn

Reply via email to