You might want to spawn the individual document inserts using xdmp:spawn()
to avoid the timeouts.

 

~Tim Meagher

 

From: [email protected]
[mailto:[email protected]] On Behalf Of Todd Gochenour
Sent: Monday, February 20, 2012 12:23 PM
To: MarkLogic Developer Discussion
Subject: Re: [MarkLogic Dev General] Processing Large Documents?

 

The XQuery I have for performing the chunking is timing out after 9 minutes
(running in the query console).   There are 156000 'rows' total in this
extract.   I'm now reading the Developer's guide for Understanding
Transactions to figure out how I might optimize this query.   My query
reads:

 

declare function local:random-hex($length as xs:integer) as xs:string {
  string-join(
    for $n in 1 to $length
    return xdmp:integer-to-hex(xdmp:random(15)),
    ""
  )
};

declare function local:generate-uuid-v4() as xs:string {
  string-join(
 
(local:random-hex(8),local:random-hex(4),local:random-hex(4),local:random-he
x(4),local:random-hex(12)),
    "-"
  )
};


for $row in /*/*/table_data/row
let $record := element {$row/../@name} <mailto:%7b$row/../@name%7d>  {
  for $field in $row/field[text()]
  return element {$field/@name} {$field/text()}
}
return
xdmp:document-insert(concat(/*/*/@name,'/',name($record),'/',name($record),'
_',local:generate-uuid-v4(),'.xml'), $record); 

_______________________________________________
General mailing list
[email protected]
http://developer.marklogic.com/mailman/listinfo/general

Reply via email to