The XDMP-MEMORY message does mean that the host couldn't allocation the needed memory. In this case that was probably because the transaction was too large to fit in memory. If you aren't already using 4.1-3, I'd upgrade - just in case this is a known problem that has already been fixed.

If 4.1-3 doesn't help, then I suppose you could increase the swap space... but I don't think you'd like the performance. You might be able to reduce the sizes of the group-level caches, but that might lead to *CACHEFULL errors.

So as Geert suggested, clearing the forest is probably the fastest solution. Or if you don't mind spending more time on it, you could delete in blocks of 1000 documents.

  for $i in xdmp:directory($path, 'infinity')[1 to 1000]
  return xdmp:document-delete(xdmp:node-uri($i))

You could automate this using xdmp:spawn(). You could also use cts:uris() with a cts:directory-query(), if you have the uri lexicon available.

-- Mike

On 2009-12-09 05:59, Lee, David wrote:
My joys of success were premature.
I ran into memory problems trying to load the full set of documents, it died 
after about 1mil.
So I tried to delete the directory and now I’m getting

Exception running: :query
com.marklogic.xcc.exceptions.XQueryException: XDMP-MEMORY: xdmp:directory-delete
("/RxNorm/rxnsat/") -- Memory exhausted
in /eval, on line 1

Arg !!!!

I’ve tried to change various memory settings to no avail.  Any clue how to 
delete this directory ?
or should I start to delete the files piecemeal.

Suggestions welcome.

-David


----------------------------------------
David A. Lee
Senior Principal Software Engineer
Epocrates, Inc.
[email protected]<mailto:[email protected]>
812-482-5224





_______________________________________________
General mailing list
[email protected]
http://xqzone.com/mailman/listinfo/general

Reply via email to