Hi,

 

I want to check if there is likely to be any problem with memory exhaustion
in the following scenario.

 

I will have text documents stored in a MarkLogic database that I will to
update using a large number of consecutive search/replaces, then finally
convert to XML. 

 

It seems obvious to me that I could easily run out of memory if I adopt this
approach (and have hundreds of replaces applied to large text documents). In
this trivial example, I am simply converting the word "Document" to
"DOCUMENT" in three steps, which I would obviously do in one for real, but
just to show the method I originally considered...

 

    let $Text :=
".............................................................. (large text
document).............................."

    let $NewText1 := fn:replace($Text, "Doc", "DOC")

    let $NewText2 := fn:replace($NewText1, "ume", "UME"))

    let $NewText3 := fn:replace($NewText2, "nt", "NT"))

    let $XML := xdmp:unquote($NewText3)

    return

      $XML

 

I am assuming that each variable contains a variant of the text document, so
memory will quickly become exhausted.

 

However, if I use xdmp:set(), would that solve the problem, because the
first variable content is being replaced, and the later variables have no
content at all?...

 

    let $Text :=
".............................................................. (large text
document).............................."

    let $NewText1 := fn:replace($Text, "Doc", "DOC")

    let $NewText2 := xdmp:set($NewText1, fn:replace($NewText1, "ume",
"UME"))

    let $NewText3 := xdmp:set($NewText1, fn:replace($NewText1, "nt", "NT"))

    let $XML := xdmp:unquote($NewText1)

    return

      $XML

 

Or would I still expect old text to still be occupying memory (lack of
string garbage collection)?

 

Thanks,

 

Neil.

 

 

 

_______________________________________________
General mailing list
[email protected]
http://xqzone.com/mailman/listinfo/general

Reply via email to