I have a marklogic database document that is  50 megs and my query below

 xquery version "1.0-ml";
for $m in
doc("/dlap/angel.bfwpub.com.xml")//response//results//result//doc//arr/str
         where fn:contains(fn:string-join(($m/text()), ""), "
http://angel.bfwpub.com";)
  return

<results>  { $m/../../@entityid  }   {  $m/../../@itemid  }  {$m/../@name }
{$m }    </results>

kept returning with a pop up in the query console of a script on this page
may be busy and would not process all the results until the browser froze
 completely.

I wrote a utility to break the 50 meg xml document into 50 approximately 1
meg documents.

I gave the resulting documents the same file name with an interger appended
starting at 0 up to 49.
I created the following xquery with a function

xquery version "1.0-ml";


 declare function local:stuff ($word as xs:int , $s as xs:string)
 {
  let $a := fn:concat("DLAP/angel.bfwpub.com", $word, ".xml")
  for $m in doc($a)//doc//arr/str
         where fn:contains(fn:string-join(($m/text()), ""), $s)
  return

<results>  { $m/../../@entityid  }   {  $m/../../@itemid  }  {$m/../@name }
{$m }    </results>

 };
 for $f in (0 to 49)      (: 49 :)
 return local:stuff($f,"http://angel.bfwpub.com";)

I got the same problem as before.  By breaking it down into chunks of 5
e.g. for $f in (0 to 4)
and capturing the results and then continuing with  for $f in (5 to 9)  I
was able to get all the results that I needed.
I have some other documents which are much larger so my manual process
maybe too time consuming or error prone.  Does anyone have a better
solution.  any information will be greatly appreciated.
_______________________________________________
General mailing list
[email protected]
http://developer.marklogic.com/mailman/listinfo/general

Reply via email to