Dec,

XQSync should do the trick, I've xqsync'd many, many more documents than 400k.  
It sounds like you're using the INPUT_QUERY option?  Are you using cts:uris() 
for that INPUT_QUERY?  Can you share that query?

--Mark

-----Original Message-----
From: [email protected] 
[mailto:[email protected]] On Behalf Of Declan Newman
Sent: Friday, July 02, 2010 6:42 AM
To: [email protected]
Subject: [MarkLogic Dev General] Database sync

Hello all,

I need to copy approx 400,000 documents across from one database 
(staging) to another (live) dependent on a query. I have looked into 
XQSync, which seems to meet most of the requirements, other than the 
fact that it times-out given the volume (even when splitting the query 
into several, using ";;"). I have written a simple extension to XQSync 
which will do the job, but in the event of a failure, it will start back 
at the beginning. I would rather not start inserting documents into 
collections if I can avoid it.

Has anyone done a similar thing, and have a nicer solution? Thanks for 
any help.

Cheers,

Dec

-- 
Declan Newman, Senior Software Engineer,
Semantico, Floor 1, 21-23 Dyke Road, Brighton BN1 3FE
<http://www.semantico.com/>
<mailto:[email protected]>
<tel:+44-1273-358247>  <fax:+44-1273-723232>

Check out all our latest news and thinking on the Discovery blog
- http://blogs.semantico.com/discovery-blog/

Follow Semantico on Twitter
- http://twitter.com/semantico

_______________________________________________
General mailing list
[email protected]
http://developer.marklogic.com/mailman/listinfo/general
_______________________________________________
General mailing list
[email protected]
http://developer.marklogic.com/mailman/listinfo/general

Reply via email to