Does anyone have experience running Data Reviewer against very large data
sets (>2 million features)?  I am running table to table attribute checks
on a utility network migration between an Oracle database export from a
Framme environment and an Arc Feature class to validate that the attributes
migrated properly.  The Oracle table has > 2 million records and the FC has
> 170,000.  Just one table to table check on one attribute is taking in
excess of 16 hours (running overnight) without completing.  I have 5 checks
I need to do on this one feature class, and somewhere in the vicinity of 25
feature classes with similar sizes and testing requirements.  I have
applied SQL filter queries to the batch checks and written a python script
to run them in order to minimize processing requirements, but it is still
taking a really long time.  I can tell that it is working, as my test runs
are writing to the reviewer tables, but those take an hour or two to run on
a new 64-bit machine with only a few hundred records to compare.  Something
seems remarkably off to me.

Thanks,
Keith McKinnon

-- 

*Twenty years from now you will be more disappointed by the things that you
didn't do than by the ones you did do. So throw off the bowlines. Sail away
from the safe harbor. Catch the trade winds in your sails. Explore. Dream.
Discover.*
- Mark Twain
_______________________________________________
Geowanking mailing list
[email protected]
http://geowanking.org/mailman/listinfo/geowanking_geowanking.org

Reply via email to