I've installed a local OpenStreetMap architecture for a project I'm currently working on. We need to load a very large amount of data into our OpenStreetMap database. The original data is stored in a FileGDB that is around 40Gb. It has been transfered into the OSM structure (with negative ids and no version, changeset,etc tags).
My question is what would be the ideal way to procede to load this initial set of data? I'v tried bulk_upload.py which takes about 1 hour to process 1/30000 of the data. Either my local API 0.6 needs to be tuned or it is not made to handle such important data updates. Osmosis requires that their be changeset and version tags which do not exist in the data I currently have. Writing directly to the database without passing through the API would possibly speed up the process, but I've not found a way to do this. Thank you in advance for your advice.
_______________________________________________ talk mailing list [email protected] http://lists.openstreetmap.org/listinfo/talk

