Hi Brian,

after your last hints, I'm now indexing for around 14 days and it seems that
I might have to wait another fourteen?

I started with

-bash-3.2$ php util.index.php --index
Rank: 0, total to do: 0
Rank: 1, total to do: 0
Rank: 2, total to do: 0
Rank: 3, total to do: 0
Rank: 4, total to do: 0
Rank: 5, total to do: 0
Rank: 6, total to do: 0
Rank: 7, total to do: 0
Rank: 8, total to do: 0
Rank: 9, total to do: 0
Rank: 10, total to do: 0
Rank: 11, total to do: 0
Rank: 12, total to do: 0
Rank: 13, total to do: 0
Rank: 14, total to do: 0
Rank: 15, total to do: 0
Rank: 16, total to do: 0
Rank: 17, total to do: 0
Rank: 18, total to do: 0
Rank: 19, total to do: 0
Rank: 20, total to do: 0
Rank: 21, total to do: 0
Rank: 22, total to do: 0
Rank: 23, total to do: 0
Rank: 24, total to do: 0
Rank: 25, total to do: 74241
355538: 1, 74241 remaining (15:32:37)
360464: 1, 74240 remaining (15:34:12)
365465: 2, 74239 remaining (15:35:47)

and now I'm here:

483447: 233, 56315 remaining (06:11:00)
483448: 706, 56082 remaining (12:15:01)
483449: 470, 55376 remaining (05:27:16)

If I'm correct, this means 20.000 of 70.000 in 14 days?
Projecting this in the future, once this will be done, I can start over with
two month of diff file indexing.

Is there no other possibility like downloading a zipped tarball of the
sqldump instead of running the full procedure on every machine?

If my assumptions are correct, it looks like better stopping my Amazon
experiments as they become far too expensive. As you always had helpful
hints: is there a way out for me?

Kind regards
Frans
_______________________________________________
Geocoding mailing list
[email protected]
http://lists.openstreetmap.org/listinfo/geocoding

Reply via email to