I had thought along that line, but will that not create a new index shapefile for every invocation?
I thought you would have to create a number of index shapefiles, then combining their content later.
On 7/14/06, Frank Warmerdam <[EMAIL PROTECTED]> wrote:
John Preston wrote:
> I'm trying to use gdaltindex and then shptree to create a tile index
> for 50000 .tif files but when I try to run gdaltindex I get:
>
> gdaltindex data/jamaica/tileindex_res_1.shp res_1/*.tif
> bash: gdaltindex: Argument list too long
>
> I expect that the problem is that the shell is expanding the *.tif
> into one long string to pass to gdaltindex, and this is too long. How
> can I get around this.
John,
On Unix/Linux or Cygwin you should be able to do something like:
find res_1 -name '*.tif' -print | xargs --max-args=50 gdaltindex \
data/jamaica/tileindex_res_1.shp
Basically, this is using the find command to collect the list of names
(instead of wildcards which max out on commandline length limits)
and piping the list to xargs which will invoke gdaltindex on up to 50
names at a time.
Good luck,
--
---------------------------------------+--------------------------------------
I set the clouds in motion - turn up | Frank Warmerdam, [EMAIL PROTECTED]
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush | President OSGF, http://osgeo.org
--
Ludwig M Brinckmann
phone: 020 7254 1181
mobile: 07949 460787
