(working from pretty close to head...) A follow up question on how to create cogeo files...
I appear to need to do additional work to set BLOCK_OFFSET_?_?. How do I go about this? I'm trying to create a cogeo without gdal_translate in java by using create and createcopy like this: https://gist.github.com/schwehr/39680bec7fd8e3e3f840122ea3bafc65 However, I get this failure when checking the result: validate_cloud_optimized_geotiff.py cogeo.tif Traceback (most recent call last): File "validate_cloud_optimized_geotiff.py", line 144, in validate data_offset = int(main_band.GetMetadataItem('BLOCK_OFFSET_0_0', 'TIFF')) TypeError: int() argument must be a string, a bytes-like object or a number, not 'NoneType If I use gdal_translate on the result form the java, it validates. gdal_translate -co COMPRESS=LZW -co TILED=YES -co COPY_SRC_OVERVIEWS=YES cogeo.tif cogeo2.tif validate_cloud_optimized_geotiff.py cogeo2.tif cogeo2.tif is a valid cloud optimized GeoTIFF gdalinfo -mdd all -listmdd -mm cogeo.tif > cogeo.tif.info gdalinfo -mdd all -listmdd -mm cogeo2.tif > cogeo2.tif.info diff -u cogeo{,2}.tif.info --- cogeo.tif.info 2018-02-14 10:54:27.050423325 -0800 +++ cogeo2.tif.info 2018-02-14 10:54:41.862227514 -0800 @@ -1,5 +1,5 @@ Driver: GTiff/GeoTIFF -Files: cogeo.tif +Files: cogeo2.tif Size is 743, 372 Coordinate System is: GEOGCS["WGS 84", @@ -19,8 +19,8 @@ Metadata: AREA_OR_POINT=Area Metadata (DERIVED_SUBDATASETS): - DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:cogeo.tif - DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from cogeo.tif + DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:cogeo2.tif + DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from cogeo2.tif Image Structure Metadata: COMPRESSION=LZW INTERLEAVE=BAND gdalinfo -mdd all -listmdd -mm cogeo.tif Driver: GTiff/GeoTIFF Files: cogeo.tif Size is 743, 372 Coordinate System is: GEOGCS["WGS 84", DATUM["WGS_1984", SPHEROID["WGS 84",6378137,298.257223563, AUTHORITY["EPSG","7030"]], AUTHORITY["EPSG","6326"]], PRIMEM["Greenwich",0], UNIT["degree",0.0174532925199433], AUTHORITY["EPSG","4326"]] Origin = (116.349975262217256,39.975075059082918) Pixel Size = (0.000134747292618,-0.000134747292618) Metadata domains: IMAGE_STRUCTURE (default) DERIVED_SUBDATASETS Metadata: AREA_OR_POINT=Area Metadata (DERIVED_SUBDATASETS): DERIVED_SUBDATASET_1_NAME=DERIVED_SUBDATASET:LOGAMPLITUDE:cogeo.tif DERIVED_SUBDATASET_1_DESC=log10 of amplitude of input bands from cogeo.tif Image Structure Metadata: COMPRESSION=LZW INTERLEAVE=BAND Corner Coordinates: Upper Left ( 116.3499753, 39.9750751) (116d20'59.91"E, 39d58'30.27"N) Lower Left ( 116.3499753, 39.9249491) (116d20'59.91"E, 39d55'29.82"N) Upper Right ( 116.4500925, 39.9750751) (116d27' 0.33"E, 39d58'30.27"N) Lower Right ( 116.4500925, 39.9249491) (116d27' 0.33"E, 39d55'29.82"N) Center ( 116.4000339, 39.9500121) (116d24' 0.12"E, 39d57' 0.04"N) Band 1 Block=256x256 Type=Float32, ColorInterp=Gray Description = B5 Computed Min/Max=0.000,0.000 Overviews: 372x186, 186x93, 93x47, 47x24, 24x12 On Sun, Dec 10, 2017 at 5:34 PM, Kurt Schwehr <schw...@gmail.com> wrote: > Thanks Even for the feedback. There have be a few offline discussions > going on and Even added some notes to the document on Trac: > > https://trac.osgeo.org/gdal/wiki/CloudOptimizedGeoTIFF? > action=diff&version=11 > <https://www.google.com/url?sa=D&q=https%3A%2F%2Ftrac.osgeo.org%2Fgdal%2Fwiki%2FCloudOptimizedGeoTIFF%3Faction%3Ddiff%26version%3D11> > > This stems from me switching Earth Engine from LZW to DEFLATE when > exporting GeoTIFFs (and I added tiling). We have a report from a user that > ENVI 5.4 (and 5.1) Classic can't read the resulting images but QGIS & > ArcGIS can. I'm reverting exports back to LZW compression. > > On Wed, Nov 15, 2017 at 6:02 AM, Even Rouault <even.roua...@spatialys.com> > wrote: > >> On mardi 14 novembre 2017 14:20:58 CET Kurt Schwehr wrote: >> >> > Hi Even, >> >> > >> >> > I have some follow up questions on Cloud Optimized GeoTIFFs: >> >> >> >> The main constraint of C.O.G is that all the IFD definitions are at the >> beginning of the file, to avoid seeking at various points in it. Other >> parameters are pretty much unspecified. >> >> >> >> > >> >> > * Is there a preferred/better INTERLEAVE if there is more than one band? >> >> >> >> Depends on access patterns. If as soon as you process one pixel you need >> to access the value for all bands, then INTERLEAVE=PIXEL is better, and it >> will result in smaller sizes of StripOffsets/TileOffsets and >> StripByteCount/TileByteCount arrays >> >> >> >> > * Is there a preferred tile blocksize? You have 512 in your examples. >> Are >> >> > there any major trade offs between using 128, 256, 512, or 1024 for x >> and y >> >> > block sizes? >> >> >> >> Too small blocksizes will result in larger ...Offsets and ...ByteCount >> arrays. >> >> >> >> > * Should tiles be square? Does it matter? >> >> >> >> No >> >> >> >> > * Is it better to skip tiling for small images? If so, at what threshold >> >> > do you think the switch should happen? 1024? >> >> >> >> I'm not sure if that has an importance. But it is not wise to have an >> image whose one dimension is larger than the corresponding block dimension >> (as blocks are not truncated) >> >> >> >> > * Is DEFLATE preferred for compression type over LZW for lossless >> >> > compression? >> >> >> >> Unspecified. DEFLATE is more CPU intensive, but if network times are the >> limiting factor, it is worth as more eficient >> >> >> >> > * If the writer isn't constrained by compute power, are there preferred >> >> > ZLEVEL and PREDICTOR values? Is there a time cost for decompressing >> >> > ZLEVEL=9 over 1? >> >> >> >> PREDICTOR has neglectable CPU inflence (just a add/diff on integer >> values), but will not always result in smaller file sizes. Depends on the >> dataset >> >> >> >> If I trust https://github.com/inikep/lzbench , the time cost for >> decompression for Deflate/Zlib doesn't seem to vary much with ZLEVEL. So >> the higher the better. >> >> I don't know for LZW. >> >> >> >> > >> >> > I'm a little confused by this code from validate_cloud_optimized_geoti >> ff.py: >> >> > >> >> > if main_band.XSize >= 512 or main_band.YSize >= 512: >> >> > if check_tiled: >> >> > block_size = main_band.GetBlockSize() >> >> > if block_size[0] == main_band.XSize and block_size[0] > 1024: >> >> > errors += ["The file is greater than 512xH or Wx512," + >> >> > "but is not tiled"] >> >> > >> >> > Will the above correctly fail an image that is (say) 256x2048 if it is >> not >> >> > tiled? >> >> >> >> No, it will pass this test. Since in that case block_size[0] == xsize == >> 256. >> >> But for such a narrow image, it should probably warn if it is not tiled, >> as the number of strips, if letting to default strip height, will be larger >> than really necessary. >> >> >> >> Even >> >> >> >> -- >> >> Spatialys - Geospatial professional services >> >> http://www.spatialys.com >> > > > > -- > -- > http://schwehr.org > -- -- http://schwehr.org
_______________________________________________ gdal-dev mailing list gdal-dev@lists.osgeo.org https://lists.osgeo.org/mailman/listinfo/gdal-dev