Can we aspire to translate having the same set of resampling algorithms as
the warper?
I see the warper adds min, max, mod, q1, q3, sum
I especially wanted sum for OVERVIEW_RESAMPLING in COG, and I can see where
it's done and ... can maybe see my way through that ... but the 600 lines
of code
what's the task? what about batching the geometry and or fields? can you
run on just the first feature, does that work?
how many features, how big is the task?
On Wed, 27 Sept 2023, 13:16 Scott via gdal-dev,
wrote:
> Any tips for using ogr2ogr to use only a specified amount of RAM? I'm
> not
Are there any formats that record "coverage" topology? What I'm worried
about is when shapes are encoded as blob geometry with initially identical
coordinates at shared vertices, is there any process that can validate or
record that particular coords should have the same values even after
=704== Rerun with --leak-check=full to see details of leaked memory
>> ==704==
>> ==704== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)
>>
>>
>> ogrinfo /tmp/newdir
>> Illegal instruction (core dumped)
>>
>> Cheers, Mike
>>
>&g
on this URL I get an error, and as far as i understand - the server doesn't
support range downloading, the file otherwise works fine.
gdalinfo failed - unable to open '/vsicurl/
https://erddap.emodnet.eu/erddap/files/biology_6640_benthos_NorthSea_e4af_0f0e_6a73/04_2021_6640_diva_benthos_erddap.nc
I don't understand how jammy is "old" when the full build is itself using
"BASE_IMAGE=ubuntu:22.04"
But, I'm out of my depth in these emails and trying to learn, thanks!
Cheers, Mike
On Wed, 7 Feb 2024, 00:46 Javier Jimenez Shaw, wrote:
> Could you set up your VMs to include those SSE
When I translate this GeoTIFF to 10% original size, it's very very slow if
OVERVIEW_LEVEL=NONE is set.
The GeoTIFF has no overviews.
export dsn="/vsicurl/
https://github.com/mdsumner/cog-example/raw/main/cog/sentinel-image.tif;
## takes *forever*
gdal_translate $dsn out.tif -outsize 219 226 -oo
May I please ask for assistance with this code? I'm trying to close a
dataset with subdatasets and then reopen if the requested subdataset is
found.
It segfaults for all but the last subdataset name ... and I'm at a loss for
what I'm doing wrong.
Thank you.
Code below, and this gist documents
L_OF_RASTER, nullptr, nullptr, nullptr);
> CPLFree(pszSubdatasetSource);
> break;
>}
>else {
> CPLFree(pszSubdatasetSource);
>}
> }
>}
>
>poSrcDS->ReleaseRef();
>return 1;
> }
>
> Even
>
>
I'm getting Illegal instruction / core dumped on ogrinfo of a directory:
ogr2ogr /tmp/newdir
https://github.com/SymbolixAU/geojsonsf/raw/master/inst/examples/geo_melbourne.geojson
-f "ESRI Shapefile"
ogrinfo /tmp/newdir/
Illegal instruction (core dumped)
I've worked back through some docker
> On Sat, Feb 3, 2024 at 12:46 PM Even Rouault
> wrote:
>
>> Michael,
>>
>> I'm wondering if there not might be something wrong with your build or
>> runtime environment. Or there's something subtle, because that works fine
>> for me with my dev build or in the ghcr.
just to follow up, I got it all working in latest GDAL:
script at
https://github.com/mdsumner/cog-example/blob/main/gti/cop90.py
creates dsn
/vsicurl/https://github.com/mdsumner/cog-example/raw/main/gti/cop90.gti.fgb
which works nicely, thanks!
Cheers, Mike
On Tue, Jan 30, 2024 at 10:57
I would start with
gdalwarp out.tif
set -ts to something small at first to get a visual like -ts 1024 0 (the
y zero means the aspect ratio is figured out sanely from the x size - with
no arguments you get best-preserving grid from all resolved inputs - set
some or 3 of -t_srs -tr -te -ts for
;> ==704== HEAP SUMMARY:
>>> ==704== in use at exit: 25,486 bytes in 216 blocks
>>> ==704== total heap usage: 15,761 allocs, 15,545 frees, 2,390,169 bytes
>>> allocated
>>> ==704==
>>> ==704== LEAK SUMMARY:
>>> ==704==def
t; ==704== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al.
>>>> ==704== Using Valgrind-3.18.1 and LibVEX; rerun with -h for copyright
>>>> info
>>>> ==704== Command: ogrinfo /tmp/newdir
>>>> ==704==
>>>> INFO: Open of `/tmp/ne
awesome, thanks Even I'm having fun with this one.
For anyone interested I created Python to parse the OpenTopography COP90
VRT (I have to wget it locally as I don't know how to hit the URL for the
xml yet).
ah thanks, all very helpful - no it's not done with master (for various
reasons), I'll follow up with details if relevant, mostly I was just
excited to get a useable workflow for the entire process.
Cheers, Mike
On Tue, 30 Jan 2024, 22:46 Even Rouault, wrote:
> Michael,
>
> You need to attach
ot; (type "run") to get more useful information
>
> Even
> Le 03/02/2024 à 02:35, Michael Sumner via gdal-dev a écrit :
>
> I'm getting Illegal instruction / core dumped on ogrinfo of a directory:
>
> ogr2ogr /tmp/newdir
> https://github.com/SymbolixAU/geojsonsf/ra
Hi, I'm getting some pushback on my code style. I just want the raw bytes
in-memory of a file in Python, for a manageable tiny dataset.
Is there anything especially wrong with the following? It's essentially
the same as the answer here:
Or a grouping function that returned the cell index for neighbours and
weighting that are involved in whatever calculation summary is wanted.
Maybe the warper could return this as a starting point rather than doing
the "task at hand". ?
On Wed, Apr 24, 2024 at 8:51 PM Even Rouault via gdal-dev
not read 4029 bytes
>
> gdalinfo failed - unable to open '/vsicurl/
> https://login:pas...@n5eil01u.ecs.nsidc.org/AMSA/AU_SI12.001/2012.07.02/AMSR_U2_L3_SeaIce12km_B04_20120702.he5
> '.
>
>
>
>
>
> Thanks
>
>
>
> Joaquim
>
>
>
> *From:* gdal-dev *On Behalf
On Fri, Apr 26, 2024 at 5:37 AM lefsky--- via gdal-dev <
gdal-dev@lists.osgeo.org> wrote:
> I'd like to have a version of gdal2tiles that handles image types other
> than uint8. Is there a reason why it doesn't handle those types of images?
> It appears I'd have to modify the output format from
This time, with:
https://n5eil01u.ecs.nsidc.org/ATLAS/ATL20.004/2018.10.14/ATL20-01_20181001010615_00370101_004_01.h5
NETCDF gets the geotransform (from x_grid,y_grid which report in the
metadata) but no crs:
gdalinfo NETCDF:ATL20-02_20181001010615_00370101_004_01.h5 -sd 1 -nomd
Driver:
Ok so my naive edits are clearly not enough, they still get written as Byte
so it's deeper in the target spec and spread across a few places I'm not
ready to get across yet.
Happy to pursue in the longer term though.
Cheers, Mike
On Tue, Apr 30, 2024 at 7:00 AM Michael Sumner wrote:
>
>
> On
on gdal.org directly :
> https://gdal.org/search.html?q=gdal.warp_keywords=yes=default
>
> that leads to https://gdal.org/api/python/utilities.html#osgeo.gdal.Warp
>
> Even
> Le 29/04/2024 à 00:52, Michael Sumner via gdal-dev a écrit :
>
> I'm confused about how to browse the
I'm confused about how to browse the python api docs, I used to just web
search "osgeo.gdal Warp", and then scan down the first result page to find
"Warp(" and I had the documentation I needed.
Where is that now? (Why can't I find it, sorry - grepping the sources for
"Warp(" only results in RFC
outlier.
>
> (Someone, super bored, should author a book about with horror stories with
> netCDF and HDF georeferencing. There's a lot of material. Although likely
> not to be a best seller)
>
> Even
> Le 30/04/2024 à 09:45, Michael Sumner via gdal-dev a écrit :
>
>
This HDF5 (requires earthdata credentials your "Authorization: Bearer
" in GDAL_HTTP_HEADERS, or equiv) presents without geolocation
arrays.
gdalinfo "/vsicurl/
https://n5eil01u.ecs.nsidc.org/AMSA/AU_SI12.001/2012.07.02/AMSR_U2_L3_SeaIce12km_B04_20120702.he5;
-sd 26
Driver: HDF5Image/HDF5 Dataset
d11'27.54"S)
> Lower Right ( 395.000,-395.000) (135d 0' 0.00"E, 41d23'59.41"S)
> Center ( 0.000, 20.000) ( 0d 0' 0.01"E, 88d 8'51.76"S)
> Band 1 Block=632x1 Type=Int32, ColorInterp=Undefined
> NoData Value=0
>
>
>
>
> > And, can index be *value* in any contexts?
>
> If you use a raster with a signed data type, that could be negative
> values (assuming I understand your question)
>
>
ah I see, arbitrary integer values map to a colour 0:(n-1) colours, match
the ordered n values in the raster - that is
Hi, can we specify overview sizes exactly? I have this odd grid that is
36000x17999, and I get consequently yucky overview sizes:
gdal_create -outsize 36000 17999 -ot Int8 -co SPARSE_OK=YES -a_srs
EPSG:4326 -a_ullr 0 17999 0 36000 weird.tif
gdal_translate weird.tif cog.tif -of COG
gdalinfo
Excellent! Thanks for the answers.
When I've explored a bit more I might implement the overview sizes just so
we can match downstream tools (the current motivation is like-for-like
performance comparison, I haven't looked there but I think odc goes very
low level to eke out speed).
Cheers, Mike
Is the palette_file .txt format documented?
https://gdal.org/programs/gdalattachpct.html
It's mentioned in a few utilities, and created by tests but I couldn't find
an existing example or a description (I guessed, incorrectly at first,
leaving out the index column). I take it that it is 0-255
> For that particular file, I see that the "feature_id" variable
> (corresponding to the "feature_id" dimension) has a cf_role =
> "timeseries_id" attribute, and that the global metadata has a
> featureType = "timeSeries" attribute. So given
>
>
Oh, my bad. It's documented to split into SDS for multiple CRS.
Ouch that's not how I thought it was working, but it makes sense.
Thanks, Mike
On Thu, Apr 4, 2024 at 1:26 AM Michael Sumner wrote:
> this works for me, there are ten items in the filelist:
>
> gdalinfo "STACIT:\"
>
this works for me, there are ten items in the filelist:
gdalinfo "STACIT:\"
https://earth-search.aws.element84.com/v1/search?collections=sentinel-2-c1-l2a=0,0,10,10=2023-01-01T00:00:00Z/2023-12-31T23:59:59Z\":asset=visual;
-oo MAX_ITEMS=10
and with 20 it's also fine, filelist of 20.
but, with
Here's an (ahem) extremely important discussion on the prospects for xarray
to extend from the coordinates-only model (like that of netcdf) for geo
reference:
https://discourse.pangeo.io/t/example-which-highlights-the-limitations-of-netcdf-style-coordinates-for-large-geospatial-rasters/
I'm
small relative epsilon, like 1e-8, is
> sufficient.
>
> Even
>
>
> Le 03/04/2024 à 22:56, Michael Sumner via gdal-dev a écrit :
>
> Here's an (ahem) extremely important discussion on the prospects for
> xarray to extend from the coordinates-only model (like that of n
This source has an array on 'feature_id' with 2729077 values, with various
fields
elevation, longitude, latitude, qBtmVertRunoff, qBucket, etc
'/vsis3/noaa-nwm-retro-v2.0-pds/full_physics/2017/20170401.CHRTOUT_DOMAIN1.comp'
It is accessible via the mdim api.
Structurally it is basically a
well actually, I think what I'm asking for is the intended behaviour, but
there's an error.
Is it meant to detect sets of variables on 1D dimensions and present them
as layers? That's what would make sense to me.
Still exploring.
Cheers, Mike
On Tue, Apr 2, 2024 at 5:36 AM Michael Sumner
Hello, I have this process to convert a URL to a MEM dataset.
It's not the target data but it's representative, a netcdf with band-level
and dataset-level metadata. We can clear the dataset level with
COPY_SRC_MDD, but I can't see how to do that for the band level. We're
hoping to keep this
This data source has an odd georeferencing, it's a 36000x17999 raster in
the extent -179.995 -89.995 180.005 89.995.
vrt://NETCDF:/vsicurl/
I forgot to mention that it needs earthdata credentials set up, basically
your "Authorization: Bearer " in GDAL_HTTP_HEADERS or similar
config.
https://urs.earthdata.nasa.gov/documentation/for_users/user_token
You can't download or stream these files without that set or logging in
(the file used
r more
> than 360 degrees of longitude. This value, when computed, is passed as a
> hint OGRCoordinateTransformation so that it can post-correct longitudes to
> apply a +/- 360 degree offset, to be in the range of the source dataset.
> I've relaxed the sanity check to allow slighly more th
44 matches
Mail list logo