Hi,
How I can use cloud optimized GeoTIFFs (COGs) exported from Google Earth 
Engine (GEE) to Google Cloud Storage (GCS) buckets to be integrated and 
read into Openlayers?

Web map app will have users (many-sites & many-dates - time-series 
imagery). Each user area of interest (AOI) is a bucket and so unique. There 
will be many buckets and unique to each user AOI. Each bucket will have 
different image types (8 bit - true color & false color, 32 bit - ndvi, 
etc.). Images aren't huge (1-10-100 MB) and a user bucket will be 1-10 GB 
for few years of data. A specific user can access only images from their 
bucket, and the user will likely access the recent and latest images in 
their bucket. Each COG time-series image will have a filename format: 
*Bucket_Sensor_Type_Date.tif 
*(Image date of 01Aug2020 format).
My plan is to manually run GEE codes on user setup to export past years 
data over user AOI, and then schedule to run GEE codes daily (once/twice) 
to identify and export any new images for user AOIs. Once new COGs exported 
to user buckets, codes run to create/update any latest modified images and 
publish them for web mapping and make available to users.  

Thank you,
Arun

-- 
You received this message because you are subscribed to the Google Groups 
"OpenLayers Dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/openlayers-dev/b588ed3e-25a4-4334-8026-092136f6bf4an%40googlegroups.com.

Reply via email to