Hello Rob,

As you may know I have a long experience in Geospatial data, and I'm now
investigating Spark... So I'll be very interested further answers but also
to participate to going forward on this great idea!

For instance, I'd say that implementing classical geospatial algorithms
like classification, feature extraction, pyramid generation and so on would
be a geo-extension lib to Spark, this would be easier using Geotrellis API.

My only question, for now, is that Geotrellis has his own notion of lineage
and Spark as well, so maybe some harmonization work will have to be done to
serialize and schedule them? Maybe Pickles could help for the serialization
part...

Sorry If I miss something (or even said stupidities ^^)... I'm going now to
the thread you mentioned!

Looking forward ;)

Cheers
andy


On Thu, Nov 7, 2013 at 8:49 PM, Rob Emanuele <lossy...@gmail.com> wrote:

> Hello,
>
> I'm a developer on the GeoTrellis project (http://geotrellis.github.io).
> We do fast raster processing over large data sets, from web-time
> (sub-100ms) processing for live endpoints to distributed raster analysis
> over clusters using Akka clustering.
>
> There's currently discussion underway about moving to support a Spark
> backend for doing large scale distributed raster analysis. You can see the
> discussion here:
> https://groups.google.com/forum/#!topic/geotrellis-user/wkUOhFwYAvc. Any
> contributions to the discussion would be welcome.
>
> My question to the list is, is there currently any development towards a
> geospatial data story for Spark, that is, using Spark for large scale
> raster\vector spatial data analysis? Is there anyone using Spark currently
> for this sort of work?
>
> Thanks,
> Rob Emanuele
>

Reply via email to