Hi Florian, I think having a pipeline performing a „point-in-polygon-check“ on gps point data, e.g., to monitor/analyze cars in cities on the fly. This could be valuable in many use cases that I came across in the past.
I have some questions/remarks: 1) Since we currently also provide the geo PE as part of the JVM all Docker image containing all JVM-based algorithms as part of „streampipes lite“ in the installer, this would introduce the necessity of having having Postgis as another mandatory external service - however only used for certain geo-related pipeline elements that depend on this database to retrieve certain information, e.g. polygon. Thus, we might need to think of a way/concept of how to „bundle“ pipeline elements in a more categorical way, e.g. (pre-processing bundle, basic analytic bundle, geo analytic bundle, ml bundle) which is a completely other topic for a new discussion or wiki entry. 2) Regarding this specific use case of having a point in polygon check PE, I assume that the geofences checked against are not frequently changing, right? So what do you think of having a pipeline element, where you can „upload“ the geometry information for the polygon(s) as part of the controller dialog [1]. What do you think? Could this be a feasible alternative? Other than that, I think we could add Postgis for the CLI development part to having an instance for testing. However, I guess we would need a solution for the init script as you also pointed out. Cheers Patrick [1] https://github.com/apache/incubator-streampipes-extensions/blob/20927075b2264a7b059d81a3d756958499360947/streampipes-processors-transformation-jvm/src/main/java/org/apache/streampipes/processors/transformation/jvm/processor/csvmetadata/CsvMetadataEnrichmentController.java <https://github.com/apache/incubator-streampipes-extensions/blob/20927075b2264a7b059d81a3d756958499360947/streampipes-processors-transformation-jvm/src/main/java/org/apache/streampipes/processors/transformation/jvm/processor/csvmetadata/CsvMetadataEnrichmentController.java> > Am 24.05.2020 um 13:11 schrieb Florian Micklich <[email protected]>: > > Hi Patrick, > > with internal use I mean that the user don't have to specify e.g. host, > db-name to use this feature. All necessary settings will be set > internal by default. > > An example would be a sink called geofence. So a single polygon > geometry or multipolygon will be stored in the DB and another PE > extract the geometry from the geofence table and add it to the stream. > The user only have to specify what table geofence he want's to use. > > A scenario would be: I have a pipeline to define my polygon as a > geofence result. This geofence is stored in the db. > In another pipeline I have floating car positions and if a car enters > my geofence, I get some message or other kind of operation. > > > Another internal use would be to get information from raster data to > enrich the stream with information. An example would be to have a > point and I want to get the height value from this position. > Free SRTM data is stored in the DB and the value can be extracted from > this internal source. > > I scripted the script part. The necessary settings and controls will be > set and checked in the depending onInvocation method. > > Greetings > Florian > > > > Am Samstag, den 23.05.2020, 09:54 +0200 schrieb Patrick Wiener: >> Hi Florian, >> >> What kind of scenario do you have in mind for what you referred to as >> „internal use“? >> Do you want to query some information from the database? Could you >> elaborate on that? >> >> In addition, I’m wondering what does this init sql script do? Create >> certain databases, schemas, tables etc? >> >> Cheers >> Patrick >> >>> Am 22.05.2020 um 19:42 schrieb Florian Micklich < >>> [email protected]>: >>> >>> Hiho, >>> >>> I want to start implementing a new service for the StreamPipes >>> installer called postgis which, as the name suggests, uses PostGIS >>> [1]. >>> >>> This service is intended for a geo-sink as well as for some >>> internal >>> use in certain Geo-PE's. >>> >>> >>> Therefore I created already a docker-compose file and also an >>> initial >>> setup script (for testing I had to comment out some parts) >>> >>> >>> version: "2.0" >>> >>> services: >>> postgis: >>> image: postgis/postgis >>> ports: >>> - 54321:5432 >>> environment: >>> POSTGRES_USER: geo_streampipes >>> POSTGRES_PASSWORD: [LIKEPOSTGRES] >>> volumes: >>> - ./scripts/init.sql:/docker-entrypoint- >>> initdb.d/01_init.sql >>> # extra_hosts: >>> # - host.docker.internal:${HOST_DOCKER_INTERNAL} >>> # networks: >>> # spnet: >>> >>> volumes: >>> postgis: >>> >>> # networks: >>> # spnet: >>> # external: true >>> >>> >>> So how can this be part of the StreamPipes installer as a service? >>> >>> Another option would be to create an own docker >>> "streampipes_postgis" >>> image and the initial script would be already implemented. >>> >>> The own image would be nicer and cleaner in my point of view. >>> >>> What do you think? >>> >>> >>> Greetings >>> Florian >>> >> >> >
