bossenti opened a new issue, #1322: URL: https://github.com/apache/streampipes/issues/1322
### Apache StreamPipes version dev (current development state) ### Affected StreamPipes components Backend ### What happened? When querying the data lake API for measurements with the `columns` parameter (allowing to filter the returned data for the selected columns), the following behavior is observed: The queried data lake measure contains data from the machine data simulator in the `density` mode. 1) Providing invalid columns If we query invalid columns, e.g. `test1234`, via `http://localhost:80/streampipes-backend/api/v4/datalake/measurements/test?columns=test1234`, the returned result does not contain any records -> expected ✅ 2) Provide valid columns If we query valid columns, e.g. `time` and `density` via `http://localhost:80/streampipes-backend/api/v4/datalake/measurements/test?columns=time,density`, the returned result does only contain the given columns -> expected ✅ 3) Provide combination of valid and invalid columns If we quer a combination of valid and invalid columns, e.g. `time`, `density` and `test1234` via `http://localhost:80/streampipes-backend/api/v4/datalake/measurements/test?columns=time,density,test1234`, the returned result contains the data of the valid columns plus the invalid columns filled up with `null`. ``` [ ['2023-02-19T15:44:57.831Z', 43.70249938964844, None], ['2023-02-19T15:44:58.839Z', 43.871578216552734, None], ... ] ``` This behavior is independent from the order of the column names ### How to reproduce? As described above ### Expected behavior The API should either return an empty data measurement to be consistent with scenario `1)` ### Additional technical information _No response_ ### Are you willing to submit a PR? None -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
