bgruenefeld opened a new issue #2124:
URL: https://github.com/apache/iotdb/issues/2124
i have created a data model with the following structure:
root.storagegroup.city.device0001
The time series measured by the device is the energy consumption in 15
minute intervals.
So for example:
root.storagegroup.city.device0001.consumption
There are currently 1000 devices in my data model:
root.storagegroup.city.device0001
root.storagegroup.city.device0002
...
...
root.storagegroup.city.device1000
I would like to query the time series data in a time range (3 days) for all
devices of a city with a SQL query.
For this I use the following SQL string:
**select consumption from root.storagegroup.city where time > 1546300800000
and time < 1546560000000**
The result of this query should contain 1000 time series with 288 (3 x 96)
values each.
However, for some time series the result provides far too many values up to
11.808
I have modified the query i used a little bit and find out that the number
of expected data points is correct for some time series and too large for
others.
for example:
**select count(comsumption) from root.storagegroup.city.device0222 where
time > 1546300800000 and time < 1546560000000**
result is **288** (which is the expected amount)
but the following query
**select count(comsumption) from root.storagegroup.city.device0020 where
time > 1546300800000 and time < 1546560000000**
result is **11.808** which is by far much more than expected 288 (
intersstingly 41 * 288 = 11.808)
So my question is:
How can this happens?
**Do I make a mistake in my SQL query or is the error somewhere else?**
- OS: Windows 10
- Version 0.11.0 RC3
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]