alamb commented on issue #4809:
URL: 
https://github.com/apache/arrow-datafusion/issues/4809#issuecomment-1372172978

   > Are you aware of any approaches other than the two we have so far 
(time_bucket_gapfill and generate_series)? 
   
   I don't know if any upcoming SQL standard for this but I didn't look hard at 
it either. 
   
   This use case is common. It is often called bucketing with "gap filling" or 
"interpolation" in other SQL implementations. This type of query is not easy to 
express in ANSI-SQL and thus databases often offer some sort of SQL extension.
   
   Here are some example extensions I found:
   * Timescale provides time_bucket_gapfill. More details:
   * *Blog*: [Mind the gap: Using SQL functions for time-series 
analysis](https://www.timescale.com/blog/sql-functions-for-time-series-analysis/)
  and
   * *Docs*: [Timescale Documentation | Gapfilling and 
interpolation](https://docs.timescale.com/timescaledb/latest/how-to-guides/hyperfunctions/gapfilling-interpolation/)
   * Vertica offers 
[TIMESERIES](https://www.vertica.com/docs/9.3.x/HTML/Content/Authoring/AnalyzingData/TimeSeries/UsingTimeSeriesAnalytics.htm)
 
   * Teradata offers [GROUP BY 
TIME](https://docs.teradata.com/r/Teradata-VantageTM-Time-Series-Tables-and-Operations/March-2019/Time-Series-Aggregates-and-SELECT-Extensions)
   * QuestDB offers [LATEST ON](https://questdb.io/docs/reference/sql/latest-on)
   … (I stopped searching) … 
   
   All of these extensions have two main features:
   * Some way to define regular windows of time across a contiguous interval of 
time, even if there are no time values in that windows
   * An interpolation policy of what value to use when (e.g. previous value, 
NULL, etc) there is no data for the window in the database.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to