[ 
https://issues.apache.org/jira/browse/SPARK-20110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16974177#comment-16974177
 ] 

hurelhuyag commented on SPARK-20110:
------------------------------------

I just faced same problem now. It's spark version 2.4.4. I don't understand 
what's difference. 2 query doing same thing. If first is wrong then second 
should wrong.

> Windowed aggregation do not work when the timestamp is a nested field
> ---------------------------------------------------------------------
>
>                 Key: SPARK-20110
>                 URL: https://issues.apache.org/jira/browse/SPARK-20110
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 2.1.0
>            Reporter: Alexis Seigneurin
>            Priority: Major
>              Labels: bulk-closed
>
> I am loading data into a DataFrame with nested fields. I want to perform a 
> windowed aggregation on the timestamp from a nested fields:
> {code}
>   .groupBy(window($"auth.sysEntryTimestamp", "2 minutes"))
> {code}
> I get the following error:
> {quote}
> org.apache.spark.sql.AnalysisException: Multiple time window expressions 
> would result in a cartesian product of rows, therefore they are not currently 
> not supported.
> {quote}
> This works fine if I first extract the timestamp to a separate column:
> {code}
>   .withColumn("sysEntryTimestamp", $"auth.sysEntryTimestamp")
>   .groupBy(
>     window($"sysEntryTimestamp", "2 minutes")
>   )
> {code}
> Please see the whole sample:
> - batch: 
> https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/4683710270868386/4278399007363210/3769253384867782/latest.html
> - Structured Streaming: 
> https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/4683710270868386/4278399007363192/3769253384867782/latest.html



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to