[ 
https://issues.apache.org/jira/browse/SPARK-12558?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maciej Bryński updated SPARK-12558:
-----------------------------------
    Description: 
Hi,
I have following issue when trying to use functions in group by clause. 
Example:

{code}
sqlCtx = HiveContext(sc)
rdd = sc.parallelize([{'test_date': 1451400761}])
df = sqlCtx.createDataFrame(rdd)
df.registerTempTable("df")
{code}

Now, where I'm using single function it's OK.
{code}
sqlCtx.sql("select cast(test_date as timestamp) from df group by cast(test_date 
as timestamp)").collect()

[Row(test_date=datetime.datetime(2015, 12, 29, 15, 52, 41))]
{code}

Where I'm using more than one function I'm getting AnalysisException
{code}
sqlCtx.sql("select date(cast(test_date as timestamp)) from df group by 
date(cast(test_date as timestamp))").collect()

Py4JJavaError: An error occurred while calling o38.sql.
: org.apache.spark.sql.AnalysisException: expression 'test_date' is neither 
present in the group by, nor is it an aggregate function. Add to group by or 
wrap in first() (or first_value) if you don't care which value you get.;
{code}

  was:
Hi,
I have following issue when trying to use functions in group by clause.
Example:

{code}
rdd = sc.parallelize([{'test_date': 1451400761}])
df = sqlCtx.createDataFrame(rdd)
df.registerTempTable("df")
{code}

Now, where I'm using single function it's OK.
{code}
sqlCtx.sql("select cast(test_date as timestamp) from df group by cast(test_date 
as timestamp)").collect()

[Row(test_date=datetime.datetime(2015, 12, 29, 15, 52, 41))]
{code}

Where I'm using more than one function I'm getting AnalysisException
{code}
sqlCtx.sql("select date(cast(test_date as timestamp)) from df group by 
date(cast(test_date as timestamp))").collect()

Py4JJavaError: An error occurred while calling o38.sql.
: org.apache.spark.sql.AnalysisException: expression 'test_date' is neither 
present in the group by, nor is it an aggregate function. Add to group by or 
wrap in first() (or first_value) if you don't care which value you get.;
{code}


> AnalysisException when multiple functions applied in GROUP BY clause
> --------------------------------------------------------------------
>
>                 Key: SPARK-12558
>                 URL: https://issues.apache.org/jira/browse/SPARK-12558
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Maciej Bryński
>
> Hi,
> I have following issue when trying to use functions in group by clause. 
> Example:
> {code}
> sqlCtx = HiveContext(sc)
> rdd = sc.parallelize([{'test_date': 1451400761}])
> df = sqlCtx.createDataFrame(rdd)
> df.registerTempTable("df")
> {code}
> Now, where I'm using single function it's OK.
> {code}
> sqlCtx.sql("select cast(test_date as timestamp) from df group by 
> cast(test_date as timestamp)").collect()
> [Row(test_date=datetime.datetime(2015, 12, 29, 15, 52, 41))]
> {code}
> Where I'm using more than one function I'm getting AnalysisException
> {code}
> sqlCtx.sql("select date(cast(test_date as timestamp)) from df group by 
> date(cast(test_date as timestamp))").collect()
> Py4JJavaError: An error occurred while calling o38.sql.
> : org.apache.spark.sql.AnalysisException: expression 'test_date' is neither 
> present in the group by, nor is it an aggregate function. Add to group by or 
> wrap in first() (or first_value) if you don't care which value you get.;
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to