[ 
https://issues.apache.org/jira/browse/SPARK-27767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16865912#comment-16865912
 ] 

Dylan Guedes edited comment on SPARK-27767 at 6/17/19 8:02 PM:
---------------------------------------------------------------

[~smilegator] by the way, I just checked and there is a (minor) difference: 
when you use `range()` and define it as a sub-query called `x`, for instance, 
the default name for the column became `x.id`, instead of just `x`, that is the 
behaviour in Postgres. For instance:

{code:sql}
from range(-32766, -32764) x;
{code}
In Spark, looks like you should reference to these values as `x.id`. Meanwhile, 
in Postgres you can call them through just `x`. 

EDIT: Btw, this call also does not work:

{code:sql}
SELECT range(1, 100) OVER () FROM empsalary
{code}



was (Author: dylanguedes):
[~smilegator] by the way, I just checked and there is a (minor) difference: 
when you use `range()` and define it as a sub-query called `x`, for instance, 
the default name for the column became `x.id`, instead of just `x`, that is the 
behaviour in Postgres. For instance:

{code:sql}
from range(-32766, -32764) x;
{code}
In Spark, looks like you should reference to these values as `x.id`. Meanwhile, 
in Postgres you can call them through just `x`. 

> Built-in function: generate_series
> ----------------------------------
>
>                 Key: SPARK-27767
>                 URL: https://issues.apache.org/jira/browse/SPARK-27767
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Xiao Li
>            Priority: Major
>
> [https://www.postgresql.org/docs/9.1/functions-srf.html]
> generate_series(start, stop): Generate a series of values, from start to stop 
> with a step size of one
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to