[ 
https://issues.apache.org/jira/browse/SPARK-19451?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-19451:
----------------------------
    Component/s:     (was: Spark Core)
                 SQL

> rangeBetween method should accept Long value as boundary
> --------------------------------------------------------
>
>                 Key: SPARK-19451
>                 URL: https://issues.apache.org/jira/browse/SPARK-19451
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1, 2.0.2
>            Reporter: Julien Champ
>            Assignee: Jiang Xingbo
>             Fix For: 2.2.1, 2.3.0
>
>
> Hi there,
> there seems to be a major limitation in spark window functions and 
> rangeBetween method.
> If I have the following code :
> {code:title=Exemple |borderStyle=solid}
>     val tw =  Window.orderBy("date")
>       .partitionBy("id")
>       .rangeBetween( from , 0)
> {code}
> Everything seems ok, while *from* value is not too large... Even if the 
> rangeBetween() method supports Long parameters.
> But.... If i set *-2160000000L* value to *from* it does not work !
> It is probably related to this part of code in the between() method, of the 
> WindowSpec class, called by rangeBetween()
> {code:title=between() method|borderStyle=solid}
>     val boundaryStart = start match {
>       case 0 => CurrentRow
>       case Long.MinValue => UnboundedPreceding
>       case x if x < 0 => ValuePreceding(-start.toInt)
>       case x if x > 0 => ValueFollowing(start.toInt)
>     }
> {code}
> ( look at this *.toInt* )
> Does anybody know it there's a way to solve / patch this behavior ?
> Any help will be appreciated
> Thx



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to