[ 
https://issues.apache.org/jira/browse/SPARK-40412?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang resolved SPARK-40412.
---------------------------------
    Resolution: Invalid

Spark SQL does not support \{{limit n, m}}. Please contact the huawei cloud.

> limit(x,y) + 子查询 出现数据丢失和乱序问题
> ----------------------------
>
>                 Key: SPARK-40412
>                 URL: https://issues.apache.org/jira/browse/SPARK-40412
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle
>    Affects Versions: 2.4.5
>         Environment: hive on spark
> hive 3.1.0
> spark 2.4.5
>            Reporter: FengJia
>            Priority: Major
>              Labels: hiveonspark, limit
>
> select * 
> from(
> select * from
> table
> limit 10,20
> )
> 结果只有10条  并且不是第11条到第20条  顺序也不对
>  
> select * from
> table
> limit 10,20
> 结果是20条,顺序是11到第30条
> select * 
> from(
> select * from
> table
> order by id
> limit 10,20
> )
> 结果是20条,且顺序也是11到30条
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to