yes, after remove  "order by" , limit works.

2019-06-13 

lk_hadoop 



发件人:ShaoFeng Shi <[email protected]>
发送时间:2019-06-07 10:45
主题:Re: Re: Re: jdbc query with limit not work
收件人:"user"<[email protected]>
抄送:

Because the query has the "order by"? The sorting need happens in Kylin side, 
so couldn't push down the limit to HBase.


Best regards,


Shaofeng Shi 史少锋
Apache Kylin PMC
Email: [email protected]


Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html
Join Kylin user mail group: [email protected]
Join Kylin dev mail group: [email protected]









lk_hadoop <[email protected]> 于2019年5月28日周二 下午1:28写道:

I've know the reason , because  
GTCubeStorageQueryBase.enableStorageLimitIfPossible method will change the 
limit push down behavior.

2019-05-28 

lk_hadoop 



发件人:"lk_hadoop"<[email protected]>
发送时间:2019-05-28 10:38
主题:Re: Re: Re: jdbc query with limit not work
收件人:"user"<[email protected]>,"lk_hadoop"<[email protected]>
抄送:

I was useing JDBC , I just add the limit clause at the end of the query sql . 
And When I do a query with aggregate functions   , I can see the log like : 
storage.StorageContext:167 : Enabling limit push down: 200000 at level: 
LIMIT_ON_RETURN_SIZE
when I qury what values dose a dim have , for example : 
"SELECT "SH_FETCH_SALE_BASE_FACT_ALL_NEW"."GOODS_SPEC" FROM 
"GJST"."SH_FETCH_SALE_BASE_FACT_ALL_NEW" "SH_FETCH_SALE_BASE_FACT_ALL_NEW"  
GROUP BY "SH_FETCH_SALE_BASE_FACT_ALL_NEW"."GOODS_SPEC" ORDER BY 
"SH_FETCH_SALE_BASE_FACT_ALL_NEW"."GOODS_SPEC" limit 1000"
I can't see any limit push down log .
But, What I really want is limit push down level : LIMIT_ON_SCAN
I have config the property : 
kylin.storage.limit-push-down-enabled=true

2019-05-28 

lk_hadoop 



发件人:"lk_hadoop"<[email protected]>
发送时间:2019-05-28 10:03
主题:Re: Re: jdbc query with limit not work
收件人:"user"<[email protected]>
抄送:



2019-05-28 

lk_hadoop 



发件人:JiaTao Tao <[email protected]>
发送时间:2019-05-27 19:47
主题:Re: jdbc query with limit not work
收件人:"user"<[email protected]>
抄送:

Hi
Try to set "kylin.query.max-return-rows" a larger value(>1042201), and re-run 
your query.



-- 



Regards!
Aron Tao


lk_hadoop <[email protected]> 于2019年5月27日周一 上午11:00写道:

hi,all:
    I'm using kylin2.6.1 , when I use JDBC Driver to connect to Kylin and query 
data, I got such error : 

org.apache.kylin.rest.exception.InternalErrorException: Query returned 1042201 
rows exceeds threshold 1000000
while executing SQL: "SELECT "SH_FETCH_SALE_BASE_FACT_ALL_NEW"."GOODS_SPEC" 
FROM "GJST"."SH_FETCH_SALE_BASE_FACT_ALL_NEW" "SH_FETCH_SALE_BASE_FACT_ALL_NEW" 
 GROUP BY "SH_FETCH_SALE_BASE_FACT_ALL_NEW"."GOODS_SPEC" ORDER BY 
"SH_FETCH_SALE_BASE_FACT_ALL_NEW"."GOODS_SPEC" limit 1000"

why ? 

2019-05-27


lk_hadoop 

Reply via email to