Ok I have already set up mine

    <property>
    <name>hive.limit.optimize.fetch.max</name>
    <value>50000</value>
    <description>
      Maximum number of rows allowed for a smaller subset of data for
simple LIMIT, if it is a fetch query.
      Insert queries are not restricted by this limit.
    </description>
  </property>

I am surprised that yours was missing. What did you set it up to?







Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 2 August 2016 at 10:18, Chanh Le <giaosu...@gmail.com> wrote:

> I tried and it works perfectly.
>
> Regards,
> Chanh
>
>
> On Aug 2, 2016, at 3:33 PM, Mich Talebzadeh <mich.talebza...@gmail.com>
> wrote:
>
> OK
>
> Try that
>
> Another tedious way is to create views in Hive based on tables and use
> limit on those views.
>
> But try that parameter first if it does anything.
>
> HTH
>
>
> Dr Mich Talebzadeh
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
> http://talebzadehmich.wordpress.com
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 2 August 2016 at 09:13, Chanh Le <giaosu...@gmail.com> wrote:
>
>> Hi Mich,
>> I use Spark Thrift Server basically it acts like Hive.
>>
>> I see that there is property in Hive.
>>
>> hive.limit.optimize.fetch.max
>>
>>    - Default Value: 50000
>>    - Added In: Hive 0.8.0
>>
>> Maximum number of rows allowed for a smaller subset of data for simple
>> LIMIT, if it is a fetch query. Insert queries are not restricted by this
>> limit.
>>
>>
>> Is that related to the problem?
>>
>>
>>
>>
>> On Aug 2, 2016, at 2:55 PM, Mich Talebzadeh <mich.talebza...@gmail.com>
>> wrote:
>>
>> This is a classic problem on any RDBMS
>>
>> Set the limit on the number of rows returned like maximum of 50K rows
>> through JDBC
>>
>> What is your JDBC connection going to? Meaning which RDBMS if any?
>>
>> HTH
>>
>> Dr Mich Talebzadeh
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>> On 2 August 2016 at 08:41, Chanh Le <giaosu...@gmail.com> wrote:
>>
>>> Hi everyone,
>>> I setup STS and use Zeppelin to query data through JDBC connection.
>>> A problem we are facing is users usually forget to put limit in the
>>> query so it causes hang the cluster.
>>>
>>> SELECT * FROM tableA;
>>>
>>> Is there anyway to config the limit by default ?
>>>
>>>
>>> Regards,
>>> Chanh
>>
>>
>>
>>
>
>

Reply via email to