Hi Ashish,

Can you please share the query profile? I am curious - did you start with
hash_join disabled or did it for troubleshooting?

Gautam

On Thu, Oct 25, 2018 at 12:01 AM Ashish Pancholi <apanch...@chambal.com>
wrote:

> ---------- Forwarded message ---------
> From: Ashish Pancholi <apanch...@chambal.com>
> Date: Thu, Oct 25, 2018 at 12:21 PM
> Subject: How to resolve error- "One or more nodes ran out of memory while
> executing the query"?
> To: <user-subscr...@drill.apache.org>, <user@drill.apache.org>
>
>
>
> I am using `Apache Drill` version `1.14` on `windows` system and running
> Drill using the command:
>
>     sqlline.bat -u "jdbc:drill:zk=local"
>
>
>  I am trying to `execute` a `join query` on two `compressed` and `archived`
> `CSV` files.
>
> Query:
>
>     SELECT  *  FROM
> >
> dfs.`C:\Users\admin\Desktop\DRILL_FILES\csvFileParquet\TBL_MOREDATA-20180924181406.tar.gz`
> > AS Table0 INNER JOIN
> >
> dfs.`C:\Users\admin\Desktop\DRILL_FILES\csvFileParquet\TBL_MOREDATA1-20180924181406.tar.gz`
> > AS Table1  ON  Table0.columns[0]=Table1.columns[0]
>
>
> But an out of memory, error occurred:
>
>     org.apache.drill.common.exceptions.UserRemoteException: RESOURCE ERROR:
> > One or more nodes ran out of memory while executing the query. Unable to
> > allocate buffer of size 131072 (rounded from 86104) due to memory limit
> > (630194176). Current allocation: 630108434 Fragment 0:0 [Error Id:
> > 585c0644-5fd5-446e-b9b3-d48e0771eb2a on DESKTOP-SM3E3KM:31010]
>
>
> To resolve the issue, I tried to update `config\drill-env.sh` file but the
> issue remains the same and it looks like updating the script file does not
> reflect the changes because I am trying to increase the DIRECT MEMORY
> beyond the system memory (RAM), every time drill starts up peacefully. Not
> even complaining that you have exceeded the memory, therefore, it looks
> like the changes are not reflecting.
>
>     export DRILLBIT_MAX_PROC_MEM=12G
> >     export DRILL_HEAP=2G
> >     export DRILL_MAX_DIRECT_MEMORY=10G
>
>
> whereas my system's main memory is only 8 GB.
>
> *Please help me to resolve the out of memory error*. I had even run the
> below queries, in order to follow the troubleshooting instructions but the
> issue remains the same.
>
>
>    -     alter session set `planner.enable_hashagg` = false;
>    -     alter session set `planner.enable_hashjoin` = false;
>    -     alter session set planner.width.max_per_node=3;
>    -     alter system set planner.width.max_per_query = 100;
>
>
>
>
>
>
>
>
>
>
>
> --
> Thanks
> Ashish Pancholi
>
>
> --
> Thanks
> Ashish Pancholi
>

Reply via email to