[ 
https://issues.apache.org/jira/browse/SPARK-18804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15744246#comment-15744246
 ] 

Gopal Nagar edited comment on SPARK-18804 at 12/13/16 6:00 AM:
---------------------------------------------------------------

Apologies for marking this JIRA as bug. This may not be a bug in Spark. But i 
wanted to get some input on How to make effective join ? Bcoz in my case, job 
fails despite having enough resources.

Any advise on how to track the issue and debugging will help. 


was (Author: gopalnaga...@gmail.com):
Apologies for marking this JIRA as bug. This may not be a bug in Spark. But i 
wanted to get some input on How to make effective join ? Bcoz in my case, job 
fails despite having enough resources.


> Join doesn't work in Spark on Bigger tables
> -------------------------------------------
>
>                 Key: SPARK-18804
>                 URL: https://issues.apache.org/jira/browse/SPARK-18804
>             Project: Spark
>          Issue Type: Question
>          Components: Input/Output
>    Affects Versions: 1.6.1
>            Reporter: Gopal Nagar
>
> Hi All,
> Spark1.6.1 has been installed on a AWS EMR 3 node cluster which has 32 GB RAM 
> and 80 GB storage each node. I am trying to join two tables (1.2 GB & 900 MB 
> ) have rows 4607818 & 14273378 respectively. It's running in client mode on 
> Yarn cluster manager.
> If i put the limit as 100 in select query it works fine. But if i try to join 
> on entire data set, Query runs for 3-4 hours and finally gets terminated. I 
> can see always 18 GB free on each nodes.
> I have tried increasing no of executers/cores/partitions. But still doesn't 
> work. This has been tried in PySpark and submitted using Spark Submit command 
> but doesn't run. Please advise.
> Join Query 
> --------------
> select * FROM table1 as t1 join table2 as t2 on t1.col = t2.col limit 100;



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to