[ 
https://issues.apache.org/jira/browse/PHOENIX-942?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

James Taylor resolved PHOENIX-942.
----------------------------------
    Resolution: Not a Problem

We support a hash join strategy only currently, so there will be limits to what 
you can join. Try increasing your memory settings if you feel that the RHS 
table query rows should fit into memory.

> exception in join over large dataset
> ------------------------------------
>
>                 Key: PHOENIX-942
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-942
>             Project: Phoenix
>          Issue Type: Improvement
>    Affects Versions: 3.0.0
>            Reporter: saravanan
>
> I am try to do a normal join operation on two tables but getting 
> exception…where my table size is 10million and 23 millions..
> This is my query
>  
> select "ga__dmne","wl__dmne" from "ff_ga_main" inner join "ff_wl_main" wl 
> on("evdt");
>  
> and my exceptions are:
>  
> java.sql.SQLException: Encountered exception in hash plan [0] execution.
>         at 
> org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:13 9)
>         at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatemen 
> t.java:202)
>         at 
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.jav a:188)
>         at 
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.execute(PhoenixPrepa 
> redStatement.java:146)
>         at 
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.execute(PhoenixPrepa 
> redStatement.java:151)
>         at 
> org.apache.phoenix.jdbc.PhoenixConnection.executeStatements(PhoenixCo 
> nnection.java:207)
>         at 
> org.apache.phoenix.util.PhoenixRuntime.executeStatements(PhoenixRunti 
> me.java:257)
>         at 
> org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:201)
> Caused by: java.lang.OutOfMemoryError: Requested array size exceeds VM limit
>         at java.util.Arrays.copyOf(Arrays.java:2271)
>         at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113)
>         at 
> java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.ja va:93)
>         at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)
>         at java.io.DataOutputStream.write(DataOutputStream.java:107)
>         at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:148)
>         at 
> org.apache.phoenix.join.HashCacheClient.serialize(HashCacheClient.jav a:101)
>         at 
> org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient. java:77)
>         at 
> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:112)
>         at 
> org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:107)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor. 
> java:1145)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor 
> .java:615)
>         at java.lang.Thread.run(Thread.java:744)
>  
>  
> can someone help?????



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to