[ 
https://issues.apache.org/jira/browse/PHOENIX-2691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15153432#comment-15153432
 ] 

Hadoop QA commented on PHOENIX-2691:
------------------------------------

{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12788562/PHOENIX-2691_v2.patch
  against master branch at commit 45a9d670bbb5e659fb967cfdbc6fc1ced43fba12.
  ATTACHMENT ID: 12788562

    {color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

    {color:green}+1 tests included{color}.  The patch appears to include 3 new 
or modified tests.

    {color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

    {color:red}-1 javadoc{color}.  The javadoc tool appears to have generated 
19 warning messages.

    {color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

    {color:red}-1 lineLengths{color}.  The patch introduces the following lines 
longer than 100:
    +        conn.createStatement().execute("UPSERT INTO test1 VALUES('1', 
'val', 100, 'a', ARRAY ['b'], 1, 2)");
+        conn.createStatement().execute("UPSERT INTO test1 VALUES('2', 'val', 
100, 'a', ARRAY ['b'], 3, 4)");
+        conn.createStatement().execute("UPSERT INTO test1 VALUES('3', 'val', 
100, 'a', ARRAY ['b','c'], 5, 6)");
+        ResultSet rs = conn.createStatement().executeQuery("SELECT c, SUM(f + 
g) AS sumone, d, e\n" + 
+                        throw new 
SQLExceptionInfo.Builder(SQLExceptionCode.UNSUPPORTED_GROUP_BY_EXPRESSIONS)
+    NO_TABLE_SPECIFIED_FOR_WILDCARD_SELECT(1057, "42Y10", "No table specified 
for wildcard select."),
+    UNSUPPORTED_GROUP_BY_EXPRESSIONS(1058, "43A14", "Only a single VARBINARY, 
ARRAY, or nullable BINARY type may be referenced in a GROUP BY."),
+    DEFAULT_COLUMN_FAMILY_ON_SHARED_TABLE(1069, "43A69", "Default column 
family not allowed on VIEW or shared INDEX."),
+        conn.createStatement().execute("CREATE TABLE T1 (PK VARCHAR PRIMARY 
KEY, c1 VARCHAR, c2 VARBINARY, C3 VARCHAR ARRAY, c4 VARBINARY, C5 VARCHAR 
ARRAY, C6 BINARY(10)) ");
+            
assertEquals(SQLExceptionCode.UNSUPPORTED_GROUP_BY_EXPRESSIONS.getErrorCode(), 
e.getErrorCode());

    {color:green}+1 core tests{color}.  The patch passed unit tests in .

Test results: 
https://builds.apache.org/job/PreCommit-PHOENIX-Build/259//testReport/
Javadoc warnings: 
https://builds.apache.org/job/PreCommit-PHOENIX-Build/259//artifact/patchprocess/patchJavadocWarnings.txt
Console output: 
https://builds.apache.org/job/PreCommit-PHOENIX-Build/259//console

This message is automatically generated.

> Exception while unpacking resultset containing VARCHAR ARRAY of unspecified 
> length
> ----------------------------------------------------------------------------------
>
>                 Key: PHOENIX-2691
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2691
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.6.0, 4.7.0
>            Reporter: Nick Dimiduk
>            Assignee: James Taylor
>             Fix For: 4.7.0
>
>         Attachments: 2691.00.patch, PHOENIX-2691.patch, PHOENIX-2691_v2.patch
>
>
> I have an aggregation query that consistently throws with either an 
> IllegalArgumentException or an OutOfMemoryException, at the same place. 
> Either way, the stack trace is nearly identical:
> {noformat}
> java.lang.IllegalArgumentException                                            
>                                                      
>         at java.nio.Buffer.position(Buffer.java:244)                          
>                                                      
>         at 
> org.apache.phoenix.schema.types.PArrayDataType.createPhoenixArray(PArrayDataType.java:1098)
>                              
>         at 
> org.apache.phoenix.schema.types.PArrayDataType.toObject(PArrayDataType.java:339)
>                                         
>         at 
> org.apache.phoenix.schema.types.PVarcharArray.toObject(PVarcharArray.java:65) 
>                                           
>         at 
> org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:985)        
>                                           
>         at 
> org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:75)
>                                     
>         at 
> org.apache.phoenix.jdbc.PhoenixResultSet.getString(PhoenixResultSet.java:601) 
>                                           
>         at sqlline.Rows$Row.<init>(Rows.java:183)                             
>                                                      
>         at sqlline.BufferedRows.<init>(BufferedRows.java:38)                  
>                                                      
>         at sqlline.SqlLine.print(SqlLine.java:1650)                           
>                                                      
>         at sqlline.Commands.execute(Commands.java:833)                        
>                                                      
>         at sqlline.Commands.sql(Commands.java:732)                            
>                                                      
>         at sqlline.SqlLine.dispatch(SqlLine.java:808)                         
>                                                      
>         at sqlline.SqlLine.begin(SqlLine.java:681)                            
>                                                      
>         at sqlline.SqlLine.start(SqlLine.java:398)                            
>                                                      
>         at sqlline.SqlLine.main(SqlLine.java:292)
> {noformat}
> or
> {noformat}
> java.lang.OutOfMemoryError: Java heap space
>         at java.lang.reflect.Array.newArray(Native Method)
>         at java.lang.reflect.Array.newInstance(Array.java:75)
>         at 
> org.apache.phoenix.schema.types.PArrayDataType.createPhoenixArray(PArrayDataType.java:1091)
>         at 
> org.apache.phoenix.schema.types.PArrayDataType.toObject(PArrayDataType.java:339)
>         at 
> org.apache.phoenix.schema.types.PVarcharArray.toObject(PVarcharArray.java:65)
>         at 
> org.apache.phoenix.schema.types.PDataType.toObject(PDataType.java:985)
>         at 
> org.apache.phoenix.compile.ExpressionProjector.getValue(ExpressionProjector.java:75)
>         at 
> org.apache.phoenix.jdbc.PhoenixResultSet.getString(PhoenixResultSet.java:601)
>         at sqlline.Rows$Row.<init>(Rows.java:183)
>         at sqlline.BufferedRows.<init>(BufferedRows.java:38)
>         at sqlline.SqlLine.print(SqlLine.java:1650)
>         at sqlline.Commands.execute(Commands.java:833)
>         at sqlline.Commands.sql(Commands.java:732)
>         at sqlline.SqlLine.dispatch(SqlLine.java:808)
>         at sqlline.SqlLine.begin(SqlLine.java:681)
>         at sqlline.SqlLine.start(SqlLine.java:398)
>         at sqlline.SqlLine.main(SqlLine.java:292)
> {noformat}
> Stepping through with the debugger, it appears the {{VARCHAR ARRAY}} value is 
> not parsed correctly. The special case of two nulls is not accounted for in 
> {{RowKeyValueAccessor#getLength()}}. This results in the offsets being 
> slightly wrong and then the value cannot be materialized correctly. Depending 
> on what's in the adjacent bytes either an invalid {{position}} call is made, 
> resulting in the {{IllegalArgumentException}}, or 
> {{PArrayDataType.createPhoenixArray}} attempts to allocate an array of 
> ridiculous size, resulting in the OOM.
> It appears the types of the columns returned in the {{KeyValue}} in the 
> {{Tuple currentRow}} are ordered {{VARCHAR}}, {{VARCHAR ARRAY}}, {{INTEGER}}. 
> I can share the KeyValue bytes with you offline if that will help in 
> debugging.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to