[ 
https://issues.apache.org/jira/browse/PHOENIX-2162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14658448#comment-14658448
 ] 

Josh Mahonin commented on PHOENIX-2162:
---------------------------------------

Thanks for the report [~rcardin]

Changing the data types didn't work for you? Given the following DDL, what 
happens if you run the Spark code below in spark-shell?

DDL
{code}
CREATE TABLE ARRAY_TEST_TABLE_SHORT (ID BIGINT NOT NULL PRIMARY KEY, SHORTARRAY 
SMALLINT[]);
{code}

Spark
{code}
import org.apache.phoenix.spark._
val dataSet = List((1L, Array[java.lang.Short](1.toShort,2.toShort,3.toShort)))
sc.parallelize(dataSet).saveToPhoenix("ARRAY_TEST_TABLE_SHORT", 
Seq("ID","SHORTARRAY"), zkUrl = Some("localhost"))
{code}

> Exception trying to write an ARRAY of UNSIGNED_SMALLINT
> -------------------------------------------------------
>
>                 Key: PHOENIX-2162
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-2162
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.5.0
>         Environment: - Windows 7
> - Spark 1.3.1
> - Scala 2.10.5
> - HBase 1.0.1.1
>            Reporter: Riccardo Cardin
>
> I am using Phoenix version 4.5.0 and the phoenix-spark plugin to write into 
> HBase an ARRAY of UNSIGNED_SMALLINT. As stated in the documentation, this 
> type is mapped to the java type java.lang.Short.
> Using the saveToPhoenix method on a RDD and passing a Scala Array of Short I 
> obtain the following stacktrace:
> {noformat}
> Caused by: java.lang.ClassCastException: [S cannot be cast to 
> [Ljava.lang.Object;
>       at 
> org.apache.phoenix.schema.types.PUnsignedSmallintArray.isCoercibleTo(PUnsignedSmallintArray.java:81)
>       at 
> org.apache.phoenix.expression.LiteralExpression.newConstant(LiteralExpression.java:174)
>       at 
> org.apache.phoenix.expression.LiteralExpression.newConstant(LiteralExpression.java:157)
>       at 
> org.apache.phoenix.expression.LiteralExpression.newConstant(LiteralExpression.java:144)
>       at 
> org.apache.phoenix.compile.UpsertCompiler$UpsertValuesCompiler.visit(UpsertCompiler.java:872)
>       at 
> org.apache.phoenix.compile.UpsertCompiler$UpsertValuesCompiler.visit(UpsertCompiler.java:856)
>       at org.apache.phoenix.parse.BindParseNode.accept(BindParseNode.java:47)
>       at 
> org.apache.phoenix.compile.UpsertCompiler.compile(UpsertCompiler.java:745)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:550)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:538)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:318)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:311)
>       at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:309)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:239)
>       at 
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.execute(PhoenixPreparedStatement.java:173)
>       at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeBatch(PhoenixStatement.java:1315)
> {noformat}
> Changing the type of the column to CHAR(1) ARRAY and use an Array of String, 
> the write operation succeds.
> I've tried to force to use an Array[java.lang.Short), to avoid mismatch 
> between Scala and Java types, but I've obtained the same error.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to