Hi Riccardo,

I think you've run into a bit of a mismatch between Scala and Java types.
Could you please file a JIRA ticket for this with all the info above?

You should be able to work around this by first converting your array
contents to be java.lang.Short. I just tried this out and it worked for me:

DDL:
CREATE TABLE ARRAY_TEST_TABLE_SHORT (ID BIGINT NOT NULL PRIMARY KEY,
SHORTARRAY SMALLINT[]);

Spark:

val dataSet = List((1L, Array[java.lang.Short](1.toShort, 2.toShort,
3.toShort)))
sc.parallelize(dataSet).saveToPhoenix("ARRAY_TEST_TABLE_SHORT",
Seq("ID","SHORTARRAY"), zkUrl = Some("localhost"))


Best of luck,

Josh



On Tue, Aug 4, 2015 at 6:49 AM, Riccardo Cardin <riccardo.car...@gmail.com>
wrote:

> Hi all,
>
> I am using Phoenix version 4.5.0 and the phoenix-spark plugin to write
> into HBase an ARRAY of UNSIGNED_SMALLINT. As stated in the documentation,
> this type is mapped to the java type java.lang.Short.
>
> Using the saveToPhoenix method on a RDD and passing a Scala Array of Short
> I obtain the following stacktrace:
>
> Caused by: java.lang.ClassCastException: *[S cannot be cast to
> [Ljava.lang.Object;*
> at
> org.apache.phoenix.schema.types.PUnsignedSmallintArray.isCoercibleTo(PUnsignedSmallintArray.java:81)
> at
> org.apache.phoenix.expression.LiteralExpression.newConstant(LiteralExpression.java:174)
> at
> org.apache.phoenix.expression.LiteralExpression.newConstant(LiteralExpression.java:157)
> at
> org.apache.phoenix.expression.LiteralExpression.newConstant(LiteralExpression.java:144)
> at
> org.apache.phoenix.compile.UpsertCompiler$UpsertValuesCompiler.visit(UpsertCompiler.java:872)
> at
> org.apache.phoenix.compile.UpsertCompiler$UpsertValuesCompiler.visit(UpsertCompiler.java:856)
> at org.apache.phoenix.parse.BindParseNode.accept(BindParseNode.java:47)
> at
> org.apache.phoenix.compile.UpsertCompiler.compile(UpsertCompiler.java:745)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:550)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableUpsertStatement.compilePlan(PhoenixStatement.java:538)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:318)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:311)
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:309)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:239)
> at
> org.apache.phoenix.jdbc.PhoenixPreparedStatement.execute(PhoenixPreparedStatement.java:173)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeBatch(PhoenixStatement.java:1315)
>
> Changing the type of the column to CHAR(1) ARRAY and use an Array of
> String, the write operation succeds.
>
> What am I doing wrong?
>
> Thanks a lot,
> Riccardo
> --
>

Reply via email to