[
https://issues.apache.org/jira/browse/PHOENIX-3311?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15696880#comment-15696880
]
Kalyan commented on PHOENIX-3311:
---------------------------------
Thanks Josh Mahonin , Both solutions are duplicate only ... we can ignore this
solution.
I tried with spark-1.6 with scala-2.10 and spark-2.0 with scala-2.11 ... code
working.
we need to prepare 2 jars with old and new version support.
> phoenix-spark("4.8.0-HBase-1.2") is not compatible with spark 2.0
> -----------------------------------------------------------------
>
> Key: PHOENIX-3311
> URL: https://issues.apache.org/jira/browse/PHOENIX-3311
> Project: Phoenix
> Issue Type: Bug
> Affects Versions: 4.8.0
> Reporter: Kui Xiang
> Assignee: Kalyan
> Priority: Critical
> Labels: phoenix, spark2.0.0
>
> sbt:
> libraryDependencies += "org.apache.phoenix" % "phoenix-spark" %
> "4.8.0-HBase-1.2"
> scala:
> import org.apache.phoenix.spark._
> will compile fail with errors like
> [error] missing or invalid dependency detected while loading class file
> 'ProductRDDFunctions.class'.
> [error] Could not access type Logging in package org.apache.spark,
> [error] because it (or its dependencies) are missing. Check your build
> definition for
> [error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath`
> to see the problematic classpath.)
> [error] A full rebuild may help if 'ProductRDDFunctions.class' was compiled
> against an incompatible version of org.apache.spark.
> [error] one error found
> [debug] Compilation failed (CompilerInterface)
> [error] (compile:compileIncremental) Compilation failed
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)