[
https://issues.apache.org/jira/browse/HBASE-16179?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15527475#comment-15527475
]
Sean Busbey commented on HBASE-16179:
-------------------------------------
So you'd like us to just mandate Scala 2.10 for both 1.6 and 2.0, right?
Reading through the threads when dev@spark talks about scala versions I got the
sense that Scala 2.10 won't be long for the 2.y line. When that time comes,
what would we do? I'm not necessarily asking that we address this now, just
want to know that we've thought of a general approach to supportability. We
haven't taken a stance as a community yet on support timelines for Scala
versions, but I'd expect we'd treat them similar to JDK versions, which would
means we'd keep a Scala 2.10 based Spark 2 module for the remainder of HBase
2.y.
> Fix compilation errors when building hbase-spark against Spark 2.0
> ------------------------------------------------------------------
>
> Key: HBASE-16179
> URL: https://issues.apache.org/jira/browse/HBASE-16179
> Project: HBase
> Issue Type: Bug
> Components: spark
> Reporter: Ted Yu
> Assignee: Ted Yu
> Fix For: 2.0.0
>
> Attachments: 16179.v0.txt, 16179.v1.txt, 16179.v1.txt, 16179.v4.txt,
> 16179.v5.txt
>
>
> I tried building hbase-spark module against Spark-2.0 snapshot and got the
> following compilation errors:
> http://pastebin.com/bg3w247a
> Some Spark classes such as DataTypeParser and Logging are no longer
> accessible to downstream projects.
> hbase-spark module should not depend on such classes.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)