[ 
https://issues.apache.org/jira/browse/HBASE-24815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

leookok updated HBASE-24815:
----------------------------
    Description: 
*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
 overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext <and>
  (listener: 
org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
 object SparkHadoopUtil in package deploy cannot be accessed in package 
org.apache.spark.deploy
[ERROR] [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
 not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found





  was:
*when  maven  command-line*

mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

will return error

{color:red}[ERROR]{color} [Error] 
F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
 overloaded method value addTaskCompletionListener with alternatives:
  (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext <and>
  (listener: 
org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
 does not take type parameters
{color:red}[ERROR] {color}one error found
 
*other try*
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12  
-Dcheckstyle.skip=true -Dmaven.test.skip=true clean install

return error 

{color:red}[ERROR]{color} [Error] 
F:\projects\git-hub\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
 object SparkHadoopUtil in package deploy cannot be accessed in package 
org.apache.spark.deploy
[ERROR] [Error] 
F:\projects\git-hub\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
 not found: value SparkHadoopUtil
{color:red}[ERROR]{color} two errors found






> hbase-connectors mvn install error
> ----------------------------------
>
>                 Key: HBASE-24815
>                 URL: https://issues.apache.org/jira/browse/HBASE-24815
>             Project: HBase
>          Issue Type: Bug
>          Components: hbase-connectors
>            Reporter: leookok
>            Priority: Blocker
>
> *when  maven  command-line*
> mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 
> -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> will return error
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216:
>  overloaded method value addTaskCompletionListener with alternatives:
>   (f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext <and>
>   (listener: 
> org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
>  does not take type parameters
> {color:red}[ERROR] {color}one error found
>  
> *other try*
> mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 
>  -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
> return error 
> {color:red}[ERROR]{color} [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439:
>  object SparkHadoopUtil in package deploy cannot be accessed in package 
> org.apache.spark.deploy
> [ERROR] [Error] 
> F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487:
>  not found: value SparkHadoopUtil
> {color:red}[ERROR]{color} two errors found



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to