[jira] [Commented] (SPARK-17563) Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X

2016-09-19 Thread Oleksiy Sayankin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15502698#comment-15502698
 ] 

Oleksiy Sayankin commented on SPARK-17563:
--

Found existing issue https://issues.apache.org/jira/browse/HIVE-14029

> Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with 
> Hive-2.X.X
> ---
>
> Key: SPARK-17563
> URL: https://issues.apache.org/jira/browse/SPARK-17563
> Project: Spark
>  Issue Type: Bug
>Reporter: Oleksiy Sayankin
>
> According to https://issues.apache.org/jira/browse/SPARK-14358 
> JavaSparkListener was deleted from Spark-2.0.0, but Hive-2.X.X uses 
> JavaSparkListener
> {code}
> package org.apache.hadoop.hive.ql.exec.spark.status.impl;
> import ...
> public class JobMetricsListener extends JavaSparkListener {
> {code}
> Configuring Hive-2.X.X on Spark-2.0.0 will give an exception:
> {code}
> 2016-09-16T11:20:57,474 INFO  [stderr-redir-1]: client.SparkClientImpl 
> (SparkClientImpl.java:run(593)) - java.lang.NoClassDefFoundError: 
> org/apache/spark/JavaSparkListener
> {code}
> Please add JavaSparkListener into Spark-2.0.0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17563) Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X

2016-09-16 Thread Oleksiy Sayankin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15496622#comment-15496622
 ] 

Oleksiy Sayankin commented on SPARK-17563:
--

Created https://issues.apache.org/jira/browse/HIVE-14777

> Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with 
> Hive-2.X.X
> ---
>
> Key: SPARK-17563
> URL: https://issues.apache.org/jira/browse/SPARK-17563
> Project: Spark
>  Issue Type: Bug
>Reporter: Oleksiy Sayankin
>
> According to https://issues.apache.org/jira/browse/SPARK-14358 
> JavaSparkListener was deleted from Spark-2.0.0, but Hive-2.X.X uses 
> JavaSparkListener
> {code}
> package org.apache.hadoop.hive.ql.exec.spark.status.impl;
> import ...
> public class JobMetricsListener extends JavaSparkListener {
> {code}
> Configuring Hive-2.X.X on Spark-2.0.0 will give an exception:
> {code}
> 2016-09-16T11:20:57,474 INFO  [stderr-redir-1]: client.SparkClientImpl 
> (SparkClientImpl.java:run(593)) - java.lang.NoClassDefFoundError: 
> org/apache/spark/JavaSparkListener
> {code}
> Please add JavaSparkListener into Spark-2.0.0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17563) Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X

2016-09-16 Thread Oleksiy Sayankin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15496614#comment-15496614
 ] 

Oleksiy Sayankin commented on SPARK-17563:
--

After three hours of fixing I have found out that there are too many changes in 
Spark-2.0.0 API comparing to Spark-1.6.1 API to make the fix in easy way. I was 
able to fix Spark Remote Client subproject, but Hive Query Language gives me a 
lot of errors.

{code}
[INFO] Hive ... SUCCESS [  0.883 s]
[INFO] Hive Shims Common .. SUCCESS [  2.424 s]
[INFO] Hive Shims 0.23  SUCCESS [  1.132 s]
[INFO] Hive Shims Scheduler ... SUCCESS [  0.299 s]
[INFO] Hive Shims . SUCCESS [  0.199 s]
[INFO] Hive Storage API ... SUCCESS [  0.851 s]
[INFO] Hive ORC ... SUCCESS [  2.346 s]
[INFO] Hive Common  SUCCESS [  3.567 s]
[INFO] Hive Serde . SUCCESS [  2.513 s]
[INFO] Hive Metastore . SUCCESS [ 10.782 s]
[INFO] Hive Ant Utilities . SUCCESS [  0.818 s]
[INFO] Hive Llap Common ... SUCCESS [  0.859 s]
[INFO] Hive Llap Client ... SUCCESS [  0.337 s]
[INFO] Hive Llap Tez .. SUCCESS [  0.525 s]
[INFO] Spark Remote Client  SUCCESS [  1.547 s]
[INFO] Hive Query Language  FAILURE [ 19.686 s]
[INFO] Hive Service ... SKIPPED
[INFO] Hive Accumulo Handler .. SKIPPED
[INFO] Hive JDBC .. SKIPPED
[INFO] Hive Beeline ... SKIPPED
[INFO] Hive CLI ... SKIPPED
[INFO] Hive Contrib ... SKIPPED
[INFO] Hive HBase Handler . SKIPPED
[INFO] Hive HCatalog .. SKIPPED
[INFO] Hive HCatalog Core . SKIPPED
[INFO] Hive HCatalog Pig Adapter .. SKIPPED
[INFO] Hive HCatalog Server Extensions  SKIPPED
[INFO] Hive HCatalog Webhcat Java Client .. SKIPPED
[INFO] Hive HCatalog Webhcat .. SKIPPED
[INFO] Hive HCatalog Streaming  SKIPPED
[INFO] Hive HPL/SQL ... SKIPPED
[INFO] Hive HWI ... SKIPPED
[INFO] Hive ODBC .. SKIPPED
[INFO] Hive Llap Server ... SKIPPED
[INFO] Hive Shims Aggregator .. SKIPPED
[INFO] Hive TestUtils . SKIPPED
[INFO] Hive Packaging . SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 49.643 s
[INFO] Finished at: 2016-09-16T18:27:24+03:00
[INFO] Final Memory: 154M/2994M
[INFO] 
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on 
project hive-exec: Compilation failure: Compilation failure:
[ERROR] 
/home/osayankin/git/myrepo/hive/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveReduceFunction.java:[28,8]
 org.apache.hadoop.hive.ql.exec.spark.HiveReduceFunction is not abstract and 
does not override abstract method 
call(java.util.Iterator>)
 in org.apache.spark.api.java.function.PairFlatMapFunction
[ERROR] 
/home/osayankin/git/myrepo/hive/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveReduceFunction.java:[40,3]
 
call(java.util.Iterator>)
 in org.apache.hadoop.hive.ql.exec.spark.HiveReduceFunction cannot implement 
call(T) in org.apache.spark.api.java.function.PairFlatMapFunction
[ERROR] return type 
java.lang.Iterable>
 is not compatible with 
java.util.Iterator>
[ERROR] 
/home/osayankin/git/myrepo/hive/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveReduceFunction.java:[38,3]
 method does not override or implement a method from a supertype
[ERROR] 

[jira] [Commented] (SPARK-17563) Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X

2016-09-16 Thread Oleksiy Sayankin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15496232#comment-15496232
 ] 

Oleksiy Sayankin commented on SPARK-17563:
--

Well I can change JavaSparkListener --> SparkListener and 

1.6.1 --> 2.0.0 

in pom.xml. I guess this will work.

> Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with 
> Hive-2.X.X
> ---
>
> Key: SPARK-17563
> URL: https://issues.apache.org/jira/browse/SPARK-17563
> Project: Spark
>  Issue Type: Bug
>Reporter: Oleksiy Sayankin
>
> According to https://issues.apache.org/jira/browse/SPARK-14358 
> JavaSparkListener was deleted from Spark-2.0.0, but Hive-2.X.X uses 
> JavaSparkListener
> {code}
> package org.apache.hadoop.hive.ql.exec.spark.status.impl;
> import ...
> public class JobMetricsListener extends JavaSparkListener {
> {code}
> Configuring Hive-2.X.X on Spark-2.0.0 will give an exception:
> {code}
> 2016-09-16T11:20:57,474 INFO  [stderr-redir-1]: client.SparkClientImpl 
> (SparkClientImpl.java:run(593)) - java.lang.NoClassDefFoundError: 
> org/apache/spark/JavaSparkListener
> {code}
> Please add JavaSparkListener into Spark-2.0.0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17563) Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with Hive-2.X.X

2016-09-16 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15496162#comment-15496162
 ] 

Sean Owen commented on SPARK-17563:
---

JobMetricsListener is not part of Spark, right?
It needs to change to be compatible with Spark 2.0.

> Add org/apache/spark/JavaSparkListener to make Spark-2.0.0 work with 
> Hive-2.X.X
> ---
>
> Key: SPARK-17563
> URL: https://issues.apache.org/jira/browse/SPARK-17563
> Project: Spark
>  Issue Type: Bug
>Reporter: Oleksiy Sayankin
>
> According to https://issues.apache.org/jira/browse/SPARK-14358 
> JavaSparkListener was deleted from Spark-2.0.0, but Hive-2.X.X uses 
> JavaSparkListener
> {code}
> package org.apache.hadoop.hive.ql.exec.spark.status.impl;
> import ...
> public class JobMetricsListener extends JavaSparkListener {
> {code}
> Configuring Hive-2.X.X on Spark-2.0.0 will give an exception:
> {code}
> 2016-09-16T11:20:57,474 INFO  [stderr-redir-1]: client.SparkClientImpl 
> (SparkClientImpl.java:run(593)) - java.lang.NoClassDefFoundError: 
> org/apache/spark/JavaSparkListener
> {code}
> Please add JavaSparkListener into Spark-2.0.0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org