Re: [apache-spark][Spark SQL][Debug] Maven Spark build fails while compiling spark-hive-thriftserver_2.12 for Hadoop 2.10.1

2021-09-17 Thread Sean Owen
I don't think that has ever showed up in the CI/CD builds and can't recall
someone reporting this. What did you change? it may be some local env issue

On Fri, Sep 17, 2021 at 7:09 AM Enrico Minardi 
wrote:

>
> Hello,
>
>
> the Maven build of Apache Spark 3.1.2 for user-provided Hadoop 2.10.1 with
> Hive and Hive-Thriftserver profiles fails while compiling
> spark-hive-thriftserver_2.12.
>
>
> I am most probably missing something. Could you please help?
>
>
> I have searched the Scala-Maven-Plugin website (
> https://davidb.github.io/scala-maven-plugin/usage.html), Stack Overflow /
> Stack Exchange, and the ASF user-list archive, but could not progress on
> this issue.
>
>
> The Maven version is 3.6.3, Scala version is the 2.12 (not installed, as
> mentioned here: https://davidb.github.io/scala-maven-plugin/index.html).
>
>
> Thank you very much for any suggestion. I would be very appreciated.
>
>
> Kind regards,
>
> Enrico
>
> --
>
>
> The error message is:
>
>
> Failed to execute goal
> [32mnet.alchim31.maven:scala-maven-plugin:4.3.0:compile [m
> [1m(scala-compile-first) [m on project [36mspark-hive-thriftserver_2.12 [m:
> [1;31mExecution scala-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:4.3.0:compile failed:
> java.lang.AssertionError: assertion failed: Expected protocol to be 'file'
> or empty in URI
> jar:file:/home/vmuser/.m2/repository/org/eclipse/jetty/jetty-server/9.4.40.v20210413/jetty-server-9.4.40.v20210413.jar!/org/eclipse/jetty/server/AbstractConnectionFactory.class
>
>
> I have adjusted the ./pom.xml as follows:
>
>
> 
>   hadoop-provided
>   
> 2.10.1
> 2.13.0
> 2.4
> servlet-api
>   
> (...)
>
>
> The command I run is:
>
>
> ./dev/make-distribution.sh -X -q -Phadoop-provided -Pyarn -Pkubernetes
> -Pscala-2.12 -Phive -Phive-thriftserver -DzincPort=3036
>
>


[apache-spark][Spark SQL][Debug] Maven Spark build fails while compiling spark-hive-thriftserver_2.12 for Hadoop 2.10.1

2021-09-17 Thread Enrico Minardi

Hello,


the Maven build of Apache Spark 3.1.2 for user-provided Hadoop 2.10.1 with Hive 
and Hive-Thriftserver profiles fails while compiling 
spark-hive-thriftserver_2.12.


I am most probably missing something. Could you please help?


I have searched the Scala-Maven-Plugin website 
(https://davidb.github.io/scala-maven-plugin/usage.html), Stack Overflow / 
Stack Exchange, and the ASF user-list archive, but could not progress on this 
issue.


The Maven version is 3.6.3, Scala version is the 2.12 (not installed, as 
mentioned here: https://davidb.github.io/scala-maven-plugin/index.html).



Thank you very much for any suggestion. I would be very appreciated.


Kind regards,

Enrico

--


The error message is:


Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:4.3.0:compile 
(scala-compile-first) on project spark-hive-thriftserver_2.12: 
Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:4.3.0:compile failed: 
java.lang.AssertionError: assertion failed: Expected protocol to be 'file' or 
empty in URI 
jar:file:/home/vmuser/.m2/repository/org/eclipse/jetty/jetty-server/9.4.40.v20210413/jetty-server-9.4.40.v20210413.jar!/org/eclipse/jetty/server/AbstractConnectionFactory.class



I have adjusted the ./pom.xml as follows:



  hadoop-provided
  
2.10.1
2.13.0
2.4
servlet-api
  
(...)


The command I run is:


./dev/make-distribution.sh -X -q -Phadoop-provided -Pyarn -Pkubernetes 
-Pscala-2.12 -Phive -Phive-thriftserver -DzincPort=3036



spark ./build/mvn test failed on aarch64

2019-06-05 Thread Tianhua huang
Hi all,
Recently I run './build/mvn test' of spark on aarch64, and master and
branch-2.4 are all failled, the log pieces as below:

..

[INFO] T E S T S
[INFO] ---
[INFO] Running org.apache.spark.util.kvstore.LevelDBTypeInfoSuite
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.081 s - in org.apache.spark.util.kvstore.LevelDBTypeInfoSuite
[INFO] Running org.apache.spark.util.kvstore.InMemoryStoreSuite
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.001 s - in org.apache.spark.util.kvstore.InMemoryStoreSuite
[INFO] Running org.apache.spark.util.kvstore.InMemoryIteratorSuite
[INFO] Tests run: 38, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.219 s - in org.apache.spark.util.kvstore.InMemoryIteratorSuite
[INFO] Running org.apache.spark.util.kvstore.LevelDBIteratorSuite
[ERROR] Tests run: 38, Failures: 0, Errors: 38, Skipped: 0, Time elapsed:
0.23 s <<< FAILURE! - in org.apache.spark.util.kvstore.LevelDBIteratorSuite
[ERROR] 
copyIndexDescendingWithStart(org.apache.spark.util.kvstore.LevelDBIteratorSuite)
Time elapsed: 0.2 s <<< ERROR!
java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no
leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in
java.library.path, no leveldbjni in java.library.path,
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
cannot open shared object file: No such file or directory (Possible cause:
can't load AMD 64-bit .so on a AARCH64-bit platform)]
at
org.apache.spark.util.kvstore.LevelDBIteratorSuite.createStore(LevelDBIteratorSuite.java:44)

..

There is a dependency of  leveldbjni-all  , but there is no the native
package for aarch64 i in leveldbjni-1.8(all) .jar, I found aarch64 is
supported after pr https://github.com/fusesource/leveldbjni/pull/82, but it
was not in the 1.8 release, and unfortunately the repo didn't updated
almost for

two years.

So I have a question: does spark support aarch64, and if it is yes, then
how to fix this problem, if it is not, what's

the plan for it? Thank you all!


spark ./build/mvn test failed on aarch64

2019-06-05 Thread Tianhua huang
Hi all,
Recently I run './build/mvn test' of spark on aarch64, and master and
branch-2.4 are all failled, the log pieces as below:

..

[INFO] T E S T S
[INFO] ---
[INFO] Running org.apache.spark.util.kvstore.LevelDBTypeInfoSuite
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.081 s - in org.apache.spark.util.kvstore.LevelDBTypeInfoSuite
[INFO] Running org.apache.spark.util.kvstore.InMemoryStoreSuite
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.001 s - in org.apache.spark.util.kvstore.InMemoryStoreSuite
[INFO] Running org.apache.spark.util.kvstore.InMemoryIteratorSuite
[INFO] Tests run: 38, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.219 s - in org.apache.spark.util.kvstore.InMemoryIteratorSuite
[INFO] Running org.apache.spark.util.kvstore.LevelDBIteratorSuite
[ERROR] Tests run: 38, Failures: 0, Errors: 38, Skipped: 0, Time elapsed:
0.23 s <<< FAILURE! - in org.apache.spark.util.kvstore.LevelDBIteratorSuite
[ERROR] 
copyIndexDescendingWithStart(org.apache.spark.util.kvstore.LevelDBIteratorSuite)
Time elapsed: 0.2 s <<< ERROR!
java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no
leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in
java.library.path, no leveldbjni in java.library.path,
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
cannot open shared object file: No such file or directory (Possible cause:
can't load AMD 64-bit .so on a AARCH64-bit platform)]
at
org.apache.spark.util.kvstore.LevelDBIteratorSuite.createStore(LevelDBIteratorSuite.java:44)

..

There is a dependency of LEVELDBJNI:

org.fusesource.leveldbjni
leveldbjni-all
1.8



CVE-2018-11804: Apache Spark build/mvn runs zinc, and can expose information from build machines

2018-10-24 Thread Sean Owen
Severity: Low

Vendor: The Apache Software Foundation

Versions Affected:
1.3.x release branch and later, including master

Description:
Spark's Apache Maven-based build includes a convenience script, 'build/mvn',
that downloads and runs a zinc server to speed up compilation. This server
will accept connections from external hosts by default. A specially-crafted
request to the zinc server could cause it to reveal information in files
readable to the developer account running the build. Note that this issue
does not affect end users of Spark, only developers building Spark from
source code.

Mitigation:
Spark users are not affected, as zinc is only a part of the build process.
Spark developers may simply use a local Maven installation's 'mvn' command
to build, and avoid running build/mvn and zinc.
Spark developers building actively-developed branches (2.2.x, 2.3.x, 2.4.x,
master) may update their branches to receive mitigations already patched
onto the build/mvn script.
Spark developers running zinc separately may include "-server 127.0.0.1" in
its command line, and consider additional flags like "-idle-timeout 30m" to
achieve similar mitigation.

Credit:
Andre Protas, Apple Information Security

References:
https://spark.apache.org/security.html

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark build 1.6.2 error

2016-09-03 Thread Diwakar Dhanuskodi
Sorry my bad. In both the runs I included -Dscala-2.11

On Sat, Sep 3, 2016 at 12:39 PM, Nachiketa 
wrote:

> I think the difference was the -Dscala2.11 to the command line.
>
> I have seen this show up when I miss that.
>
> Regards,
> Nachiketa
>
> On Sat 3 Sep, 2016, 12:14 PM Diwakar Dhanuskodi, <
> diwakar.dhanusk...@gmail.com> wrote:
>
>> Hi,
>>
>> Just re-ran again without killing zinc server process
>>
>> /make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
>> -Pyarn -Dmaven.version=3.0.4 -Dscala-2.11 -X -rf :spark-sql_2.11
>>
>> Build is success. Not sure how it worked with just re-running command
>> again.
>>
>> On Sat, Sep 3, 2016 at 11:44 AM, Diwakar Dhanuskodi <
>> diwakar.dhanusk...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> java version 7
>>>
>>> mvn command
>>> ./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
>>> -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4
>>>
>>>
>>> yes, I executed script to change scala version to 2.11
>>> killed  "com.typesafe zinc.Nailgun" process
>>>
>>> re-ran mvn with below command again
>>>
>>> ./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
>>> -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4 -X -rf :spark-sql_2.11
>>>
>>> Getting same error
>>>
>>> [warn] /home/cloudera/Downloads/spark-1.6.2/sql/core/src/main/
>>> scala/org/apache/spark/sql/sources/interfaces.scala:911: method isDir
>>> in class FileStatus is deprecated: see corresponding Javadoc for more
>>> information.
>>> [warn] status.isDir,
>>> [warn]^
>>> [error] missing or invalid dependency detected while loading class file
>>> 'WebUI.class'.
>>> [error] Could not access term eclipse in package org,
>>> [error] because it (or its dependencies) are missing. Check your build
>>> definition for
>>> [error] missing or conflicting dependencies. (Re-run with
>>> `-Ylog-classpath` to see the problematic classpath.)
>>> [error] A full rebuild may help if 'WebUI.class' was compiled against an
>>> incompatible version of org.
>>> [error] missing or invalid dependency detected while loading class file
>>> 'WebUI.class'.
>>> [error] Could not access term jetty in value org.eclipse,
>>> [error] because it (or its dependencies) are missing. Check your build
>>> definition for
>>> [error] missing or conflicting dependencies. (Re-run with
>>> `-Ylog-classpath` to see the problematic classpath.)
>>> [error] A full rebuild may help if 'WebUI.class' was compiled against an
>>> incompatible version of org.eclipse.
>>> [warn] 17 warnings found
>>> [error] two errors found
>>> [debug] Compilation failed (CompilerInterface)
>>> [error] Compile failed at Sep 3, 2016 11:28:34 AM [21.611s]
>>> [INFO] 
>>> 
>>> [INFO] Reactor Summary:
>>> [INFO]
>>> [INFO] Spark Project Parent POM .. SUCCESS
>>> [5.583s]
>>> [INFO] Spark Project Test Tags ... SUCCESS
>>> [4.189s]
>>> [INFO] Spark Project Launcher  SUCCESS
>>> [12.226s]
>>> [INFO] Spark Project Networking .. SUCCESS
>>> [13.386s]
>>> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS
>>> [6.723s]
>>> [INFO] Spark Project Unsafe .. SUCCESS
>>> [21.231s]
>>> [INFO] Spark Project Core  SUCCESS
>>> [3:46.334s]
>>> [INFO] Spark Project Bagel ... SUCCESS
>>> [7.032s]
>>> [INFO] Spark Project GraphX .. SUCCESS
>>> [19.558s]
>>> [INFO] Spark Project Streaming ... SUCCESS
>>> [50.452s]
>>> [INFO] Spark Project Catalyst  SUCCESS
>>> [1:14.172s]
>>> [INFO] Spark Project SQL . FAILURE
>>> [23.222s]
>>> [INFO] Spark Project ML Library .. SKIPPED
>>> [INFO] Spark Project Tools ... SKIPPED
>>> [INFO] Spark Project Hive  SKIPPED
>>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>>> [INFO] Spark Project REPL  SKIPPED
>>> [INFO] Spark Project YARN Shuffle Service  SKIPPED
>>> [INFO] Spark Project YARN  SKIPPED
>>> [INFO] Spark Project Assembly  SKIPPED
>>> [INFO] Spark Project External Twitter  SKIPPED
>>> [INFO] Spark Project External Flume Sink . SKIPPED
>>> [INFO] Spark Project External Flume .. SKIPPED
>>> [INFO] Spark Project External Flume Assembly . SKIPPED
>>> [INFO] Spark Project External MQTT ... SKIPPED
>>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>>> [INFO] Spark Project External ZeroMQ . SKIPPED
>>> [INFO] Spark Project External Kafka .. 

Re: Spark build 1.6.2 error

2016-09-03 Thread Nachiketa
I think the difference was the -Dscala2.11 to the command line.

I have seen this show up when I miss that.

Regards,
Nachiketa

On Sat 3 Sep, 2016, 12:14 PM Diwakar Dhanuskodi, <
diwakar.dhanusk...@gmail.com> wrote:

> Hi,
>
> Just re-ran again without killing zinc server process
>
> /make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
> -Pyarn -Dmaven.version=3.0.4 -Dscala-2.11 -X -rf :spark-sql_2.11
>
> Build is success. Not sure how it worked with just re-running command
> again.
>
> On Sat, Sep 3, 2016 at 11:44 AM, Diwakar Dhanuskodi <
> diwakar.dhanusk...@gmail.com> wrote:
>
>> Hi,
>>
>> java version 7
>>
>> mvn command
>> ./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
>> -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4
>>
>>
>> yes, I executed script to change scala version to 2.11
>> killed  "com.typesafe zinc.Nailgun" process
>>
>> re-ran mvn with below command again
>>
>> ./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
>> -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4 -X -rf :spark-sql_2.11
>>
>> Getting same error
>>
>> [warn]
>> /home/cloudera/Downloads/spark-1.6.2/sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:911:
>> method isDir in class FileStatus is deprecated: see corresponding Javadoc
>> for more information.
>> [warn] status.isDir,
>> [warn]^
>> [error] missing or invalid dependency detected while loading class file
>> 'WebUI.class'.
>> [error] Could not access term eclipse in package org,
>> [error] because it (or its dependencies) are missing. Check your build
>> definition for
>> [error] missing or conflicting dependencies. (Re-run with
>> `-Ylog-classpath` to see the problematic classpath.)
>> [error] A full rebuild may help if 'WebUI.class' was compiled against an
>> incompatible version of org.
>> [error] missing or invalid dependency detected while loading class file
>> 'WebUI.class'.
>> [error] Could not access term jetty in value org.eclipse,
>> [error] because it (or its dependencies) are missing. Check your build
>> definition for
>> [error] missing or conflicting dependencies. (Re-run with
>> `-Ylog-classpath` to see the problematic classpath.)
>> [error] A full rebuild may help if 'WebUI.class' was compiled against an
>> incompatible version of org.eclipse.
>> [warn] 17 warnings found
>> [error] two errors found
>> [debug] Compilation failed (CompilerInterface)
>> [error] Compile failed at Sep 3, 2016 11:28:34 AM [21.611s]
>> [INFO]
>> 
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] Spark Project Parent POM .. SUCCESS
>> [5.583s]
>> [INFO] Spark Project Test Tags ... SUCCESS
>> [4.189s]
>> [INFO] Spark Project Launcher  SUCCESS
>> [12.226s]
>> [INFO] Spark Project Networking .. SUCCESS
>> [13.386s]
>> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS
>> [6.723s]
>> [INFO] Spark Project Unsafe .. SUCCESS
>> [21.231s]
>> [INFO] Spark Project Core  SUCCESS
>> [3:46.334s]
>> [INFO] Spark Project Bagel ... SUCCESS
>> [7.032s]
>> [INFO] Spark Project GraphX .. SUCCESS
>> [19.558s]
>> [INFO] Spark Project Streaming ... SUCCESS
>> [50.452s]
>> [INFO] Spark Project Catalyst  SUCCESS
>> [1:14.172s]
>> [INFO] Spark Project SQL . FAILURE
>> [23.222s]
>> [INFO] Spark Project ML Library .. SKIPPED
>> [INFO] Spark Project Tools ... SKIPPED
>> [INFO] Spark Project Hive  SKIPPED
>> [INFO] Spark Project Docker Integration Tests  SKIPPED
>> [INFO] Spark Project REPL  SKIPPED
>> [INFO] Spark Project YARN Shuffle Service  SKIPPED
>> [INFO] Spark Project YARN  SKIPPED
>> [INFO] Spark Project Assembly  SKIPPED
>> [INFO] Spark Project External Twitter  SKIPPED
>> [INFO] Spark Project External Flume Sink . SKIPPED
>> [INFO] Spark Project External Flume .. SKIPPED
>> [INFO] Spark Project External Flume Assembly . SKIPPED
>> [INFO] Spark Project External MQTT ... SKIPPED
>> [INFO] Spark Project External MQTT Assembly .. SKIPPED
>> [INFO] Spark Project External ZeroMQ . SKIPPED
>> [INFO] Spark Project External Kafka .. SKIPPED
>> [INFO] Spark Project Examples  SKIPPED
>> [INFO] Spark Project External Kafka Assembly . SKIPPED
>> [INFO]
>> 
>> [INFO] BUILD FAILURE

Re: Spark build 1.6.2 error

2016-09-03 Thread Diwakar Dhanuskodi
Hi,

Just re-ran again without killing zinc server process

/make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive -Pyarn
-Dmaven.version=3.0.4 -Dscala-2.11 -X -rf :spark-sql_2.11

Build is success. Not sure how it worked with just re-running command
again.

On Sat, Sep 3, 2016 at 11:44 AM, Diwakar Dhanuskodi <
diwakar.dhanusk...@gmail.com> wrote:

> Hi,
>
> java version 7
>
> mvn command
> ./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
> -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4
>
>
> yes, I executed script to change scala version to 2.11
> killed  "com.typesafe zinc.Nailgun" process
>
> re-ran mvn with below command again
>
> ./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
> -Phive-thriftserver -Pyarn -Dmaven.version=3.0.4 -X -rf :spark-sql_2.11
>
> Getting same error
>
> [warn] /home/cloudera/Downloads/spark-1.6.2/sql/core/src/main/
> scala/org/apache/spark/sql/sources/interfaces.scala:911: method isDir in
> class FileStatus is deprecated: see corresponding Javadoc for more
> information.
> [warn] status.isDir,
> [warn]^
> [error] missing or invalid dependency detected while loading class file
> 'WebUI.class'.
> [error] Could not access term eclipse in package org,
> [error] because it (or its dependencies) are missing. Check your build
> definition for
> [error] missing or conflicting dependencies. (Re-run with
> `-Ylog-classpath` to see the problematic classpath.)
> [error] A full rebuild may help if 'WebUI.class' was compiled against an
> incompatible version of org.
> [error] missing or invalid dependency detected while loading class file
> 'WebUI.class'.
> [error] Could not access term jetty in value org.eclipse,
> [error] because it (or its dependencies) are missing. Check your build
> definition for
> [error] missing or conflicting dependencies. (Re-run with
> `-Ylog-classpath` to see the problematic classpath.)
> [error] A full rebuild may help if 'WebUI.class' was compiled against an
> incompatible version of org.eclipse.
> [warn] 17 warnings found
> [error] two errors found
> [debug] Compilation failed (CompilerInterface)
> [error] Compile failed at Sep 3, 2016 11:28:34 AM [21.611s]
> [INFO] 
> 
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Parent POM .. SUCCESS [5.583s]
> [INFO] Spark Project Test Tags ... SUCCESS [4.189s]
> [INFO] Spark Project Launcher  SUCCESS
> [12.226s]
> [INFO] Spark Project Networking .. SUCCESS
> [13.386s]
> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS [6.723s]
> [INFO] Spark Project Unsafe .. SUCCESS
> [21.231s]
> [INFO] Spark Project Core  SUCCESS
> [3:46.334s]
> [INFO] Spark Project Bagel ... SUCCESS
> [7.032s]
> [INFO] Spark Project GraphX .. SUCCESS
> [19.558s]
> [INFO] Spark Project Streaming ... SUCCESS
> [50.452s]
> [INFO] Spark Project Catalyst  SUCCESS
> [1:14.172s]
> [INFO] Spark Project SQL . FAILURE
> [23.222s]
> [INFO] Spark Project ML Library .. SKIPPED
> [INFO] Spark Project Tools ... SKIPPED
> [INFO] Spark Project Hive  SKIPPED
> [INFO] Spark Project Docker Integration Tests  SKIPPED
> [INFO] Spark Project REPL  SKIPPED
> [INFO] Spark Project YARN Shuffle Service  SKIPPED
> [INFO] Spark Project YARN  SKIPPED
> [INFO] Spark Project Assembly  SKIPPED
> [INFO] Spark Project External Twitter  SKIPPED
> [INFO] Spark Project External Flume Sink . SKIPPED
> [INFO] Spark Project External Flume .. SKIPPED
> [INFO] Spark Project External Flume Assembly . SKIPPED
> [INFO] Spark Project External MQTT ... SKIPPED
> [INFO] Spark Project External MQTT Assembly .. SKIPPED
> [INFO] Spark Project External ZeroMQ . SKIPPED
> [INFO] Spark Project External Kafka .. SKIPPED
> [INFO] Spark Project Examples  SKIPPED
> [INFO] Spark Project External Kafka Assembly . SKIPPED
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time: 7:45.641s
> [INFO] Finished at: Sat Sep 03 11:28:34 IST 2016
> [INFO] Final Memory: 49M/415M
> [INFO] 
> 
> [ERROR] Failed to execute goal 
> 

Re: Spark build 1.6.2 error

2016-09-03 Thread Diwakar Dhanuskodi
Hi,

java version 7

mvn command
./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
-Phive-thriftserver -Pyarn -Dmaven.version=3.0.4


yes, I executed script to change scala version to 2.11
killed  "com.typesafe zinc.Nailgun" process

re-ran mvn with below command again

./make-distribution.sh --name custom-spark --tgz  -Phadoop-2.6 -Phive
-Phive-thriftserver -Pyarn -Dmaven.version=3.0.4 -X -rf :spark-sql_2.11

Getting same error

[warn]
/home/cloudera/Downloads/spark-1.6.2/sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:911:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more information.
[warn] status.isDir,
[warn]^
[error] missing or invalid dependency detected while loading class file
'WebUI.class'.
[error] Could not access term eclipse in package org,
[error] because it (or its dependencies) are missing. Check your build
definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath`
to see the problematic classpath.)
[error] A full rebuild may help if 'WebUI.class' was compiled against an
incompatible version of org.
[error] missing or invalid dependency detected while loading class file
'WebUI.class'.
[error] Could not access term jetty in value org.eclipse,
[error] because it (or its dependencies) are missing. Check your build
definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath`
to see the problematic classpath.)
[error] A full rebuild may help if 'WebUI.class' was compiled against an
incompatible version of org.eclipse.
[warn] 17 warnings found
[error] two errors found
[debug] Compilation failed (CompilerInterface)
[error] Compile failed at Sep 3, 2016 11:28:34 AM [21.611s]
[INFO]

[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .. SUCCESS [5.583s]
[INFO] Spark Project Test Tags ... SUCCESS [4.189s]
[INFO] Spark Project Launcher  SUCCESS [12.226s]
[INFO] Spark Project Networking .. SUCCESS [13.386s]
[INFO] Spark Project Shuffle Streaming Service ... SUCCESS [6.723s]
[INFO] Spark Project Unsafe .. SUCCESS [21.231s]
[INFO] Spark Project Core  SUCCESS
[3:46.334s]
[INFO] Spark Project Bagel ... SUCCESS [7.032s]
[INFO] Spark Project GraphX .. SUCCESS [19.558s]
[INFO] Spark Project Streaming ... SUCCESS [50.452s]
[INFO] Spark Project Catalyst  SUCCESS
[1:14.172s]
[INFO] Spark Project SQL . FAILURE [23.222s]
[INFO] Spark Project ML Library .. SKIPPED
[INFO] Spark Project Tools ... SKIPPED
[INFO] Spark Project Hive  SKIPPED
[INFO] Spark Project Docker Integration Tests  SKIPPED
[INFO] Spark Project REPL  SKIPPED
[INFO] Spark Project YARN Shuffle Service  SKIPPED
[INFO] Spark Project YARN  SKIPPED
[INFO] Spark Project Assembly  SKIPPED
[INFO] Spark Project External Twitter  SKIPPED
[INFO] Spark Project External Flume Sink . SKIPPED
[INFO] Spark Project External Flume .. SKIPPED
[INFO] Spark Project External Flume Assembly . SKIPPED
[INFO] Spark Project External MQTT ... SKIPPED
[INFO] Spark Project External MQTT Assembly .. SKIPPED
[INFO] Spark Project External ZeroMQ . SKIPPED
[INFO] Spark Project External Kafka .. SKIPPED
[INFO] Spark Project Examples  SKIPPED
[INFO] Spark Project External Kafka Assembly . SKIPPED
[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 7:45.641s
[INFO] Finished at: Sat Sep 03 11:28:34 IST 2016
[INFO] Final Memory: 49M/415M
[INFO]

[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first)
on project spark-sql_2.11: Execution scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed
-> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile
(scala-compile-first) on project spark-sql_2.11: Execution
scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225)

Re: Spark build 1.6.2 error

2016-08-31 Thread Divya Gehlot
Which java version are you using ?

On 31 August 2016 at 04:30, Diwakar Dhanuskodi  wrote:

> Hi,
>
> While building Spark 1.6.2 , getting below error in spark-sql. Much
> appreciate for any help.
>
> ERROR] missing or invalid dependency detected while loading class file
> 'WebUI.class'.
> Could not access term eclipse in package org,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'WebUI.class' was compiled against an
> incompatible version of org.
> [ERROR] missing or invalid dependency detected while loading class file
> 'WebUI.class'.
> Could not access term jetty in value org.eclipse,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'WebUI.class' was compiled against an
> incompatible version of org.eclipse.
> [WARNING] 17 warnings found
> [ERROR] two errors found
> [INFO] 
> 
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Parent POM .. SUCCESS [4.399s]
> [INFO] Spark Project Test Tags ... SUCCESS [3.443s]
> [INFO] Spark Project Launcher  SUCCESS
> [10.131s]
> [INFO] Spark Project Networking .. SUCCESS
> [11.849s]
> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS [6.641s]
> [INFO] Spark Project Unsafe .. SUCCESS
> [19.765s]
> [INFO] Spark Project Core  SUCCESS
> [4:16.511s]
> [INFO] Spark Project Bagel ... SUCCESS
> [13.401s]
> [INFO] Spark Project GraphX .. SUCCESS
> [1:08.824s]
> [INFO] Spark Project Streaming ... SUCCESS
> [2:18.844s]
> [INFO] Spark Project Catalyst  SUCCESS
> [2:43.695s]
> [INFO] Spark Project SQL . FAILURE
> [1:01.762s]
> [INFO] Spark Project ML Library .. SKIPPED
> [INFO] Spark Project Tools ... SKIPPED
> [INFO] Spark Project Hive  SKIPPED
> [INFO] Spark Project Docker Integration Tests  SKIPPED
> [INFO] Spark Project REPL  SKIPPED
> [INFO] Spark Project YARN Shuffle Service  SKIPPED
> [INFO] Spark Project YARN  SKIPPED
> [INFO] Spark Project Assembly  SKIPPED
> [INFO] Spark Project External Twitter  SKIPPED
> [INFO] Spark Project External Flume Sink . SKIPPED
> [INFO] Spark Project External Flume .. SKIPPED
> [INFO] Spark Project External Flume Assembly . SKIPPED
> [INFO] Spark Project External MQTT ... SKIPPED
> [INFO] Spark Project External MQTT Assembly .. SKIPPED
> [INFO] Spark Project External ZeroMQ . SKIPPED
> [INFO] Spark Project External Kafka .. SKIPPED
> [INFO] Spark Project Examples  SKIPPED
> [INFO] Spark Project External Kafka Assembly . SKIPPED
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time: 12:40.525s
> [INFO] Finished at: Wed Aug 31 01:56:50 IST 2016
> [INFO] Final Memory: 71M/830M
> [INFO] 
> 
> [ERROR] Failed to execute goal 
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile
> (scala-compile-first) on project spark-sql_2.11: Execution
> scala-compile-first of goal 
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile
> failed. CompileFailed -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> PluginExecutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
> [ERROR]   mvn  -rf :spark-sql_2.11
>
>
>


Re: Spark build 1.6.2 error

2016-08-31 Thread Adam Roberts
Looks familiar, got the zinc server running and using a shared dev box?

ps -ef | grep "com.typesafe zinc.Nailgun", look for the zinc server 
process, kill it and try again, Spark branch-1.6 builds great here from 
scratch, had plenty of problems thanks to running the zinc server here 
(started with build/mvn)




From:   Nachiketa <nachiketa.shu...@gmail.com>
To: Diwakar Dhanuskodi <diwakar.dhanusk...@gmail.com>
Cc: user <user@spark.apache.org>
Date:   31/08/2016 12:17
Subject:Re: Spark build 1.6.2 error



Hi Diwakar,

Could you please share the entire maven command that you are using to 
build ? And also the JDK version you are using ?

Also could you please confirm that you did execute the script for change 
scala version to 2.11 before starting the build ? Thanks.

Regards,
Nachiketa

On Wed, Aug 31, 2016 at 2:00 AM, Diwakar Dhanuskodi <
diwakar.dhanusk...@gmail.com> wrote:
Hi, 

While building Spark 1.6.2 , getting below error in spark-sql. Much 
appreciate for any help.

ERROR] missing or invalid dependency detected while loading class file 
'WebUI.class'.
Could not access term eclipse in package org,
because it (or its dependencies) are missing. Check your build definition 
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see 
the problematic classpath.)
A full rebuild may help if 'WebUI.class' was compiled against an 
incompatible version of org.
[ERROR] missing or invalid dependency detected while loading class file 
'WebUI.class'.
Could not access term jetty in value org.eclipse,
because it (or its dependencies) are missing. Check your build definition 
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see 
the problematic classpath.)
A full rebuild may help if 'WebUI.class' was compiled against an 
incompatible version of org.eclipse.
[WARNING] 17 warnings found
[ERROR] two errors found
[INFO] 

[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM .. SUCCESS 
[4.399s]
[INFO] Spark Project Test Tags ... SUCCESS 
[3.443s]
[INFO] Spark Project Launcher  SUCCESS 
[10.131s]
[INFO] Spark Project Networking .. SUCCESS 
[11.849s]
[INFO] Spark Project Shuffle Streaming Service ... SUCCESS 
[6.641s]
[INFO] Spark Project Unsafe .. SUCCESS 
[19.765s]
[INFO] Spark Project Core  SUCCESS 
[4:16.511s]
[INFO] Spark Project Bagel ... SUCCESS 
[13.401s]
[INFO] Spark Project GraphX .. SUCCESS 
[1:08.824s]
[INFO] Spark Project Streaming ... SUCCESS 
[2:18.844s]
[INFO] Spark Project Catalyst  SUCCESS 
[2:43.695s]
[INFO] Spark Project SQL . FAILURE 
[1:01.762s]
[INFO] Spark Project ML Library .. SKIPPED
[INFO] Spark Project Tools ... SKIPPED
[INFO] Spark Project Hive  SKIPPED
[INFO] Spark Project Docker Integration Tests  SKIPPED
[INFO] Spark Project REPL  SKIPPED
[INFO] Spark Project YARN Shuffle Service  SKIPPED
[INFO] Spark Project YARN  SKIPPED
[INFO] Spark Project Assembly  SKIPPED
[INFO] Spark Project External Twitter  SKIPPED
[INFO] Spark Project External Flume Sink . SKIPPED
[INFO] Spark Project External Flume .. SKIPPED
[INFO] Spark Project External Flume Assembly . SKIPPED
[INFO] Spark Project External MQTT ... SKIPPED
[INFO] Spark Project External MQTT Assembly .. SKIPPED
[INFO] Spark Project External ZeroMQ . SKIPPED
[INFO] Spark Project External Kafka .. SKIPPED
[INFO] Spark Project Examples  SKIPPED
[INFO] Spark Project External Kafka Assembly . SKIPPED
[INFO] 

[INFO] BUILD FAILURE
[INFO] 

[INFO] Total time: 12:40.525s
[INFO] Finished at: Wed Aug 31 01:56:50 IST 2016
[INFO] Final Memory: 71M/830M
[INFO] 

[ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) 
on project spark-sql_2.11: Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed 
-> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the 
-e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.

Re: Spark build 1.6.2 error

2016-08-31 Thread Nachiketa
Hi Diwakar,

Could you please share the entire maven command that you are using to build
? And also the JDK version you are using ?

Also could you please confirm that you did execute the script for change
scala version to 2.11 before starting the build ? Thanks.

Regards,
Nachiketa

On Wed, Aug 31, 2016 at 2:00 AM, Diwakar Dhanuskodi <
diwakar.dhanusk...@gmail.com> wrote:

> Hi,
>
> While building Spark 1.6.2 , getting below error in spark-sql. Much
> appreciate for any help.
>
> ERROR] missing or invalid dependency detected while loading class file
> 'WebUI.class'.
> Could not access term eclipse in package org,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'WebUI.class' was compiled against an
> incompatible version of org.
> [ERROR] missing or invalid dependency detected while loading class file
> 'WebUI.class'.
> Could not access term jetty in value org.eclipse,
> because it (or its dependencies) are missing. Check your build definition
> for
> missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
> the problematic classpath.)
> A full rebuild may help if 'WebUI.class' was compiled against an
> incompatible version of org.eclipse.
> [WARNING] 17 warnings found
> [ERROR] two errors found
> [INFO] 
> 
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Parent POM .. SUCCESS [4.399s]
> [INFO] Spark Project Test Tags ... SUCCESS [3.443s]
> [INFO] Spark Project Launcher  SUCCESS
> [10.131s]
> [INFO] Spark Project Networking .. SUCCESS
> [11.849s]
> [INFO] Spark Project Shuffle Streaming Service ... SUCCESS [6.641s]
> [INFO] Spark Project Unsafe .. SUCCESS
> [19.765s]
> [INFO] Spark Project Core  SUCCESS
> [4:16.511s]
> [INFO] Spark Project Bagel ... SUCCESS
> [13.401s]
> [INFO] Spark Project GraphX .. SUCCESS
> [1:08.824s]
> [INFO] Spark Project Streaming ... SUCCESS
> [2:18.844s]
> [INFO] Spark Project Catalyst  SUCCESS
> [2:43.695s]
> [INFO] Spark Project SQL . FAILURE
> [1:01.762s]
> [INFO] Spark Project ML Library .. SKIPPED
> [INFO] Spark Project Tools ... SKIPPED
> [INFO] Spark Project Hive  SKIPPED
> [INFO] Spark Project Docker Integration Tests  SKIPPED
> [INFO] Spark Project REPL  SKIPPED
> [INFO] Spark Project YARN Shuffle Service  SKIPPED
> [INFO] Spark Project YARN  SKIPPED
> [INFO] Spark Project Assembly  SKIPPED
> [INFO] Spark Project External Twitter  SKIPPED
> [INFO] Spark Project External Flume Sink . SKIPPED
> [INFO] Spark Project External Flume .. SKIPPED
> [INFO] Spark Project External Flume Assembly . SKIPPED
> [INFO] Spark Project External MQTT ... SKIPPED
> [INFO] Spark Project External MQTT Assembly .. SKIPPED
> [INFO] Spark Project External ZeroMQ . SKIPPED
> [INFO] Spark Project External Kafka .. SKIPPED
> [INFO] Spark Project Examples  SKIPPED
> [INFO] Spark Project External Kafka Assembly . SKIPPED
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time: 12:40.525s
> [INFO] Finished at: Wed Aug 31 01:56:50 IST 2016
> [INFO] Final Memory: 71M/830M
> [INFO] 
> 
> [ERROR] Failed to execute goal 
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile
> (scala-compile-first) on project spark-sql_2.11: Execution
> scala-compile-first of goal 
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile
> failed. CompileFailed -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> PluginExecutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
> [ERROR]   mvn  -rf :spark-sql_2.11
>
>
>


-- 
Regards,
-- Nachiketa


Spark build 1.6.2 error

2016-08-30 Thread Diwakar Dhanuskodi
Hi,

While building Spark 1.6.2 , getting below error in spark-sql. Much
appreciate for any help.

ERROR] missing or invalid dependency detected while loading class file
'WebUI.class'.
Could not access term eclipse in package org,
because it (or its dependencies) are missing. Check your build definition
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
the problematic classpath.)
A full rebuild may help if 'WebUI.class' was compiled against an
incompatible version of org.
[ERROR] missing or invalid dependency detected while loading class file
'WebUI.class'.
Could not access term jetty in value org.eclipse,
because it (or its dependencies) are missing. Check your build definition
for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see
the problematic classpath.)
A full rebuild may help if 'WebUI.class' was compiled against an
incompatible version of org.eclipse.
[WARNING] 17 warnings found
[ERROR] two errors found
[INFO]

[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .. SUCCESS [4.399s]
[INFO] Spark Project Test Tags ... SUCCESS [3.443s]
[INFO] Spark Project Launcher  SUCCESS [10.131s]
[INFO] Spark Project Networking .. SUCCESS [11.849s]
[INFO] Spark Project Shuffle Streaming Service ... SUCCESS [6.641s]
[INFO] Spark Project Unsafe .. SUCCESS [19.765s]
[INFO] Spark Project Core  SUCCESS
[4:16.511s]
[INFO] Spark Project Bagel ... SUCCESS [13.401s]
[INFO] Spark Project GraphX .. SUCCESS
[1:08.824s]
[INFO] Spark Project Streaming ... SUCCESS
[2:18.844s]
[INFO] Spark Project Catalyst  SUCCESS
[2:43.695s]
[INFO] Spark Project SQL . FAILURE
[1:01.762s]
[INFO] Spark Project ML Library .. SKIPPED
[INFO] Spark Project Tools ... SKIPPED
[INFO] Spark Project Hive  SKIPPED
[INFO] Spark Project Docker Integration Tests  SKIPPED
[INFO] Spark Project REPL  SKIPPED
[INFO] Spark Project YARN Shuffle Service  SKIPPED
[INFO] Spark Project YARN  SKIPPED
[INFO] Spark Project Assembly  SKIPPED
[INFO] Spark Project External Twitter  SKIPPED
[INFO] Spark Project External Flume Sink . SKIPPED
[INFO] Spark Project External Flume .. SKIPPED
[INFO] Spark Project External Flume Assembly . SKIPPED
[INFO] Spark Project External MQTT ... SKIPPED
[INFO] Spark Project External MQTT Assembly .. SKIPPED
[INFO] Spark Project External ZeroMQ . SKIPPED
[INFO] Spark Project External Kafka .. SKIPPED
[INFO] Spark Project Examples  SKIPPED
[INFO] Spark Project External Kafka Assembly . SKIPPED
[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 12:40.525s
[INFO] Finished at: Wed Aug 31 01:56:50 IST 2016
[INFO] Final Memory: 71M/830M
[INFO]

[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first)
on project spark-sql_2.11: Execution scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn  -rf :spark-sql_2.11


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
Good news - and Java 8 as well. I saw Matei after his talk at Scala days
and he said he would look into a 2.11 default but it seems that is already
the plan. Scala 2.12 is getting closer as well.

On Mon, May 16, 2016 at 2:55 PM, Ted Yu  wrote:

> For 2.0, I believe that is the case.
>
> Jenkins jobs have been running against Scala 2.11:
>
> [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ 
> java8-tests_2.11 ---
>
>
> FYI
>
>
> On Mon, May 16, 2016 at 2:45 PM, Eric Richardson 
> wrote:
>
>> On Thu, May 12, 2016 at 9:23 PM, Luciano Resende 
>> wrote:
>>
>>> Spark has moved to build using Scala 2.11 by default in master/trunk.
>>>
>>
>> Does this mean that the pre-built binaries for download will also move to
>> 2.11 as well?
>>
>>
>>>
>>>
>>> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk
>>> and you might be missing some modules/profiles for your build. What command
>>> did you use to build ?
>>>
>>> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
>>> m.vijayaragh...@gmail.com> wrote:
>>>
 Hello All,

 I built Spark from the source code available at
 https://github.com/apache/spark/. Although I haven't specified the
 "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
 see that it ended up using Scala 2.11. Now, for my application sbt, what
 should be the spark version? I tried the following

 val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
 val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

 and scalaVersion := "2.11.8"

 But this setting of spark version gives sbt error

 unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT

 I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
 Does this mean, the only option is to put all the required jars in the lib
 folder (unmanaged dependencies)?

 Regards,
 Raghava.

>>>
>>>
>>>
>>> --
>>> Luciano Resende
>>> http://twitter.com/lresende1975
>>> http://lresende.blogspot.com/
>>>
>>
>>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Ted Yu
For 2.0, I believe that is the case.

Jenkins jobs have been running against Scala 2.11:

[INFO] --- scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) @ java8-tests_2.11 ---


FYI


On Mon, May 16, 2016 at 2:45 PM, Eric Richardson 
wrote:

> On Thu, May 12, 2016 at 9:23 PM, Luciano Resende 
> wrote:
>
>> Spark has moved to build using Scala 2.11 by default in master/trunk.
>>
>
> Does this mean that the pre-built binaries for download will also move to
> 2.11 as well?
>
>
>>
>>
>> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
>> you might be missing some modules/profiles for your build. What command did
>> you use to build ?
>>
>> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
>> m.vijayaragh...@gmail.com> wrote:
>>
>>> Hello All,
>>>
>>> I built Spark from the source code available at
>>> https://github.com/apache/spark/. Although I haven't specified the
>>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>>> should be the spark version? I tried the following
>>>
>>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>>
>>> and scalaVersion := "2.11.8"
>>>
>>> But this setting of spark version gives sbt error
>>>
>>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>>
>>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>>> Does this mean, the only option is to put all the required jars in the lib
>>> folder (unmanaged dependencies)?
>>>
>>> Regards,
>>> Raghava.
>>>
>>
>>
>>
>> --
>> Luciano Resende
>> http://twitter.com/lresende1975
>> http://lresende.blogspot.com/
>>
>
>


Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
On Thu, May 12, 2016 at 9:23 PM, Luciano Resende 
wrote:

> Spark has moved to build using Scala 2.11 by default in master/trunk.
>

Does this mean that the pre-built binaries for download will also move to
2.11 as well?


>
>
> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
> you might be missing some modules/profiles for your build. What command did
> you use to build ?
>
> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
> m.vijayaragh...@gmail.com> wrote:
>
>> Hello All,
>>
>> I built Spark from the source code available at
>> https://github.com/apache/spark/. Although I haven't specified the
>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>> should be the spark version? I tried the following
>>
>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>
>> and scalaVersion := "2.11.8"
>>
>> But this setting of spark version gives sbt error
>>
>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>
>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>> Does this mean, the only option is to put all the required jars in the lib
>> folder (unmanaged dependencies)?
>>
>> Regards,
>> Raghava.
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>


Re: sbt for Spark build with Scala 2.11

2016-05-13 Thread Raghava Mutharaju
Thank you for the response.

I used the following command to build from source

build/mvn -Dhadoop.version=2.6.4 -Phadoop-2.6 -DskipTests clean package

Would this put in the required jars in .ivy2 during the build process? If
so, how can I make the spark distribution runnable, so that I can use it on
other machines as well (make-distribution.sh no longer exists in Spark root
folder)?

For compiling my application, I put in the following lines in the build.sbt

packAutoSettings
val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

lazy val root = (project in file(".")).
  settings(
name := "sparkel",
version := "0.1.0",
scalaVersion := "2.11.8",
libraryDependencies += spark,
libraryDependencies += sparksql
  )


Regards,
Raghava.


On Fri, May 13, 2016 at 12:23 AM, Luciano Resende 
wrote:

> Spark has moved to build using Scala 2.11 by default in master/trunk.
>
> As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
> you might be missing some modules/profiles for your build. What command did
> you use to build ?
>
> On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
> m.vijayaragh...@gmail.com> wrote:
>
>> Hello All,
>>
>> I built Spark from the source code available at
>> https://github.com/apache/spark/. Although I haven't specified the
>> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
>> see that it ended up using Scala 2.11. Now, for my application sbt, what
>> should be the spark version? I tried the following
>>
>> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
>> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>>
>> and scalaVersion := "2.11.8"
>>
>> But this setting of spark version gives sbt error
>>
>> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>>
>> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
>> Does this mean, the only option is to put all the required jars in the lib
>> folder (unmanaged dependencies)?
>>
>> Regards,
>> Raghava.
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>



-- 
Regards,
Raghava
http://raghavam.github.io


Re: sbt for Spark build with Scala 2.11

2016-05-12 Thread Luciano Resende
Spark has moved to build using Scala 2.11 by default in master/trunk.

As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
you might be missing some modules/profiles for your build. What command did
you use to build ?

On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
m.vijayaragh...@gmail.com> wrote:

> Hello All,
>
> I built Spark from the source code available at
> https://github.com/apache/spark/. Although I haven't specified the
> "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
> see that it ended up using Scala 2.11. Now, for my application sbt, what
> should be the spark version? I tried the following
>
> val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
> val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"
>
> and scalaVersion := "2.11.8"
>
> But this setting of spark version gives sbt error
>
> unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT
>
> I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT.
> Does this mean, the only option is to put all the required jars in the lib
> folder (unmanaged dependencies)?
>
> Regards,
> Raghava.
>



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


sbt for Spark build with Scala 2.11

2016-05-12 Thread Raghava Mutharaju
Hello All,

I built Spark from the source code available at
https://github.com/apache/spark/. Although I haven't specified the
"-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
see that it ended up using Scala 2.11. Now, for my application sbt, what
should be the spark version? I tried the following

val spark = "org.apache.spark" %% "spark-core" % "2.0.0-SNAPSHOT"
val sparksql = "org.apache.spark" % "spark-sql_2.11" % "2.0.0-SNAPSHOT"

and scalaVersion := "2.11.8"

But this setting of spark version gives sbt error

unresolved dependency: org.apache.spark#spark-core_2.11;2.0.0-SNAPSHOT

I guess this is because the repository doesn't contain 2.0.0-SNAPSHOT. Does
this mean, the only option is to put all the required jars in the lib
folder (unmanaged dependencies)?

Regards,
Raghava.


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-09 Thread Luciano Resende
The manual download is not required on latest trunk code anymore.

On Monday, May 9, 2016, Andrew Lee <alee...@hotmail.com> wrote:

> In fact, it does require ojdbc from Oracle which also requires a username
> and password. This was added as part of the testing scope for
> Oracle's docker.
>
>
> I notice this PR and commit in branch-2.0 according to
> https://issues.apache.org/jira/browse/SPARK-12941.
>
> In the comment, I'm not sure what does it mean by installing the JAR
> locally while Spark QA test run. IF this is the case,
>
> it means someone downloaded the JAR from Oracle and manually added to the
> local build machine that is building Spark branch-2.0 or internal maven
> repository that will serve this ojdbc JAR.
>
>
> 
>
> commit 8afe49141d9b6a603eb3907f32dce802a3d05172
>
> Author: thomastechs <thomas.sebast...@tcs.com
> <javascript:_e(%7B%7D,'cvml','thomas.sebast...@tcs.com');>>
>
> Date:   Thu Feb 25 22:52:25 2016 -0800
>
>
> [SPARK-12941][SQL][MASTER] Spark-SQL JDBC Oracle dialect fails to map
> string datatypes to Oracle VARCHAR datatype
>
>
>
> ## What changes were proposed in this pull request?
>
>
>
> This Pull request is used for the fix SPARK-12941, creating a data
> type mapping to Oracle for the corresponding data type"Stringtype" from
>
> dataframe. This PR is for the master branch fix, where as another PR is
> already tested with the branch 1.4
>
>
>
> ## How was the this patch tested?
>
>
>
> (Please explain how this patch was tested. E.g. unit tests,
> integration tests, manual tests)
>
> This patch was tested using the Oracle docker .Created a new
> integration suite for the same.The oracle.jdbc jar was to be downloaded
> from the maven repository.Since there was no jdbc jar available in the
> maven repository, the jar was downloaded from oracle site manually and
> installed in the local; thus tested. So, for SparkQA test case run, the
> ojdbc jar might be manually placed in the local maven
> repository(com/oracle/ojdbc6/11.2.0.2.0) while Spark QA test run.
>
>
>
> Author: thomastechs <thomas.sebast...@tcs.com
> <javascript:_e(%7B%7D,'cvml','thomas.sebast...@tcs.com');>>
>
>
>
> Closes #11306 from thomastechs/master.
> 
>
>
> Meanwhile, I also notice that the ojdbc groupID provided by Oracle
> (official website
> https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc)  is
> different.
>
>
> 
>
>   com.oracle.jdbc
>
>   ojdbc6
>
>   11.2.0.4
>
>   test
>
> 
>
> as oppose to the one in Spark branch-2.0
>
> external/docker-integration-tests/pom.xml
>
>
> 
>
>   com.oracle
>
>   ojdbc6
>
>   11.2.0.1.0
>
>   test
>
> 
>
>
> The version is out of date and not available from the Oracle Maven repo.
> The PR was created awhile back, so the solution may just cross Oracle's
> maven release blog.
>
>
> Just my inference based on what I see form git and JIRA, however, I do see
> a fix required to patch pom.xml to apply the correct groupId and version #
> for ojdbc6 driver.
>
>
> Thoughts?
>
>
>
> Get Oracle JDBC drivers and UCP from Oracle Maven ...
> <https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc>
> blogs.oracle.com
> Get Oracle JDBC drivers and UCP from Oracle Maven Repository (without
> IDEs) By Nirmala Sundarappa-Oracle on Feb 15, 2016
>
>
>
>
>
>
>
> --
> *From:* Mich Talebzadeh <mich.talebza...@gmail.com
> <javascript:_e(%7B%7D,'cvml','mich.talebza...@gmail.com');>>
> *Sent:* Tuesday, May 3, 2016 1:04 AM
> *To:* Luciano Resende
> *Cc:* Hien Luu; ☼ R Nair (रविशंकर नायर); user
> *Subject:* Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0
>
> which version of Spark are using?
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 3 May 2016 at 02:13, Luciano Resende <luckbr1...@gmail.com
> <javascript:_e(%7B%7D,'cvml','luckbr1...@gmail.com');>> wrote:
>
>> You might have a settings.xml that is forcing your internal Maven
>> repository to be the mirror of external repositories and thus not finding
>> the dependency.
>>
>> On Mon, May 2, 2016 at 6:11 PM, Hien Luu <hien...@gmail.com
>> <javascript:_e(%7B%7D,'cvml','hien...@gmail.com');>&

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-09 Thread Andrew Lee
In fact, it does require ojdbc from Oracle which also requires a username and 
password. This was added as part of the testing scope for Oracle's docker.


I notice this PR and commit in branch-2.0 according to 
https://issues.apache.org/jira/browse/SPARK-12941.

In the comment, I'm not sure what does it mean by installing the JAR locally 
while Spark QA test run. IF this is the case,

it means someone downloaded the JAR from Oracle and manually added to the local 
build machine that is building Spark branch-2.0 or internal maven repository 
that will serve this ojdbc JAR.




commit 8afe49141d9b6a603eb3907f32dce802a3d05172

Author: thomastechs <thomas.sebast...@tcs.com>

Date:   Thu Feb 25 22:52:25 2016 -0800


[SPARK-12941][SQL][MASTER] Spark-SQL JDBC Oracle dialect fails to map 
string datatypes to Oracle VARCHAR datatype



## What changes were proposed in this pull request?



This Pull request is used for the fix SPARK-12941, creating a data type 
mapping to Oracle for the corresponding data type"Stringtype" from

dataframe. This PR is for the master branch fix, where as another PR is already 
tested with the branch 1.4



## How was the this patch tested?



(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)

This patch was tested using the Oracle docker .Created a new integration 
suite for the same.The oracle.jdbc jar was to be downloaded from the maven 
repository.Since there was no jdbc jar available in the maven repository, the 
jar was downloaded from oracle site manually and installed in the local; thus 
tested. So, for SparkQA test case run, the ojdbc jar might be manually placed 
in the local maven repository(com/oracle/ojdbc6/11.2.0.2.0) while Spark QA test 
run.



Author: thomastechs <thomas.sebast...@tcs.com>



Closes #11306 from thomastechs/master.




Meanwhile, I also notice that the ojdbc groupID provided by Oracle (official 
website https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc)  is 
different.




  com.oracle.jdbc

  ojdbc6

  11.2.0.4

  test




as oppose to the one in Spark branch-2.0

external/docker-integration-tests/pom.xml




  com.oracle

  ojdbc6

  11.2.0.1.0

  test





The version is out of date and not available from the Oracle Maven repo. The PR 
was created awhile back, so the solution may just cross Oracle's maven release 
blog.


Just my inference based on what I see form git and JIRA, however, I do see a 
fix required to patch pom.xml to apply the correct groupId and version # for 
ojdbc6 driver.


Thoughts?



Get Oracle JDBC drivers and UCP from Oracle Maven 
...<https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc>
blogs.oracle.com
Get Oracle JDBC drivers and UCP from Oracle Maven Repository (without IDEs) By 
Nirmala Sundarappa-Oracle on Feb 15, 2016









From: Mich Talebzadeh <mich.talebza...@gmail.com>
Sent: Tuesday, May 3, 2016 1:04 AM
To: Luciano Resende
Cc: Hien Luu; ☼ R Nair (रविशंकर नायर); user
Subject: Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

which version of Spark are using?


Dr Mich Talebzadeh



LinkedIn  
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw



http://talebzadehmich.wordpress.com<http://talebzadehmich.wordpress.com/>



On 3 May 2016 at 02:13, Luciano Resende 
<luckbr1...@gmail.com<mailto:luckbr1...@gmail.com>> wrote:
You might have a settings.xml that is forcing your internal Maven repository to 
be the mirror of external repositories and thus not finding the dependency.

On Mon, May 2, 2016 at 6:11 PM, Hien Luu 
<hien...@gmail.com<mailto:hien...@gmail.com>> wrote:
Not I am not.  I am considering downloading it manually and place it in my 
local repository.

On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) 
<ravishankar.n...@gmail.com<mailto:ravishankar.n...@gmail.com>> wrote:

Oracle jdbc is not part of Maven repository,  are you keeping a downloaded file 
in your local repo?

Best, RS

On May 2, 2016 8:51 PM, "Hien Luu" 
<hien...@gmail.com<mailto:hien...@gmail.com>> wrote:
Hi all,

I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.  It 
kept getting "Operation timed out" while building Spark Project Docker 
Integration Tests module (see the error below).

Has anyone run this problem before? If so, how did you resolve around this 
problem?

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ... SUCCESS [  2.423 s]

[INFO] Spark Project Test Tags  SUCCESS [  0.712 s]

[INFO] Spark Project Sketch ... SUCCESS [  0.498 s]

[INFO] Spark Project Networking ... SUCCESS [  1.743 s]

[INFO] Spark Project Shuffle Streaming Service 

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-03 Thread Mich Talebzadeh
which version of Spark are using?

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com



On 3 May 2016 at 02:13, Luciano Resende  wrote:

> You might have a settings.xml that is forcing your internal Maven
> repository to be the mirror of external repositories and thus not finding
> the dependency.
>
> On Mon, May 2, 2016 at 6:11 PM, Hien Luu  wrote:
>
>> Not I am not.  I am considering downloading it manually and place it in
>> my local repository.
>>
>> On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) <
>> ravishankar.n...@gmail.com> wrote:
>>
>>> Oracle jdbc is not part of Maven repository,  are you keeping a
>>> downloaded file in your local repo?
>>>
>>> Best, RS
>>> On May 2, 2016 8:51 PM, "Hien Luu"  wrote:
>>>
 Hi all,

 I am running into a build problem with
 com.oracle:ojdbc6:jar:11.2.0.1.0.  It kept getting "Operation timed out"
 while building Spark Project Docker Integration Tests module (see the error
 below).

 Has anyone run this problem before? If so, how did you resolve around
 this problem?

 [INFO] Reactor Summary:

 [INFO]

 [INFO] Spark Project Parent POM ... SUCCESS [
 2.423 s]

 [INFO] Spark Project Test Tags  SUCCESS [
 0.712 s]

 [INFO] Spark Project Sketch ... SUCCESS [
 0.498 s]

 [INFO] Spark Project Networking ... SUCCESS [
 1.743 s]

 [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
 0.587 s]

 [INFO] Spark Project Unsafe ... SUCCESS [
 0.503 s]

 [INFO] Spark Project Launcher . SUCCESS [
 4.894 s]

 [INFO] Spark Project Core . SUCCESS [
 17.953 s]

 [INFO] Spark Project GraphX ... SUCCESS [
 3.480 s]

 [INFO] Spark Project Streaming  SUCCESS [
 6.022 s]

 [INFO] Spark Project Catalyst . SUCCESS [
 8.664 s]

 [INFO] Spark Project SQL .. SUCCESS [
 12.440 s]

 [INFO] Spark Project ML Local Library . SUCCESS [
 0.498 s]

 [INFO] Spark Project ML Library ... SUCCESS [
 8.594 s]

 [INFO] Spark Project Tools  SUCCESS [
 0.162 s]

 [INFO] Spark Project Hive . SUCCESS [
 9.834 s]

 [INFO] Spark Project HiveContext Compatibility  SUCCESS [
 1.428 s]

 [INFO] Spark Project Docker Integration Tests . FAILURE
 [02:32 min]

 [INFO] Spark Project REPL . SKIPPED

 [INFO] Spark Project Assembly . SKIPPED

 [INFO] Spark Project External Flume Sink .. SKIPPED

 [INFO] Spark Project External Flume ... SKIPPED

 [INFO] Spark Project External Flume Assembly .. SKIPPED

 [INFO] Spark Project External Kafka ... SKIPPED

 [INFO] Spark Project Examples . SKIPPED

 [INFO] Spark Project External Kafka Assembly .. SKIPPED

 [INFO] Spark Project Java 8 Tests . SKIPPED

 [INFO]
 

 [INFO] BUILD FAILURE

 [INFO]
 

 [INFO] Total time: 03:53 min

 [INFO] Finished at: 2016-05-02T17:44:57-07:00

 [INFO] Final Memory: 80M/1525M

 [INFO]
 

 [ERROR] Failed to execute goal on project
 spark-docker-integration-tests_2.11: Could not resolve dependencies for
 project
 org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
 Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
 to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
 transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
 (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
 artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
 failed: Operation timed out -> [Help 1]


>>
>>
>> --
>> Regards,
>>
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> 

Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Luciano Resende
You might have a settings.xml that is forcing your internal Maven
repository to be the mirror of external repositories and thus not finding
the dependency.

On Mon, May 2, 2016 at 6:11 PM, Hien Luu  wrote:

> Not I am not.  I am considering downloading it manually and place it in my
> local repository.
>
> On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) <
> ravishankar.n...@gmail.com> wrote:
>
>> Oracle jdbc is not part of Maven repository,  are you keeping a
>> downloaded file in your local repo?
>>
>> Best, RS
>> On May 2, 2016 8:51 PM, "Hien Luu"  wrote:
>>
>>> Hi all,
>>>
>>> I am running into a build problem with
>>> com.oracle:ojdbc6:jar:11.2.0.1.0.  It kept getting "Operation timed out"
>>> while building Spark Project Docker Integration Tests module (see the error
>>> below).
>>>
>>> Has anyone run this problem before? If so, how did you resolve around
>>> this problem?
>>>
>>> [INFO] Reactor Summary:
>>>
>>> [INFO]
>>>
>>> [INFO] Spark Project Parent POM ... SUCCESS [
>>> 2.423 s]
>>>
>>> [INFO] Spark Project Test Tags  SUCCESS [
>>> 0.712 s]
>>>
>>> [INFO] Spark Project Sketch ... SUCCESS [
>>> 0.498 s]
>>>
>>> [INFO] Spark Project Networking ... SUCCESS [
>>> 1.743 s]
>>>
>>> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
>>> 0.587 s]
>>>
>>> [INFO] Spark Project Unsafe ... SUCCESS [
>>> 0.503 s]
>>>
>>> [INFO] Spark Project Launcher . SUCCESS [
>>> 4.894 s]
>>>
>>> [INFO] Spark Project Core . SUCCESS [
>>> 17.953 s]
>>>
>>> [INFO] Spark Project GraphX ... SUCCESS [
>>> 3.480 s]
>>>
>>> [INFO] Spark Project Streaming  SUCCESS [
>>> 6.022 s]
>>>
>>> [INFO] Spark Project Catalyst . SUCCESS [
>>> 8.664 s]
>>>
>>> [INFO] Spark Project SQL .. SUCCESS [
>>> 12.440 s]
>>>
>>> [INFO] Spark Project ML Local Library . SUCCESS [
>>> 0.498 s]
>>>
>>> [INFO] Spark Project ML Library ... SUCCESS [
>>> 8.594 s]
>>>
>>> [INFO] Spark Project Tools  SUCCESS [
>>> 0.162 s]
>>>
>>> [INFO] Spark Project Hive . SUCCESS [
>>> 9.834 s]
>>>
>>> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
>>> 1.428 s]
>>>
>>> [INFO] Spark Project Docker Integration Tests . FAILURE
>>> [02:32 min]
>>>
>>> [INFO] Spark Project REPL . SKIPPED
>>>
>>> [INFO] Spark Project Assembly . SKIPPED
>>>
>>> [INFO] Spark Project External Flume Sink .. SKIPPED
>>>
>>> [INFO] Spark Project External Flume ... SKIPPED
>>>
>>> [INFO] Spark Project External Flume Assembly .. SKIPPED
>>>
>>> [INFO] Spark Project External Kafka ... SKIPPED
>>>
>>> [INFO] Spark Project Examples . SKIPPED
>>>
>>> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>>>
>>> [INFO] Spark Project Java 8 Tests . SKIPPED
>>>
>>> [INFO]
>>> 
>>>
>>> [INFO] BUILD FAILURE
>>>
>>> [INFO]
>>> 
>>>
>>> [INFO] Total time: 03:53 min
>>>
>>> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>>>
>>> [INFO] Final Memory: 80M/1525M
>>>
>>> [INFO]
>>> 
>>>
>>> [ERROR] Failed to execute goal on project
>>> spark-docker-integration-tests_2.11: Could not resolve dependencies for
>>> project
>>> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
>>> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
>>> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
>>> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
>>> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
>>> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
>>> failed: Operation timed out -> [Help 1]
>>>
>>>
>
>
> --
> Regards,
>



-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Hien Luu
Not I am not.  I am considering downloading it manually and place it in my
local repository.

On Mon, May 2, 2016 at 5:54 PM, ☼ R Nair (रविशंकर नायर) <
ravishankar.n...@gmail.com> wrote:

> Oracle jdbc is not part of Maven repository,  are you keeping a downloaded
> file in your local repo?
>
> Best, RS
> On May 2, 2016 8:51 PM, "Hien Luu"  wrote:
>
>> Hi all,
>>
>> I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
>> It kept getting "Operation timed out" while building Spark Project Docker
>> Integration Tests module (see the error below).
>>
>> Has anyone run this problem before? If so, how did you resolve around
>> this problem?
>>
>> [INFO] Reactor Summary:
>>
>> [INFO]
>>
>> [INFO] Spark Project Parent POM ... SUCCESS [
>> 2.423 s]
>>
>> [INFO] Spark Project Test Tags  SUCCESS [
>> 0.712 s]
>>
>> [INFO] Spark Project Sketch ... SUCCESS [
>> 0.498 s]
>>
>> [INFO] Spark Project Networking ... SUCCESS [
>> 1.743 s]
>>
>> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
>> 0.587 s]
>>
>> [INFO] Spark Project Unsafe ... SUCCESS [
>> 0.503 s]
>>
>> [INFO] Spark Project Launcher . SUCCESS [
>> 4.894 s]
>>
>> [INFO] Spark Project Core . SUCCESS [
>> 17.953 s]
>>
>> [INFO] Spark Project GraphX ... SUCCESS [
>> 3.480 s]
>>
>> [INFO] Spark Project Streaming  SUCCESS [
>> 6.022 s]
>>
>> [INFO] Spark Project Catalyst . SUCCESS [
>> 8.664 s]
>>
>> [INFO] Spark Project SQL .. SUCCESS [
>> 12.440 s]
>>
>> [INFO] Spark Project ML Local Library . SUCCESS [
>> 0.498 s]
>>
>> [INFO] Spark Project ML Library ... SUCCESS [
>> 8.594 s]
>>
>> [INFO] Spark Project Tools  SUCCESS [
>> 0.162 s]
>>
>> [INFO] Spark Project Hive . SUCCESS [
>> 9.834 s]
>>
>> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
>> 1.428 s]
>>
>> [INFO] Spark Project Docker Integration Tests . FAILURE
>> [02:32 min]
>>
>> [INFO] Spark Project REPL . SKIPPED
>>
>> [INFO] Spark Project Assembly . SKIPPED
>>
>> [INFO] Spark Project External Flume Sink .. SKIPPED
>>
>> [INFO] Spark Project External Flume ... SKIPPED
>>
>> [INFO] Spark Project External Flume Assembly .. SKIPPED
>>
>> [INFO] Spark Project External Kafka ... SKIPPED
>>
>> [INFO] Spark Project Examples . SKIPPED
>>
>> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>>
>> [INFO] Spark Project Java 8 Tests . SKIPPED
>>
>> [INFO]
>> 
>>
>> [INFO] BUILD FAILURE
>>
>> [INFO]
>> 
>>
>> [INFO] Total time: 03:53 min
>>
>> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>>
>> [INFO] Final Memory: 80M/1525M
>>
>> [INFO]
>> 
>>
>> [ERROR] Failed to execute goal on project
>> spark-docker-integration-tests_2.11: Could not resolve dependencies for
>> project
>> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
>> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
>> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
>> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
>> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
>> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
>> failed: Operation timed out -> [Help 1]
>>
>>


-- 
Regards,


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Ted Yu
>From the output of dependency:tree of master branch:

[INFO]

[INFO] Building Spark Project Docker Integration Tests 2.0.0-SNAPSHOT
[INFO]

[WARNING] The POM for com.oracle:ojdbc6:jar:11.2.0.1.0 is missing, no
dependency information available
[INFO]
...
[INFO] +- com.oracle:ojdbc6:jar:11.2.0.1.0:test

Are you building behind a proxy ?

On Mon, May 2, 2016 at 5:51 PM, Hien Luu  wrote:

> Hi all,
>
> I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
> It kept getting "Operation timed out" while building Spark Project Docker
> Integration Tests module (see the error below).
>
> Has anyone run this problem before? If so, how did you resolve around this
> problem?
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Spark Project Parent POM ... SUCCESS [
> 2.423 s]
>
> [INFO] Spark Project Test Tags  SUCCESS [
> 0.712 s]
>
> [INFO] Spark Project Sketch ... SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project Networking ... SUCCESS [
> 1.743 s]
>
> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
> 0.587 s]
>
> [INFO] Spark Project Unsafe ... SUCCESS [
> 0.503 s]
>
> [INFO] Spark Project Launcher . SUCCESS [
> 4.894 s]
>
> [INFO] Spark Project Core . SUCCESS [
> 17.953 s]
>
> [INFO] Spark Project GraphX ... SUCCESS [
> 3.480 s]
>
> [INFO] Spark Project Streaming  SUCCESS [
> 6.022 s]
>
> [INFO] Spark Project Catalyst . SUCCESS [
> 8.664 s]
>
> [INFO] Spark Project SQL .. SUCCESS [
> 12.440 s]
>
> [INFO] Spark Project ML Local Library . SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project ML Library ... SUCCESS [
> 8.594 s]
>
> [INFO] Spark Project Tools  SUCCESS [
> 0.162 s]
>
> [INFO] Spark Project Hive . SUCCESS [
> 9.834 s]
>
> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
> 1.428 s]
>
> [INFO] Spark Project Docker Integration Tests . FAILURE [02:32
> min]
>
> [INFO] Spark Project REPL . SKIPPED
>
> [INFO] Spark Project Assembly . SKIPPED
>
> [INFO] Spark Project External Flume Sink .. SKIPPED
>
> [INFO] Spark Project External Flume ... SKIPPED
>
> [INFO] Spark Project External Flume Assembly .. SKIPPED
>
> [INFO] Spark Project External Kafka ... SKIPPED
>
> [INFO] Spark Project Examples . SKIPPED
>
> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>
> [INFO] Spark Project Java 8 Tests . SKIPPED
>
> [INFO]
> 
>
> [INFO] BUILD FAILURE
>
> [INFO]
> 
>
> [INFO] Total time: 03:53 min
>
> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>
> [INFO] Final Memory: 80M/1525M
>
> [INFO]
> 
>
> [ERROR] Failed to execute goal on project
> spark-docker-integration-tests_2.11: Could not resolve dependencies for
> project
> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
> failed: Operation timed out -> [Help 1]
>
>


Re: Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread रविशंकर नायर
Oracle jdbc is not part of Maven repository,  are you keeping a downloaded
file in your local repo?

Best, RS
On May 2, 2016 8:51 PM, "Hien Luu"  wrote:

> Hi all,
>
> I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
> It kept getting "Operation timed out" while building Spark Project Docker
> Integration Tests module (see the error below).
>
> Has anyone run this problem before? If so, how did you resolve around this
> problem?
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Spark Project Parent POM ... SUCCESS [
> 2.423 s]
>
> [INFO] Spark Project Test Tags  SUCCESS [
> 0.712 s]
>
> [INFO] Spark Project Sketch ... SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project Networking ... SUCCESS [
> 1.743 s]
>
> [INFO] Spark Project Shuffle Streaming Service  SUCCESS [
> 0.587 s]
>
> [INFO] Spark Project Unsafe ... SUCCESS [
> 0.503 s]
>
> [INFO] Spark Project Launcher . SUCCESS [
> 4.894 s]
>
> [INFO] Spark Project Core . SUCCESS [
> 17.953 s]
>
> [INFO] Spark Project GraphX ... SUCCESS [
> 3.480 s]
>
> [INFO] Spark Project Streaming  SUCCESS [
> 6.022 s]
>
> [INFO] Spark Project Catalyst . SUCCESS [
> 8.664 s]
>
> [INFO] Spark Project SQL .. SUCCESS [
> 12.440 s]
>
> [INFO] Spark Project ML Local Library . SUCCESS [
> 0.498 s]
>
> [INFO] Spark Project ML Library ... SUCCESS [
> 8.594 s]
>
> [INFO] Spark Project Tools  SUCCESS [
> 0.162 s]
>
> [INFO] Spark Project Hive . SUCCESS [
> 9.834 s]
>
> [INFO] Spark Project HiveContext Compatibility  SUCCESS [
> 1.428 s]
>
> [INFO] Spark Project Docker Integration Tests . FAILURE [02:32
> min]
>
> [INFO] Spark Project REPL . SKIPPED
>
> [INFO] Spark Project Assembly . SKIPPED
>
> [INFO] Spark Project External Flume Sink .. SKIPPED
>
> [INFO] Spark Project External Flume ... SKIPPED
>
> [INFO] Spark Project External Flume Assembly .. SKIPPED
>
> [INFO] Spark Project External Kafka ... SKIPPED
>
> [INFO] Spark Project Examples . SKIPPED
>
> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>
> [INFO] Spark Project Java 8 Tests . SKIPPED
>
> [INFO]
> 
>
> [INFO] BUILD FAILURE
>
> [INFO]
> 
>
> [INFO] Total time: 03:53 min
>
> [INFO] Finished at: 2016-05-02T17:44:57-07:00
>
> [INFO] Final Memory: 80M/1525M
>
> [INFO]
> 
>
> [ERROR] Failed to execute goal on project
> spark-docker-integration-tests_2.11: Could not resolve dependencies for
> project
> org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
> Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
> to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
> transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
> (http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
> artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
> failed: Operation timed out -> [Help 1]
>
>


Spark build failure with com.oracle:ojdbc6:jar:11.2.0.1.0

2016-05-02 Thread Hien Luu
Hi all,

I am running into a build problem with com.oracle:ojdbc6:jar:11.2.0.1.0.
It kept getting "Operation timed out" while building Spark Project Docker
Integration Tests module (see the error below).

Has anyone run this problem before? If so, how did you resolve around this
problem?

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ... SUCCESS [
2.423 s]

[INFO] Spark Project Test Tags  SUCCESS [
0.712 s]

[INFO] Spark Project Sketch ... SUCCESS [
0.498 s]

[INFO] Spark Project Networking ... SUCCESS [
1.743 s]

[INFO] Spark Project Shuffle Streaming Service  SUCCESS [
0.587 s]

[INFO] Spark Project Unsafe ... SUCCESS [
0.503 s]

[INFO] Spark Project Launcher . SUCCESS [
4.894 s]

[INFO] Spark Project Core . SUCCESS [
17.953 s]

[INFO] Spark Project GraphX ... SUCCESS [
3.480 s]

[INFO] Spark Project Streaming  SUCCESS [
6.022 s]

[INFO] Spark Project Catalyst . SUCCESS [
8.664 s]

[INFO] Spark Project SQL .. SUCCESS [
12.440 s]

[INFO] Spark Project ML Local Library . SUCCESS [
0.498 s]

[INFO] Spark Project ML Library ... SUCCESS [
8.594 s]

[INFO] Spark Project Tools  SUCCESS [
0.162 s]

[INFO] Spark Project Hive . SUCCESS [
9.834 s]

[INFO] Spark Project HiveContext Compatibility  SUCCESS [
1.428 s]

[INFO] Spark Project Docker Integration Tests . FAILURE [02:32
min]

[INFO] Spark Project REPL . SKIPPED

[INFO] Spark Project Assembly . SKIPPED

[INFO] Spark Project External Flume Sink .. SKIPPED

[INFO] Spark Project External Flume ... SKIPPED

[INFO] Spark Project External Flume Assembly .. SKIPPED

[INFO] Spark Project External Kafka ... SKIPPED

[INFO] Spark Project Examples . SKIPPED

[INFO] Spark Project External Kafka Assembly .. SKIPPED

[INFO] Spark Project Java 8 Tests . SKIPPED

[INFO]


[INFO] BUILD FAILURE

[INFO]


[INFO] Total time: 03:53 min

[INFO] Finished at: 2016-05-02T17:44:57-07:00

[INFO] Final Memory: 80M/1525M

[INFO]


[ERROR] Failed to execute goal on project
spark-docker-integration-tests_2.11: Could not resolve dependencies for
project
org.apache.spark:spark-docker-integration-tests_2.11:jar:2.0.0-SNAPSHOT:
Failed to collect dependencies at com.oracle:ojdbc6:jar:11.2.0.1.0: Failed
to read artifact descriptor for com.oracle:ojdbc6:jar:11.2.0.1.0: Could not
transfer artifact com.oracle:ojdbc6:pom:11.2.0.1.0 from/to uber-artifactory
(http://artifactory.uber.internal:4587/artifactory/repo/): Connect to
artifactory.uber.internal:4587 [artifactory.uber.internal/10.162.11.61]
failed: Operation timed out -> [Help 1]


Re: [spark] build/sbt gen-idea error

2016-04-12 Thread Sean Owen
We just removed the gen-idea plugin.
Just import the Maven project into IDEA or Eclipse.

On Tue, Apr 12, 2016 at 4:52 PM, ImMr.K <875061...@qq.com> wrote:
> But how to import spark repo into idea or eclipse?
>
>
>
> -- 原始邮件 --
> 发件人: Ted Yu 
> 发送时间: 2016年4月12日 23:38
> 收件人: ImMr.K <875061...@qq.com>
> 抄送: user 
> 主题: Re: build/sbt gen-idea error
>
> gen-idea doesn't seem to be a valid command:
>
> [warn] Ignoring load failure: no project loaded.
> [error] Not a valid command: gen-idea
> [error] gen-idea
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re??[spark] build/sbt gen-idea error

2016-04-12 Thread ImMr.K
But how to import spark repo into idea or eclipse?




--  --
??: Ted Yu <yuzhih...@gmail.com>
: 2016??4??12?? 23:38
??: ImMr.K <875061...@qq.com>
: user <user@spark.apache.org>
: Re: build/sbt gen-idea error



gen-idea doesn't seem to be a valid command:
[warn] Ignoring load failure: no project loaded.
[error] Not a valid command: gen-idea
[error] gen-idea




On Tue, Apr 12, 2016 at 8:28 AM, ImMr.K <875061...@qq.com> wrote:
Hi,
I have cloned spark and ,
cd spark
build/sbt gen-idea


got the following output:




Using /usr/java/jre1.7.0_09 as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] Loading project definition from /home/king/github/spark/project/project
[info] Loading project definition from 
/home/king/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with 
same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project 
resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from /home/king/github/spark/project
org.apache.maven.model.building.ModelBuildingException: 1 problem was 
encountered while building the effective model for 
org.apache.spark:spark-parent_2.11:2.0.0-SNAPSHOT
[FATAL] Non-resolvable parent POM: Could not transfer artifact 
org.apache:apache:pom:14 from/to central ( 
http://repo.maven.apache.org/maven2): Error transferring file: Connection timed 
out from  
http://repo.maven.apache.org/maven2/org/apache/apache/14/apache-14.pom and 
'parent.relativePath' points at wrong local POM @ line 22, column 11


at 
org.apache.maven.model.building.DefaultModelProblemCollector.newModelBuildingException(DefaultModelProblemCollector.java:195)
at 
org.apache.maven.model.building.DefaultModelBuilder.readParentExternally(DefaultModelBuilder.java:841)
at 
org.apache.maven.model.building.DefaultModelBuilder.readParent(DefaultModelBuilder.java:664)
at 
org.apache.maven.model.building.DefaultModelBuilder.build(DefaultModelBuilder.java:310)
at 
org.apache.maven.model.building.DefaultModelBuilder.build(DefaultModelBuilder.java:232)
at 
com.typesafe.sbt.pom.MvnPomResolver.loadEffectivePom(MavenPomResolver.scala:61)
at com.typesafe.sbt.pom.package$.loadEffectivePom(package.scala:41)
at 
com.typesafe.sbt.pom.MavenProjectHelper$.makeProjectTree(MavenProjectHelper.scala:128)
at 
com.typesafe.sbt.pom.MavenProjectHelper$.makeReactorProject(MavenProjectHelper.scala:49)
at com.typesafe.sbt.pom.PomBuild$class.projectDefinitions(PomBuild.scala:28)
at SparkBuild$.projectDefinitions(SparkBuild.scala:347)
at sbt.Load$.sbt$Load$$projectsFromBuild(Load.scala:506)
at sbt.Load$$anonfun$27.apply(Load.scala:446)
at sbt.Load$$anonfun$27.apply(Load.scala:446)
at scala.collection.immutable.Stream.flatMap(Stream.scala:442)
at sbt.Load$.loadUnit(Load.scala:446)
at sbt.Load$$anonfun$18$$anonfun$apply$11.apply(Load.scala:291)
at sbt.Load$$anonfun$18$$anonfun$apply$11.apply(Load.scala:291)
at 
sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:91)
at 
sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:90)
at sbt.BuildLoader.apply(BuildLoader.scala:140)
at sbt.Load$.loadAll(Load.scala:344)
at sbt.Load$.loadURI(Load.scala:299)
at sbt.Load$.load(Load.scala:295)
at sbt.Load$.load(Load.scala:286)
at sbt.Load$.apply(Load.scala:140)
at sbt.Load$.defaultLoad(Load.scala:36)
at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:492)
at sbt.BuiltinCommands$.doLoadProject(Main.scala:492)
at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:484)
at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:484)
at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
at sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61)
at sbt.Command$.process(Command.scala:93)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96)
at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96)
at sbt.State$$anon$1.process(State.scala:184)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:96)
at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:96)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.MainLoop$.next(MainLoop.scala:96)
at sbt.MainLoop$.run(MainLoop.scala:89)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:68)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:63)
at sbt.Using.apply(Using.scala:24)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:63)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:46)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:30)
at sbt.MainLoop$.run

Re: Re:[spark] build/sbt gen-idea error

2016-04-12 Thread Marco Mistroni
Have you tried SBT eclipse plugin? Then u can run SBT eclipse and have ur
spark project directly in eclipse
Pls Google it and u shud b able to find ur way.
If not ping me and I send u the plugin (I m replying from my phone)
Hth
On 12 Apr 2016 4:53 pm, "ImMr.K" <875061...@qq.com> wrote:

But how to import spark repo into idea or eclipse?



-- 原始邮件 --
*发件人:* Ted Yu <yuzhih...@gmail.com>
*发送时间:* 2016年4月12日 23:38
*收件人:* ImMr.K <875061...@qq.com>
*抄送:* user <user@spark.apache.org>
*主题:* Re: build/sbt gen-idea error

gen-idea doesn't seem to be a valid command:

[warn] Ignoring load failure: no project loaded.
[error] Not a valid command: gen-idea
[error] gen-idea

On Tue, Apr 12, 2016 at 8:28 AM, ImMr.K <875061...@qq.com> wrote:

> Hi,
> I have cloned spark and ,
> cd spark
> build/sbt gen-idea
>
> got the following output:
>
>
> Using /usr/java/jre1.7.0_09 as default JAVA_HOME.
> Note, this will be overridden by -java-home if it is set.
> [info] Loading project definition from
> /home/king/github/spark/project/project
> [info] Loading project definition from
> /home/king/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader/project
> [warn] Multiple resolvers having different access mechanism configured
> with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate
> project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
> [info] Loading project definition from /home/king/github/spark/project
> org.apache.maven.model.building.ModelBuildingException: 1 problem was
> encountered while building the effective model for
> org.apache.spark:spark-parent_2.11:2.0.0-SNAPSHOT
> [FATAL] Non-resolvable parent POM: Could not transfer artifact
> org.apache:apache:pom:14 from/to central (
> http://repo.maven.apache.org/maven2): Error transferring file: Connection
> timed out from
> http://repo.maven.apache.org/maven2/org/apache/apache/14/apache-14.pom
> and 'parent.relativePath' points at wrong local POM @ line 22, column 11
>
> at
> org.apache.maven.model.building.DefaultModelProblemCollector.newModelBuildingException(DefaultModelProblemCollector.java:195)
> at
> org.apache.maven.model.building.DefaultModelBuilder.readParentExternally(DefaultModelBuilder.java:841)
> at
> org.apache.maven.model.building.DefaultModelBuilder.readParent(DefaultModelBuilder.java:664)
> at
> org.apache.maven.model.building.DefaultModelBuilder.build(DefaultModelBuilder.java:310)
> at
> org.apache.maven.model.building.DefaultModelBuilder.build(DefaultModelBuilder.java:232)
> at
> com.typesafe.sbt.pom.MvnPomResolver.loadEffectivePom(MavenPomResolver.scala:61)
> at com.typesafe.sbt.pom.package$.loadEffectivePom(package.scala:41)
> at
> com.typesafe.sbt.pom.MavenProjectHelper$.makeProjectTree(MavenProjectHelper.scala:128)
> at
> com.typesafe.sbt.pom.MavenProjectHelper$.makeReactorProject(MavenProjectHelper.scala:49)
> at
> com.typesafe.sbt.pom.PomBuild$class.projectDefinitions(PomBuild.scala:28)
> at SparkBuild$.projectDefinitions(SparkBuild.scala:347)
> at sbt.Load$.sbt$Load$$projectsFromBuild(Load.scala:506)
> at sbt.Load$$anonfun$27.apply(Load.scala:446)
> at sbt.Load$$anonfun$27.apply(Load.scala:446)
> at scala.collection.immutable.Stream.flatMap(Stream.scala:442)
> at sbt.Load$.loadUnit(Load.scala:446)
> at sbt.Load$$anonfun$18$$anonfun$apply$11.apply(Load.scala:291)
> at sbt.Load$$anonfun$18$$anonfun$apply$11.apply(Load.scala:291)
> at
> sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:91)
> at
> sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:90)
> at sbt.BuildLoader.apply(BuildLoader.scala:140)
> at sbt.Load$.loadAll(Load.scala:344)
> at sbt.Load$.loadURI(Load.scala:299)
> at sbt.Load$.load(Load.scala:295)
> at sbt.Load$.load(Load.scala:286)
> at sbt.Load$.apply(Load.scala:140)
> at sbt.Load$.defaultLoad(Load.scala:36)
> at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:492)
> at sbt.BuiltinCommands$.doLoadProject(Main.scala:492)
> at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:484)
> at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:484)
> at
> sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
> at
> sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
> at
> sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61)
> at
> sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:61)
> at sbt.Command$.process(Command.scala:93)
> at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:96)
> at sbt.MainLoop$$anonfun$1$$anonfun$apply$1

Re: [spark] build/sbt gen-idea error

2016-04-12 Thread Ted Yu
See
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup

On Tue, Apr 12, 2016 at 8:52 AM, ImMr.K <875061...@qq.com> wrote:

> But how to import spark repo into idea or eclipse?
>
>
>
> -- 原始邮件 --
> *发件人:* Ted Yu <yuzhih...@gmail.com>
> *发送时间:* 2016年4月12日 23:38
> *收件人:* ImMr.K <875061...@qq.com>
> *抄送:* user <user@spark.apache.org>
> *主题:* Re: build/sbt gen-idea error
>
> gen-idea doesn't seem to be a valid command:
>
> [warn] Ignoring load failure: no project loaded.
> [error] Not a valid command: gen-idea
> [error] gen-idea
>
> On Tue, Apr 12, 2016 at 8:28 AM, ImMr.K <875061...@qq.com> wrote:
>
>> Hi,
>> I have cloned spark and ,
>> cd spark
>> build/sbt gen-idea
>>
>> got the following output:
>>
>>
>> Using /usr/java/jre1.7.0_09 as default JAVA_HOME.
>> Note, this will be overridden by -java-home if it is set.
>> [info] Loading project definition from
>> /home/king/github/spark/project/project
>> [info] Loading project definition from
>> /home/king/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader/project
>> [warn] Multiple resolvers having different access mechanism configured
>> with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate
>> project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
>> [info] Loading project definition from /home/king/github/spark/project
>> org.apache.maven.model.building.ModelBuildingException: 1 problem was
>> encountered while building the effective model for
>> org.apache.spark:spark-parent_2.11:2.0.0-SNAPSHOT
>> [FATAL] Non-resolvable parent POM: Could not transfer artifact
>> org.apache:apache:pom:14 from/to central (
>> http://repo.maven.apache.org/maven2): Error transferring file:
>> Connection timed out from
>> http://repo.maven.apache.org/maven2/org/apache/apache/14/apache-14.pom
>> and 'parent.relativePath' points at wrong local POM @ line 22, column 11
>>
>> at
>> org.apache.maven.model.building.DefaultModelProblemCollector.newModelBuildingException(DefaultModelProblemCollector.java:195)
>> at
>> org.apache.maven.model.building.DefaultModelBuilder.readParentExternally(DefaultModelBuilder.java:841)
>> at
>> org.apache.maven.model.building.DefaultModelBuilder.readParent(DefaultModelBuilder.java:664)
>> at
>> org.apache.maven.model.building.DefaultModelBuilder.build(DefaultModelBuilder.java:310)
>> at
>> org.apache.maven.model.building.DefaultModelBuilder.build(DefaultModelBuilder.java:232)
>> at
>> com.typesafe.sbt.pom.MvnPomResolver.loadEffectivePom(MavenPomResolver.scala:61)
>> at com.typesafe.sbt.pom.package$.loadEffectivePom(package.scala:41)
>> at
>> com.typesafe.sbt.pom.MavenProjectHelper$.makeProjectTree(MavenProjectHelper.scala:128)
>> at
>> com.typesafe.sbt.pom.MavenProjectHelper$.makeReactorProject(MavenProjectHelper.scala:49)
>> at
>> com.typesafe.sbt.pom.PomBuild$class.projectDefinitions(PomBuild.scala:28)
>> at SparkBuild$.projectDefinitions(SparkBuild.scala:347)
>> at sbt.Load$.sbt$Load$$projectsFromBuild(Load.scala:506)
>> at sbt.Load$$anonfun$27.apply(Load.scala:446)
>> at sbt.Load$$anonfun$27.apply(Load.scala:446)
>> at scala.collection.immutable.Stream.flatMap(Stream.scala:442)
>> at sbt.Load$.loadUnit(Load.scala:446)
>> at sbt.Load$$anonfun$18$$anonfun$apply$11.apply(Load.scala:291)
>> at sbt.Load$$anonfun$18$$anonfun$apply$11.apply(Load.scala:291)
>> at
>> sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:91)
>> at
>> sbt.BuildLoader$$anonfun$componentLoader$1$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6.apply(BuildLoader.scala:90)
>> at sbt.BuildLoader.apply(BuildLoader.scala:140)
>> at sbt.Load$.loadAll(Load.scala:344)
>> at sbt.Load$.loadURI(Load.scala:299)
>> at sbt.Load$.load(Load.scala:295)
>> at sbt.Load$.load(Load.scala:286)
>> at sbt.Load$.apply(Load.scala:140)
>> at sbt.Load$.defaultLoad(Load.scala:36)
>> at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:492)
>> at sbt.BuiltinCommands$.doLoadProject(Main.scala:492)
>> at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:484)
>> at sbt.BuiltinCommands$$anonfun$loadProjectImpl$2.apply(Main.scala:484)
>> at
>> sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
>> at
>> sbt.Command$$anonfun$applyEffect$1$$anonfun$apply$2.apply(Command.scala:59)
>> at
>> sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.sca

Spark build error

2015-11-17 Thread 金国栋
Hi!

I tried to build spark source code from github, and I successfully built it
from command line using `*sbt/sbt assembly*`. While I encountered an error
when compiling the project in Intellij IDEA(V14.1.5).


The error log is below:
*Error:scala: *
* while compiling:
/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
*during phase: jvm*
 library version: version 2.10.5
compiler version: version 2.10.5
  reconstructed args: -nobootcp -javabootclasspath : -deprecation -feature
-classpath

Re: Spark build error

2015-11-17 Thread Ted Yu
Is the Scala version in Intellij the same as the one used by sbt ?

Cheers

On Tue, Nov 17, 2015 at 6:45 PM, 金国栋  wrote:

> Hi!
>
> I tried to build spark source code from github, and I successfully built
> it from command line using `*sbt/sbt assembly*`. While I encountered an
> error when compiling the project in Intellij IDEA(V14.1.5).
>
>
> The error log is below:
> *Error:scala: *
> * while compiling:
> /Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
> *during phase: jvm*
>  library version: version 2.10.5
> compiler version: version 2.10.5
>   reconstructed args: -nobootcp -javabootclasspath : -deprecation -feature
> -classpath
> 

Re: Spark build error

2015-11-17 Thread Jeff Zhang
This also bother me for a long time. I suspect the intellij builder
conflicts with the sbt/maven builder.

I resolve this issue by rebuild spark in intellij.  You may meet
compilation issue when building it in intellij.
For that you need to put external/flume-sink/target/java on the source
build path.



On Wed, Nov 18, 2015 at 12:02 PM, Ted Yu  wrote:

> Is the Scala version in Intellij the same as the one used by sbt ?
>
> Cheers
>
> On Tue, Nov 17, 2015 at 6:45 PM, 金国栋  wrote:
>
>> Hi!
>>
>> I tried to build spark source code from github, and I successfully built
>> it from command line using `*sbt/sbt assembly*`. While I encountered an
>> error when compiling the project in Intellij IDEA(V14.1.5).
>>
>>
>> The error log is below:
>> *Error:scala: *
>> * while compiling:
>> /Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
>> *during phase: jvm*
>>  library version: version 2.10.5
>> compiler version: version 2.10.5
>>   reconstructed args: -nobootcp -javabootclasspath : -deprecation
>> -feature -classpath
>> 

Re: Spark build/sbt assembly

2015-07-30 Thread Rahul Palamuttam
Hi Akhil,

Yes I did try to remove it, and i tried to build again.
However that jar keeps getting recreated, whenever i run ./build/sbt
assembly

Thanks,

Rahul P

On Thu, Jul 30, 2015 at 12:38 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Did you try removing this jar? build/sbt-launch-0.13.7.jar

 Thanks
 Best Regards

 On Tue, Jul 28, 2015 at 12:08 AM, Rahul Palamuttam rahulpala...@gmail.com
  wrote:

 Hi All,

 I hope this is the right place to post troubleshooting questions.
 I've been following the install instructions and I get the following error
 when running the following from Spark home directory

 $./build/sbt
 Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
 Note, this will be overridden by -java-home if it is set.
 Attempting to fetch sbt
 Launching sbt from build/sbt-launch-0.13.7.jar
 Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

 However when I run sbt assembly it compiles, with a couple of warnings,
 but
 it works none-the less.
 Is the build/sbt script deprecated? I do notice on one node it works but
 on
 the other it gives me the above error.

 Thanks,

 Rahul P



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Re: Spark build/sbt assembly

2015-07-30 Thread Akhil Das
Did you try removing this jar? build/sbt-launch-0.13.7.jar

Thanks
Best Regards

On Tue, Jul 28, 2015 at 12:08 AM, Rahul Palamuttam rahulpala...@gmail.com
wrote:

 Hi All,

 I hope this is the right place to post troubleshooting questions.
 I've been following the install instructions and I get the following error
 when running the following from Spark home directory

 $./build/sbt
 Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
 Note, this will be overridden by -java-home if it is set.
 Attempting to fetch sbt
 Launching sbt from build/sbt-launch-0.13.7.jar
 Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

 However when I run sbt assembly it compiles, with a couple of warnings, but
 it works none-the less.
 Is the build/sbt script deprecated? I do notice on one node it works but on
 the other it gives me the above error.

 Thanks,

 Rahul P



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark build/sbt assembly

2015-07-27 Thread Ted Yu
bq. on one node it works but on the other it gives me the above error.

Can you tell us the difference between the environments on the two nodes ?
Does the other node use Java 8 ?

Cheers

On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam rahulpala...@gmail.com
wrote:

 Hi All,

 I hope this is the right place to post troubleshooting questions.
 I've been following the install instructions and I get the following error
 when running the following from Spark home directory

 $./build/sbt
 Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
 Note, this will be overridden by -java-home if it is set.
 Attempting to fetch sbt
 Launching sbt from build/sbt-launch-0.13.7.jar
 Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

 However when I run sbt assembly it compiles, with a couple of warnings, but
 it works none-the less.
 Is the build/sbt script deprecated? I do notice on one node it works but on
 the other it gives me the above error.

 Thanks,

 Rahul P



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
So just to clarify, I have 4 nodes, all of which use Java 8.
Only one of them is able to successfully execute the build/sbt assembly
command.
However on the 3 others I get the error.

If I run sbt assembly in Spark Home, it works and I'm able to launch the
master and worker processes.

On Mon, Jul 27, 2015 at 11:48 AM, Rahul Palamuttam rahulpala...@gmail.com
wrote:

 All nodes are using java 8.
 I've tried to mimic the environments as much as possible among all nodes.


 On Mon, Jul 27, 2015 at 11:44 AM, Ted Yu yuzhih...@gmail.com wrote:

 bq. on one node it works but on the other it gives me the above error.

 Can you tell us the difference between the environments on the two nodes ?
 Does the other node use Java 8 ?

 Cheers

 On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam 
 rahulpala...@gmail.com wrote:

 Hi All,

 I hope this is the right place to post troubleshooting questions.
 I've been following the install instructions and I get the following
 error
 when running the following from Spark home directory

 $./build/sbt
 Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
 Note, this will be overridden by -java-home if it is set.
 Attempting to fetch sbt
 Launching sbt from build/sbt-launch-0.13.7.jar
 Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

 However when I run sbt assembly it compiles, with a couple of warnings,
 but
 it works none-the less.
 Is the build/sbt script deprecated? I do notice on one node it works but
 on
 the other it gives me the above error.

 Thanks,

 Rahul P



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org






Re: Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
All nodes are using java 8.
I've tried to mimic the environments as much as possible among all nodes.


On Mon, Jul 27, 2015 at 11:44 AM, Ted Yu yuzhih...@gmail.com wrote:

 bq. on one node it works but on the other it gives me the above error.

 Can you tell us the difference between the environments on the two nodes ?
 Does the other node use Java 8 ?

 Cheers

 On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam rahulpala...@gmail.com
  wrote:

 Hi All,

 I hope this is the right place to post troubleshooting questions.
 I've been following the install instructions and I get the following error
 when running the following from Spark home directory

 $./build/sbt
 Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
 Note, this will be overridden by -java-home if it is set.
 Attempting to fetch sbt
 Launching sbt from build/sbt-launch-0.13.7.jar
 Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

 However when I run sbt assembly it compiles, with a couple of warnings,
 but
 it works none-the less.
 Is the build/sbt script deprecated? I do notice on one node it works but
 on
 the other it gives me the above error.

 Thanks,

 Rahul P



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
Hi All,

I hope this is the right place to post troubleshooting questions.
I've been following the install instructions and I get the following error
when running the following from Spark home directory

$./build/sbt
Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Attempting to fetch sbt
Launching sbt from build/sbt-launch-0.13.7.jar
Error: Invalid or corrupt jarfile build/sbt-launch-0.13.7.jar

However when I run sbt assembly it compiles, with a couple of warnings, but
it works none-the less.
Is the build/sbt script deprecated? I do notice on one node it works but on
the other it gives me the above error.

Thanks,

Rahul P



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-build-sbt-assembly-tp24012.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark build with Hive

2015-05-20 Thread guoqing0...@yahoo.com.hk
Hi , is the Spark-1.3.1 can build with the Hive-1.2 ? it seem to Spark-1.3.1 
can only build with 0.13 , 0.12 according to the document .

# Apache Hadoop 2.4.X with Hive 13 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
-DskipTests clean package
# Apache Hadoop 2.4.X with Hive 12 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-0.12.0 
-Phive-thriftserver -DskipTests clean package



guoqing0...@yahoo.com.hk


Re: RE: Spark build with Hive

2015-05-20 Thread guoqing0...@yahoo.com.hk
Thanks very much , Which version will be support In the upcome 1.4 ?  I hope it 
will be support more versions.



guoqing0...@yahoo.com.hk
 
From: Cheng, Hao
Date: 2015-05-21 11:20
To: Ted Yu; guoqing0...@yahoo.com.hk
CC: user
Subject: RE: Spark build with Hive
Yes, ONLY support 0.12.0 and 0.13.1 currently. Hopefully we can support higher 
versions in next 1 or 2 releases.
 
From: Ted Yu [mailto:yuzhih...@gmail.com] 
Sent: Thursday, May 21, 2015 11:12 AM
To: guoqing0...@yahoo.com.hk
Cc: user
Subject: Re: Spark build with Hive
 
I am afraid even Hive 1.0 is not supported, let alone Hive 1.2
 
Cheers
 
On Wed, May 20, 2015 at 8:08 PM, guoqing0...@yahoo.com.hk 
guoqing0...@yahoo.com.hk wrote:
Hi , is the Spark-1.3.1 can build with the Hive-1.2 ? it seem to Spark-1.3.1 
can only build with 0.13 , 0.12 according to the document .
 
# Apache Hadoop 2.4.X with Hive 13 supportmvn -Pyarn -Phadoop-2.4 
-Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package# 
Apache Hadoop 2.4.X with Hive 12 supportmvn -Pyarn -Phadoop-2.4 
-Dhadoop.version=2.4.0 -Phive -Phive-0.12.0 -Phive-thriftserver -DskipTests 
clean package
 


guoqing0...@yahoo.com.hk
 


RE: RE: Spark build with Hive

2015-05-20 Thread Wang, Daoyuan
In 1.4 I think we still only support 0.12.0 and 0.13.1.

From: guoqing0...@yahoo.com.hk [mailto:guoqing0...@yahoo.com.hk]
Sent: Thursday, May 21, 2015 12:03 PM
To: Cheng, Hao; Ted Yu
Cc: user
Subject: Re: RE: Spark build with Hive

Thanks very much , Which version will be support In the upcome 1.4 ?  I hope it 
will be support more versions.


guoqing0...@yahoo.com.hkmailto:guoqing0...@yahoo.com.hk

From: Cheng, Haomailto:hao.ch...@intel.com
Date: 2015-05-21 11:20
To: Ted Yumailto:yuzhih...@gmail.com; 
guoqing0...@yahoo.com.hkmailto:guoqing0...@yahoo.com.hk
CC: usermailto:user@spark.apache.org
Subject: RE: Spark build with Hive
Yes, ONLY support 0.12.0 and 0.13.1 currently. Hopefully we can support higher 
versions in next 1 or 2 releases.

From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Thursday, May 21, 2015 11:12 AM
To: guoqing0...@yahoo.com.hkmailto:guoqing0...@yahoo.com.hk
Cc: user
Subject: Re: Spark build with Hive

I am afraid even Hive 1.0 is not supported, let alone Hive 1.2

Cheers

On Wed, May 20, 2015 at 8:08 PM, 
guoqing0...@yahoo.com.hkmailto:guoqing0...@yahoo.com.hk 
guoqing0...@yahoo.com.hkmailto:guoqing0...@yahoo.com.hk wrote:
Hi , is the Spark-1.3.1 can build with the Hive-1.2 ? it seem to Spark-1.3.1 
can only build with 0.13 , 0.12 according to the document .


# Apache Hadoop 2.4.X with Hive 13 support

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
-DskipTests clean package

# Apache Hadoop 2.4.X with Hive 12 support

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-0.12.0 
-Phive-thriftserver -DskipTests clean package


guoqing0...@yahoo.com.hkmailto:guoqing0...@yahoo.com.hk



Re: Spark build with Hive

2015-05-20 Thread Ted Yu
I am afraid even Hive 1.0 is not supported, let alone Hive 1.2

Cheers

On Wed, May 20, 2015 at 8:08 PM, guoqing0...@yahoo.com.hk 
guoqing0...@yahoo.com.hk wrote:

 Hi , is the Spark-1.3.1 can build with the Hive-1.2 ? it seem to
 Spark-1.3.1 can only build with 0.13 , 0.12 according to the document .

 # Apache Hadoop 2.4.X with Hive 13 support
 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver 
 -DskipTests clean package# Apache Hadoop 2.4.X with Hive 12 support
 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-0.12.0 
 -Phive-thriftserver -DskipTests clean package


 --
 guoqing0...@yahoo.com.hk



Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Todd Nist
First, thanks to everyone for their assistance and recommendations.

@Marcelo

I applied the patch that you recommended and am now able to get into the
shell, thank you worked great after I realized that the pom was pointing to
the 1.3.0-SNAPSHOT for parent, need to bump that down to 1.2.1.

@Zhan

Need to apply this patch next.  I tried to start the spark-thriftserver but
and it starts, then fails with like this:  I have the entries in my
spark-default.conf, but not the patch applied.

./sbin/start-thriftserver.sh --master yarn --executor-memory 1024m
--hiveconf hive.server2.thrift.port=10001

5/03/06 12:34:17 INFO ui.SparkUI: Started SparkUI at
http://hadoopdev01.opsdatastore.com:404015/03/06 12:34:18 INFO
impl.TimelineClientImpl: Timeline service address:
http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/15/03/06
12:34:18 INFO client.RMProxy: Connecting to ResourceManager at
hadoopdev02.opsdatastore.com/192.168.15.154:805015/03/06 12:34:18 INFO
yarn.Client: Requesting a new application from cluster with 4
NodeManagers15/03/06 12:34:18 INFO yarn.Client: Verifying our
application has not requested more than the maximum memory capability
of the cluster (8192 MB per container)15/03/06 12:34:18 INFO
yarn.Client: Will allocate AM container, with 896 MB memory including
384 MB overhead15/03/06 12:34:18 INFO yarn.Client: Setting up
container launch context for our AM15/03/06 12:34:18 INFO yarn.Client:
Preparing resources for our AM container15/03/06 12:34:19 WARN
shortcircuit.DomainSocketFactory: The short-circuit local reads
feature cannot be used because libhadoop cannot be loaded.15/03/06
12:34:19 INFO yarn.Client: Uploading resource
file:/root/spark-1.2.1-bin-hadoop2.6/lib/spark-assembly-1.2.1-hadoop2.6.0.jar
- 
hdfs://hadoopdev01.opsdatastore.com:8020/user/root/.sparkStaging/application_1425078697953_0018/spark-assembly-1.2.1-hadoop2.6.0.jar15/03/06
12:34:21 INFO yarn.Client: Setting up the launch environment for our
AM container15/03/06 12:34:21 INFO spark.SecurityManager: Changing
view acls to: root15/03/06 12:34:21 INFO spark.SecurityManager:
Changing modify acls to: root15/03/06 12:34:21 INFO
spark.SecurityManager: SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(root); users with
modify permissions: Set(root)15/03/06 12:34:21 INFO yarn.Client:
Submitting application 18 to ResourceManager15/03/06 12:34:21 INFO
impl.YarnClientImpl: Submitted application
application_1425078697953_001815/03/06 12:34:22 INFO yarn.Client:
Application report for application_1425078697953_0018 (state:
ACCEPTED)15/03/06 12:34:22 INFO yarn.Client:
 client token: N/A
 diagnostics: N/A
 ApplicationMaster host: N/A
 ApplicationMaster RPC port: -1
 queue: default
 start time: 1425663261755
 final status: UNDEFINED
 tracking URL:
http://hadoopdev02.opsdatastore.com:8088/proxy/application_1425078697953_0018/
 user: root15/03/06 12:34:23 INFO yarn.Client: Application report
for application_1425078697953_0018 (state: ACCEPTED)15/03/06 12:34:24
INFO yarn.Client: Application report for
application_1425078697953_0018 (state: ACCEPTED)15/03/06 12:34:25 INFO
yarn.Client: Application report for application_1425078697953_0018
(state: ACCEPTED)15/03/06 12:34:26 INFO yarn.Client: Application
report for application_1425078697953_0018 (state: ACCEPTED)15/03/06
12:34:27 INFO cluster.YarnClientSchedulerBackend: ApplicationMaster
registered as 
Actor[akka.tcp://sparkyar...@hadoopdev08.opsdatastore.com:40201/user/YarnAM#-557112763]15/03/06
12:34:27 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter.
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter,
Map(PROXY_HOSTS - hadoopdev02.opsdatastore.com, PROXY_URI_BASES -
http://hadoopdev02.opsdatastore.com:8088/proxy/application_1425078697953_0018),
/proxy/application_1425078697953_001815/03/06 12:34:27 INFO
ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter15/03/06
12:34:27 INFO yarn.Client: Application report for
application_1425078697953_0018 (state: RUNNING)15/03/06 12:34:27 INFO
yarn.Client:
 client token: N/A
 diagnostics: N/A
 ApplicationMaster host: hadoopdev08.opsdatastore.com
 ApplicationMaster RPC port: 0
 queue: default
 start time: 1425663261755
 final status: UNDEFINED
 tracking URL:
http://hadoopdev02.opsdatastore.com:8088/proxy/application_1425078697953_0018/
 user: root15/03/06 12:34:27 INFO
cluster.YarnClientSchedulerBackend: Application
application_1425078697953_0018 has started running.15/03/06 12:34:28
INFO netty.NettyBlockTransferService: Server created on 4612415/03/06
12:34:28 INFO storage.BlockManagerMaster: Trying to register
BlockManager15/03/06 12:34:28 INFO storage.BlockManagerMasterActor:
Registering block manager hadoopdev01.opsdatastore.com:46124 with
265.4 MB RAM, BlockManagerId(driver, hadoopdev01.opsdatastore.com,
46124)15/03/06 12:34:28 INFO storage.BlockManagerMaster: Registered

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Zhan Zhang
Hi Todd,

Looks like the thrift server can connect to metastore, but something wrong in 
the executors. You can try to get the log with yarn logs -applicationID xxx” 
to check why it failed. If there is no log (master or executor is not started 
at all), you can go to the RM webpage, click the link to see why the shell 
failed in the first place.

Thanks.

Zhan Zhang

On Mar 6, 2015, at 9:59 AM, Todd Nist 
tsind...@gmail.commailto:tsind...@gmail.com wrote:

First, thanks to everyone for their assistance and recommendations.

@Marcelo

I applied the patch that you recommended and am now able to get into the shell, 
thank you worked great after I realized that the pom was pointing to the 
1.3.0-SNAPSHOT for parent, need to bump that down to 1.2.1.

@Zhan

Need to apply this patch next.  I tried to start the spark-thriftserver but and 
it starts, then fails with like this:  I have the entries in my 
spark-default.conf, but not the patch applied.


./sbin/start-thriftserver.sh --master yarn --executor-memory 1024m --hiveconf 
hive.server2.thrift.port=10001

5/03/06 12:34:17 INFO ui.SparkUI: Started SparkUI at 
http://hadoopdev01http://hadoopdev01/.opsdatastore.com:4040
15/03/06 12:34:18 INFO impl.TimelineClientImpl: Timeline service address: 
http://hadoopdev02http://hadoopdev02/.opsdatastore.com:8188/ws/v1/timeline/
15/03/06 12:34:18 INFO client.RMProxy: Connecting to ResourceManager at 
hadoopdev02.opsdatastore.com/192.168.15.154:8050
15/03/06 12:34:18 INFO yarn.Client: Requesting a new application from cluster 
with 4 NodeManagers
15/03/06 12:34:18 INFO yarn.Client: Verifying our application has not requested 
more than the maximum memory capability of the cluster (8192 MB per container)
15/03/06 12:34:18 INFO yarn.Client: Will allocate AM container, with 896 MB 
memory including 384 MB overhead
15/03/06 12:34:18 INFO yarn.Client: Setting up container launch context for our 
AM
15/03/06 12:34:18 INFO yarn.Client: Preparing resources for our AM container
15/03/06 12:34:19 WARN shortcircuit.DomainSocketFactory: The short-circuit 
local reads feature cannot be used because libhadoop cannot be loaded.
15/03/06 12:34:19 INFO yarn.Client: Uploading resource 
file:/root/spark-1.2.1-bin-hadoop2.6/lib/spark-assembly-1.2.1-hadoop2.6.0.jar 
- 
hdfs://hadoopdev01.opsdatastore.com:8020/user/root/.sparkStaging/application_1425078697953_0018/spark-assembly-1.2.1-hadoop2.6.0.jar
15/03/06 12:34:21 INFO yarn.Client: Setting up the launch environment for our 
AM container
15/03/06 12:34:21 INFO spark.SecurityManager: Changing view acls to: root
15/03/06 12:34:21 INFO spark.SecurityManager: Changing modify acls to: root
15/03/06 12:34:21 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(root); users with 
modify permissions: Set(root)
15/03/06 12:34:21 INFO yarn.Client: Submitting application 18 to ResourceManager
15/03/06 12:34:21 INFO impl.YarnClientImpl: Submitted application 
application_1425078697953_0018
15/03/06 12:34:22 INFO yarn.Client: Application report for 
application_1425078697953_0018 (state: ACCEPTED)
15/03/06 12:34:22 INFO yarn.Client:
 client token: N/A
 diagnostics: N/A
 ApplicationMaster host: N/A
 ApplicationMaster RPC port: -1
 queue: default
 start time: 1425663261755
 final status: UNDEFINED
 tracking URL: 
http://hadoopdev02http://hadoopdev02/.opsdatastore.com:8088/proxy/application_1425078697953_0018/
 user: root
15/03/06 12:34:23 INFO yarn.Client: Application report for 
application_1425078697953_0018 (state: ACCEPTED)
15/03/06 12:34:24 INFO yarn.Client: Application report for 
application_1425078697953_0018 (state: ACCEPTED)
15/03/06 12:34:25 INFO yarn.Client: Application report for 
application_1425078697953_0018 (state: ACCEPTED)
15/03/06 12:34:26 INFO yarn.Client: Application report for 
application_1425078697953_0018 (state: ACCEPTED)
15/03/06 12:34:27 INFO cluster.YarnClientSchedulerBackend: ApplicationMaster 
registered as 
Actor[akka.tcp://sparkyar...@hadoopdev08.opsdatastore.com:40201/user/YarnAM#-557112763]
15/03/06 12:34:27 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS - 
hadoopdev02.opsdatastore.com, PROXY_URI_BASES - 
http://hadoopdev02http://hadoopdev02/.opsdatastore.com:8088/proxy/application_1425078697953_0018),
 /proxy/application_1425078697953_0018
15/03/06 12:34:27 INFO ui.JettyUtils: Adding filter: 
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/03/06 12:34:27 INFO yarn.Client: Application report for 
application_1425078697953_0018 (state: RUNNING)
15/03/06 12:34:27 INFO yarn.Client:
 client token: N/A
 diagnostics: N/A
 ApplicationMaster host: hadoopdev08.opsdatastore.com
 ApplicationMaster RPC port: 0
 queue: default
 start time: 1425663261755
 final status: UNDEFINED
 tracking URL: 

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Todd Nist
Hi Zhan,

I applied the patch you recommended,
https://github.com/apache/spark/pull/3409, it it now works. It was failing
with this:

Exception message:
/hadoop/yarn/local/usercache/root/appcache/application_1425078697953_0020/container_1425078697953_0020_01_02/launch_container.sh:
line 14:
$PWD:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:/usr/hdp/
*${hdp.version}*/hadoop/lib/hadoop-lzo-0.6.0.*${hdp.version}*.jar:/etc/hadoop/conf/secure:$PWD/__app__.jar:$PWD/*:
*bad substitution*

While the spark-default.conf has these defined:

spark.driver.extraJavaOptions -Dhdp.version=2.2.0.0-2041
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041


without the patch *${hdp.version} * was not being substituted.

Thanks for pointing me to that patch, appreciate it.

-Todd

On Fri, Mar 6, 2015 at 1:12 PM, Zhan Zhang zzh...@hortonworks.com wrote:

  Hi Todd,

  Looks like the thrift server can connect to metastore, but something
 wrong in the executors. You can try to get the log with yarn logs
 -applicationID xxx” to check why it failed. If there is no log (master or
 executor is not started at all), you can go to the RM webpage, click the
 link to see why the shell failed in the first place.

  Thanks.

  Zhan Zhang

  On Mar 6, 2015, at 9:59 AM, Todd Nist tsind...@gmail.com wrote:

  First, thanks to everyone for their assistance and recommendations.

  @Marcelo

  I applied the patch that you recommended and am now able to get into the
 shell, thank you worked great after I realized that the pom was pointing to
 the 1.3.0-SNAPSHOT for parent, need to bump that down to 1.2.1.

  @Zhan

  Need to apply this patch next.  I tried to start the spark-thriftserver
 but and it starts, then fails with like this:  I have the entries in my
 spark-default.conf, but not the patch applied.

   ./sbin/start-thriftserver.sh --master yarn --executor-memory 1024m 
 --hiveconf hive.server2.thrift.port=10001

  5/03/06 12:34:17 INFO ui.SparkUI: Started SparkUI at 
 http://hadoopdev01.opsdatastore.com:404015/03/06 12:34:18 INFO 
 impl.TimelineClientImpl: Timeline service address: 
 http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/15/03/06 12:34:18 
 INFO client.RMProxy: Connecting to ResourceManager at 
 hadoopdev02.opsdatastore.com/192.168.15.154:805015/03/06 12:34:18 INFO 
 yarn.Client: Requesting a new application from cluster with 4 
 NodeManagers15/03/06 12:34:18 INFO yarn.Client: Verifying our application has 
 not requested more than the maximum memory capability of the cluster (8192 MB 
 per container)15/03/06 12:34:18 INFO yarn.Client: Will allocate AM container, 
 with 896 MB memory including 384 MB overhead15/03/06 12:34:18 INFO 
 yarn.Client: Setting up container launch context for our AM15/03/06 12:34:18 
 INFO yarn.Client: Preparing resources for our AM container15/03/06 12:34:19 
 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature 
 cannot be used because libhadoop cannot be loaded.15/03/06 12:34:19 INFO 
 yarn.Client: Uploading resource 
 file:/root/spark-1.2.1-bin-hadoop2.6/lib/spark-assembly-1.2.1-hadoop2.6.0.jar 
 - 
 hdfs://hadoopdev01.opsdatastore.com:8020/user/root/.sparkStaging/application_1425078697953_0018/spark-assembly-1.2.1-hadoop2.6.0.jar15/03/06
  12:34:21 INFO yarn.Client: Setting up the launch environment for our AM 
 container15/03/06 12:34:21 INFO spark.SecurityManager: Changing view acls to: 
 root15/03/06 12:34:21 INFO spark.SecurityManager: Changing modify acls to: 
 root15/03/06 12:34:21 INFO spark.SecurityManager: SecurityManager: 
 authentication disabled; ui acls disabled; users with view permissions: 
 Set(root); users with modify permissions: Set(root)15/03/06 12:34:21 INFO 
 yarn.Client: Submitting application 18 to ResourceManager15/03/06 12:34:21 
 INFO impl.YarnClientImpl: Submitted application 
 application_1425078697953_001815/03/06 12:34:22 INFO yarn.Client: Application 
 report for application_1425078697953_0018 (state: ACCEPTED)15/03/06 12:34:22 
 INFO yarn.Client:
  client token: N/A
  diagnostics: N/A
  ApplicationMaster host: N/A
  ApplicationMaster RPC port: -1
  queue: default
  start time: 1425663261755
  final status: UNDEFINED
  tracking URL: 
 http://hadoopdev02.opsdatastore.com:8088/proxy/application_1425078697953_0018/
  user: root15/03/06 12:34:23 INFO yarn.Client: Application report for 
 

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Zhan Zhang
Sorry. Misunderstanding. Looks like it already worked. If you still met some 
hdp.version problem, you can try it :)

Thanks.

Zhan Zhang

On Mar 6, 2015, at 11:40 AM, Zhan Zhang 
zzh...@hortonworks.commailto:zzh...@hortonworks.com wrote:

You are using 1.2.1 right? If so, please add java-opts  in conf directory and 
give it a try.

[root@c6401 conf]# more java-opts
  -Dhdp.version=2.2.2.0-2041

Thanks.

Zhan Zhang

On Mar 6, 2015, at 11:35 AM, Todd Nist 
tsind...@gmail.commailto:tsind...@gmail.com wrote:

 -Dhdp.version=2.2.0.0-2041




Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Zhan Zhang
You are using 1.2.1 right? If so, please add java-opts  in conf directory and 
give it a try.

[root@c6401 conf]# more java-opts
  -Dhdp.version=2.2.2.0-2041

Thanks.

Zhan Zhang

On Mar 6, 2015, at 11:35 AM, Todd Nist 
tsind...@gmail.commailto:tsind...@gmail.com wrote:

 -Dhdp.version=2.2.0.0-2041



Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-06 Thread Todd Nist
Working great now, after applying that patch; thanks again.

On Fri, Mar 6, 2015 at 2:42 PM, Zhan Zhang zzh...@hortonworks.com wrote:

  Sorry. Misunderstanding. Looks like it already worked. If you still met
 some hdp.version problem, you can try it :)

  Thanks.

  Zhan Zhang

  On Mar 6, 2015, at 11:40 AM, Zhan Zhang zzh...@hortonworks.com wrote:

  You are using 1.2.1 right? If so, please add java-opts  in conf
 directory and give it a try.

  [root@c6401 conf]# more java-opts
   -Dhdp.version=2.2.2.0-2041

  Thanks.

  Zhan Zhang

  On Mar 6, 2015, at 11:35 AM, Todd Nist tsind...@gmail.com wrote:

  -Dhdp.version=2.2.0.0-2041






Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Marcelo Vanzin
It seems from the excerpt below that your cluster is set up to use the
Yarn ATS, and the code is failing in that path. I think you'll need to
apply the following patch to your Spark sources if you want this to
work:

https://github.com/apache/spark/pull/3938

On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:
 org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:166)
 at
 org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
 at
 org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:65)
 at
 org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
 at
 org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:140)
 at org.apache.spark.SparkContext.init(SparkContext.scala:348)

-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Zhan Zhang
In addition, you may need following patch if it is not in 1.2.1 to solve some 
system property issue if you use HDP 2.2.

https://github.com/apache/spark/pull/3409

You can follow the following link to set hdp.version for java options.

http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/

Thanks.

Zhan Zhang

On Mar 5, 2015, at 11:09 AM, Marcelo Vanzin 
van...@cloudera.commailto:van...@cloudera.com wrote:

It seems from the excerpt below that your cluster is set up to use the
Yarn ATS, and the code is failing in that path. I think you'll need to
apply the following patch to your Spark sources if you want this to
work:

https://github.com/apache/spark/pull/3938

On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:166)
   at
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
   at
org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:65)
   at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
   at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:140)
   at org.apache.spark.SparkContext.init(SparkContext.scala:348)

--
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org




Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Todd Nist
I am running Spark on a HortonWorks HDP Cluster. I have deployed there
prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a
few fixes and features in there that I would like to leverage.

I just downloaded the spark-1.2.1 source and built it to support Hadoop 2.6
by doing the following:

radtech:spark-1.2.1 tnist$ ./make-distribution.sh --name hadoop2.6
--tgz -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive
-Phive-thriftserver -DskipTests clean package

When I deploy this to my hadoop cluster and kick of a spark-shell,

$ spark-1.2.1-bin-hadoop2.6]# ./bin/spark-shell --master yarn-client
--driver-memory 512m --executor-memory 512m

Results in  java.lang.NoClassDefFoundError:
org/codehaus/jackson/map/deser/std/StdDeserializer

The full stack trace is below. I have validate that the
$SPARK_HOME/lib/spark-assembly-1.2.1-hadoop2.6.0.jar does infact contain
the class in question:

jar -tvf spark-assembly-1.2.1-hadoop2.6.0.jar | grep
'org/codehaus/jackson/map/deser/std'
...
 18002 Thu Mar 05 11:23:04 EST 2015
parquet/org/codehaus/jackson/map/deser/std/StdDeserializer.class
  1584 Thu Mar 05 11:23:04 EST 2015
parquet/org/codehaus/jackson/map/deser/std/StdKeyDeserializer$BoolKD.class...

Any guidance on what I missed ? If i start the spark-shell in standalone it
comes up fine, $SPARK_HOME/bin/spark-shell so it looks to be related to
starting it under yarn from what I can tell.

TIA for the assistance.

-Todd
Stack Trace

15/03/05 12:12:38 INFO spark.SecurityManager: Changing view acls to:
root15/03/05 12:12:38 INFO spark.SecurityManager: Changing modify acls
to: root15/03/05 12:12:38 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions: Set(root); users with modify permissions:
Set(root)15/03/05 12:12:38 INFO spark.HttpServer: Starting HTTP
Server15/03/05 12:12:39 INFO server.Server:
jetty-8.y.z-SNAPSHOT15/03/05 12:12:39 INFO server.AbstractConnector:
Started SocketConnector@0.0.0.0:3617615/03/05 12:12:39 INFO
util.Utils: Successfully started service 'HTTP class server' on port
36176.
Welcome to
    __
 / __/__  ___ _/ /__
_\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.2.1
  /_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
Type in expressions to have them evaluated.
Type :help for more information.15/03/05 12:12:43 INFO
spark.SecurityManager: Changing view acls to: root15/03/05 12:12:43
INFO spark.SecurityManager: Changing modify acls to: root15/03/05
12:12:43 INFO spark.SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(root);
users with modify permissions: Set(root)15/03/05 12:12:44 INFO
slf4j.Slf4jLogger: Slf4jLogger started15/03/05 12:12:44 INFO Remoting:
Starting remoting15/03/05 12:12:44 INFO Remoting: Remoting started;
listening on addresses
:[akka.tcp://sparkdri...@hadoopdev01.opsdatastore.com:50544]15/03/05
12:12:44 INFO util.Utils: Successfully started service 'sparkDriver'
on port 50544.15/03/05 12:12:44 INFO spark.SparkEnv: Registering
MapOutputTracker15/03/05 12:12:44 INFO spark.SparkEnv: Registering
BlockManagerMaster15/03/05 12:12:44 INFO storage.DiskBlockManager:
Created local directory at
/tmp/spark-16402794-cc1e-42d0-9f9c-99f15eaa1861/spark-118bc6af-4008-45d7-a22f-491bcd1856c015/03/05
12:12:44 INFO storage.MemoryStore: MemoryStore started with capacity
265.4 MB15/03/05 12:12:45 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable15/03/05 12:12:45 INFO spark.HttpFileServer: HTTP File
server directory is
/tmp/spark-5d7da34c-58d4-4d60-9b6a-3dce43cab39e/spark-4d65aacb-78bd-40fd-b6c0-53b47e28819915/03/05
12:12:45 INFO spark.HttpServer: Starting HTTP Server15/03/05 12:12:45
INFO server.Server: jetty-8.y.z-SNAPSHOT15/03/05 12:12:45 INFO
server.AbstractConnector: Started
SocketConnector@0.0.0.0:5645215/03/05 12:12:45 INFO util.Utils:
Successfully started service 'HTTP file server' on port 56452.15/03/05
12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT15/03/05 12:12:45
INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0:404015/03/05 12:12:45 INFO util.Utils:
Successfully started service 'SparkUI' on port 4040.15/03/05 12:12:45
INFO ui.SparkUI: Started SparkUI at
http://hadoopdev01.opsdatastore.com:404015/03/05 12:12:46 INFO
impl.TimelineClientImpl: Timeline service address:
http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/
java.lang.NoClassDefFoundError:
org/codehaus/jackson/map/deser/std/StdDeserializer
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at 

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Sean Owen
Jackson 1.9.13? and codehaus.jackson.version? that's already set by
the profile hadoop-2.4.

On Thu, Mar 5, 2015 at 6:13 PM, Ted Yu yuzhih...@gmail.com wrote:
 Please add the following to build command:
 -Djackson.version=1.9.3

 Cheers

 On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:

 I am running Spark on a HortonWorks HDP Cluster. I have deployed there
 prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a
 few fixes and features in there that I would like to leverage.

 I just downloaded the spark-1.2.1 source and built it to support Hadoop
 2.6 by doing the following:

 radtech:spark-1.2.1 tnist$ ./make-distribution.sh --name hadoop2.6 --tgz
 -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
 -DskipTests clean package

 When I deploy this to my hadoop cluster and kick of a spark-shell,

 $ spark-1.2.1-bin-hadoop2.6]# ./bin/spark-shell --master yarn-client
 --driver-memory 512m --executor-memory 512m

 Results in  java.lang.NoClassDefFoundError:
 org/codehaus/jackson/map/deser/std/StdDeserializer

 The full stack trace is below. I have validate that the
 $SPARK_HOME/lib/spark-assembly-1.2.1-hadoop2.6.0.jar does infact contain the
 class in question:

 jar -tvf spark-assembly-1.2.1-hadoop2.6.0.jar | grep
 'org/codehaus/jackson/map/deser/std'

 ...
  18002 Thu Mar 05 11:23:04 EST 2015
 parquet/org/codehaus/jackson/map/deser/std/StdDeserializer.class
   1584 Thu Mar 05 11:23:04 EST 2015
 parquet/org/codehaus/jackson/map/deser/std/StdKeyDeserializer$BoolKD.class
 ...

 Any guidance on what I missed ? If i start the spark-shell in standalone
 it comes up fine, $SPARK_HOME/bin/spark-shell so it looks to be related to
 starting it under yarn from what I can tell.

 TIA for the assistance.

 -Todd

 Stack Trace

 15/03/05 12:12:38 INFO spark.SecurityManager: Changing view acls to: root
 15/03/05 12:12:38 INFO spark.SecurityManager: Changing modify acls to:
 root
 15/03/05 12:12:38 INFO spark.SecurityManager: SecurityManager:
 authentication disabled; ui acls disabled; users with view permissions:
 Set(root); users with modify permissions: Set(root)
 15/03/05 12:12:38 INFO spark.HttpServer: Starting HTTP Server
 15/03/05 12:12:39 INFO server.Server: jetty-8.y.z-SNAPSHOT
 15/03/05 12:12:39 INFO server.AbstractConnector: Started
 SocketConnector@0.0.0.0:36176
 15/03/05 12:12:39 INFO util.Utils: Successfully started service 'HTTP
 class server' on port 36176.
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.2.1
   /_/

 Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
 Type in expressions to have them evaluated.
 Type :help for more information.
 15/03/05 12:12:43 INFO spark.SecurityManager: Changing view acls to: root
 15/03/05 12:12:43 INFO spark.SecurityManager: Changing modify acls to:
 root
 15/03/05 12:12:43 INFO spark.SecurityManager: SecurityManager:
 authentication disabled; ui acls disabled; users with view permissions:
 Set(root); users with modify permissions: Set(root)
 15/03/05 12:12:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
 15/03/05 12:12:44 INFO Remoting: Starting remoting
 15/03/05 12:12:44 INFO Remoting: Remoting started; listening on addresses
 :[akka.tcp://sparkdri...@hadoopdev01.opsdatastore.com:50544]
 15/03/05 12:12:44 INFO util.Utils: Successfully started service
 'sparkDriver' on port 50544.
 15/03/05 12:12:44 INFO spark.SparkEnv: Registering MapOutputTracker
 15/03/05 12:12:44 INFO spark.SparkEnv: Registering BlockManagerMaster
 15/03/05 12:12:44 INFO storage.DiskBlockManager: Created local directory
 at
 /tmp/spark-16402794-cc1e-42d0-9f9c-99f15eaa1861/spark-118bc6af-4008-45d7-a22f-491bcd1856c0
 15/03/05 12:12:44 INFO storage.MemoryStore: MemoryStore started with
 capacity 265.4 MB
 15/03/05 12:12:45 WARN util.NativeCodeLoader: Unable to load native-hadoop
 library for your platform... using builtin-java classes where applicable
 15/03/05 12:12:45 INFO spark.HttpFileServer: HTTP File server directory is
 /tmp/spark-5d7da34c-58d4-4d60-9b6a-3dce43cab39e/spark-4d65aacb-78bd-40fd-b6c0-53b47e288199
 15/03/05 12:12:45 INFO spark.HttpServer: Starting HTTP Server
 15/03/05 12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT
 15/03/05 12:12:45 INFO server.AbstractConnector: Started
 SocketConnector@0.0.0.0:56452
 15/03/05 12:12:45 INFO util.Utils: Successfully started service 'HTTP file
 server' on port 56452.
 15/03/05 12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT
 15/03/05 12:12:45 INFO server.AbstractConnector: Started
 SelectChannelConnector@0.0.0.0:4040
 15/03/05 12:12:45 INFO util.Utils: Successfully started service 'SparkUI'
 on port 4040.
 15/03/05 12:12:45 INFO ui.SparkUI: Started SparkUI at
 http://hadoopdev01.opsdatastore.com:4040
 15/03/05 12:12:46 INFO impl.TimelineClientImpl: Timeline service address:
 http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/
 java.lang.NoClassDefFoundError:
 

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Victor Tso-Guillen
That particular class you did find is under parquet/... which means it was
shaded. Did you build your application against a hadoop2.6 dependency? The
maven central repo only has 2.2 but HDP has its own repos.

On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:

 I am running Spark on a HortonWorks HDP Cluster. I have deployed there
 prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a
 few fixes and features in there that I would like to leverage.

 I just downloaded the spark-1.2.1 source and built it to support Hadoop
 2.6 by doing the following:

 radtech:spark-1.2.1 tnist$ ./make-distribution.sh --name hadoop2.6 --tgz 
 -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver 
 -DskipTests clean package

 When I deploy this to my hadoop cluster and kick of a spark-shell,

 $ spark-1.2.1-bin-hadoop2.6]# ./bin/spark-shell --master yarn-client 
 --driver-memory 512m --executor-memory 512m

 Results in  java.lang.NoClassDefFoundError:
 org/codehaus/jackson/map/deser/std/StdDeserializer

 The full stack trace is below. I have validate that the
 $SPARK_HOME/lib/spark-assembly-1.2.1-hadoop2.6.0.jar does infact contain
 the class in question:

 jar -tvf spark-assembly-1.2.1-hadoop2.6.0.jar | grep 
 'org/codehaus/jackson/map/deser/std'
 ...
  18002 Thu Mar 05 11:23:04 EST 2015  
 parquet/org/codehaus/jackson/map/deser/std/StdDeserializer.class
   1584 Thu Mar 05 11:23:04 EST 2015 
 parquet/org/codehaus/jackson/map/deser/std/StdKeyDeserializer$BoolKD.class...

 Any guidance on what I missed ? If i start the spark-shell in standalone
 it comes up fine, $SPARK_HOME/bin/spark-shell so it looks to be related
 to starting it under yarn from what I can tell.

 TIA for the assistance.

 -Todd
 Stack Trace

 15/03/05 12:12:38 INFO spark.SecurityManager: Changing view acls to: 
 root15/03/05 12:12:38 INFO spark.SecurityManager: Changing modify acls to: 
 root15/03/05 12:12:38 INFO spark.SecurityManager: SecurityManager: 
 authentication disabled; ui acls disabled; users with view permissions: 
 Set(root); users with modify permissions: Set(root)15/03/05 12:12:38 INFO 
 spark.HttpServer: Starting HTTP Server15/03/05 12:12:39 INFO server.Server: 
 jetty-8.y.z-SNAPSHOT15/03/05 12:12:39 INFO server.AbstractConnector: Started 
 SocketConnector@0.0.0.0:3617615/03/05 12:12:39 INFO util.Utils: Successfully 
 started service 'HTTP class server' on port 36176.
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.2.1
   /_/

 Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
 Type in expressions to have them evaluated.
 Type :help for more information.15/03/05 12:12:43 INFO spark.SecurityManager: 
 Changing view acls to: root15/03/05 12:12:43 INFO spark.SecurityManager: 
 Changing modify acls to: root15/03/05 12:12:43 INFO spark.SecurityManager: 
 SecurityManager: authentication disabled; ui acls disabled; users with view 
 permissions: Set(root); users with modify permissions: Set(root)15/03/05 
 12:12:44 INFO slf4j.Slf4jLogger: Slf4jLogger started15/03/05 12:12:44 INFO 
 Remoting: Starting remoting15/03/05 12:12:44 INFO Remoting: Remoting started; 
 listening on addresses 
 :[akka.tcp://sparkdri...@hadoopdev01.opsdatastore.com:50544]15/03/05 12:12:44 
 INFO util.Utils: Successfully started service 'sparkDriver' on port 
 50544.15/03/05 12:12:44 INFO spark.SparkEnv: Registering 
 MapOutputTracker15/03/05 12:12:44 INFO spark.SparkEnv: Registering 
 BlockManagerMaster15/03/05 12:12:44 INFO storage.DiskBlockManager: Created 
 local directory at 
 /tmp/spark-16402794-cc1e-42d0-9f9c-99f15eaa1861/spark-118bc6af-4008-45d7-a22f-491bcd1856c015/03/05
  12:12:44 INFO storage.MemoryStore: MemoryStore started with capacity 265.4 
 MB15/03/05 12:12:45 WARN util.NativeCodeLoader: Unable to load native-hadoop 
 library for your platform... using builtin-java classes where 
 applicable15/03/05 12:12:45 INFO spark.HttpFileServer: HTTP File server 
 directory is 
 /tmp/spark-5d7da34c-58d4-4d60-9b6a-3dce43cab39e/spark-4d65aacb-78bd-40fd-b6c0-53b47e28819915/03/05
  12:12:45 INFO spark.HttpServer: Starting HTTP Server15/03/05 12:12:45 INFO 
 server.Server: jetty-8.y.z-SNAPSHOT15/03/05 12:12:45 INFO 
 server.AbstractConnector: Started SocketConnector@0.0.0.0:5645215/03/05 
 12:12:45 INFO util.Utils: Successfully started service 'HTTP file server' on 
 port 56452.15/03/05 12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT15/03/05 
 12:12:45 INFO server.AbstractConnector: Started 
 SelectChannelConnector@0.0.0.0:404015/03/05 12:12:45 INFO util.Utils: 
 Successfully started service 'SparkUI' on port 4040.15/03/05 12:12:45 INFO 
 ui.SparkUI: Started SparkUI at 
 http://hadoopdev01.opsdatastore.com:404015/03/05 12:12:46 INFO 
 impl.TimelineClientImpl: Timeline service address: 
 http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/
 java.lang.NoClassDefFoundError: 
 

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Ted Yu
Please add the following to build command:
-Djackson.version=1.9.3

Cheers

On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:

 I am running Spark on a HortonWorks HDP Cluster. I have deployed there
 prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are a
 few fixes and features in there that I would like to leverage.

 I just downloaded the spark-1.2.1 source and built it to support Hadoop
 2.6 by doing the following:

 radtech:spark-1.2.1 tnist$ ./make-distribution.sh --name hadoop2.6 --tgz 
 -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver 
 -DskipTests clean package

 When I deploy this to my hadoop cluster and kick of a spark-shell,

 $ spark-1.2.1-bin-hadoop2.6]# ./bin/spark-shell --master yarn-client 
 --driver-memory 512m --executor-memory 512m

 Results in  java.lang.NoClassDefFoundError:
 org/codehaus/jackson/map/deser/std/StdDeserializer

 The full stack trace is below. I have validate that the
 $SPARK_HOME/lib/spark-assembly-1.2.1-hadoop2.6.0.jar does infact contain
 the class in question:

 jar -tvf spark-assembly-1.2.1-hadoop2.6.0.jar | grep 
 'org/codehaus/jackson/map/deser/std'
 ...
  18002 Thu Mar 05 11:23:04 EST 2015  
 parquet/org/codehaus/jackson/map/deser/std/StdDeserializer.class
   1584 Thu Mar 05 11:23:04 EST 2015 
 parquet/org/codehaus/jackson/map/deser/std/StdKeyDeserializer$BoolKD.class...

 Any guidance on what I missed ? If i start the spark-shell in standalone
 it comes up fine, $SPARK_HOME/bin/spark-shell so it looks to be related
 to starting it under yarn from what I can tell.

 TIA for the assistance.

 -Todd
 Stack Trace

 15/03/05 12:12:38 INFO spark.SecurityManager: Changing view acls to: 
 root15/03/05 12:12:38 INFO spark.SecurityManager: Changing modify acls to: 
 root15/03/05 12:12:38 INFO spark.SecurityManager: SecurityManager: 
 authentication disabled; ui acls disabled; users with view permissions: 
 Set(root); users with modify permissions: Set(root)15/03/05 12:12:38 INFO 
 spark.HttpServer: Starting HTTP Server15/03/05 12:12:39 INFO server.Server: 
 jetty-8.y.z-SNAPSHOT15/03/05 12:12:39 INFO server.AbstractConnector: Started 
 SocketConnector@0.0.0.0:3617615/03/05 12:12:39 INFO util.Utils: Successfully 
 started service 'HTTP class server' on port 36176.
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.2.1
   /_/

 Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
 Type in expressions to have them evaluated.
 Type :help for more information.15/03/05 12:12:43 INFO spark.SecurityManager: 
 Changing view acls to: root15/03/05 12:12:43 INFO spark.SecurityManager: 
 Changing modify acls to: root15/03/05 12:12:43 INFO spark.SecurityManager: 
 SecurityManager: authentication disabled; ui acls disabled; users with view 
 permissions: Set(root); users with modify permissions: Set(root)15/03/05 
 12:12:44 INFO slf4j.Slf4jLogger: Slf4jLogger started15/03/05 12:12:44 INFO 
 Remoting: Starting remoting15/03/05 12:12:44 INFO Remoting: Remoting started; 
 listening on addresses 
 :[akka.tcp://sparkdri...@hadoopdev01.opsdatastore.com:50544]15/03/05 12:12:44 
 INFO util.Utils: Successfully started service 'sparkDriver' on port 
 50544.15/03/05 12:12:44 INFO spark.SparkEnv: Registering 
 MapOutputTracker15/03/05 12:12:44 INFO spark.SparkEnv: Registering 
 BlockManagerMaster15/03/05 12:12:44 INFO storage.DiskBlockManager: Created 
 local directory at 
 /tmp/spark-16402794-cc1e-42d0-9f9c-99f15eaa1861/spark-118bc6af-4008-45d7-a22f-491bcd1856c015/03/05
  12:12:44 INFO storage.MemoryStore: MemoryStore started with capacity 265.4 
 MB15/03/05 12:12:45 WARN util.NativeCodeLoader: Unable to load native-hadoop 
 library for your platform... using builtin-java classes where 
 applicable15/03/05 12:12:45 INFO spark.HttpFileServer: HTTP File server 
 directory is 
 /tmp/spark-5d7da34c-58d4-4d60-9b6a-3dce43cab39e/spark-4d65aacb-78bd-40fd-b6c0-53b47e28819915/03/05
  12:12:45 INFO spark.HttpServer: Starting HTTP Server15/03/05 12:12:45 INFO 
 server.Server: jetty-8.y.z-SNAPSHOT15/03/05 12:12:45 INFO 
 server.AbstractConnector: Started SocketConnector@0.0.0.0:5645215/03/05 
 12:12:45 INFO util.Utils: Successfully started service 'HTTP file server' on 
 port 56452.15/03/05 12:12:45 INFO server.Server: jetty-8.y.z-SNAPSHOT15/03/05 
 12:12:45 INFO server.AbstractConnector: Started 
 SelectChannelConnector@0.0.0.0:404015/03/05 12:12:45 INFO util.Utils: 
 Successfully started service 'SparkUI' on port 4040.15/03/05 12:12:45 INFO 
 ui.SparkUI: Started SparkUI at 
 http://hadoopdev01.opsdatastore.com:404015/03/05 12:12:46 INFO 
 impl.TimelineClientImpl: Timeline service address: 
 http://hadoopdev02.opsdatastore.com:8188/ws/v1/timeline/
 java.lang.NoClassDefFoundError: 
 org/codehaus/jackson/map/deser/std/StdDeserializer
 at java.lang.ClassLoader.defineClass1(Native Method)
 at 

Re: Spark Build with Hadoop 2.6, yarn - encounter java.lang.NoClassDefFoundError: org/codehaus/jackson/map/deser/std/StdDeserializer

2015-03-05 Thread Todd Nist
@Victor,

I'm pretty sure I built it correctly, I specified -Dhadoop.version=2.6.0,
am I missing something here?  Followed the docs on this but I'm open to
suggestions.

make-distribution.sh --name hadoop2.6 --tgz -Pyarn -Phadoop-2.4
*-Dhadoop.version=2.6.0* -Phive -Phive-thriftserver -DskipTests clean
package

@Ted
Well it is building now with the Djackson.version=1.9.3, can update in a
few on if it works.

@Sean
Since it in the process of building I will let it finish and try it out,
but do you see any other possible issues with the approach I have taken?

Thanks all for the quick responses.

-Todd

On Thu, Mar 5, 2015 at 1:20 PM, Sean Owen so...@cloudera.com wrote:

 Jackson 1.9.13? and codehaus.jackson.version? that's already set by
 the profile hadoop-2.4.

 On Thu, Mar 5, 2015 at 6:13 PM, Ted Yu yuzhih...@gmail.com wrote:
  Please add the following to build command:
  -Djackson.version=1.9.3
 
  Cheers
 
  On Thu, Mar 5, 2015 at 10:04 AM, Todd Nist tsind...@gmail.com wrote:
 
  I am running Spark on a HortonWorks HDP Cluster. I have deployed there
  prebuilt version but it is only for Spark 1.2.0 not 1.2.1 and there are
 a
  few fixes and features in there that I would like to leverage.
 
  I just downloaded the spark-1.2.1 source and built it to support Hadoop
  2.6 by doing the following:
 
  radtech:spark-1.2.1 tnist$ ./make-distribution.sh --name hadoop2.6 --tgz
  -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
  -DskipTests clean package
 
  When I deploy this to my hadoop cluster and kick of a spark-shell,
 
  $ spark-1.2.1-bin-hadoop2.6]# ./bin/spark-shell --master yarn-client
  --driver-memory 512m --executor-memory 512m
 
  Results in  java.lang.NoClassDefFoundError:
  org/codehaus/jackson/map/deser/std/StdDeserializer
 
  The full stack trace is below. I have validate that the
  $SPARK_HOME/lib/spark-assembly-1.2.1-hadoop2.6.0.jar does infact
 contain the
  class in question:
 
  jar -tvf spark-assembly-1.2.1-hadoop2.6.0.jar | grep
  'org/codehaus/jackson/map/deser/std'
 
  ...
   18002 Thu Mar 05 11:23:04 EST 2015
  parquet/org/codehaus/jackson/map/deser/std/StdDeserializer.class
1584 Thu Mar 05 11:23:04 EST 2015
 
 parquet/org/codehaus/jackson/map/deser/std/StdKeyDeserializer$BoolKD.class
  ...
 
  Any guidance on what I missed ? If i start the spark-shell in standalone
  it comes up fine, $SPARK_HOME/bin/spark-shell so it looks to be related
 to
  starting it under yarn from what I can tell.
 
  TIA for the assistance.
 
  -Todd
 
  Stack Trace
 
  15/03/05 12:12:38 INFO spark.SecurityManager: Changing view acls to:
 root
  15/03/05 12:12:38 INFO spark.SecurityManager: Changing modify acls to:
  root
  15/03/05 12:12:38 INFO spark.SecurityManager: SecurityManager:
  authentication disabled; ui acls disabled; users with view permissions:
  Set(root); users with modify permissions: Set(root)
  15/03/05 12:12:38 INFO spark.HttpServer: Starting HTTP Server
  15/03/05 12:12:39 INFO server.Server: jetty-8.y.z-SNAPSHOT
  15/03/05 12:12:39 INFO server.AbstractConnector: Started
  SocketConnector@0.0.0.0:36176
  15/03/05 12:12:39 INFO util.Utils: Successfully started service 'HTTP
  class server' on port 36176.
  Welcome to
  __
   / __/__  ___ _/ /__
  _\ \/ _ \/ _ `/ __/  '_/
 /___/ .__/\_,_/_/ /_/\_\   version 1.2.1
/_/
 
  Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
  Type in expressions to have them evaluated.
  Type :help for more information.
  15/03/05 12:12:43 INFO spark.SecurityManager: Changing view acls to:
 root
  15/03/05 12:12:43 INFO spark.SecurityManager: Changing modify acls to:
  root
  15/03/05 12:12:43 INFO spark.SecurityManager: SecurityManager:
  authentication disabled; ui acls disabled; users with view permissions:
  Set(root); users with modify permissions: Set(root)
  15/03/05 12:12:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
  15/03/05 12:12:44 INFO Remoting: Starting remoting
  15/03/05 12:12:44 INFO Remoting: Remoting started; listening on
 addresses
  :[akka.tcp://sparkdri...@hadoopdev01.opsdatastore.com:50544]
  15/03/05 12:12:44 INFO util.Utils: Successfully started service
  'sparkDriver' on port 50544.
  15/03/05 12:12:44 INFO spark.SparkEnv: Registering MapOutputTracker
  15/03/05 12:12:44 INFO spark.SparkEnv: Registering BlockManagerMaster
  15/03/05 12:12:44 INFO storage.DiskBlockManager: Created local directory
  at
 
 /tmp/spark-16402794-cc1e-42d0-9f9c-99f15eaa1861/spark-118bc6af-4008-45d7-a22f-491bcd1856c0
  15/03/05 12:12:44 INFO storage.MemoryStore: MemoryStore started with
  capacity 265.4 MB
  15/03/05 12:12:45 WARN util.NativeCodeLoader: Unable to load
 native-hadoop
  library for your platform... using builtin-java classes where applicable
  15/03/05 12:12:45 INFO spark.HttpFileServer: HTTP File server directory
 is
 
 /tmp/spark-5d7da34c-58d4-4d60-9b6a-3dce43cab39e/spark-4d65aacb-78bd-40fd-b6c0-53b47e288199
  15/03/05 12:12:45 INFO 

Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-18 Thread Jianshi Huang
Ok, I'll wait until -Pscala-2.11 is more stable and used by more people.

Thanks for the help!

Jianshi

On Tue, Nov 18, 2014 at 3:49 PM, Ye Xianjin advance...@gmail.com wrote:

 Hi Prashant Sharma,

 It's not even ok to build with scala-2.11 profile on my machine.

 Just check out the master(c6e0c2ab1c29c184a9302d23ad75e4ccd8060242)
 run sbt/sbt -Pscala-2.11 clean assembly:

 .. skip the normal part
 info] Resolving org.scalamacros#quasiquotes_2.11;2.0.1 ...
 [warn] module not found: org.scalamacros#quasiquotes_2.11;2.0.1
 [warn]  local: tried
 [warn]
 /Users/yexianjin/.ivy2/local/org.scalamacros/quasiquotes_2.11/2.0.1/ivys/ivy.xml
 [warn]  public: tried
 [warn]
 https://repo1.maven.org/maven2/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  central: tried
 [warn]
 https://repo1.maven.org/maven2/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  apache-repo: tried
 [warn]
 https://repository.apache.org/content/repositories/releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  jboss-repo: tried
 [warn]
 https://repository.jboss.org/nexus/content/repositories/releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  mqtt-repo: tried
 [warn]
 https://repo.eclipse.org/content/repositories/paho-releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  cloudera-repo: tried
 [warn]
 https://repository.cloudera.com/artifactory/cloudera-repos/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  mapr-repo: tried
 [warn]
 http://repository.mapr.com/maven/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  spring-releases: tried
 [warn]
 https://repo.spring.io/libs-release/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  spark-staging: tried
 [warn]
 https://oss.sonatype.org/content/repositories/orgspark-project-1085/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  spark-staging-hive13: tried
 [warn]
 https://oss.sonatype.org/content/repositories/orgspark-project-1089/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  apache.snapshots: tried
 [warn]
 http://repository.apache.org/snapshots/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [warn]  Maven2 Local: tried
 [warn]
 file:/Users/yexianjin/.m2/repository/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
 [info] Resolving jline#jline;2.12 ...
 [warn] ::
 [warn] ::  UNRESOLVED DEPENDENCIES ::
 [warn] ::
 [warn] :: org.scalamacros#quasiquotes_2.11;2.0.1: not found
 [warn] ::
 [info] Resolving org.scala-lang#scala-library;2.11.2 ...
 [warn]
 [warn] Note: Unresolved dependencies path:
 [warn] org.scalamacros:quasiquotes_2.11:2.0.1
 ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
 [warn]   +- org.apache.spark:spark-catalyst_2.11:1.2.0-SNAPSHOT
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Updating {file:/Users/yexianjin/spark/}streaming-twitter...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-zeromq...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-flume...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-mqtt...
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving com.esotericsoftware.minlog#minlog;1.2 ...
 [info] Updating {file:/Users/yexianjin/spark/}streaming-kafka...
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving jline#jline;2.12 ...
 [info] Done updating.
 [info] Resolving org.apache.kafka#kafka_2.11;0.8.0 ...
 [warn] module not found: org.apache.kafka#kafka_2.11;0.8.0
 [warn]  local: tried
 [warn]
 /Users/yexianjin/.ivy2/local/org.apache.kafka/kafka_2.11/0.8.0/ivys/ivy.xml
 [warn]  public: tried
 [warn]
 https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  central: tried
 [warn]
 https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  apache-repo: tried
 [warn]
 https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  jboss-repo: tried
 [warn]
 https://repository.jboss.org/nexus/content/repositories/releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  mqtt-repo: tried
 [warn]
 https://repo.eclipse.org/content/repositories/paho-releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  cloudera-repo: tried
 [warn]
 https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
 [warn]  mapr-repo: tried
 [warn]
 

Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Jianshi Huang
Any notable issues for using Scala 2.11? Is it stable now?

Or can I use Scala 2.11 in my spark application and use Spark dist build
with 2.10 ?

I'm looking forward to migrate to 2.11 for some quasiquote features.
Couldn't make it run in 2.10...

Cheers,
-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github  Blog: http://huangjs.github.com/


Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Prashant Sharma
It is safe in the sense we would help you with the fix if you run into
issues. I have used it, but since I worked on the patch the opinion can be
biased. I am using scala 2.11 for day to day development. You should
checkout the build instructions here :
https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md

Prashant Sharma



On Tue, Nov 18, 2014 at 12:19 PM, Jianshi Huang jianshi.hu...@gmail.com
wrote:

 Any notable issues for using Scala 2.11? Is it stable now?

 Or can I use Scala 2.11 in my spark application and use Spark dist build
 with 2.10 ?

 I'm looking forward to migrate to 2.11 for some quasiquote features.
 Couldn't make it run in 2.10...

 Cheers,
 --
 Jianshi Huang

 LinkedIn: jianshi
 Twitter: @jshuang
 Github  Blog: http://huangjs.github.com/



Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Prashant Sharma
Looks like sbt/sbt -Pscala-2.11 is broken by a recent patch for improving
maven build.

Prashant Sharma



On Tue, Nov 18, 2014 at 12:57 PM, Prashant Sharma scrapco...@gmail.com
wrote:

 It is safe in the sense we would help you with the fix if you run into
 issues. I have used it, but since I worked on the patch the opinion can be
 biased. I am using scala 2.11 for day to day development. You should
 checkout the build instructions here :
 https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md

 Prashant Sharma



 On Tue, Nov 18, 2014 at 12:19 PM, Jianshi Huang jianshi.hu...@gmail.com
 wrote:

 Any notable issues for using Scala 2.11? Is it stable now?

 Or can I use Scala 2.11 in my spark application and use Spark dist build
 with 2.10 ?

 I'm looking forward to migrate to 2.11 for some quasiquote features.
 Couldn't make it run in 2.10...

 Cheers,
 --
 Jianshi Huang

 LinkedIn: jianshi
 Twitter: @jshuang
 Github  Blog: http://huangjs.github.com/





Re: Is it safe to use Scala 2.11 for Spark build?

2014-11-17 Thread Ye Xianjin
Hi Prashant Sharma, 

It's not even ok to build with scala-2.11 profile on my machine.

Just check out the master(c6e0c2ab1c29c184a9302d23ad75e4ccd8060242)
run sbt/sbt -Pscala-2.11 clean assembly:

.. skip the normal part
info] Resolving org.scalamacros#quasiquotes_2.11;2.0.1 ...
[warn] module not found: org.scalamacros#quasiquotes_2.11;2.0.1
[warn]  local: tried
[warn]   
/Users/yexianjin/.ivy2/local/org.scalamacros/quasiquotes_2.11/2.0.1/ivys/ivy.xml
[warn]  public: tried
[warn]   
https://repo1.maven.org/maven2/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  central: tried
[warn]   
https://repo1.maven.org/maven2/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  apache-repo: tried
[warn]   
https://repository.apache.org/content/repositories/releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  jboss-repo: tried
[warn]   
https://repository.jboss.org/nexus/content/repositories/releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  mqtt-repo: tried
[warn]   
https://repo.eclipse.org/content/repositories/paho-releases/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  cloudera-repo: tried
[warn]   
https://repository.cloudera.com/artifactory/cloudera-repos/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  mapr-repo: tried
[warn]   
http://repository.mapr.com/maven/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  spring-releases: tried
[warn]   
https://repo.spring.io/libs-release/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  spark-staging: tried
[warn]   
https://oss.sonatype.org/content/repositories/orgspark-project-1085/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  spark-staging-hive13: tried
[warn]   
https://oss.sonatype.org/content/repositories/orgspark-project-1089/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  apache.snapshots: tried
[warn]   
http://repository.apache.org/snapshots/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[warn]  Maven2 Local: tried
[warn]   
file:/Users/yexianjin/.m2/repository/org/scalamacros/quasiquotes_2.11/2.0.1/quasiquotes_2.11-2.0.1.pom
[info] Resolving jline#jline;2.12 ...
[warn] ::
[warn] ::  UNRESOLVED DEPENDENCIES ::
[warn] ::
[warn] :: org.scalamacros#quasiquotes_2.11;2.0.1: not found
[warn] ::
[info] Resolving org.scala-lang#scala-library;2.11.2 ...
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.scalamacros:quasiquotes_2.11:2.0.1 
((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
[warn]  +- org.apache.spark:spark-catalyst_2.11:1.2.0-SNAPSHOT
[info] Resolving jline#jline;2.12 ...
[info] Done updating.
[info] Updating {file:/Users/yexianjin/spark/}streaming-twitter...
[info] Updating {file:/Users/yexianjin/spark/}streaming-zeromq...
[info] Updating {file:/Users/yexianjin/spark/}streaming-flume...
[info] Updating {file:/Users/yexianjin/spark/}streaming-mqtt...
[info] Resolving jline#jline;2.12 ...
[info] Done updating.
[info] Resolving com.esotericsoftware.minlog#minlog;1.2 ...
[info] Updating {file:/Users/yexianjin/spark/}streaming-kafka...
[info] Resolving jline#jline;2.12 ...
[info] Done updating.
[info] Resolving jline#jline;2.12 ...
[info] Done updating.
[info] Resolving jline#jline;2.12 ...
[info] Done updating.
[info] Resolving org.apache.kafka#kafka_2.11;0.8.0 ...
[warn] module not found: org.apache.kafka#kafka_2.11;0.8.0
[warn]  local: tried
[warn]   
/Users/yexianjin/.ivy2/local/org.apache.kafka/kafka_2.11/0.8.0/ivys/ivy.xml
[warn]  public: tried
[warn]   
https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  central: tried
[warn]   
https://repo1.maven.org/maven2/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  apache-repo: tried
[warn]   
https://repository.apache.org/content/repositories/releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  jboss-repo: tried
[warn]   
https://repository.jboss.org/nexus/content/repositories/releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  mqtt-repo: tried
[warn]   
https://repo.eclipse.org/content/repositories/paho-releases/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  cloudera-repo: tried
[warn]   
https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  mapr-repo: tried
[warn]   
http://repository.mapr.com/maven/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  spring-releases: tried
[warn]   
https://repo.spring.io/libs-release/org/apache/kafka/kafka_2.11/0.8.0/kafka_2.11-0.8.0.pom
[warn]  

Spark Build

2014-10-31 Thread Terry Siu
I am synced up to the Spark master branch as of commit 23468e7e96. I have Maven 
3.0.5, Scala 2.10.3, and SBT 0.13.1. I’ve built the master branch successfully 
previously and am trying to rebuild again to take advantage of the new Hive 
0.13.1 profile. I execute the following command:

$ mvn -DskipTests -Phive-0.13-1 -Phadoop-2.4 -Pyarn clean package

The build fails at the following stage:


INFO] Using incremental compilation

[INFO] compiler plugin: 
BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)

[INFO] Compiling 5 Scala sources to 
/home/terrys/Applications/spark/yarn/stable/target/scala-2.10/test-classes...

[ERROR] 
/home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:20:
 object MemLimitLogger is not a member of package org.apache.spark.deploy.yarn

[ERROR] import org.apache.spark.deploy.yarn.MemLimitLogger._

[ERROR] ^

[ERROR] 
/home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:29:
 not found: value memLimitExceededLogMessage

[ERROR] val vmemMsg = memLimitExceededLogMessage(diagnostics, 
VMEM_EXCEEDED_PATTERN)

[ERROR]   ^

[ERROR] 
/home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:30:
 not found: value memLimitExceededLogMessage

[ERROR] val pmemMsg = memLimitExceededLogMessage(diagnostics, 
PMEM_EXCEEDED_PATTERN)

[ERROR]   ^

[ERROR] three errors found

[INFO] 

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM .. SUCCESS [2.758s]

[INFO] Spark Project Common Network Code . SUCCESS [6.716s]

[INFO] Spark Project Core  SUCCESS [2:46.610s]

[INFO] Spark Project Bagel ... SUCCESS [16.776s]

[INFO] Spark Project GraphX .. SUCCESS [52.159s]

[INFO] Spark Project Streaming ... SUCCESS [1:09.883s]

[INFO] Spark Project ML Library .. SUCCESS [1:18.932s]

[INFO] Spark Project Tools ... SUCCESS [10.210s]

[INFO] Spark Project Catalyst  SUCCESS [1:12.499s]

[INFO] Spark Project SQL . SUCCESS [1:10.561s]

[INFO] Spark Project Hive  SUCCESS [1:08.571s]

[INFO] Spark Project REPL  SUCCESS [32.377s]

[INFO] Spark Project YARN Parent POM . SUCCESS [1.317s]

[INFO] Spark Project YARN Stable API . FAILURE [25.918s]

[INFO] Spark Project Assembly  SKIPPED

[INFO] Spark Project External Twitter  SKIPPED

[INFO] Spark Project External Kafka .. SKIPPED

[INFO] Spark Project External Flume Sink . SKIPPED

[INFO] Spark Project External Flume .. SKIPPED

[INFO] Spark Project External ZeroMQ . SKIPPED

[INFO] Spark Project External MQTT ... SKIPPED

[INFO] Spark Project Examples  SKIPPED

[INFO] 

[INFO] BUILD FAILURE

[INFO] 

[INFO] Total time: 11:15.889s

[INFO] Finished at: Fri Oct 31 12:08:55 PDT 2014

[INFO] Final Memory: 73M/829M

[INFO] 

[ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile 
(scala-test-compile-first) on project spark-yarn_2.10: Execution 
scala-test-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile failed. CompileFailed 
- [Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions, please 
read the following articles:

[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn goals -rf :spark-yarn_2.10


I could not find MemLimitLogger anywhere in the Spark code. Anybody else 
seen/encounter this?


Thanks,

-Terry






Re: Spark Build

2014-10-31 Thread Shivaram Venkataraman
Yeah looks like https://github.com/apache/spark/pull/2744 broke the
build. We will fix it soon

On Fri, Oct 31, 2014 at 12:21 PM, Terry Siu terry@smartfocus.com wrote:
 I am synced up to the Spark master branch as of commit 23468e7e96. I have
 Maven 3.0.5, Scala 2.10.3, and SBT 0.13.1. I’ve built the master branch
 successfully previously and am trying to rebuild again to take advantage of
 the new Hive 0.13.1 profile. I execute the following command:

 $ mvn -DskipTests -Phive-0.13-1 -Phadoop-2.4 -Pyarn clean package

 The build fails at the following stage:

 INFO] Using incremental compilation

 [INFO] compiler plugin:
 BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)

 [INFO] Compiling 5 Scala sources to
 /home/terrys/Applications/spark/yarn/stable/target/scala-2.10/test-classes...

 [ERROR]
 /home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:20:
 object MemLimitLogger is not a member of package
 org.apache.spark.deploy.yarn

 [ERROR] import org.apache.spark.deploy.yarn.MemLimitLogger._

 [ERROR] ^

 [ERROR]
 /home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:29:
 not found: value memLimitExceededLogMessage

 [ERROR] val vmemMsg = memLimitExceededLogMessage(diagnostics,
 VMEM_EXCEEDED_PATTERN)

 [ERROR]   ^

 [ERROR]
 /home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala:30:
 not found: value memLimitExceededLogMessage

 [ERROR] val pmemMsg = memLimitExceededLogMessage(diagnostics,
 PMEM_EXCEEDED_PATTERN)

 [ERROR]   ^

 [ERROR] three errors found

 [INFO]
 

 [INFO] Reactor Summary:

 [INFO]

 [INFO] Spark Project Parent POM .. SUCCESS [2.758s]

 [INFO] Spark Project Common Network Code . SUCCESS [6.716s]

 [INFO] Spark Project Core  SUCCESS
 [2:46.610s]

 [INFO] Spark Project Bagel ... SUCCESS [16.776s]

 [INFO] Spark Project GraphX .. SUCCESS [52.159s]

 [INFO] Spark Project Streaming ... SUCCESS
 [1:09.883s]

 [INFO] Spark Project ML Library .. SUCCESS
 [1:18.932s]

 [INFO] Spark Project Tools ... SUCCESS [10.210s]

 [INFO] Spark Project Catalyst  SUCCESS
 [1:12.499s]

 [INFO] Spark Project SQL . SUCCESS
 [1:10.561s]

 [INFO] Spark Project Hive  SUCCESS
 [1:08.571s]

 [INFO] Spark Project REPL  SUCCESS [32.377s]

 [INFO] Spark Project YARN Parent POM . SUCCESS [1.317s]

 [INFO] Spark Project YARN Stable API . FAILURE [25.918s]

 [INFO] Spark Project Assembly  SKIPPED

 [INFO] Spark Project External Twitter  SKIPPED

 [INFO] Spark Project External Kafka .. SKIPPED

 [INFO] Spark Project External Flume Sink . SKIPPED

 [INFO] Spark Project External Flume .. SKIPPED

 [INFO] Spark Project External ZeroMQ . SKIPPED

 [INFO] Spark Project External MQTT ... SKIPPED

 [INFO] Spark Project Examples  SKIPPED

 [INFO]
 

 [INFO] BUILD FAILURE

 [INFO]
 

 [INFO] Total time: 11:15.889s

 [INFO] Finished at: Fri Oct 31 12:08:55 PDT 2014

 [INFO] Final Memory: 73M/829M

 [INFO]
 

 [ERROR] Failed to execute goal
 net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile
 (scala-test-compile-first) on project spark-yarn_2.10: Execution
 scala-test-compile-first of goal
 net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile failed.
 CompileFailed - [Help 1]

 [ERROR]

 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
 switch.

 [ERROR] Re-run Maven using the -X switch to enable full debug logging.

 [ERROR]

 [ERROR] For more information about the errors and possible solutions, please
 read the following articles:

 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException

 [ERROR]

 [ERROR] After correcting the problems, you can resume the build with the
 command

 [ERROR]   mvn goals -rf :spark-yarn_2.10


 I could not find MemLimitLogger anywhere in the Spark code. Anybody else
 seen/encounter this?


 Thanks,

 -Terry




-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional 

Re: Spark Build

2014-10-31 Thread Terry Siu
Thanks for the update, Shivaram.

-Terry

On 10/31/14, 12:37 PM, Shivaram Venkataraman
shiva...@eecs.berkeley.edu wrote:

Yeah looks like https://github.com/apache/spark/pull/2744 broke the
build. We will fix it soon

On Fri, Oct 31, 2014 at 12:21 PM, Terry Siu terry@smartfocus.com
wrote:
 I am synced up to the Spark master branch as of commit 23468e7e96. I
have
 Maven 3.0.5, Scala 2.10.3, and SBT 0.13.1. I¹ve built the master branch
 successfully previously and am trying to rebuild again to take
advantage of
 the new Hive 0.13.1 profile. I execute the following command:

 $ mvn -DskipTests -Phive-0.13-1 -Phadoop-2.4 -Pyarn clean package

 The build fails at the following stage:

 INFO] Using incremental compilation

 [INFO] compiler plugin:
 BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)

 [INFO] Compiling 5 Scala sources to
 
/home/terrys/Applications/spark/yarn/stable/target/scala-2.10/test-classe
s...

 [ERROR]
 
/home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spa
rk/deploy/yarn/YarnAllocatorSuite.scala:20:
 object MemLimitLogger is not a member of package
 org.apache.spark.deploy.yarn

 [ERROR] import org.apache.spark.deploy.yarn.MemLimitLogger._

 [ERROR] ^

 [ERROR]
 
/home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spa
rk/deploy/yarn/YarnAllocatorSuite.scala:29:
 not found: value memLimitExceededLogMessage

 [ERROR] val vmemMsg = memLimitExceededLogMessage(diagnostics,
 VMEM_EXCEEDED_PATTERN)

 [ERROR]   ^

 [ERROR]
 
/home/terrys/Applications/spark/yarn/common/src/test/scala/org/apache/spa
rk/deploy/yarn/YarnAllocatorSuite.scala:30:
 not found: value memLimitExceededLogMessage

 [ERROR] val pmemMsg = memLimitExceededLogMessage(diagnostics,
 PMEM_EXCEEDED_PATTERN)

 [ERROR]   ^

 [ERROR] three errors found

 [INFO]
 

 [INFO] Reactor Summary:

 [INFO]

 [INFO] Spark Project Parent POM .. SUCCESS
[2.758s]

 [INFO] Spark Project Common Network Code . SUCCESS
[6.716s]

 [INFO] Spark Project Core  SUCCESS
 [2:46.610s]

 [INFO] Spark Project Bagel ... SUCCESS
[16.776s]

 [INFO] Spark Project GraphX .. SUCCESS
[52.159s]

 [INFO] Spark Project Streaming ... SUCCESS
 [1:09.883s]

 [INFO] Spark Project ML Library .. SUCCESS
 [1:18.932s]

 [INFO] Spark Project Tools ... SUCCESS
[10.210s]

 [INFO] Spark Project Catalyst  SUCCESS
 [1:12.499s]

 [INFO] Spark Project SQL . SUCCESS
 [1:10.561s]

 [INFO] Spark Project Hive  SUCCESS
 [1:08.571s]

 [INFO] Spark Project REPL  SUCCESS
[32.377s]

 [INFO] Spark Project YARN Parent POM . SUCCESS
[1.317s]

 [INFO] Spark Project YARN Stable API . FAILURE
[25.918s]

 [INFO] Spark Project Assembly  SKIPPED

 [INFO] Spark Project External Twitter  SKIPPED

 [INFO] Spark Project External Kafka .. SKIPPED

 [INFO] Spark Project External Flume Sink . SKIPPED

 [INFO] Spark Project External Flume .. SKIPPED

 [INFO] Spark Project External ZeroMQ . SKIPPED

 [INFO] Spark Project External MQTT ... SKIPPED

 [INFO] Spark Project Examples  SKIPPED

 [INFO]
 

 [INFO] BUILD FAILURE

 [INFO]
 

 [INFO] Total time: 11:15.889s

 [INFO] Finished at: Fri Oct 31 12:08:55 PDT 2014

 [INFO] Final Memory: 73M/829M

 [INFO]
 

 [ERROR] Failed to execute goal
 net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile
 (scala-test-compile-first) on project spark-yarn_2.10: Execution
 scala-test-compile-first of goal
 net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile failed.
 CompileFailed - [Help 1]

 [ERROR]

 [ERROR] To see the full stack trace of the errors, re-run Maven with
the -e
 switch.

 [ERROR] Re-run Maven using the -X switch to enable full debug logging.

 [ERROR]

 [ERROR] For more information about the errors and possible solutions,
please
 read the following articles:

 [ERROR] [Help 1]
 
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException

 [ERROR]

 [ERROR] After correcting the problems, you can resume the build with the
 command

 [ERROR]   mvn goals -rf :spark-yarn_2.10


 I could not find MemLimitLogger anywhere in the Spark code. Anybody else
 seen/encounter this?


 Thanks,

 -Terry







Spark build error

2014-08-06 Thread Priya Ch
Hi,

I am trying to build jars using the command :

mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package

Execution of the above command is throwing the following error:

[INFO] Spark Project Core . FAILURE [  0.295 s]
[INFO] Spark Project Bagel  SKIPPED
[INFO] Spark Project GraphX ... SKIPPED
[INFO] Spark Project ML Library ... SKIPPED
[INFO] Spark Project Streaming  SKIPPED
[INFO] Spark Project Tools  SKIPPED
[INFO] Spark Project Catalyst . SKIPPED
[INFO] Spark Project SQL .. SKIPPED
[INFO] Spark Project Hive . SKIPPED
[INFO] Spark Project REPL . SKIPPED
[INFO] Spark Project YARN Parent POM .. SKIPPED
[INFO] Spark Project YARN Stable API .. SKIPPED
[INFO] Spark Project Assembly . SKIPPED
[INFO] Spark Project External Twitter . SKIPPED
[INFO] Spark Project External Kafka ... SKIPPED
[INFO] Spark Project External Flume ... SKIPPED
[INFO] Spark Project External ZeroMQ .. SKIPPED
[INFO] Spark Project External MQTT  SKIPPED
[INFO] Spark Project Examples . SKIPPED
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 3.748 s
[INFO] Finished at: 2014-08-07T01:00:48+05:30
[INFO] Final Memory: 24M/175M
[INFO] 
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
(default) on project spark-core_2.10: Execution default of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
failed: For artifact {null:null:null:jar}: The groupId cannot be
empty. - [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
execute goal org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
(default) on project spark-core_2.10: Execution default of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
failed: For artifact {null:null:null:jar}: The groupId cannot be
empty.
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at 
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at 
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:347)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:154)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at 
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution
default of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
failed: For artifact {null:null:null:jar}: The groupId cannot be
empty.
at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:143)
at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: org.apache.maven.artifact.InvalidArtifactRTException: For
artifact {null:null:null:jar}: The groupId cannot be empty.



Can someone help me on this ?