Re: failed to build spark with maven for both 1.0.1 and latest master branch

2014-07-31 Thread yao
Great, thanks Ted. I just did a maven build on CentOS 6, everything looks
good. So this is more like a Mac specific issue.


On Thu, Jul 31, 2014 at 1:43 PM, Ted Yu  wrote:

> The following command succeeded (on Linux) on Spark master checked out this
> morning:
>
> mvn -Pyarn -Phive -Phadoop-2.4 -DskipTests install
>
> FYI
>
>
> On Thu, Jul 31, 2014 at 1:36 PM, yao  wrote:
>
> > Hi TD,
> >
> > I've asked my colleagues to do the same thing but compile still fails.
> > However, maven build succeeded once I built it on my personal macbook
> (with
> > the latest MacOS Yosemite). So I guess there might be something wrong in
> my
> > build environment. Wonder if anyone tried to compile spark using maven
> > under Mavericks, please let me know your result.
> >
> > Thanks
> > Shengzhe
> >
> >
> > On Thu, Jul 31, 2014 at 1:25 AM, Tathagata Das <
> > tathagata.das1...@gmail.com>
> > wrote:
> >
> > > Does a "mvn clean" or "sbt/sbt clean" help?
> > >
> > > TD
> > >
> > > On Wed, Jul 30, 2014 at 9:25 PM, yao  wrote:
> > > > Hi Folks,
> > > >
> > > > Today I am trying to build spark using maven; however, the following
> > > > command failed consistently for both 1.0.1 and the latest master.
> >  (BTW,
> > > it
> > > > seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
> > > > assembly)*
> > > >
> > > > Environment: Mac OS Mavericks
> > > > Maven: 3.2.2 (installed by homebrew)
> > > >
> > > >
> > > >
> > > >
> > > > *export M2_HOME=/usr/local/Cellar/maven/3.2.2/libexec/export
> > > > PATH=$M2_HOME/bin:$PATHexport MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> > > > -XX:ReservedCodeCacheSize=512m"mvn -Pyarn -Phadoop-2.4
> > > > -Dhadoop.version=2.4.0 -DskipTests clean package*
> > > >
> > > > Build outputs:
> > > >
> > > > [INFO] Scanning for projects...
> > > > [INFO]
> > > >
> > 
> > > > [INFO] Reactor Build Order:
> > > > [INFO]
> > > > [INFO] Spark Project Parent POM
> > > > [INFO] Spark Project Core
> > > > [INFO] Spark Project Bagel
> > > > [INFO] Spark Project GraphX
> > > > [INFO] Spark Project ML Library
> > > > [INFO] Spark Project Streaming
> > > > [INFO] Spark Project Tools
> > > > [INFO] Spark Project Catalyst
> > > > [INFO] Spark Project SQL
> > > > [INFO] Spark Project Hive
> > > > [INFO] Spark Project REPL
> > > > [INFO] Spark Project YARN Parent POM
> > > > [INFO] Spark Project YARN Stable API
> > > > [INFO] Spark Project Assembly
> > > > [INFO] Spark Project External Twitter
> > > > [INFO] Spark Project External Kafka
> > > > [INFO] Spark Project External Flume
> > > > [INFO] Spark Project External ZeroMQ
> > > > [INFO] Spark Project External MQTT
> > > > [INFO] Spark Project Examples
> > > > [INFO]
> > > > [INFO]
> > > >
> > 
> > > > [INFO] Building Spark Project Parent POM 1.0.1
> > > > [INFO]
> > > >
> > 
> > > > [INFO]
> > > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @
> spark-parent
> > > ---
> > > > [INFO]
> > > > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > > > spark-parent ---
> > > > [INFO]
> > > > [INFO] --- build-helper-maven-plugin:1.8:add-source
> > (add-scala-sources) @
> > > > spark-parent ---
> > > > [INFO] Source directory:
> > > > /Users/syao/git/grid/thirdparty/spark/src/main/scala added.
> > > > [INFO]
> > > > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > > > spark-parent ---
> > > > [INFO]
> > > > [INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first)
> @
> > > > spark-parent ---
> > > > [INFO] Add Test Source directory:
> > > > /Users/syao/git/grid/thirdparty/spark/src/test/scala
> > > > [INFO]
> > > > [INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
> > > > spark-parent ---
> > > > [INFO] No sources to compile
> > > > [INFO]
> > > > [INFO] --- build-helper-maven-plugin:1.8:add-test-source
> > > > (add-scala-test-sources) @ spark-parent ---
> > > > [INFO] Test Source directory:
> > > > /Users/syao/git/grid/thirdparty/spark/src/test/scala added.
> > > > [INFO]
> > > > [INFO] --- scala-maven-plugin:3.1.6:testCompile
> > > (scala-test-compile-first)
> > > > @ spark-parent ---
> > > > [INFO] No sources to compile
> > > > [INFO]
> > > > [INFO] --- maven-site-plugin:3.3:attach-descriptor
> (attach-descriptor)
> > @
> > > > spark-parent ---
> > > > [INFO]
> > > > [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar)
> @
> > > > spark-parent ---
> > > > [INFO]
> > > > [INFO]
> > > >
> > 
> > > > [INFO] Building Spark Project Core 1.0.1
> > > > [INFO]
> > > >
> > 
> > > > [INFO]
> > > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @
> > spark-core_2.10
> > > > ---
> > > > [INFO]
> > > > [INFO] --- maven-e

Re: failed to build spark with maven for both 1.0.1 and latest master branch

2014-07-31 Thread Ted Yu
The following command succeeded (on Linux) on Spark master checked out this
morning:

mvn -Pyarn -Phive -Phadoop-2.4 -DskipTests install

FYI


On Thu, Jul 31, 2014 at 1:36 PM, yao  wrote:

> Hi TD,
>
> I've asked my colleagues to do the same thing but compile still fails.
> However, maven build succeeded once I built it on my personal macbook (with
> the latest MacOS Yosemite). So I guess there might be something wrong in my
> build environment. Wonder if anyone tried to compile spark using maven
> under Mavericks, please let me know your result.
>
> Thanks
> Shengzhe
>
>
> On Thu, Jul 31, 2014 at 1:25 AM, Tathagata Das <
> tathagata.das1...@gmail.com>
> wrote:
>
> > Does a "mvn clean" or "sbt/sbt clean" help?
> >
> > TD
> >
> > On Wed, Jul 30, 2014 at 9:25 PM, yao  wrote:
> > > Hi Folks,
> > >
> > > Today I am trying to build spark using maven; however, the following
> > > command failed consistently for both 1.0.1 and the latest master.
>  (BTW,
> > it
> > > seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
> > > assembly)*
> > >
> > > Environment: Mac OS Mavericks
> > > Maven: 3.2.2 (installed by homebrew)
> > >
> > >
> > >
> > >
> > > *export M2_HOME=/usr/local/Cellar/maven/3.2.2/libexec/export
> > > PATH=$M2_HOME/bin:$PATHexport MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> > > -XX:ReservedCodeCacheSize=512m"mvn -Pyarn -Phadoop-2.4
> > > -Dhadoop.version=2.4.0 -DskipTests clean package*
> > >
> > > Build outputs:
> > >
> > > [INFO] Scanning for projects...
> > > [INFO]
> > >
> 
> > > [INFO] Reactor Build Order:
> > > [INFO]
> > > [INFO] Spark Project Parent POM
> > > [INFO] Spark Project Core
> > > [INFO] Spark Project Bagel
> > > [INFO] Spark Project GraphX
> > > [INFO] Spark Project ML Library
> > > [INFO] Spark Project Streaming
> > > [INFO] Spark Project Tools
> > > [INFO] Spark Project Catalyst
> > > [INFO] Spark Project SQL
> > > [INFO] Spark Project Hive
> > > [INFO] Spark Project REPL
> > > [INFO] Spark Project YARN Parent POM
> > > [INFO] Spark Project YARN Stable API
> > > [INFO] Spark Project Assembly
> > > [INFO] Spark Project External Twitter
> > > [INFO] Spark Project External Kafka
> > > [INFO] Spark Project External Flume
> > > [INFO] Spark Project External ZeroMQ
> > > [INFO] Spark Project External MQTT
> > > [INFO] Spark Project Examples
> > > [INFO]
> > > [INFO]
> > >
> 
> > > [INFO] Building Spark Project Parent POM 1.0.1
> > > [INFO]
> > >
> 
> > > [INFO]
> > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-parent
> > ---
> > > [INFO]
> > > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > > spark-parent ---
> > > [INFO]
> > > [INFO] --- build-helper-maven-plugin:1.8:add-source
> (add-scala-sources) @
> > > spark-parent ---
> > > [INFO] Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/src/main/scala added.
> > > [INFO]
> > > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > > spark-parent ---
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
> > > spark-parent ---
> > > [INFO] Add Test Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/src/test/scala
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
> > > spark-parent ---
> > > [INFO] No sources to compile
> > > [INFO]
> > > [INFO] --- build-helper-maven-plugin:1.8:add-test-source
> > > (add-scala-test-sources) @ spark-parent ---
> > > [INFO] Test Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/src/test/scala added.
> > > [INFO]
> > > [INFO] --- scala-maven-plugin:3.1.6:testCompile
> > (scala-test-compile-first)
> > > @ spark-parent ---
> > > [INFO] No sources to compile
> > > [INFO]
> > > [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor)
> @
> > > spark-parent ---
> > > [INFO]
> > > [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @
> > > spark-parent ---
> > > [INFO]
> > > [INFO]
> > >
> 
> > > [INFO] Building Spark Project Core 1.0.1
> > > [INFO]
> > >
> 
> > > [INFO]
> > > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @
> spark-core_2.10
> > > ---
> > > [INFO]
> > > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > > spark-core_2.10 ---
> > > [INFO]
> > > [INFO] --- build-helper-maven-plugin:1.8:add-source
> (add-scala-sources) @
> > > spark-core_2.10 ---
> > > [INFO] Source directory:
> > > /Users/syao/git/grid/thirdparty/spark/core/src/main/scala added.
> > > [INFO]
> > > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > > spark-core_2.10 ---
> > > [INFO]
> > > [INFO] --- exec-maven-plugin:1.2.1:ex

Re: failed to build spark with maven for both 1.0.1 and latest master branch

2014-07-31 Thread yao
Hi TD,

I've asked my colleagues to do the same thing but compile still fails.
However, maven build succeeded once I built it on my personal macbook (with
the latest MacOS Yosemite). So I guess there might be something wrong in my
build environment. Wonder if anyone tried to compile spark using maven
under Mavericks, please let me know your result.

Thanks
Shengzhe


On Thu, Jul 31, 2014 at 1:25 AM, Tathagata Das 
wrote:

> Does a "mvn clean" or "sbt/sbt clean" help?
>
> TD
>
> On Wed, Jul 30, 2014 at 9:25 PM, yao  wrote:
> > Hi Folks,
> >
> > Today I am trying to build spark using maven; however, the following
> > command failed consistently for both 1.0.1 and the latest master.  (BTW,
> it
> > seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
> > assembly)*
> >
> > Environment: Mac OS Mavericks
> > Maven: 3.2.2 (installed by homebrew)
> >
> >
> >
> >
> > *export M2_HOME=/usr/local/Cellar/maven/3.2.2/libexec/export
> > PATH=$M2_HOME/bin:$PATHexport MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> > -XX:ReservedCodeCacheSize=512m"mvn -Pyarn -Phadoop-2.4
> > -Dhadoop.version=2.4.0 -DskipTests clean package*
> >
> > Build outputs:
> >
> > [INFO] Scanning for projects...
> > [INFO]
> > 
> > [INFO] Reactor Build Order:
> > [INFO]
> > [INFO] Spark Project Parent POM
> > [INFO] Spark Project Core
> > [INFO] Spark Project Bagel
> > [INFO] Spark Project GraphX
> > [INFO] Spark Project ML Library
> > [INFO] Spark Project Streaming
> > [INFO] Spark Project Tools
> > [INFO] Spark Project Catalyst
> > [INFO] Spark Project SQL
> > [INFO] Spark Project Hive
> > [INFO] Spark Project REPL
> > [INFO] Spark Project YARN Parent POM
> > [INFO] Spark Project YARN Stable API
> > [INFO] Spark Project Assembly
> > [INFO] Spark Project External Twitter
> > [INFO] Spark Project External Kafka
> > [INFO] Spark Project External Flume
> > [INFO] Spark Project External ZeroMQ
> > [INFO] Spark Project External MQTT
> > [INFO] Spark Project Examples
> > [INFO]
> > [INFO]
> > 
> > [INFO] Building Spark Project Parent POM 1.0.1
> > [INFO]
> > 
> > [INFO]
> > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-parent
> ---
> > [INFO]
> > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > spark-parent ---
> > [INFO]
> > [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @
> > spark-parent ---
> > [INFO] Source directory:
> > /Users/syao/git/grid/thirdparty/spark/src/main/scala added.
> > [INFO]
> > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > spark-parent ---
> > [INFO]
> > [INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
> > spark-parent ---
> > [INFO] Add Test Source directory:
> > /Users/syao/git/grid/thirdparty/spark/src/test/scala
> > [INFO]
> > [INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
> > spark-parent ---
> > [INFO] No sources to compile
> > [INFO]
> > [INFO] --- build-helper-maven-plugin:1.8:add-test-source
> > (add-scala-test-sources) @ spark-parent ---
> > [INFO] Test Source directory:
> > /Users/syao/git/grid/thirdparty/spark/src/test/scala added.
> > [INFO]
> > [INFO] --- scala-maven-plugin:3.1.6:testCompile
> (scala-test-compile-first)
> > @ spark-parent ---
> > [INFO] No sources to compile
> > [INFO]
> > [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
> > spark-parent ---
> > [INFO]
> > [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @
> > spark-parent ---
> > [INFO]
> > [INFO]
> > 
> > [INFO] Building Spark Project Core 1.0.1
> > [INFO]
> > 
> > [INFO]
> > [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-core_2.10
> > ---
> > [INFO]
> > [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> > spark-core_2.10 ---
> > [INFO]
> > [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @
> > spark-core_2.10 ---
> > [INFO] Source directory:
> > /Users/syao/git/grid/thirdparty/spark/core/src/main/scala added.
> > [INFO]
> > [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> > spark-core_2.10 ---
> > [INFO]
> > [INFO] --- exec-maven-plugin:1.2.1:exec (default) @ spark-core_2.10 ---
> > Archive:  lib/py4j-0.8.1-src.zip
> >   inflating: build/py4j/tests/java_map_test.py
> >  extracting: build/py4j/tests/__init__.py
> >   inflating: build/py4j/tests/java_gateway_test.py
> >   inflating: build/py4j/tests/java_callback_test.py
> >   inflating: build/py4j/tests/java_list_test.py
> >   inflating: build/py4j/tests/byte_string_test.py
> >   inflating: build/py4j/tests/multithreadtest.py
> >   inflating: build/py4j/tests/java_array_test.py
> >   infla

Re: failed to build spark with maven for both 1.0.1 and latest master branch

2014-07-31 Thread Tathagata Das
Does a "mvn clean" or "sbt/sbt clean" help?

TD

On Wed, Jul 30, 2014 at 9:25 PM, yao  wrote:
> Hi Folks,
>
> Today I am trying to build spark using maven; however, the following
> command failed consistently for both 1.0.1 and the latest master.  (BTW, it
> seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
> assembly)*
>
> Environment: Mac OS Mavericks
> Maven: 3.2.2 (installed by homebrew)
>
>
>
>
> *export M2_HOME=/usr/local/Cellar/maven/3.2.2/libexec/export
> PATH=$M2_HOME/bin:$PATHexport MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> -XX:ReservedCodeCacheSize=512m"mvn -Pyarn -Phadoop-2.4
> -Dhadoop.version=2.4.0 -DskipTests clean package*
>
> Build outputs:
>
> [INFO] Scanning for projects...
> [INFO]
> 
> [INFO] Reactor Build Order:
> [INFO]
> [INFO] Spark Project Parent POM
> [INFO] Spark Project Core
> [INFO] Spark Project Bagel
> [INFO] Spark Project GraphX
> [INFO] Spark Project ML Library
> [INFO] Spark Project Streaming
> [INFO] Spark Project Tools
> [INFO] Spark Project Catalyst
> [INFO] Spark Project SQL
> [INFO] Spark Project Hive
> [INFO] Spark Project REPL
> [INFO] Spark Project YARN Parent POM
> [INFO] Spark Project YARN Stable API
> [INFO] Spark Project Assembly
> [INFO] Spark Project External Twitter
> [INFO] Spark Project External Kafka
> [INFO] Spark Project External Flume
> [INFO] Spark Project External ZeroMQ
> [INFO] Spark Project External MQTT
> [INFO] Spark Project Examples
> [INFO]
> [INFO]
> 
> [INFO] Building Spark Project Parent POM 1.0.1
> [INFO]
> 
> [INFO]
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-parent ---
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> spark-parent ---
> [INFO]
> [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @
> spark-parent ---
> [INFO] Source directory:
> /Users/syao/git/grid/thirdparty/spark/src/main/scala added.
> [INFO]
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-parent ---
> [INFO]
> [INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
> spark-parent ---
> [INFO] Add Test Source directory:
> /Users/syao/git/grid/thirdparty/spark/src/test/scala
> [INFO]
> [INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
> spark-parent ---
> [INFO] No sources to compile
> [INFO]
> [INFO] --- build-helper-maven-plugin:1.8:add-test-source
> (add-scala-test-sources) @ spark-parent ---
> [INFO] Test Source directory:
> /Users/syao/git/grid/thirdparty/spark/src/test/scala added.
> [INFO]
> [INFO] --- scala-maven-plugin:3.1.6:testCompile (scala-test-compile-first)
> @ spark-parent ---
> [INFO] No sources to compile
> [INFO]
> [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
> spark-parent ---
> [INFO]
> [INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @
> spark-parent ---
> [INFO]
> [INFO]
> 
> [INFO] Building Spark Project Core 1.0.1
> [INFO]
> 
> [INFO]
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-core_2.10
> ---
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
> spark-core_2.10 ---
> [INFO]
> [INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @
> spark-core_2.10 ---
> [INFO] Source directory:
> /Users/syao/git/grid/thirdparty/spark/core/src/main/scala added.
> [INFO]
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-core_2.10 ---
> [INFO]
> [INFO] --- exec-maven-plugin:1.2.1:exec (default) @ spark-core_2.10 ---
> Archive:  lib/py4j-0.8.1-src.zip
>   inflating: build/py4j/tests/java_map_test.py
>  extracting: build/py4j/tests/__init__.py
>   inflating: build/py4j/tests/java_gateway_test.py
>   inflating: build/py4j/tests/java_callback_test.py
>   inflating: build/py4j/tests/java_list_test.py
>   inflating: build/py4j/tests/byte_string_test.py
>   inflating: build/py4j/tests/multithreadtest.py
>   inflating: build/py4j/tests/java_array_test.py
>   inflating: build/py4j/tests/py4j_callback_example2.py
>   inflating: build/py4j/tests/py4j_example.py
>   inflating: build/py4j/tests/py4j_callback_example.py
>   inflating: build/py4j/tests/finalizer_test.py
>   inflating: build/py4j/tests/java_set_test.py
>   inflating: build/py4j/finalizer.py
>  extracting: build/py4j/__init__.py
>   inflating: build/py4j/java_gateway.py
>   inflating: build/py4j/protocol.py
>   inflating: build/py4j/java_collections.py
>  extracting: build/py4j/version.py
>   inflating: build/py4j/compat.py
> [INFO]
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> spark-core_2.10 ---
> [INFO] Using 'UTF-8' encoding to copy filtered

failed to build spark with maven for both 1.0.1 and latest master branch

2014-07-30 Thread yao
Hi Folks,

Today I am trying to build spark using maven; however, the following
command failed consistently for both 1.0.1 and the latest master.  (BTW, it
seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean
assembly)*

Environment: Mac OS Mavericks
Maven: 3.2.2 (installed by homebrew)




*export M2_HOME=/usr/local/Cellar/maven/3.2.2/libexec/export
PATH=$M2_HOME/bin:$PATHexport MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
-XX:ReservedCodeCacheSize=512m"mvn -Pyarn -Phadoop-2.4
-Dhadoop.version=2.4.0 -DskipTests clean package*

Build outputs:

[INFO] Scanning for projects...
[INFO]

[INFO] Reactor Build Order:
[INFO]
[INFO] Spark Project Parent POM
[INFO] Spark Project Core
[INFO] Spark Project Bagel
[INFO] Spark Project GraphX
[INFO] Spark Project ML Library
[INFO] Spark Project Streaming
[INFO] Spark Project Tools
[INFO] Spark Project Catalyst
[INFO] Spark Project SQL
[INFO] Spark Project Hive
[INFO] Spark Project REPL
[INFO] Spark Project YARN Parent POM
[INFO] Spark Project YARN Stable API
[INFO] Spark Project Assembly
[INFO] Spark Project External Twitter
[INFO] Spark Project External Kafka
[INFO] Spark Project External Flume
[INFO] Spark Project External ZeroMQ
[INFO] Spark Project External MQTT
[INFO] Spark Project Examples
[INFO]
[INFO]

[INFO] Building Spark Project Parent POM 1.0.1
[INFO]

[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-parent ---
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
spark-parent ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @
spark-parent ---
[INFO] Source directory:
/Users/syao/git/grid/thirdparty/spark/src/main/scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-parent ---
[INFO]
[INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
spark-parent ---
[INFO] Add Test Source directory:
/Users/syao/git/grid/thirdparty/spark/src/test/scala
[INFO]
[INFO] --- scala-maven-plugin:3.1.6:compile (scala-compile-first) @
spark-parent ---
[INFO] No sources to compile
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-test-source
(add-scala-test-sources) @ spark-parent ---
[INFO] Test Source directory:
/Users/syao/git/grid/thirdparty/spark/src/test/scala added.
[INFO]
[INFO] --- scala-maven-plugin:3.1.6:testCompile (scala-test-compile-first)
@ spark-parent ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
spark-parent ---
[INFO]
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @
spark-parent ---
[INFO]
[INFO]

[INFO] Building Spark Project Core 1.0.1
[INFO]

[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-core_2.10
---
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @
spark-core_2.10 ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @
spark-core_2.10 ---
[INFO] Source directory:
/Users/syao/git/grid/thirdparty/spark/core/src/main/scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-core_2.10 ---
[INFO]
[INFO] --- exec-maven-plugin:1.2.1:exec (default) @ spark-core_2.10 ---
Archive:  lib/py4j-0.8.1-src.zip
  inflating: build/py4j/tests/java_map_test.py
 extracting: build/py4j/tests/__init__.py
  inflating: build/py4j/tests/java_gateway_test.py
  inflating: build/py4j/tests/java_callback_test.py
  inflating: build/py4j/tests/java_list_test.py
  inflating: build/py4j/tests/byte_string_test.py
  inflating: build/py4j/tests/multithreadtest.py
  inflating: build/py4j/tests/java_array_test.py
  inflating: build/py4j/tests/py4j_callback_example2.py
  inflating: build/py4j/tests/py4j_example.py
  inflating: build/py4j/tests/py4j_callback_example.py
  inflating: build/py4j/tests/finalizer_test.py
  inflating: build/py4j/tests/java_set_test.py
  inflating: build/py4j/finalizer.py
 extracting: build/py4j/__init__.py
  inflating: build/py4j/java_gateway.py
  inflating: build/py4j/protocol.py
  inflating: build/py4j/java_collections.py
 extracting: build/py4j/version.py
  inflating: build/py4j/compat.py
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
spark-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 6 resources
[INFO] Copying 20 resources
[INFO] Copying 7 resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.1.6:add-source (scala-compile-first) @
spark-core_2.10 ---
[INFO] Add Test Source directory:
/Users/syao/git/grid/thirdparty/spark/core/src/test/scala
[INFO]
[INFO] --- scala-maven-plugin:3.1.6:co