RE: Building Spark without hive libraries
Looks better Ted bow :) [INFO] [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ... SUCCESS [ 39.937 s] [INFO] Spark Project Launcher . SUCCESS [ 44.718 s] [INFO] Spark Project Networking ... SUCCESS [ 11.294 s] [INFO] Spark Project Shuffle Streaming Service SUCCESS [ 4.720 s] [INFO] Spark Project Unsafe ... SUCCESS [ 10.705 s] [INFO] Spark Project Core . SUCCESS [02:52 min] [INFO] Spark Project Bagel SUCCESS [ 5.937 s] [INFO] Spark Project GraphX ... SUCCESS [ 15.977 s] [INFO] Spark Project Streaming SUCCESS [ 36.453 s] [INFO] Spark Project Catalyst . SUCCESS [ 54.381 s] [INFO] Spark Project SQL .. SUCCESS [01:07 min] [INFO] Spark Project ML Library ... SUCCESS [01:22 min] [INFO] Spark Project Tools SUCCESS [ 2.493 s] [INFO] Spark Project Hive . SUCCESS [ 58.496 s] [INFO] Spark Project REPL . SUCCESS [ 9.278 s] [INFO] Spark Project YARN . SUCCESS [ 12.424 s] [INFO] Spark Project Assembly . SUCCESS [01:51 min] [INFO] Spark Project External Twitter . SUCCESS [ 7.604 s] [INFO] Spark Project External Flume Sink .. SUCCESS [ 7.580 s] [INFO] Spark Project External Flume ... SUCCESS [ 9.526 s] [INFO] Spark Project External Flume Assembly .. SUCCESS [ 3.163 s] [INFO] Spark Project External MQTT SUCCESS [ 31.774 s] [INFO] Spark Project External MQTT Assembly ... SUCCESS [ 8.698 s] [INFO] Spark Project External ZeroMQ .. SUCCESS [ 6.992 s] [INFO] Spark Project External Kafka ... SUCCESS [ 11.487 s] [INFO] Spark Project Examples . SUCCESS [02:12 min] [INFO] Spark Project External Kafka Assembly .. SUCCESS [ 9.046 s] [INFO] Spark Project YARN Shuffle Service . SUCCESS [ 6.097 s] [INFO] [INFO] BUILD SUCCESS [INFO] [INFO] Total time: 16:16 min [INFO] Finished at: 2015-11-25T23:34:35+00:00 [INFO] Final Memory: 90M/1312M [INFO] Mich Talebzadeh Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly: Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Mich Talebzadeh [mailto:m...@peridale.co.uk] Sent: 25 November 2015 23:08 To: 'Ted Yu' Cc: 'user' Subject: RE: Building Spark without hive libraries Yep. The user hduser was using the wrong version of maven hduser@rhes564::/usr/lib/spark> build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package > log Using `mvn` from path: /usr/local/apache-maven/apache-maven-3.3.1/bin/mvn WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion failed with message: Detected Maven Version: 3.3.1 is not in the allowed range 3.3.3. Mich Talebzadeh Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. co-aut
RE: Building Spark without hive libraries
Yep. The user hduser was using the wrong version of maven hduser@rhes564::/usr/lib/spark> build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package > log Using `mvn` from path: /usr/local/apache-maven/apache-maven-3.3.1/bin/mvn WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion failed with message: Detected Maven Version: 3.3.1 is not in the allowed range 3.3.3. Mich Talebzadeh Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly: Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: 25 November 2015 22:54 To: Mich Talebzadeh Cc: user Subject: Re: Building Spark without hive libraries bq. I have to run this as root otherwise build does not progress I build Spark as non-root user and don't problem. I suggest you dig a little bit to see what was stalling running as non-root user. On Wed, Nov 25, 2015 at 2:48 PM, Mich Talebzadeh mailto:m...@peridale.co.uk> > wrote: Thanks Ted. I have the jar file scala-compiler-2.10.4.jar as well pwd / find ./ -name scala-compiler-2.10.4.jar ./usr/lib/spark/build/zinc-0.3.5.3/lib/scala-compiler-2.10.4.jar ./usr/lib/spark/build/apache-maven-3.3.3/lib/scala-compiler-2.10.4.jar ./root/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar Sounds like (?) because I am running the maven command as root, it cannot find that file!! Do I need to add it somewhere or set it up on the PATH/CLASSPATH? NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Ted Yu [mailto:yuzhih...@gmail.com <mailto:yuzhih...@gmail.com> ] Sent: 25 November 2015 22:35 To: Mich Talebzadeh mailto:m...@peridale.co.uk> > Cc: user mailto:user@spark.apache.org> > Subject: Re: Building Spark without hive libraries bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found: scala-compiler-2.10.4.jar^[[0m Can you search for the above jar ? I found two locally: /home/hbase/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar /home/hbase/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar On Wed, Nov 25, 2015 at 2:30 PM, Mich Talebzadeh mailto:m...@peridale.co.uk> > wrote: Thanks Ted. I ran maven in debug mode as follows build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package > log Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn Still cannot determine the cause of this error. Thanks, Mich NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Ted Yu [mailto:yuzhih...@gmail.com <mailto:yuzhih...@gmail.com> ] Sent: 25 November 2015 21:52 To: Mich Talebzadeh mailto:m...@peridale.co.uk> > Cc: user mailto:user@spark.apache.org>
Re: Building Spark without hive libraries
bq. I have to run this as root otherwise build does not progress I build Spark as non-root user and don't problem. I suggest you dig a little bit to see what was stalling running as non-root user. On Wed, Nov 25, 2015 at 2:48 PM, Mich Talebzadeh wrote: > Thanks Ted. > > > > I have the jar file scala-compiler-2.10.4.jar as well > > > > pwd > > / > > *find ./ -name scala-compiler-2.10.4.jar* > > ./usr/lib/spark/build/zinc-0.3.5.3/lib/scala-compiler-2.10.4.jar > > ./usr/lib/spark/build/apache-maven-3.3.3/lib/scala-compiler-2.10.4.jar > > > ./root/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar > > > > Sounds like (?) because I am running the maven command as root, it cannot > find that file!! > > > > Do I need to add it somewhere or set it up on the PATH/CLASSPATH? > > > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* 25 November 2015 22:35 > > *To:* Mich Talebzadeh > *Cc:* user > *Subject:* Re: Building Spark without hive libraries > > > > bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found: > scala-compiler-2.10.4.jar^[[0m > > > > Can you search for the above jar ? > > > > I found two locally: > > > > > /home/hbase/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar > > > /home/hbase/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar > > > > On Wed, Nov 25, 2015 at 2:30 PM, Mich Talebzadeh > wrote: > > Thanks Ted. > > > > I ran maven in debug mode as follows > > > > *build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean > package > log* > > Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn > > > > Still cannot determine the cause of this error. > > > > Thanks, > > > > Mich > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* 25 November 2015 21:52 > *To:* Mich Talebzadeh > *Cc:* user > *Subject:* Re: Building Spark without hive libraries > > > > Take a look at install_zinc() in build/mvn > > > > Cheers > > > > On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh > wrote: > > Hi, > > > > I am trying to build sparc from the source and not using Hive. I am > getting > > > > [error] Required file not found: scala-compiler-2.10.4.jar > > [error] See zinc -help for information about locating necessary files > > > > I have to run this as root otherwise build does not progress. Any help is > appreciated. > > > > > > -bash-3.2# ./make-distribution.sh --name "hadoop2-without-hive" --tgz > "-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided" > > +++ dirname ./make-distribution.sh > > ++ cd . > > ++ pwd > > + SPARK_HOME=/usr/lib/spark > > + DISTDIR=/usr/lib/spark/dist > > + SPARK_TACHYON=false > > + TACHYON_VERSION=0.7.1 > > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz > > + TACHYON_URL= > https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz > > + MAKE_TGZ=false > > + NAME=none > > + MVN=/usr/lib/spark/build/mvn > > + (( 4 )) > > + case $1 in > > + NAME=hadoop2-without-hive > > + shift > > + shift > > + (( 2 )) > > + case $1 in > > + MAKE_TGZ=true > > + shift > > + (( 1 )) > > + case $1 in > > + break > > + '[' -z /usr/java/latest ']' > > + '[
RE: Building Spark without hive libraries
Thanks Ted. I have the jar file scala-compiler-2.10.4.jar as well pwd / find ./ -name scala-compiler-2.10.4.jar ./usr/lib/spark/build/zinc-0.3.5.3/lib/scala-compiler-2.10.4.jar ./usr/lib/spark/build/apache-maven-3.3.3/lib/scala-compiler-2.10.4.jar ./root/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar Sounds like (?) because I am running the maven command as root, it cannot find that file!! Do I need to add it somewhere or set it up on the PATH/CLASSPATH? NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: 25 November 2015 22:35 To: Mich Talebzadeh Cc: user Subject: Re: Building Spark without hive libraries bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found: scala-compiler-2.10.4.jar^[[0m Can you search for the above jar ? I found two locally: /home/hbase/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar /home/hbase/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar On Wed, Nov 25, 2015 at 2:30 PM, Mich Talebzadeh mailto:m...@peridale.co.uk> > wrote: Thanks Ted. I ran maven in debug mode as follows build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package > log Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn Still cannot determine the cause of this error. Thanks, Mich NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Ted Yu [mailto:yuzhih...@gmail.com <mailto:yuzhih...@gmail.com> ] Sent: 25 November 2015 21:52 To: Mich Talebzadeh mailto:m...@peridale.co.uk> > Cc: user mailto:user@spark.apache.org> > Subject: Re: Building Spark without hive libraries Take a look at install_zinc() in build/mvn Cheers On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh mailto:m...@peridale.co.uk> > wrote: Hi, I am trying to build sparc from the source and not using Hive. I am getting [error] Required file not found: scala-compiler-2.10.4.jar [error] See zinc -help for information about locating necessary files I have to run this as root otherwise build does not progress. Any help is appreciated. -bash-3.2# ./make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided" +++ dirname ./make-distribution.sh ++ cd . ++ pwd + SPARK_HOME=/usr/lib/spark + DISTDIR=/usr/lib/spark/dist + SPARK_TACHYON=false + TACHYON_VERSION=0.7.1 + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz + TACHYON_URL=https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz + MAKE_TGZ=false + NAME=none + MVN=/usr/lib/spark/build/mvn + (( 4 )) + case $1 in + NAME=hadoop2-without-hive + shift + shift + (( 2 )) + case $1 in + MAKE_TGZ=true + shift + (( 1 )) + case $1 in + break + '[' -z /usr/java/latest ']' + '[' -z /usr/java/latest ']' ++ command -v git + '[' ']' ++ command -v /usr/lib/spark/build/mvn + '[' '!' /usr/lib/spark/build/mvn ']' ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.version -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided ++ grep -v INFO ++ tail -n 1 + VERSION=1.5.2 ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=scala.binary.version -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided ++ grep -v INFO ++ tail -n 1 + SCALA_VERSION=2.10 ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=hadoop.version -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided ++ grep -v INFO ++ tail -n 1 + SPARK_HADOOP_VERSION=2.6.0 ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.activeProfiles -pl sql/hive -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided ++ grep -v INFO ++ fgrep --count 'hive' ++ echo -n + SPARK_HIVE=0 + '[' hadoop2-without-hive == none
Re: Building Spark without hive libraries
bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found: scala-compiler-2.10.4.jar^[[0m Can you search for the above jar ? I found two locally: /home/hbase/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar /home/hbase/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar On Wed, Nov 25, 2015 at 2:30 PM, Mich Talebzadeh wrote: > Thanks Ted. > > > > I ran maven in debug mode as follows > > > > *build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean > package > log* > > Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn > > > > Still cannot determine the cause of this error. > > > > Thanks, > > > > Mich > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* 25 November 2015 21:52 > *To:* Mich Talebzadeh > *Cc:* user > *Subject:* Re: Building Spark without hive libraries > > > > Take a look at install_zinc() in build/mvn > > > > Cheers > > > > On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh > wrote: > > Hi, > > > > I am trying to build sparc from the source and not using Hive. I am > getting > > > > [error] Required file not found: scala-compiler-2.10.4.jar > > [error] See zinc -help for information about locating necessary files > > > > I have to run this as root otherwise build does not progress. Any help is > appreciated. > > > > > > -bash-3.2# ./make-distribution.sh --name "hadoop2-without-hive" --tgz > "-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided" > > +++ dirname ./make-distribution.sh > > ++ cd . > > ++ pwd > > + SPARK_HOME=/usr/lib/spark > > + DISTDIR=/usr/lib/spark/dist > > + SPARK_TACHYON=false > > + TACHYON_VERSION=0.7.1 > > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz > > + TACHYON_URL= > https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz > > + MAKE_TGZ=false > > + NAME=none > > + MVN=/usr/lib/spark/build/mvn > > + (( 4 )) > > + case $1 in > > + NAME=hadoop2-without-hive > > + shift > > + shift > > + (( 2 )) > > + case $1 in > > + MAKE_TGZ=true > > + shift > > + (( 1 )) > > + case $1 in > > + break > > + '[' -z /usr/java/latest ']' > > + '[' -z /usr/java/latest ']' > > ++ command -v git > > + '[' ']' > > ++ command -v /usr/lib/spark/build/mvn > > + '[' '!' /usr/lib/spark/build/mvn ']' > > ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.version > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ tail -n 1 > > + VERSION=1.5.2 > > ++ /usr/lib/spark/build/mvn help:evaluate > -Dexpression=scala.binary.version > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ tail -n 1 > > + SCALA_VERSION=2.10 > > ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=hadoop.version > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ tail -n 1 > > + SPARK_HADOOP_VERSION=2.6.0 > > ++ /usr/lib/spark/build/mvn help:evaluate > -Dexpression=project.activeProfiles -pl sql/hive > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ fgrep --count 'hive' > > ++ echo -n > > + SPARK_HIVE=0 > > + '[' hadoop2-without-hive == none ']' > > + echo 'Spark version is 1.5.2' > > Spark version is 1.5.2 > > + '[' true == true ']' > > + echo 'Making spark-1.5.2-bin-hadoop2-without-hive.tgz' > > Making spark-1.5.2-bin-hadoop2-without-hive.tgz > > + '[' false == true ']' > > + echo 'Tachyon Disabled' > > Tachyon Disabled > > + cd /usr/lib/spark > > + export 'MAVEN_OPTS=-Xmx2g -XX:MaxPermSize=512M > -XX:ReservedCodeCacheSize=512m' > > + MAVEN_OPTS='-Xmx2g -XX:MaxPermSize=512M -XX:ReservedC
Re: Building Spark without hive libraries
Take a look at install_zinc() in build/mvn Cheers On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh wrote: > Hi, > > > > I am trying to build sparc from the source and not using Hive. I am > getting > > > > [error] Required file not found: scala-compiler-2.10.4.jar > > [error] See zinc -help for information about locating necessary files > > > > I have to run this as root otherwise build does not progress. Any help is > appreciated. > > > > > > -bash-3.2# ./make-distribution.sh --name "hadoop2-without-hive" --tgz > "-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided" > > +++ dirname ./make-distribution.sh > > ++ cd . > > ++ pwd > > + SPARK_HOME=/usr/lib/spark > > + DISTDIR=/usr/lib/spark/dist > > + SPARK_TACHYON=false > > + TACHYON_VERSION=0.7.1 > > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz > > + TACHYON_URL= > https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz > > + MAKE_TGZ=false > > + NAME=none > > + MVN=/usr/lib/spark/build/mvn > > + (( 4 )) > > + case $1 in > > + NAME=hadoop2-without-hive > > + shift > > + shift > > + (( 2 )) > > + case $1 in > > + MAKE_TGZ=true > > + shift > > + (( 1 )) > > + case $1 in > > + break > > + '[' -z /usr/java/latest ']' > > + '[' -z /usr/java/latest ']' > > ++ command -v git > > + '[' ']' > > ++ command -v /usr/lib/spark/build/mvn > > + '[' '!' /usr/lib/spark/build/mvn ']' > > ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.version > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ tail -n 1 > > + VERSION=1.5.2 > > ++ /usr/lib/spark/build/mvn help:evaluate > -Dexpression=scala.binary.version > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ tail -n 1 > > + SCALA_VERSION=2.10 > > ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=hadoop.version > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ tail -n 1 > > + SPARK_HADOOP_VERSION=2.6.0 > > ++ /usr/lib/spark/build/mvn help:evaluate > -Dexpression=project.activeProfiles -pl sql/hive > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > ++ grep -v INFO > > ++ fgrep --count 'hive' > > ++ echo -n > > + SPARK_HIVE=0 > > + '[' hadoop2-without-hive == none ']' > > + echo 'Spark version is 1.5.2' > > Spark version is 1.5.2 > > + '[' true == true ']' > > + echo 'Making spark-1.5.2-bin-hadoop2-without-hive.tgz' > > Making spark-1.5.2-bin-hadoop2-without-hive.tgz > > + '[' false == true ']' > > + echo 'Tachyon Disabled' > > Tachyon Disabled > > + cd /usr/lib/spark > > + export 'MAVEN_OPTS=-Xmx2g -XX:MaxPermSize=512M > -XX:ReservedCodeCacheSize=512m' > > + MAVEN_OPTS='-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m' > > + BUILD_COMMAND=("$MVN" clean package -DskipTests $@) > > + echo -e '\nBuilding with...' > > > > Building with... > > + echo -e '$ /usr/lib/spark/build/mvn' clean package -DskipTests > '-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided\n' > > $ /usr/lib/spark/build/mvn clean package -DskipTests > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > > > + /usr/lib/spark/build/mvn clean package -DskipTests > -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided > > Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn > > [INFO] Scanning for projects... > > [INFO] > > > [INFO] Reactor Build Order: > > [INFO] > > [INFO] Spark Project Parent POM > > [INFO] Spark Project Launcher > > [INFO] Spark Project Networking > > [INFO] Spark Project Shuffle Streaming Service > > [INFO] Spark Project Unsafe > > [INFO] Spark Project Core > > [INFO] Spark Project Bagel > > [INFO] Spark Project GraphX > > [INFO] Spark Project Streaming > > [INFO] Spark Project Catalyst > > [INFO] Spark Project SQL > > [INFO] Spark Project ML Library > > [INFO] Spark Project Tools > > [INFO] Spark Project Hive > > [INFO] Spark Project REPL > > [INFO] Spark Project YARN > > [INFO] Spark Project Assembly > > [INFO] Spark Project External Twitter > > [INFO] Spark Project External Flume Sink > > [INFO] Spark Project External Flume > > [INFO] Spark Project External Flume Assembly > > [INFO] Spark Project External MQTT > > [INFO] Spark Project External MQTT Assembly > > [INFO] Spark Project External ZeroMQ > > [INFO] Spark Project External Kafka > > [INFO] Spark Project Examples > > [INFO] Spark Project External Kafka Assembly > > [INFO] Spark Project YARN Shuffle Service > > [INFO] > > [INFO] > > > [INFO] Building Spark Project Parent POM 1.5.2 > > [INFO] > > > [INFO] > > [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ > spark-parent_2.10 --- > > [INFO] Deleting /usr/lib/spark/target > > [INFO] > > [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @ > spark-parent_2.10 --- > > [INFO] > > [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-ad