Getting error when trying to start master node after building spark 1.3

2015-12-07 Thread Mich Talebzadeh
 

Thanks sorted.

 

Actually I used version 1.3.1 and now I managed to make it work as Hive 
execution engine.

 

Cheers,

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Akhil Das [mailto:ak...@sigmoidanalytics.com] 
Sent: 07 December 2015 14:31
To: Mich Talebzadeh mailto:m...@peridale.co.uk> >
Cc: user mailto:user@spark.apache.org> >
Subject: Re: Getting error when trying to start master node after building 
spark 1.3

 

Did you read 
http://spark.apache.org/docs/latest/building-spark.html#building-with-hive-and-jdbc-support

 

 




Thanks

Best Regards

 

On Fri, Dec 4, 2015 at 4:12 PM, Mich Talebzadeh mailto:m...@peridale.co.uk> > wrote:

Hi,

 

 

I am trying to make Hive work with Spark.

 

I have been told that I need to use Spark 1.3 and build it from source code 
WITHOUT HIVE libraries.

 

I have built it as follows:

 

./make-distribution.sh --name "hadoop2-without-hive" --tgz 
"-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"

 

 

Now the issue I have that I cannot start master node.

 

When I try

 

hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin 
<mailto:hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin> > 
./start-master.sh

starting org.apache.spark.deploy.master.Master, logging to 
/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

failed to launch org.apache.spark.deploy.master.Master:

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 6 more

full log in 
/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

 

I get

 

Spark Command: /usr/java/latest/bin/java -cp 
:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop
 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m 
org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077 
--webui-port 8080



 

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

at java.lang.Class.getDeclaredMethods0(Native Method)

at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

at java.lang.Class.getMethod0(Class.java:2764)

at java.lang.Class.getMethod(Class.java:1653)

at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 6 more

 

Any advice will be appreciated.

 

Thanks,

 

Mich

 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

 



Re: Getting error when trying to start master node after building spark 1.3

2015-12-07 Thread Akhil Das
Did you read
http://spark.apache.org/docs/latest/building-spark.html#building-with-hive-and-jdbc-support



Thanks
Best Regards

On Fri, Dec 4, 2015 at 4:12 PM, Mich Talebzadeh  wrote:

> Hi,
>
>
>
>
>
> I am trying to make Hive work with Spark.
>
>
>
> I have been told that I need to use Spark 1.3 and build it from source
> code WITHOUT HIVE libraries.
>
>
>
> I have built it as follows:
>
>
>
> ./make-distribution.sh --name "hadoop2-without-hive" --tgz
> "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"
>
>
>
>
>
> Now the issue I have that I cannot start master node.
>
>
>
> When I try
>
>
>
> hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin>
> ./start-master.sh
>
> starting org.apache.spark.deploy.master.Master, logging to
> /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out
>
> failed to launch org.apache.spark.deploy.master.Master:
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> ... 6 more
>
> full log in
> /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out
>
>
>
> I get
>
>
>
> Spark Command: /usr/java/latest/bin/java -cp
> :/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077
> --webui-port 8080
>
> 
>
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
>
> at java.lang.Class.getDeclaredMethods0(Native Method)
>
> at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)
>
> at java.lang.Class.getMethod0(Class.java:2764)
>
> at java.lang.Class.getMethod(Class.java:1653)
>
> at
> sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
>
> at
> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
>
> Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> ... 6 more
>
>
>
> Any advice will be appreciated.
>
>
>
> Thanks,
>
>
>
> Mich
>
>
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>


Getting error when trying to start master node after building spark 1.3

2015-12-04 Thread Mich Talebzadeh
Hi,

 

 

I am trying to make Hive work with Spark.

 

I have been told that I need to use Spark 1.3 and build it from source code
WITHOUT HIVE libraries.

 

I have built it as follows:

 

./make-distribution.sh --name "hadoop2-without-hive" --tgz
"-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"

 

 

Now the issue I have that I cannot start master node.

 

When I try

 

hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin>
./start-master.sh

starting org.apache.spark.deploy.master.Master, logging to
/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.
apache.spark.deploy.master.Master-1-rhes564.out

failed to launch org.apache.spark.deploy.master.Master:

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 6 more

full log in
/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.
apache.spark.deploy.master.Master-1-rhes564.out

 

I get

 

Spark Command: /usr/java/latest/bin/java -cp
:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1
.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home
/hduser/hadoop-2.6.0/etc/hadoop -XX:MaxPermSize=128m
-Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077
--webui-port 8080



 

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

at java.lang.Class.getDeclaredMethods0(Native Method)

at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

at java.lang.Class.getMethod0(Class.java:2764)

at java.lang.Class.getMethod(Class.java:1653)

at
sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

at
sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

... 6 more

 

Any advice will be appreciated.

 

Thanks,

 

Mich

 

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is
the responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.