Re: spark-shell 1.5 doesn't seem to work in local mode

2015-09-19 Thread Madhu
Thanks guys.

I do have HADOOP_INSTALL set, but Spark 1.4.1 did not seem to mind.
Seems like there's a difference in behavior between 1.5.0 and 1.4.1 for some
reason.

To the best of my knowledge, I just downloaded each tgz and untarred them in
/opt
I adjusted my PATH to point to one or the other, but that should be about
it.

Does 1.5.0 pick up HADOOP_INSTALL?
Wouldn't spark-shell --master local override that?
1.5 seemed to completely ignore --master local



-
--
Madhu
https://www.linkedin.com/in/msiddalingaiah
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-1-5-doesn-t-seem-to-work-in-local-mode-tp14212p14217.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: spark-shell 1.5 doesn't seem to work in local mode

2015-09-19 Thread Sean Owen
It sounds a lot like you have some local Hadoop config pointing to a
cluster, and you're picking that up when you run the shell. Look for
HADOOP_* env variables and clear them, and use --master local[*]

On Sat, Sep 19, 2015 at 5:14 PM, Madhu  wrote:
> I downloaded spark-1.5.0-bin-hadoop2.6.tgz recently and installed on CentOS.
> All my local Spark code works fine locally.
>
> For some odd reason, spark-shell doesn't work in local mode.
> It looks like it want's to connect to HDFS, even if I use --master local or
> specify local mode in the conf.
> Even sc.textFile(...) is trying to connect to HDFS.
> Here's the conf, which clearly says spark.master is local:
>
> scala> sc.getConf.getAll.foreach(println)
> (spark.repl.class.uri,http://192.168.2.133:60639)
> (spark.app.name,Spark shell)
> (spark.driver.port,57705)
> (spark.fileserver.uri,http://192.168.2.133:38494)
> (spark.app.id,local-1442679054864)
> (spark.driver.host,192.168.2.133)
> (spark.jars,)
> (spark.externalBlockStore.folderName,spark-34654a51-3461-4851-91be-0b78dd4b4bd6)
> (spark.master,local[*])
> (spark.executor.id,driver)
> (spark.submit.deployMode,client)
>
> Just to check my environment, I downloaded spark-1.4.1-bin-hadoop2.6.tgz,
> and spark-shell behaves normally. I can access local files, everything works
> as expected, no exceptions.
>
> Here's the stack trace when I run spark-shell with Spark 1.5:
>
> java.lang.RuntimeException: java.net.ConnectException: Call From
> ltree1/127.0.0.1 to localhost:9000 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
> at
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:171)
> at
> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
> at
> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
> at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:168)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at 
> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
> at $iwC$$iwC.(:9)
> at $iwC.(:18)
> at (:20)
> at .(:24)
> at .()
> at .(:7)
> at .()
> at $print()
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
> at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
> at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> at
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
> at
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
> at 
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> at
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
> at 
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
> at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
> at
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
> at
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
> at 
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
> at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
> at
> 

Re: spark-shell 1.5 doesn't seem to work in local mode

2015-09-19 Thread Zhan Zhang
It does not matter whether you start your spark with local or other mode. If 
you have hdfs-site.xml somewhere and spark configuration pointing to that 
config, you will read/write to HDFS.

Thanks.

Zhan Zhang


From: Madhu <ma...@madhu.com>
Sent: Saturday, September 19, 2015 12:14 PM
To: dev@spark.apache.org
Subject: Re: spark-shell 1.5 doesn't seem to work in local mode

Thanks guys.

I do have HADOOP_INSTALL set, but Spark 1.4.1 did not seem to mind.
Seems like there's a difference in behavior between 1.5.0 and 1.4.1 for some
reason.

To the best of my knowledge, I just downloaded each tgz and untarred them in
/opt
I adjusted my PATH to point to one or the other, but that should be about
it.

Does 1.5.0 pick up HADOOP_INSTALL?
Wouldn't spark-shell --master local override that?
1.5 seemed to completely ignore --master local



-
--
Madhu
https://www.linkedin.com/in/msiddalingaiah
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-1-5-doesn-t-seem-to-work-in-local-mode-tp14212p14217.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: spark-shell 1.5 doesn't seem to work in local mode

2015-09-19 Thread Reynold Xin
Maybe you have a hdfs-site.xml lying around somewhere?


On Sat, Sep 19, 2015 at 9:14 AM, Madhu  wrote:

> I downloaded spark-1.5.0-bin-hadoop2.6.tgz recently and installed on
> CentOS.
> All my local Spark code works fine locally.
>
> For some odd reason, spark-shell doesn't work in local mode.
> It looks like it want's to connect to HDFS, even if I use --master local or
> specify local mode in the conf.
> Even sc.textFile(...) is trying to connect to HDFS.
> Here's the conf, which clearly says spark.master is local:
>
> scala> sc.getConf.getAll.foreach(println)
> (spark.repl.class.uri,http://192.168.2.133:60639)
> (spark.app.name,Spark shell)
> (spark.driver.port,57705)
> (spark.fileserver.uri,http://192.168.2.133:38494)
> (spark.app.id,local-1442679054864)
> (spark.driver.host,192.168.2.133)
> (spark.jars,)
>
> (spark.externalBlockStore.folderName,spark-34654a51-3461-4851-91be-0b78dd4b4bd6)
> (spark.master,local[*])
> (spark.executor.id,driver)
> (spark.submit.deployMode,client)
>
> Just to check my environment, I downloaded spark-1.4.1-bin-hadoop2.6.tgz,
> and spark-shell behaves normally. I can access local files, everything
> works
> as expected, no exceptions.
>
> Here's the stack trace when I run spark-shell with Spark 1.5:
>
> java.lang.RuntimeException: java.net.ConnectException: Call From
> ltree1/127.0.0.1 to localhost:9000 failed on connection exception:
> java.net.ConnectException: Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
> at
>
> org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:171)
> at
>
> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
> at
> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
> at
> org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:168)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
> at $iwC$$iwC.(:9)
> at $iwC.(:18)
> at (:20)
> at .(:24)
> at .()
> at .(:7)
> at .()
> at $print()
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
> at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
> at
>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
> at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
> at
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> at
>
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
> at
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
> at
>
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
> at
>
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
> at
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> at
>
>