[ 
https://issues.apache.org/jira/browse/SPARK-15524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-15524.
-------------------------------
    Resolution: Not A Problem

This is specific to your environment. Your config is specifying a wrong/host 
port somewhere for YARN/HDFS daemons or they aren't reachable.

> Strange issue starting spark-shell: screen block
> ------------------------------------------------
>
>                 Key: SPARK-15524
>                 URL: https://issues.apache.org/jira/browse/SPARK-15524
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Posty
>
> When I start spark-shell with yarn Im getting a strange issue. The screen 
> stops here:
> http://i.stack.imgur.com/kmska.png
> No erros, just this screen. Do you know know what issue can be?
> The only two spark files I configured was spark-env.sh:
> HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
> export SPARK_MASTER_IP=192.168.1.110
> export SPARK_LOCAL_IP=192.168.1.110
> export SPARK_EXECUTOR_MEMORY=4G
> And slaves:
> 192.168.1.110
> 192.168.1.111
> I wait for 30minutes and the screen get unbloked, But not it showing this 
> error below, any ideais why can be?
> 16/05/25 11:13:43 ERROR SparkContext: Error initializing SparkContext.
> java.net.ConnectException: Call From master/192.168.1.110 to 0.0.0.0:8032     
>                                failed on connection exception: 
> java.net.ConnectException: Connection refused; F                              
>      or more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>         at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown 
> Source                                   )
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC         
>                           onstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>         at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng         
>                           ine.java:232)
>         at com.sun.proxy.$Proxy11.getNewApplication(Unknown Source)
>         at 
> org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPB         
>                           
> ClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:217)
>         at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI         
>                           nvocationHandler.java:187)
>         at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat         
>                           ionHandler.java:102)
>         at com.sun.proxy.$Proxy12.getNewApplication(Unknown Source)
>         at 
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplicati         
>                           on(YarnClientImpl.java:206)
>         at 
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplicati         
>                           on(YarnClientImpl.java:214)
>         at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:13         
>                           2)
>         at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(Y         
>                           arnClientSchedulerBackend.scala:57)
>         at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.         
>                           scala:144)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
>         at 
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:         
>                           1017)
>         at $line3.$read$$iwC$$iwC.<init>(<console>:15)
>         at $line3.$read$$iwC.<init>(<console>:24)
>         at $line3.$read.<init>(<console>:26)
>         at $line3.$read$.<init>(<console>:30)
>         at $line3.$read$.<clinit>(<console>)
>         at $line3.$eval$.<init>(<console>:7)
>         at $line3.$eval$.<clinit>(<console>)
>         at $line3.$eval.$print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.         
>                           java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:         
>                           1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:         
>                           1346)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840         
>                           )
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8         
>                           57)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca         
>                           la:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply         
>                           (SparkILoopInit.scala:125)
>         at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply         
>                           (SparkILoopInit.scala:124)
>         at 
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop         
>                           Init.scala:124)
>         at 
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           
> ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s         
>                           cala:159)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL         
>                           oopInit.scala:108)
>         at 
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:         
>                           64)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass         
>                           Loader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr         
>                           ocess(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.         
>                           java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub         
>                           mit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18         
>                           1)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.net.ConnectException: Connection refused
>         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>         at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717         
>                           )
>         at 
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout         
>                           .java:206)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
>         at 
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:6         
>                           07)
>         at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:70         
>                           5)
>         at 
> org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
>         at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1438)
>         ... 64 more
> 16/05/25 11:13:43 WARN MetricsSystem: Stopping a MetricsSystem that is not 
> runni                                   ng
> java.net.ConnectException: Call From sgd34.dei.uc.pt/10.17.0.88 to 
> 0.0.0.0:8032                                    failed on connection 
> exception: java.net.ConnectException: Connection refused; F                   
>                 or more details see:  
> http://wiki.apache.org/hadoop/ConnectionRefused
>         at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown 
> Source                                   )
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC         
>                           onstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>         at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng         
>                           ine.java:232)
>         at com.sun.proxy.$Proxy11.getNewApplication(Unknown Source)
>         at 
> org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPB         
>                           
> ClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:217)
>         at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI         
>                           nvocationHandler.java:187)
>         at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat         
>                           ionHandler.java:102)
>         at com.sun.proxy.$Proxy12.getNewApplication(Unknown Source)
>         at 
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplicati         
>                           on(YarnClientImpl.java:206)
>         at 
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplicati         
>                           on(YarnClientImpl.java:214)
>         at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:13         
>                           2)
>         at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(Y         
>                           arnClientSchedulerBackend.scala:57)
>         at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.         
>                           scala:144)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
>         at 
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:         
>                           1017)
>         at $iwC$$iwC.<init>(<console>:15)
>         at $iwC.<init>(<console>:24)
>         at <init>(<console>:26)
>         at .<init>(<console>:30)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.         
>                           java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:         
>                           1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:         
>                           1346)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840         
>                           )
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8         
>                           57)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca         
>                           la:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply         
>                           (SparkILoopInit.scala:125)
>         at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply         
>                           (SparkILoopInit.scala:124)
>         at 
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop         
>                           Init.scala:124)
>         at 
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           
> ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s         
>                           cala:159)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL         
>                           oopInit.scala:108)
>         at 
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:         
>                           64)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass         
>                           Loader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr         
>                           ocess(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.         
>                           java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub         
>                           mit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18         
>                           1)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.net.ConnectException: Connection refused
>         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>         at 
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717         
>                           )
>         at 
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout         
>                           .java:206)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
>         at 
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:6         
>                           07)
>         at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:70         
>                           5)
>         at 
> org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
>         at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1438)
>         ... 64 more
> java.lang.NullPointerException
>         at 
> org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala         
>                           :1367)
>         at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct         
>                           orAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC         
>                           onstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at 
> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:10         
>                           28)
>         at $iwC$$iwC.<init>(<console>:15)
>         at $iwC.<init>(<console>:24)
>         at <init>(<console>:26)
>         at .<init>(<console>:30)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.         
>                           java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:         
>                           1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:         
>                           1346)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840         
>                           )
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8         
>                           57)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca         
>                           la:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply         
>                           (SparkILoopInit.scala:132)
>         at 
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply         
>                           (SparkILoopInit.scala:124)
>         at 
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoop         
>                           Init.scala:124)
>         at 
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           
> ILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.s         
>                           cala:159)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>         at 
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkIL         
>                           oopInit.scala:108)
>         at 
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:         
>                           64)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark         
>                           ILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass         
>                           Loader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr         
>                           ocess(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.         
>                           java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces         
>                           sorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub         
>                           mit$$runMain(SparkSubmit.scala:731)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:18         
>                           1)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> <console>:16: error: not found: value sqlContext
>          import sqlContext.implicits._
>                 ^
> <console>:16: error: not found: value sqlContext
>          import sqlContext.sql



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to