Re: Windows - Spark 2 - Standalone - Worker not able to connect to Master

2016-08-01 Thread Nikolay Zhebet
Your exception says, that you have  connection trouble with Spark master.

Check if it is available from your environment where you trying to run job.
In Linux system for this can be suitable this commands: "telnet 127.0.0.1
7077" or "netstat -ntpl | grep 7077" or "nmap 127.0.0.1 | grep 7077".

Try to use analog of this commands in Windows and check if is available
spark master from your running environment?

2016-08-01 14:35 GMT+03:00 ayan guha :

> No I confirmed master is running by spark ui at localhost:8080
> On 1 Aug 2016 18:22, "Nikolay Zhebet"  wrote:
>
>> I think you haven't run spark master yet, or maybe port 7077 is not yours
>> default port for spark master.
>>
>> 2016-08-01 4:24 GMT+03:00 ayan guha :
>>
>>> Hi
>>>
>>> I just downloaded Spark 2.0 on my windows 7 to check it out. However,
>>> not able to set up a standalone cluster:
>>>
>>>
>>> Step 1: master set up (Successful)
>>>
>>> bin/spark-class org.apache.spark.deploy.master.Master
>>>
>>> It did throw an error about not able to find winutils, but started
>>> successfully.
>>>
>>> Step II: Set up Worker (Failed)
>>>
>>> bin/spark-class org.apache.spark.deploy.worker.Worker
>>> spark://localhost:7077
>>>
>>> This step fails with following error:
>>>
>>> 16/08/01 11:21:27 INFO Worker: Connecting to master localhost:7077...
>>> 16/08/01 11:21:28 WARN Worker: Failed to connect to master localhost:7077
>>> org.apache.spark.SparkException: Exception thrown in awaitResult
>>> at
>>> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
>>> la:77)
>>> at
>>> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
>>> la:75)
>>> at
>>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.s
>>> cala:36)
>>> at
>>> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
>>> rElse(RpcTimeout.scala:59)
>>> at
>>> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
>>> rElse(RpcTimeout.scala:59)
>>> at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
>>> at
>>> org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
>>> at
>>> org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
>>> at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
>>> at
>>> org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deplo
>>> y$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:216)
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
>>> Source)
>>> at java.util.concurrent.FutureTask.run(Unknown Source)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
>>> Source)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
>>> Source)
>>> at java.lang.Thread.run(Unknown Source)
>>> Caused by: java.io.IOException: Failed to connect to localhost/
>>> 127.0.0.1:7077
>>> at
>>> org.apache.spark.network.client.TransportClientFactory.createClient(T
>>> ransportClientFactory.java:228)
>>> at
>>> org.apache.spark.network.client.TransportClientFactory.createClient(T
>>> ransportClientFactory.java:179)
>>> at
>>> org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala
>>> :197)
>>> at
>>> org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191)
>>> at
>>> org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187)
>>> ... 4 more
>>> Caused by: java.net.ConnectException: Connection refused: no further
>>> information
>>> : localhost/127.0.0.1:7077
>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>> at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>>> at
>>> io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocke
>>> tChannel.java:224)
>>> at
>>> io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConne
>>> ct(AbstractNioChannel.java:289)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav
>>> a:528)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve
>>> ntLoop.java:468)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja
>>> va:382)
>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>>> at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread
>>> EventExecutor.java:111)
>>> ... 1 more
>>>
>>> Am I doing something wrong?
>>>
>>>
>>> --
>>> Best Regards,
>>> Ayan Guha
>>>
>>
>>


Re: Windows - Spark 2 - Standalone - Worker not able to connect to Master

2016-08-01 Thread ayan guha
No I confirmed master is running by spark ui at localhost:8080
On 1 Aug 2016 18:22, "Nikolay Zhebet"  wrote:

> I think you haven't run spark master yet, or maybe port 7077 is not yours
> default port for spark master.
>
> 2016-08-01 4:24 GMT+03:00 ayan guha :
>
>> Hi
>>
>> I just downloaded Spark 2.0 on my windows 7 to check it out. However, not
>> able to set up a standalone cluster:
>>
>>
>> Step 1: master set up (Successful)
>>
>> bin/spark-class org.apache.spark.deploy.master.Master
>>
>> It did throw an error about not able to find winutils, but started
>> successfully.
>>
>> Step II: Set up Worker (Failed)
>>
>> bin/spark-class org.apache.spark.deploy.worker.Worker
>> spark://localhost:7077
>>
>> This step fails with following error:
>>
>> 16/08/01 11:21:27 INFO Worker: Connecting to master localhost:7077...
>> 16/08/01 11:21:28 WARN Worker: Failed to connect to master localhost:7077
>> org.apache.spark.SparkException: Exception thrown in awaitResult
>> at
>> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
>> la:77)
>> at
>> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
>> la:75)
>> at
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.s
>> cala:36)
>> at
>> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
>> rElse(RpcTimeout.scala:59)
>> at
>> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
>> rElse(RpcTimeout.scala:59)
>> at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
>> at
>> org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
>> at
>> org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
>> at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
>> at
>> org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deplo
>> y$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:216)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
>> Source)
>> at java.util.concurrent.FutureTask.run(Unknown Source)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
>> Source)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
>> Source)
>> at java.lang.Thread.run(Unknown Source)
>> Caused by: java.io.IOException: Failed to connect to localhost/
>> 127.0.0.1:7077
>> at
>> org.apache.spark.network.client.TransportClientFactory.createClient(T
>> ransportClientFactory.java:228)
>> at
>> org.apache.spark.network.client.TransportClientFactory.createClient(T
>> ransportClientFactory.java:179)
>> at
>> org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala
>> :197)
>> at
>> org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191)
>> at
>> org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187)
>> ... 4 more
>> Caused by: java.net.ConnectException: Connection refused: no further
>> information
>> : localhost/127.0.0.1:7077
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>> at
>> io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocke
>> tChannel.java:224)
>> at
>> io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConne
>> ct(AbstractNioChannel.java:289)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav
>> a:528)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve
>> ntLoop.java:468)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja
>> va:382)
>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>> at
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread
>> EventExecutor.java:111)
>> ... 1 more
>>
>> Am I doing something wrong?
>>
>>
>> --
>> Best Regards,
>> Ayan Guha
>>
>
>


Re: Windows - Spark 2 - Standalone - Worker not able to connect to Master

2016-08-01 Thread Nikolay Zhebet
I think you haven't run spark master yet, or maybe port 7077 is not yours
default port for spark master.

2016-08-01 4:24 GMT+03:00 ayan guha :

> Hi
>
> I just downloaded Spark 2.0 on my windows 7 to check it out. However, not
> able to set up a standalone cluster:
>
>
> Step 1: master set up (Successful)
>
> bin/spark-class org.apache.spark.deploy.master.Master
>
> It did throw an error about not able to find winutils, but started
> successfully.
>
> Step II: Set up Worker (Failed)
>
> bin/spark-class org.apache.spark.deploy.worker.Worker
> spark://localhost:7077
>
> This step fails with following error:
>
> 16/08/01 11:21:27 INFO Worker: Connecting to master localhost:7077...
> 16/08/01 11:21:28 WARN Worker: Failed to connect to master localhost:7077
> org.apache.spark.SparkException: Exception thrown in awaitResult
> at
> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
> la:77)
> at
> org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
> la:75)
> at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.s
> cala:36)
> at
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
> rElse(RpcTimeout.scala:59)
> at
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
> rElse(RpcTimeout.scala:59)
> at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
> at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
> at
> org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
> at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
> at
> org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deplo
> y$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:216)
> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
> Source)
> at java.util.concurrent.FutureTask.run(Unknown Source)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
> Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
> Source)
> at java.lang.Thread.run(Unknown Source)
> Caused by: java.io.IOException: Failed to connect to localhost/
> 127.0.0.1:7077
> at
> org.apache.spark.network.client.TransportClientFactory.createClient(T
> ransportClientFactory.java:228)
> at
> org.apache.spark.network.client.TransportClientFactory.createClient(T
> ransportClientFactory.java:179)
> at
> org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala
> :197)
> at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191)
> at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187)
> ... 4 more
> Caused by: java.net.ConnectException: Connection refused: no further
> information
> : localhost/127.0.0.1:7077
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
> at
> io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocke
> tChannel.java:224)
> at
> io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConne
> ct(AbstractNioChannel.java:289)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav
> a:528)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve
> ntLoop.java:468)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja
> va:382)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread
> EventExecutor.java:111)
> ... 1 more
>
> Am I doing something wrong?
>
>
> --
> Best Regards,
> Ayan Guha
>


Windows - Spark 2 - Standalone - Worker not able to connect to Master

2016-07-31 Thread ayan guha
Hi

I just downloaded Spark 2.0 on my windows 7 to check it out. However, not
able to set up a standalone cluster:


Step 1: master set up (Successful)

bin/spark-class org.apache.spark.deploy.master.Master

It did throw an error about not able to find winutils, but started
successfully.

Step II: Set up Worker (Failed)

bin/spark-class org.apache.spark.deploy.worker.Worker spark://localhost:7077

This step fails with following error:

16/08/01 11:21:27 INFO Worker: Connecting to master localhost:7077...
16/08/01 11:21:28 WARN Worker: Failed to connect to master localhost:7077
org.apache.spark.SparkException: Exception thrown in awaitResult
at
org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
la:77)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.sca
la:75)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.s
cala:36)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
rElse(RpcTimeout.scala:59)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyO
rElse(RpcTimeout.scala:59)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
at
org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
at
org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deplo
y$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:216)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Failed to connect to localhost/
127.0.0.1:7077
at
org.apache.spark.network.client.TransportClientFactory.createClient(T
ransportClientFactory.java:228)
at
org.apache.spark.network.client.TransportClientFactory.createClient(T
ransportClientFactory.java:179)
at
org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala
:197)
at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:191)
at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:187)
... 4 more
Caused by: java.net.ConnectException: Connection refused: no further
information
: localhost/127.0.0.1:7077
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at
io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocke
tChannel.java:224)
at
io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConne
ct(AbstractNioChannel.java:289)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.jav
a:528)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEve
ntLoop.java:468)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.ja
va:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThread
EventExecutor.java:111)
... 1 more

Am I doing something wrong?


-- 
Best Regards,
Ayan Guha