Re: Failed to connect to master ...

2017-03-09 Thread ??????????
if you want debug app with remote cluster, you should submit the jar in cmd 
with java debug option and the start the ideal to connect to the cluster.

---Original---
From: "Shixiong(Ryan) Zhu"<shixi...@databricks.com>
Date: 2017/3/8 15:38:35
To: "Mina Aslani"<aslanim...@gmail.com>;
Cc: "user@spark.apache.org"<user@spark.apache.org>;"ayan 
guha"<guha.a...@gmail.com>;
Subject: Re: Failed to connect to master ...


The Spark master may bind to a different address. Take a look at this page to 
find the correct URL: http://VM_IPAddress:8080/


On Tue, Mar 7, 2017 at 10:13 PM, Mina Aslani <aslanim...@gmail.com> wrote:
Master and worker processes are running!

On Wed, Mar 8, 2017 at 12:38 AM, ayan guha <guha.a...@gmail.com> wrote:
You need to start Master and worker processes before connecting to them.

On Wed, Mar 8, 2017 at 3:33 PM, Mina Aslani <aslanim...@gmail.com> wrote:
Hi,

I am writing a spark Transformer in intelliJ in Java and trying to connect to 
the spark in a VM using setMaster. I get "Failed to connect to master ..."


I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed to 
connect to masterVM_IPAddress:7077

org.apache.spark.SparkException: Exception thrown in awaitResult
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at 
org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)



SparkSession spark = SparkSession
  .builder()
  .appName("Java Spark SQL")
  //.master("local[1]")
  .master("spark://VM_IPAddress:7077")
  .getOrCreate();
Dataset lines = spark   .readStream()   .format("kafka")   
.option("kafka.bootstrap.servers", brokers)   .option("subscribe", topic)   
.load()   .selectExpr("CAST(value AS STRING)")   
.as(Encoders.STRING());



I get same error when I trymaster("spark://spark-master:7077").

However, .master("local[1]") no exception is thrown.
 My Kafka is in the same VM and being new to SPARK still trying to understand: 
- Why I get above exception and how I can fix it (connect to SPARK in VM and 
read form KAfKA in VM)?- Why using "local[1]" no exception is thrown and how to 
setup to read from kafka in VM?- How to stream from Kafka (data in the topic is 
in json format)?  Your input is appreciated! Best regards, Mina




 






-- 
Best Regards,
Ayan Guha

Re: Failed to connect to master ...

2017-03-07 Thread Shixiong(Ryan) Zhu
The Spark master may bind to a different address. Take a look at this page
to find the correct URL: http://VM_IPAddress:8080/

On Tue, Mar 7, 2017 at 10:13 PM, Mina Aslani <aslanim...@gmail.com> wrote:

> Master and worker processes are running!
>
> On Wed, Mar 8, 2017 at 12:38 AM, ayan guha <guha.a...@gmail.com> wrote:
>
>> You need to start Master and worker processes before connecting to them.
>>
>> On Wed, Mar 8, 2017 at 3:33 PM, Mina Aslani <aslanim...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am writing a spark Transformer in intelliJ in Java and trying to
>>> connect to the spark in a VM using setMaster. I get "Failed to connect to
>>> master ..."
>>>
>>> I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed
>>> to connect to master VM_IPAddress:7077
>>> org.apache.spark.SparkException: Exception thrown in awaitResult
>>> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTi
>>> meout.scala:77)
>>> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTi
>>> meout.scala:75)
>>> at scala.runtime.AbstractPartialFunction.apply(AbstractPartialF
>>> unction.scala:36)
>>> at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout
>>> $1.applyOrElse(RpcTimeout.scala:59)
>>>
>>> SparkSession spark = SparkSession
>>>   .builder()
>>>   .appName("Java Spark SQL")
>>>   //.master("local[1]")
>>>   .master("spark://VM_IPAddress:7077")
>>>   .getOrCreate();
>>>
>>> Dataset lines = spark
>>>   .readStream()
>>>   .format("kafka")  .option("kafka.bootstrap.servers", brokers) 
>>>  .option("subscribe", topic)  .load()
>>>   .selectExpr("CAST(value AS STRING)")  .as(Encoders.STRING());
>>>
>>>
>>>
>>> I get same error when I try master("*spark://spark-master:7077**"*).
>>>
>>> *However, .master("local[1]") *no exception is thrown*.*
>>> *
>>> My Kafka is in the same VM and being new to SPARK still trying to 
>>> understand:
>>> *
>>>
>>> - Why I get above exception and how I can fix it (connect to SPARK in VM 
>>> and read form KAfKA in VM)?
>>>
>>> - Why using "local[1]" no exception is thrown and how to setup to read from 
>>> kafka in VM?
>>>
>>> *- How to stream from Kafka (data in the topic is in json format)?
>>> *
>>> Your input is appreciated!
>>>
>>> Best regards,
>>> Mina
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Best Regards,
>> Ayan Guha
>>
>
>


Re: Failed to connect to master ...

2017-03-07 Thread Mina Aslani
Master and worker processes are running!

On Wed, Mar 8, 2017 at 12:38 AM, ayan guha <guha.a...@gmail.com> wrote:

> You need to start Master and worker processes before connecting to them.
>
> On Wed, Mar 8, 2017 at 3:33 PM, Mina Aslani <aslanim...@gmail.com> wrote:
>
>> Hi,
>>
>> I am writing a spark Transformer in intelliJ in Java and trying to
>> connect to the spark in a VM using setMaster. I get "Failed to connect to
>> master ..."
>>
>> I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed
>> to connect to master VM_IPAddress:7077
>> org.apache.spark.SparkException: Exception thrown in awaitResult
>> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(
>> RpcTimeout.scala:77)
>> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(
>> RpcTimeout.scala:75)
>> at scala.runtime.AbstractPartialFunction.apply(AbstractPartialF
>> unction.scala:36)
>> at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout
>> $1.applyOrElse(RpcTimeout.scala:59)
>>
>> SparkSession spark = SparkSession
>>   .builder()
>>   .appName("Java Spark SQL")
>>   //.master("local[1]")
>>   .master("spark://VM_IPAddress:7077")
>>   .getOrCreate();
>>
>> Dataset lines = spark
>>   .readStream()
>>   .format("kafka")  .option("kafka.bootstrap.servers", brokers)  
>> .option("subscribe", topic)  .load()
>>   .selectExpr("CAST(value AS STRING)")  .as(Encoders.STRING());
>>
>>
>>
>> I get same error when I try master("*spark://spark-master:7077**"*).
>>
>> *However, .master("local[1]") *no exception is thrown*.*
>> *
>> My Kafka is in the same VM and being new to SPARK still trying to understand:
>> *
>>
>> - Why I get above exception and how I can fix it (connect to SPARK in VM and 
>> read form KAfKA in VM)?
>>
>> - Why using "local[1]" no exception is thrown and how to setup to read from 
>> kafka in VM?
>>
>> *- How to stream from Kafka (data in the topic is in json format)?
>> *
>> Your input is appreciated!
>>
>> Best regards,
>> Mina
>>
>>
>>
>>
>
>
> --
> Best Regards,
> Ayan Guha
>


Re: Failed to connect to master ...

2017-03-07 Thread ayan guha
You need to start Master and worker processes before connecting to them.

On Wed, Mar 8, 2017 at 3:33 PM, Mina Aslani <aslanim...@gmail.com> wrote:

> Hi,
>
> I am writing a spark Transformer in intelliJ in Java and trying to connect
> to the spark in a VM using setMaster. I get "Failed to connect to master
> ..."
>
> I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed
> to connect to master VM_IPAddress:7077
> org.apache.spark.SparkException: Exception thrown in awaitResult
> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.
> applyOrElse(RpcTimeout.scala:77)
> at org.apache.spark.rpc.RpcTimeout$$anonfun$1.
> applyOrElse(RpcTimeout.scala:75)
> at scala.runtime.AbstractPartialFunction.apply(
> AbstractPartialFunction.scala:36)
> at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.
> applyOrElse(RpcTimeout.scala:59)
>
> SparkSession spark = SparkSession
>   .builder()
>   .appName("Java Spark SQL")
>   //.master("local[1]")
>   .master("spark://VM_IPAddress:7077")
>   .getOrCreate();
>
> Dataset lines = spark
>   .readStream()
>   .format("kafka")  .option("kafka.bootstrap.servers", brokers)  
> .option("subscribe", topic)  .load()
>   .selectExpr("CAST(value AS STRING)")  .as(Encoders.STRING());
>
>
>
> I get same error when I try master("*spark://spark-master:7077**"*).
>
> *However, .master("local[1]") *no exception is thrown*.*
> *
> My Kafka is in the same VM and being new to SPARK still trying to understand:
> *
>
> - Why I get above exception and how I can fix it (connect to SPARK in VM and 
> read form KAfKA in VM)?
>
> - Why using "local[1]" no exception is thrown and how to setup to read from 
> kafka in VM?
>
> *- How to stream from Kafka (data in the topic is in json format)?
> *
> Your input is appreciated!
>
> Best regards,
> Mina
>
>
>
>


-- 
Best Regards,
Ayan Guha


Failed to connect to master ...

2017-03-07 Thread Mina Aslani
Hi,

I am writing a spark Transformer in intelliJ in Java and trying to connect
to the spark in a VM using setMaster. I get "Failed to connect to master
..."

I get 17/03/07 16:20:55 WARN StandaloneAppClient$ClientEndpoint: Failed to
connect to master VM_IPAddress:7077
org.apache.spark.SparkException: Exception thrown in awaitResult
at
org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)

SparkSession spark = SparkSession
  .builder()
  .appName("Java Spark SQL")
  //.master("local[1]")
  .master("spark://VM_IPAddress:7077")
  .getOrCreate();

Dataset lines = spark
  .readStream()
  .format("kafka")  .option("kafka.bootstrap.servers",
brokers)  .option("subscribe", topic)  .load()
  .selectExpr("CAST(value AS STRING)")  .as(Encoders.STRING());



I get same error when I try master("*spark://spark-master:7077**"*).

*However, .master("local[1]") *no exception is thrown*.*
*
My Kafka is in the same VM and being new to SPARK still trying to understand:
*

- Why I get above exception and how I can fix it (connect to SPARK in
VM and read form KAfKA in VM)?

- Why using "local[1]" no exception is thrown and how to setup to read
from kafka in VM?

*- How to stream from Kafka (data in the topic is in json format)?
*
Your input is appreciated!

Best regards,
Mina