Hello,
I'm learning about Spark Streaming and I'm really excited.
Today I was testing to package some apps and send them in a Standalone
cluster in my computer locally.
It occurred ok.

So,
I created one Virtual Machine with network bridge and I tried to send again
the app to this VM from my local PC.

But I had some errors, like these:
WARN AppClient$ClientActor: Could not connect to
akka.tcp://sparkMaster@spark-01:7077:
akka.remote.EndpointAssociationException: Association failed with
[akka.tcp://sparkMaster@spark-01:7077]
14/09/24 16:28:48 WARN TaskSchedulerImpl: Initial job has not accepted any
resources; check your cluster UI to ensure that workers are registered and
have sufficient memory

Can I submit an application from another computer to a remote master with
workers?
Anybody suggest something?

Thanks a lot.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Question-About-Submit-Application-tp15072.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to