Hi Amal,

It seems that you're using a standalone Spark cluster. In this case,
instead of `pio eventserver &`, you would have to run `pio-start-all` to
start the master and slave processes. If you need to modify the default
settings, you can refer to here for further info (
http://spark.apache.org/docs/latest/spark-standalone.html) and start the
master/slave process yourself.

Chan

On Mon, Oct 17, 2016 at 1:43 AM, amal kumar <amal.kmr.si...@gmail.com>
wrote:

> Hi,
>
> I am trying to use Spark installed on a remote cluster, other than
> PredictionIO( intalled on ec2).
>
> As per my understanding, I have done the below changes.
> 1. installed a matching version of Spark locally
> 2. updated the  SPARK_HOME in conf/pio-env.sh to point to Spark (installed
> locally)
>
> 'pio status' is success and also, I am able to start the event server 'pio
> eventserver --ip 0.0.0.0 &'
>
>
> But, I am getting error while training the model.
>
> *$ pio train -- --master spark://MYSPARKCLUSTER:7077*
>
> *[INFO] [Remoting] Starting remoting*
> *[INFO] [Remoting] Remoting started; listening on addresses
> :[akka.tcp://sparkDriverActorSystem@172.31.6.92:35117
> <http://sparkDriverActorSystem@172.31.6.92:35117>]*
> *[WARN] [AppClient$ClientEndpoint] Failed to connect to
> master MYSPARKCLUSTER:7077*
> *[WARN] [AppClient$ClientEndpoint] Failed to connect to
> master MYSPARKCLUSTER:7077*
> *[WARN] [AppClient$ClientEndpoint] Failed to connect to
> master MYSPARKCLUSTER:7077*
> *[ERROR] [SparkDeploySchedulerBackend] Application has been killed.
> Reason: All masters are unresponsive! Giving up.*
> *[WARN] [SparkDeploySchedulerBackend] Application ID is not initialized
> yet.*
> *[WARN] [AppClient$ClientEndpoint] Failed to connect to
> master MYSPARKCLUSTER:7077*
> *[WARN] [AppClient$ClientEndpoint] Drop UnregisterApplication(null)
> because has not yet connected to master*
> *[ERROR] [SparkContext] Error initializing SparkContext.*
>
>
>
> Can you advice, what am I missing here?
>
>
> Thanks,
> Amal Kumar
>

Reply via email to