Hi,

I am trying to use Spark installed on a remote cluster, other than
PredictionIO( intalled on ec2).

As per my understanding, I have done the below changes.
1. installed a matching version of Spark locally
2. updated the  SPARK_HOME in conf/pio-env.sh to point to Spark (installed
locally)

'pio status' is success and also, I am able to start the event server 'pio
eventserver --ip 0.0.0.0 &'


But, I am getting error while training the model.

*$ pio train -- --master spark://MYSPARKCLUSTER:7077*

*[INFO] [Remoting] Starting remoting*
*[INFO] [Remoting] Remoting started; listening on addresses
:[akka.tcp://[email protected]:35117
<http://[email protected]:35117>]*
*[WARN] [AppClient$ClientEndpoint] Failed to connect to
master MYSPARKCLUSTER:7077*
*[WARN] [AppClient$ClientEndpoint] Failed to connect to
master MYSPARKCLUSTER:7077*
*[WARN] [AppClient$ClientEndpoint] Failed to connect to
master MYSPARKCLUSTER:7077*
*[ERROR] [SparkDeploySchedulerBackend] Application has been killed. Reason:
All masters are unresponsive! Giving up.*
*[WARN] [SparkDeploySchedulerBackend] Application ID is not initialized
yet.*
*[WARN] [AppClient$ClientEndpoint] Failed to connect to
master MYSPARKCLUSTER:7077*
*[WARN] [AppClient$ClientEndpoint] Drop UnregisterApplication(null) because
has not yet connected to master*
*[ERROR] [SparkContext] Error initializing SparkContext.*



Can you advice, what am I missing here?


Thanks,
Amal Kumar

Reply via email to