Hi,

I have setup predictionIo successfully on a linux VM. Event server is running 
at http://localhost:7070.

I am using elasticseacrh as data storage and it is up and running at 
http://localhost:9200

But when I am trying to deploy Recommendation engine, deployment was  
successful but I am not able to access it at localhost. You can see the below 
trace after successful deployment.

[WARN] [WorkflowUtils$] Non-empty parameters supplied to 
org.template.recommendation.Preparator, but its constructor does not accept any 
arguments. Stubbing with empty parameters.
[WARN] [WorkflowUtils$] Non-empty parameters supplied to 
org.template.recommendation.Serving, but its constructor does not accept any 
arguments. Stubbing with empty parameters.
[INFO] [Remoting] Starting remoting
[INFO] [Remoting] Remoting started; listening on addresses 
:[akka.tcp://[email protected]:40932]
[WARN] [MetricsSystem] Using default name DAGScheduler for source because 
spark.app.id is not set.
[INFO] [Engine] Using persisted model
[INFO] [Engine] Custom-persisted model detected for algorithm 
org.template.recommendation.ALSAlgorithm
[WARN] [ALSModel] User factor does not have a partitioner. Prediction on 
individual records could be slow.
[WARN] [ALSModel] User factor is not cached. Prediction could be slow.
[WARN] [ALSModel] Product factor does not have a partitioner. Prediction on 
individual records could be slow.
[WARN] [ALSModel] Product factor is not cached. Prediction could be slow.
[INFO] [MasterActor] Undeploying any existing engine instance at 
http://0.0.0.0:8000
[WARN] [MasterActor] Nothing at http://0.0.0.0:8000
[INFO] [HttpListener] Bound to /0.0.0.0:8000
[INFO] [MasterActor] Engine is deployed and running. Engine API is live at 
http://0.0.0.0:8000.


$ curl -i -X GET "http://localhost:8000";
curl: (7) Failed to connect to localhost port 8000: Connection refused

I have also tried with different ports but nothing worked. Could someone please 
help me resolve these.

Regards,
Sravya

Reply via email to