I've been able to solve this problem with the system restart, thank you! On Sun, Mar 26, 2017 at 7:44 PM, Vaghawan Ojha <vaghawan...@gmail.com> wrote:
> But when I restart the whole machine, it again works. Its quite weird. > > On Sun, Mar 26, 2017 at 7:16 PM, Vaghawan Ojha <vaghawan...@gmail.com> > wrote: > >> Sorry even pio status shows some errors now: >> >> [ERROR] [RecoverableZooKeeper] ZooKeeper exists failed after 1 attempts >> [ERROR] [ZooKeeperWatcher] hconnection-0x39ad12b6, quorum=localhost:2181, >> baseZNode=/hbase Received unexpected KeeperException, re-throwing exception >> [WARN] [ZooKeeperRegistry] Can't retrieve clusterId from Zookeeper >> [ERROR] [StorageClient] Cannot connect to ZooKeeper (ZooKeeper ensemble: >> localhost). Please make sure that the configuration is pointing at the >> correct ZooKeeper ensemble. By default, HBase manages its own ZooKeeper, so >> if you have not configured HBase to use an external ZooKeeper, that means >> your HBase is not started or configured properly. >> [ERROR] [Storage$] Error initializing storage client for source HBASE >> [ERROR] [Console$] Unable to connect to all storage backends >> successfully. The following shows the error message from the storage >> backend. >> [ERROR] [Console$] Data source HBASE was not properly initialized. >> (org.apache.predictionio.data.storage.StorageClientException) >> [ERROR] [Console$] Dumping configuration of initialized storage backend >> sources. Please make sure they are correct. >> [ERROR] [Console$] Source Name: ELASTICSEARCH; Type: elasticsearch; >> Configuration: HOME -> /var/www/abc/apache-prediction >> io-0.10.0-incubating/PredictionIO-0.10.0-incubating/vendors/elasticsearch-1.4.4, >> HOSTS -> localhost, PORTS -> 9300, CLUSTERNAME -> elasticsearch, TYPE -> >> elasticsearch >> [ERROR] [Console$] Source Name: LOCALFS; Type: localfs; Configuration: >> PATH -> /home/ekbana-php/.pio_store/models, TYPE -> localfs >> [ERROR] [Console$] Source Name: HBASE; Type: (error); Configuration: >> (error) >> >> >> How would I actually solve it? >> >> On Sun, Mar 26, 2017 at 7:13 PM, Vaghawan Ojha <vaghawan...@gmail.com> >> wrote: >> >>> While importing the event data, I get the following error, I am new and >>> I don't even understand what does they means? While doing pio status it's >>> shows everything ok. What's wrong here. >>> >>> sudo PredictionIO-0.10.0-incubating/bin/pio import --appid 1 --input >>> my_events.json >>> [INFO] [Runner$] Submission command: /var/www/abc/apache-prediction >>> io-0.10.0-incubating/PredictionIO-0.10.0-incubating/vendors/ >>> spark-1.5.1-bin-hadoop2.6/bin/spark-submit --class >>> org.apache.predictionio.tools.imprt.FileToEvents --files >>> file:/var/www/abc/apache-predictionio-0.10.0-incubating/Pred >>> ictionIO-0.10.0-incubating/conf/log4j.properties,file:/ >>> var/www/abc/apache-predictionio-0.10.0-incubating/ >>> PredictionIO-0.10.0-incubating/vendors/hbase-1.0.0/conf/hbase-site.xml >>> --driver-class-path /var/www/abc/apache-prediction >>> io-0.10.0-incubating/PredictionIO-0.10.0-incubating/conf:/ >>> var/www/abc/apache-predictionio-0.10.0-incubating/PredictionIO-0.10.0- >>> incubating/vendors/elasticsearch-1.4.4/conf:/var/www/abc/apa >>> che-predictionio-0.10.0-incubating/PredictionIO-0.10.0- >>> incubating/lib/postgresql-9.4-1204.jdbc41.jar:/var/www/abc/ >>> apache-predictionio-0.10.0-incubating/PredictionIO-0.10. >>> 0-incubating/lib/mysql-connector-java-5.1.37.jar:/var/www/ >>> abc/apache-predictionio-0.10.0-incubating/PredictionIO-0. >>> 10.0-incubating/vendors/spark-1.5.1-bin-hadoop2.6/conf:/var/ >>> www/abc/apache-predictionio-0.10.0-incubating/PredictionIO- >>> 0.10.0-incubating/vendors/hbase-1.0.0/conf >>> file:/var/www/abc/apache-predictionio-0.10.0-incubating/Pred >>> ictionIO-0.10.0-incubating/lib/pio-assembly-0.10.0-incubating.jar >>> --appid 1 --input file:/var/www/abc/apache-predi >>> ctionio-0.10.0-incubating/my_events.json --env >>> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase,PIO_ENV_LOADED=1,PIO_ST >>> ORAGE_REPOSITORIES_METADATA_NAME=pio_meta,PIO_FS_BASEDIR=/ho >>> me/ekbana-php/.pio_store,PIO_STORAGE_SOURCES_ELASTICSEARCH_ >>> HOSTS=localhost,PIO_STORAGE_SOURCES_HBASE_HOME=/var/www/ >>> abc/apache-predictionio-0.10.0-incubating/PredictionIO-0. >>> 10.0-incubating/vendors/hbase-1.0.0,PIO_HOME=/var/www/abc/ >>> apache-predictionio-0.10.0-incubating/PredictionIO-0.10. >>> 0-incubating,PIO_FS_ENGINESDIR=/home/ekbana-php/. >>> pio_store/engines,PIO_STORAGE_SOURCES_LOCALFS_PATH=/home/ >>> ekbana-php/.pio_store/models,PIO_STORAGE_SOURCES_ >>> ELASTICSEARCH_TYPE=elasticsearch,PIO_STORAGE_REPOSITORIES_ >>> METADATA_SOURCE=ELASTICSEARCH,PIO_STORAGE_REPOSITORIES_ >>> MODELDATA_SOURCE=LOCALFS,PIO_STORAGE_REPOSITORIES_EVENTDATA_ >>> NAME=pio_event,PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERN >>> AME=elasticsearch,PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/ >>> var/www/abc/apache-predictionio-0.10.0-incubating/Prediction >>> IO-0.10.0-incubating/vendors/elasticsearch-1.4.4,PIO_FS_ >>> TMPDIR=/home/ekbana-php/.pio_store/tmp,PIO_STORAGE_ >>> REPOSITORIES_MODELDATA_NAME=pio_model,PIO_STORAGE_ >>> REPOSITORIES_EVENTDATA_SOURCE=HBASE,PIO_CONF_DIR=/var/www/ >>> abc/apache-predictionio-0.10.0-incubating/PredictionIO-0. >>> 10.0-incubating/conf,PIO_STORAGE_SOURCES_ELASTICSEARCH_ >>> PORTS=9300,PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs >>> [WARN] [Utils] Your hostname, EK-LT-15 resolves to a loopback address: >>> 127.0.1.1; using 192.168.10.8 instead (on interface wlp6s0) >>> [WARN] [Utils] Set SPARK_LOCAL_IP if you need to bind to another address >>> [INFO] [Remoting] Starting remoting >>> [INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp:// >>> sparkDriver@192.168.10.8:36713] >>> [WARN] [MetricsSystem] Using default name DAGScheduler for source >>> because spark.app.id is not set. >>> [ERROR] [RecoverableZooKeeper] ZooKeeper exists failed after 1 attempts >>> [ERROR] [ZooKeeperWatcher] hconnection-0x7bede4ea, >>> quorum=localhost:2181, baseZNode=/hbase Received unexpected >>> KeeperException, re-throwing exception >>> [WARN] [ZooKeeperRegistry] Can't retrieve clusterId from Zookeeper >>> [ERROR] [StorageClient] Cannot connect to ZooKeeper (ZooKeeper ensemble: >>> localhost). Please make sure that the configuration is pointing at the >>> correct ZooKeeper ensemble. By default, HBase manages its own ZooKeeper, so >>> if you have not configured HBase to use an external ZooKeeper, that means >>> your HBase is not started or configured properly. >>> [ERROR] [Storage$] Error initializing storage client for source HBASE >>> Exception in thread "main" >>> org.apache.predictionio.data.storage.StorageClientException: >>> Data source HBASE was not properly initialized. >>> at org.apache.predictionio.data.storage.Storage$$anonfun$10.app >>> ly(Storage.scala:282) >>> at org.apache.predictionio.data.storage.Storage$$anonfun$10.app >>> ly(Storage.scala:282) >>> at scala.Option.getOrElse(Option.scala:120) >>> at org.apache.predictionio.data.storage.Storage$.getDataObject( >>> Storage.scala:281) >>> at org.apache.predictionio.data.storage.Storage$.getPDataObject >>> (Storage.scala:330) >>> at org.apache.predictionio.data.storage.Storage$.getPDataObject >>> (Storage.scala:273) >>> at org.apache.predictionio.data.storage.Storage$.getPEvents(Sto >>> rage.scala:394) >>> at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$ma >>> in$1.apply(FileToEvents.scala:98) >>> at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$ma >>> in$1.apply(FileToEvents.scala:68) >>> at scala.Option.map(Option.scala:145) >>> at org.apache.predictionio.tools.imprt.FileToEvents$.main(FileT >>> oEvents.scala:68) >>> at org.apache.predictionio.tools.imprt.FileToEvents.main(FileTo >>> Events.scala) >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce >>> ssorImpl.java:62) >>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe >>> thodAccessorImpl.java:43) >>> at java.lang.reflect.Method.invoke(Method.java:498) >>> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy >>> $SparkSubmit$$runMain(SparkSubmit.scala:672) >>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit >>> .scala:180) >>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) >>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) >>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >>> ekbana-php@EK-LT-15:/var/www/abc/apache-predictionio-0.10.0-incubating$ >>> sudo PredictionIO-0.10.0-incubating/bin/pio-stop-all >>> Stopping PredictionIO Event Server... >>> Stopping HBase... >>> stopping hbase >>> Stopping Elasticsearch... >>> ekbana-php@EK-LT-15:/var/www/abc/apache-predictionio-0.10.0-incubating$ >>> sudo PredictionIO-0.10.0-incubating/bin/pio-start-all >>> Starting Elasticsearch... >>> Starting HBase... >>> starting master, logging to /var/www/abc/apache-prediction >>> io-0.10.0-incubating/PredictionIO-0.10.0-incubating/vendors/ >>> hbase-1.0.0/bin/../logs/hbase-root-master-EK-LT-15.out >>> Waiting 10 seconds for HBase to fully initialize... >>> Starting PredictionIO Event Server... >>> ekbana-php@EK-LT-15:/var/www/abc/apache-predictionio-0.10.0-incubating$ >>> sudo PredictionIO-0.10.0-incubating/bin/pio import --appid 1 --input >>> my_events.json >>> [INFO] [Runner$] Submission command: /var/www/abc/apache-prediction >>> io-0.10.0-incubating/PredictionIO-0.10.0-incubating/vendors/ >>> spark-1.5.1-bin-hadoop2.6/bin/spark-submit --class >>> org.apache.predictionio.tools.imprt.FileToEvents --files >>> file:/var/www/abc/apache-predictionio-0.10.0-incubating/Pred >>> ictionIO-0.10.0-incubating/conf/log4j.properties,file:/ >>> var/www/abc/apache-predictionio-0.10.0-incubating/ >>> PredictionIO-0.10.0-incubating/vendors/hbase-1.0.0/conf/hbase-site.xml >>> --driver-class-path /var/www/abc/apache-prediction >>> io-0.10.0-incubating/PredictionIO-0.10.0-incubating/conf:/ >>> var/www/abc/apache-predictionio-0.10.0-incubating/PredictionIO-0.10.0- >>> incubating/vendors/elasticsearch-1.4.4/conf:/var/www/abc/apa >>> che-predictionio-0.10.0-incubating/PredictionIO-0.10.0- >>> incubating/lib/postgresql-9.4-1204.jdbc41.jar:/var/www/abc/ >>> apache-predictionio-0.10.0-incubating/PredictionIO-0.10. >>> 0-incubating/lib/mysql-connector-java-5.1.37.jar:/var/www/ >>> abc/apache-predictionio-0.10.0-incubating/PredictionIO-0. >>> 10.0-incubating/vendors/spark-1.5.1-bin-hadoop2.6/conf:/var/ >>> www/abc/apache-predictionio-0.10.0-incubating/PredictionIO- >>> 0.10.0-incubating/vendors/hbase-1.0.0/conf >>> file:/var/www/abc/apache-predictionio-0.10.0-incubating/Pred >>> ictionIO-0.10.0-incubating/lib/pio-assembly-0.10.0-incubating.jar >>> --appid 1 --input file:/var/www/abc/apache-predi >>> ctionio-0.10.0-incubating/my_events.json --env >>> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase,PIO_ENV_LOADED=1,PIO_ST >>> ORAGE_REPOSITORIES_METADATA_NAME=pio_meta,PIO_FS_BASEDIR=/ho >>> me/ekbana-php/.pio_store,PIO_STORAGE_SOURCES_ELASTICSEARCH_ >>> HOSTS=localhost,PIO_STORAGE_SOURCES_HBASE_HOME=/var/www/ >>> abc/apache-predictionio-0.10.0-incubating/PredictionIO-0. >>> 10.0-incubating/vendors/hbase-1.0.0,PIO_HOME=/var/www/abc/ >>> apache-predictionio-0.10.0-incubating/PredictionIO-0.10. >>> 0-incubating,PIO_FS_ENGINESDIR=/home/ekbana-php/. >>> pio_store/engines,PIO_STORAGE_SOURCES_LOCALFS_PATH=/home/ >>> ekbana-php/.pio_store/models,PIO_STORAGE_SOURCES_ >>> ELASTICSEARCH_TYPE=elasticsearch,PIO_STORAGE_REPOSITORIES_ >>> METADATA_SOURCE=ELASTICSEARCH,PIO_STORAGE_REPOSITORIES_ >>> MODELDATA_SOURCE=LOCALFS,PIO_STORAGE_REPOSITORIES_EVENTDATA_ >>> NAME=pio_event,PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERN >>> AME=elasticsearch,PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/ >>> var/www/abc/apache-predictionio-0.10.0-incubating/Prediction >>> IO-0.10.0-incubating/vendors/elasticsearch-1.4.4,PIO_FS_ >>> TMPDIR=/home/ekbana-php/.pio_store/tmp,PIO_STORAGE_ >>> REPOSITORIES_MODELDATA_NAME=pio_model,PIO_STORAGE_ >>> REPOSITORIES_EVENTDATA_SOURCE=HBASE,PIO_CONF_DIR=/var/www/ >>> abc/apache-predictionio-0.10.0-incubating/PredictionIO-0. >>> 10.0-incubating/conf,PIO_STORAGE_SOURCES_ELASTICSEARCH_ >>> PORTS=9300,PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs >>> [WARN] [Utils] Your hostname, EK-LT-15 resolves to a loopback address: >>> 127.0.1.1; using 192.168.10.8 instead (on interface wlp6s0) >>> [WARN] [Utils] Set SPARK_LOCAL_IP if you need to bind to another address >>> [INFO] [Remoting] Starting remoting >>> [INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp:// >>> sparkDriver@192.168.10.8:44522] >>> [WARN] [MetricsSystem] Using default name DAGScheduler for source >>> because spark.app.id is not set. >>> [ERROR] [RecoverableZooKeeper] ZooKeeper exists failed after 1 attempts >>> [ERROR] [ZooKeeperWatcher] hconnection-0x1d2d8846, >>> quorum=localhost:2181, baseZNode=/hbase Received unexpected >>> KeeperException, re-throwing exception >>> [WARN] [ZooKeeperRegistry] Can't retrieve clusterId from Zookeeper >>> [ERROR] [StorageClient] Cannot connect to ZooKeeper (ZooKeeper ensemble: >>> localhost). Please make sure that the configuration is pointing at the >>> correct ZooKeeper ensemble. By default, HBase manages its own ZooKeeper, so >>> if you have not configured HBase to use an external ZooKeeper, that means >>> your HBase is not started or configured properly. >>> [ERROR] [Storage$] Error initializing storage client for source HBASE >>> Exception in thread "main" >>> org.apache.predictionio.data.storage.StorageClientException: >>> Data source HBASE was not properly initialized. >>> at org.apache.predictionio.data.storage.Storage$$anonfun$10.app >>> ly(Storage.scala:282) >>> at org.apache.predictionio.data.storage.Storage$$anonfun$10.app >>> ly(Storage.scala:282) >>> at scala.Option.getOrElse(Option.scala:120) >>> at org.apache.predictionio.data.storage.Storage$.getDataObject( >>> Storage.scala:281) >>> at org.apache.predictionio.data.storage.Storage$.getPDataObject >>> (Storage.scala:330) >>> at org.apache.predictionio.data.storage.Storage$.getPDataObject >>> (Storage.scala:273) >>> at org.apache.predictionio.data.storage.Storage$.getPEvents(Sto >>> rage.scala:394) >>> at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$ma >>> in$1.apply(FileToEvents.scala:98) >>> at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$ma >>> in$1.apply(FileToEvents.scala:68) >>> at scala.Option.map(Option.scala:145) >>> at org.apache.predictionio.tools.imprt.FileToEvents$.main(FileT >>> oEvents.scala:68) >>> at org.apache.predictionio.tools.imprt.FileToEvents.main(FileTo >>> Events.scala) >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce >>> ssorImpl.java:62) >>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe >>> thodAccessorImpl.java:43) >>> at java.lang.reflect.Method.invoke(Method.java:498) >>> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy >>> $SparkSubmit$$runMain(SparkSubmit.scala:672) >>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit >>> .scala:180) >>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) >>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) >>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >>> >>> >> >