Yes. the output of "hadoop classpath" should give you a list of jars that a Hadoop client needed to talk to hadoop cluster. as long as you put these in Elasticsearch's classpath, it should eliminate many common issues like wrong version of jar get loaded or a wrong hadoop client version is used to talk to cluster make. Make sure you follow classpath's format when you append output to ES_CLASSPATH
On Friday, November 21, 2014 3:00:11 PM UTC-8, Daniel Gligorov wrote: > > Hey Jinyuan, > > I'm having the same issue and want to solve with the light version. > How are you doing this part `hadoop classpath` to ES_CLASSPATH? > > I tried to exec `hadoop class' path and added the result to "export > ES_CLASSPATH=<pasted output>" but getting same error, is that what you ment? > > Thank you, > > On Monday, July 7, 2014 2:42:04 PM UTC-7, Jinyuan Zhou wrote: >> >> I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server >> is installed on the same vm. >> I have one successful senario: I used light version and add the result >> and command `hadoop classpath` to ES_CLASSPATH >> >> But I encoutered errros with the default version and hadoop2 version. >> Here is the details of issues. >> #1. I installed the plugin with this command >> bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0 >> and I sent a PUT request below: >> url: http://localhost:9200/_snapshot/hdfs_repo >> data :{ >> "type":"hdfs", >> "settings": >> { >> "uri":"hdfs://localhost.localdomain:8020", >> "path":"/user/cloudera/es_snapshot" >> } >> } >> >> I got this response >> >> >> 1. "error": "RepositoryException[[hdfs_repo] failed to create >> repository]; nested: CreationException[Guice creation errors: >> >> 1) Error injecting constructor, >> org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs >> file-system for uri [hdfs://localhost.localdomain:8020] >> at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source) >> while locating org.elasticsearch.repositories.hdfs.HdfsRepository >> while locating org.elasticsearch.repositories.Repository >> >> 1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs >> file-system for uri [hdfs://localhost.localdomain:8020]]; nested: >> RemoteException[Server IPC version 7 cannot communicate with client version >> 4]; ", >> 2. "status": 500 >> >> >> >> >> I noticed RemoteException: Server IPC version 7 cannot communicate with >> client version 4 >> >> #2 Then I tried hadoop2 version,. So I installed plugin with this command >> bin/plugin --install >> elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2 >> >> I sent a PUT request as above, this time I even got more strange >> exectiopon >> >> NoClassDefFoundError[org/apache/commons/cli/ParseException] >> Here is the response. >> >> { >> "error": "RepositoryException[[hdfs_repo] failed to create repository]; >> nested: CreationException[Guice creation errors: >> >> 1) Error injecting constructor, java.lang.NoClassDefFoundError: >> org/apache/commons/cli/ParseException >> at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source) >> while locating org.elasticsearch.repositories.hdfs.HdfsRepository >> while locating org.elasticsearch.repositories.Repository >> >> 1 error]; nested: >> NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: >> ClassNotFoundException[org.apache.commons.cli.ParseException]; ", >> "status": 500 >> } >> >> I wonder if any one have simiar experiences. Note the failed cases are >> actaully >> more realistic deplyoment choices. Because my hadoop cluster will less >> likely be on the same node >> as my ES server. >> Thanks, >> Jack >> >> >> >> >> >> >> >> -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/e3bafea8-5fcc-4801-8c43-90056c7c05a4%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
