We had the same issue. We're also running CDH 4.6, which expects a different hadoop client.
We fixed these by grabbing the source, removing the 'exclude module: "commons-cli"' from repository-hdfs/build.gradle, setting 'hadoop2Version = 2.0.0-cdh4.6.0' in gradle.properties (and set the esVersion and luceneVersions for good measure), and building our own zip: cd repository-hdfs/; ../gradlew -Pdistro=hadoopYarn distZip You can install this with: bin/plugin -u file:///<path>/elasticsearch-repository-hdfs-2.1.0.BUILD-SNAPSHOT-hadoop2.zip -i elasticsearch-repository-hdfs Hope this helps. -brent On Monday, July 7, 2014 3:42:04 PM UTC-6, Jinyuan Zhou wrote: > > I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server > is installed on the same vm. > I have one successful senario: I used light version and add the result and > command `hadoop classpath` to ES_CLASSPATH > > But I encoutered errros with the default version and hadoop2 version. > Here is the details of issues. > #1. I installed the plugin with this command > bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0 > and I sent a PUT request below: > url: http://localhost:9200/_snapshot/hdfs_repo > data :{ > "type":"hdfs", > "settings": > { > "uri":"hdfs://localhost.localdomain:8020", > "path":"/user/cloudera/es_snapshot" > } > } > > I got this response > > > 1. "error": "RepositoryException[[hdfs_repo] failed to create repository]; > nested: CreationException[Guice creation errors: > > 1) Error injecting constructor, > org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs > file-system for uri [hdfs://localhost.localdomain:8020] > at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source) > while locating org.elasticsearch.repositories.hdfs.HdfsRepository > while locating org.elasticsearch.repositories.Repository > > 1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs > file-system for uri [hdfs://localhost.localdomain:8020]]; nested: > RemoteException[Server IPC version 7 cannot communicate with client version > 4]; ", > 2. "status": 500 > > > > > I noticed RemoteException: Server IPC version 7 cannot communicate with > client version 4 > > #2 Then I tried hadoop2 version,. So I installed plugin with this command > bin/plugin --install > elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2 > > I sent a PUT request as above, this time I even got more strange > exectiopon > > NoClassDefFoundError[org/apache/commons/cli/ParseException] > Here is the response. > > { > "error": "RepositoryException[[hdfs_repo] failed to create repository]; > nested: CreationException[Guice creation errors: > > 1) Error injecting constructor, java.lang.NoClassDefFoundError: > org/apache/commons/cli/ParseException > at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source) > while locating org.elasticsearch.repositories.hdfs.HdfsRepository > while locating org.elasticsearch.repositories.Repository > > 1 error]; nested: > NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: > ClassNotFoundException[org.apache.commons.cli.ParseException]; ", > "status": 500 > } > > I wonder if any one have simiar experiences. Note the failed cases are > actaully > more realistic deplyoment choices. Because my hadoop cluster will less likely > be on the same node > as my ES server. > Thanks, > Jack > > > > > > > > -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/259f35c0-32f1-45fe-8b3c-0e4f69c30ca5%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
