Re: how to integrate solr with HDFS HA

2013-08-25 Thread YouPeng Yang
Hi Greg
   Thanks for your reponse.
   It works



2013/8/23 Greg Walters gwalt...@sherpaanalytics.com

 Finally something I can help with! I went through the same problems you're
 having a short while ago. Check out
 https://cwiki.apache.org/confluence/display/solr/Running+Solr+on+HDFS for
 most of the information you need and be sure to check the comments on the
 page as well.

 Here's an example from my working setup:

 **
   directoryFactory name=DirectoryFactory
 class=solr.HdfsDirectoryFactory
 bool name=solr.hdfs.blockcache.enabledtrue/bool
 int name=solr.hdfs.blockcache.slab.count1/int
 bool
 name=solr.hdfs.blockcache.direct.memory.allocationtrue/bool
 int name=solr.hdfs.blockcache.blocksperbank16384/int
 bool name=solr.hdfs.blockcache.read.enabledtrue/bool
 bool name=solr.hdfs.blockcache.write.enabledtrue/bool
 bool name=solr.hdfs.nrtcachingdirectory.enabletrue/bool
 int name=solr.hdfs.nrtcachingdirectory.maxmergesizemb16/int
 int name=solr.hdfs.nrtcachingdirectory.maxcachedmb192/int
 str name=solr.hdfs.homehdfs://nameservice1:8020/solr/str
 str name=solr.hdfs.confdir/etc/hadoop/conf.cloudera.hdfs1/str
   /directoryFactory
 **

 Thanks,
 Greg

 -Original Message-
 From: YouPeng Yang [mailto:yypvsxf19870...@gmail.com]
 Sent: Friday, August 23, 2013 1:16 AM
 To: solr-user@lucene.apache.org
 Subject: how to integrate solr with HDFS HA

 Hi all
 I try to integrate solr with HDFS HA.When I start the solr server, it
 comes out an exeception[1].
 And I do know this is because the hadoop.conf.Configuration  in
 HdfsDirectoryFactory.java does not include the HA configuration.
 So I want to know ,in solr,is there any way to include my hadoop  HA
 configuration ?



 [1]---
 Caused by: java.lang.IllegalArgumentException:
 java.net.UnknownHostException: lklcluster
 at

 org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:418)
 at

 org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:164)
 at

 org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:129)
 at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:415)
 at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:382)
 at

 org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:123)
 at
 org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2277)
 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:86)
 at
 org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2311)
 at
 org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2299)
 at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:364)
 at
 org.apache.solr.store.hdfs.HdfsDirectory.init(HdfsDirectory.java:59)
 at

 org.apache.solr.core.HdfsDirectoryFactory.create(HdfsDirectoryFactory.java:154)
 at

 org.apache.solr.core.CachingDirectoryFactory.get(CachingDirectoryFactory.java:350)
 at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:256)
 at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:469)
 at org.apache.solr.core.SolrCore.init(SolrCore.java:759)



RE: how to integrate solr with HDFS HA

2013-08-23 Thread Greg Walters
Finally something I can help with! I went through the same problems you're 
having a short while ago. Check out 
https://cwiki.apache.org/confluence/display/solr/Running+Solr+on+HDFS for most 
of the information you need and be sure to check the comments on the page as 
well.

Here's an example from my working setup:

**
  directoryFactory name=DirectoryFactory
class=solr.HdfsDirectoryFactory
bool name=solr.hdfs.blockcache.enabledtrue/bool
int name=solr.hdfs.blockcache.slab.count1/int
bool name=solr.hdfs.blockcache.direct.memory.allocationtrue/bool
int name=solr.hdfs.blockcache.blocksperbank16384/int
bool name=solr.hdfs.blockcache.read.enabledtrue/bool
bool name=solr.hdfs.blockcache.write.enabledtrue/bool
bool name=solr.hdfs.nrtcachingdirectory.enabletrue/bool
int name=solr.hdfs.nrtcachingdirectory.maxmergesizemb16/int
int name=solr.hdfs.nrtcachingdirectory.maxcachedmb192/int
str name=solr.hdfs.homehdfs://nameservice1:8020/solr/str
str name=solr.hdfs.confdir/etc/hadoop/conf.cloudera.hdfs1/str
  /directoryFactory
**

Thanks,
Greg

-Original Message-
From: YouPeng Yang [mailto:yypvsxf19870...@gmail.com] 
Sent: Friday, August 23, 2013 1:16 AM
To: solr-user@lucene.apache.org
Subject: how to integrate solr with HDFS HA

Hi all
I try to integrate solr with HDFS HA.When I start the solr server, it comes 
out an exeception[1].
And I do know this is because the hadoop.conf.Configuration  in 
HdfsDirectoryFactory.java does not include the HA configuration.
So I want to know ,in solr,is there any way to include my hadoop  HA 
configuration ?


[1]---
Caused by: java.lang.IllegalArgumentException:
java.net.UnknownHostException: lklcluster
at
org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:418)
at
org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:164)
at
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:129)
at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:415)
at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:382)
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:123)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2277)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:86)
at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2311)
at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2299)
at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:364)
at
org.apache.solr.store.hdfs.HdfsDirectory.init(HdfsDirectory.java:59)
at
org.apache.solr.core.HdfsDirectoryFactory.create(HdfsDirectoryFactory.java:154)
at
org.apache.solr.core.CachingDirectoryFactory.get(CachingDirectoryFactory.java:350)
at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:256)
at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:469)
at org.apache.solr.core.SolrCore.init(SolrCore.java:759)