Hi,

I have been trying a MarkLogic tiered storage POC with HDFS as storage layer 
for one of the tier. I have been trying to create a forest with data directory 
as hadoop file system directory.

I have one hadoop cluster and one Marklogic cluster. I downloaded the 
configuration files from Hadoop and copied them to /usr/hadoop directory on the 
MarkLogic machine and I have also downloaded the required jar files based on 
the documentation here. 
https://docs.marklogic.com/guide/performance/disk-storage#id_27091

I have placed these as well in the /usr/hadoop directory with proper lib 
structure.

I am getting the below error when I try creating the forest.

2015-03-12 19:17:20.087 Error: Automount Foresthadoop: SVC-HDFSNOT: HDFS not 
available for 'hdfs://{namdenode_hostname}:8020/tmp': unknown error

I tried changing the log level to finest in the group configurations and I have 
also added trace events for forest. But I am not able to get any additional 
details that could point me to what the error is about.

Any help in this regard would be appreciated. Please let me know if there are 
any other ways to connect to HDFS as a forest directory.



Thanks,
Sudheer


---
This communication may contain confidential and/or privileged information. If 
you are not the intended recipient (or have received this communication in 
error) please notify the sender immediately and destroy this communication. Any 
unauthorized copying, disclosure or distribution of the material in this 
communication is strictly forbidden.

Deutsche Bank does not render legal or tax advice, and the information 
contained in this communication should not be regarded as such.
_______________________________________________
General mailing list
[email protected]
http://developer.marklogic.com/mailman/listinfo/general

Reply via email to