Hi there,
      previous problem has been solved via adding the directory of "fuse_dfs" 
into PATH.
      Then new problem followed. When I run  "fuse_dfs_wrapper.sh 
dfs://master:9000 /mnt/fuse -d", there are three errors in logs as below:

[root@master fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://master:9000 /mnt/fuse -d
INFO 
/opt/hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
 Adding FUSE arg /mnt/fuse
INFO 
/opt/hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
 Ignoring option -d
FUSE library version: 2.9.4
nullpath_ok: 0
nopath: 0
utime_omit_ok: 0
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56, pid: 0
INIT: 7.14
flags=0x0000f07b
max_readahead=0x00020000
INFO 
/opt/hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
 Mounting with options: [ protected=(NULL), nn_uri=hdfs://master:9000, 
nn_port=0, debug=0, read_only=0, initchecks=0, no_permissions=0, usetrash=0, 
entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760, direct_io=0 ]
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: 
ExceptionUtils::getStackTrace error.)
hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: 
ExceptionUtils::getStackTrace error.)
Unable to determine the configured value for hadoop.fuse.timer.period.ERROR 
/opt/hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:134
 FATAL: dfs_init: fuseConnectInit failed with error -22!
ERROR 
/opt/hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
 
LD_LIBRARY_PATH=/usr/lib/jdk/jre/lib/amd64/server:/usr/lib/hadoop/c++/Linux-amd64-64/lib:/usr/local/lib:/usr/lib:
ERROR 
/opt/hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
 CLASSPATH=:/usr/lib/phoenix:find:/usr/lib/hadoop:-name:*.jar


What is wrong with my fuse-hdfs?Any idea will be appreciated.












Best wishes.


San.Luo
Celloud

Mailbox:[email protected]
Website:www.celloud.org
Address:2109,Building 4,Zhubang 2000, Chao Yang Road,Chao Yang 
District,Beijing,China
 ZipCode:100020








在2015年07月24 13时56分,"罗辉"<[email protected]>写道:


Hi chris and all:
        I am also trying to access and operate HDFS in perl application via 
fuse-dfs. How to do this successfully?
        I installed hadoop-2.7.0-src and ran "mvn clean package -Pnative 
-Drequire.fuse=true -DskipTests -Dmaven.javadoc.skip=true" successfully. 
However failed to run "fuse_dfs_wrapper.sh dfs://master:9000 /export/hdfs -d" 
with a error message "./fuse_dfs_wrapper.sh: line 54: fuse_dfs: command not 
found". I checked line 54 in fuse_dfs_wrapper.sh, it is "fuse_dfs $@".
         in hadoop2.7.0 version, there is not  "fuse_dfs.sh" in the path 
"hadoop-2.7.0-src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs".
        Any idea to solve this problem?
       thanks.













Best wishes.


San.Luo
Celloud










在2015年07月23 03时18分,"Chris Nauroth"<[email protected]>写道:

The only fuse-dfs documentation I'm aware of is here:


https://github.com/apache/hadoop/tree/release-2.7.1/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/doc


(This is a link into the source for the recent 2.7.1 release.)


Unfortunately, this is somewhat outdated.  I can tell because the build command 
shows ant, but we've converted to Maven.  Running this command would build it:


mvn clean package -Pnative -Drequire.fuse=true -DskipTests 
-Dmaven.javadoc.skip=true


If you need more information on setting up a Hadoop build environment, see 
BUILDING.txt in the root of the project's source tree.


--Chris Nauroth




From: Caesar Samsi <[email protected]>
Date: Wednesday, July 22, 2015 at 11:20 AM
To: Chris Nauroth <[email protected]>, "[email protected]" 
<[email protected]>
Subject: RE: fuse-dfs



Hi Chris, and all,
 
Fuse-Dfs is of interest also, would you have a getting started link to it? The 
link I have is rather old from version 0.x.
 
Thank you! Caesar.
 
From: Chris Nauroth [mailto:[email protected]]
Sent: Wednesday, July 22, 2015 11:56 AM
To: [email protected]
Subject: Re: hadoop-hdfs-fuse missing?


 
Hello Caesar,

 

Since this is a question specific to a vendor's distribution and how to consume 
their packaging, I recommend taking the question to the vendor's own forums.  
Questions about how to use fuse-dfs or build it from Apache source definitely 
would be a good fit for this list though.

 

Thank you!

 

--Chris Nauroth



 

From: Caesar Samsi <[email protected]>
Reply-To: "[email protected]" <[email protected]>
Date: Tuesday, July 21, 2015 at 5:10 PM
To: "[email protected]" <[email protected]>
Subject: hadoop-hdfs-fuse missing?

 

Hi,
 
I’m trying to install hadoop-hdfs-fuse package on a Ubuntu machine.
 
I’ve added the cloudera repository deb 
[arch=amd64]http://archive.cloudera.com/cm5/ubuntu/trusty/amd64/cm trusty-cm5 
contrib
 
Also done sudo apt-get update
 
When I do sudo apt-get install hadoop-hdfs-fuse I get an “Unable to locate 
package” error.
 
Did I use the right repository? If not, what is the correct one?
 
Thank you, Caesar.
 









Reply via email to