Hi all,
I fixed the previous issue but now I am getting this :

Entry in /etc/fstab :
fuse_dfs#dfs://7071bcce81d9:54310 /home/jony/FreshHadoop/mnt fuse 
-oallow_other,rw,-ousetrash 0 0

$ sudo mount /home/jony/FreshHadoop/mnt
port=54310,server=7071bcce81d9
fuse-dfs didn't recognize /home/jony/FreshHadoop/mnt,-2
fuse-dfs ignoring option -oallow_other
fuse-dfs ignoring option -ousetrash
fuse-dfs ignoring option dev
fuse-dfs ignoring option suid
fuse: unknown option `-oallow_other'

I am able to mount with "fuse_dfs_wrapper.sh" script. Any Ideas??

Thanks
-----Original Message-----
From: Alexander Lorenz [mailto:wget.n...@googlemail.com] 
Sent: Wednesday, January 04, 2012 8:37 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Hi Stuti,

Do a search for libhdfs.so* and do also an ldd /path/to/fuse_dfs. Could be that 
only a symlink is missing. With ldd you will see which libraries the binary 
wants, if the libhdfs.so.1 is not in the path export the path where you found 
it. 

- Alex

Alexander Lorenz
http://mapredit.blogspot.com

On Jan 4, 2012, at 4:08 AM, Stuti Awasthi <stutiawas...@hcl.com> wrote:

> I have already exported it in the env. Output of "export" command. 
> 
> declare -x 
> LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/home/jony/FreshHadoop/hadoop-0.20.2/build/libhdfs:/usr/lib/jvm/java-6-openjdk/jre/lib/i386/server/:/usr/lib/libfuse.so"
> 
> Stuti
> ________________________________________
> From: Harsh J [ha...@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:34 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti,
> 
> Your env needs to carry this:
> 
> export 
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/
> present
> 
> Otherwise the fuse_dfs binary won't be able to find and load it. The 
> wrapper script does this as part of its setup if you read it.
> 
> On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <stutiawas...@hcl.com> wrote:
>> Im able to mount using command :
>> 
>> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>> 
>> -----Original Message-----
>> From: Stuti Awasthi
>> Sent: Wednesday, January 04, 2012 5:24 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: RE: Mounting HDFS
>> 
>> Harsh,
>> 
>> Output of $file `which fuse_dfs`
>> 
>> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 
>> (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, 
>> not stripped
>> 
>> Same output for $ file /sbin/fuse_dfs
>> 
>> Thanks
>> ________________________________________
>> From: Harsh J [ha...@cloudera.com]
>> Sent: Wednesday, January 04, 2012 5:18 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>> 
>> Stuti,
>> 
>> My original command was "file `which fuse_dfs`", and not just the which 
>> command.
>> 
>> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' 
>> available (its mostly present).
>> 
>> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>> 
>>> Hi Harsh,
>>> 
>>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>> 
>>> Output of : $ which fuse_dfs
>>> /sbin/fuse_dfs
>>> 
>>> I searched on net and I got this url 
>>> "http://wiki.apache.org/hadoop/MountableHDFS";
>>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, 
>>> can you please guide me more on this .
>>> 
>>> Thanks
>>> 
>>> -----Original Message-----
>>> From: Harsh J [mailto:ha...@cloudera.com]
>>> Sent: Wednesday, January 04, 2012 4:51 PM
>>> To: hdfs-user@hadoop.apache.org
>>> Subject: Re: Mounting HDFS
>>> 
>>> Stuti,
>>> 
>>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs 
>>> for?
>>> 
>>> What's the output of the following?
>>> $ file `which fuse_dfs`
>>> 
>>> FWIW, the most hassle free way to do these things today is to use proper 
>>> packages available for your platform, instead of compiling it by yourself. 
>>> Just a suggestion.
>>> 
>>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>>> I have successfully followed the steps till "Installing" and I am able 
>>>> mount it properly. After that I am trying with "Deploying" step and 
>>>> followed the steps:
>>>> 
>>>> 1. add the following to /etc/fstab
>>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse 
>>>> -oallow_other,rw,-ousetrash 0 0
>>>> 
>>>> 2. added fuse_dfs to /sbin
>>>> 
>>>> 3. export 
>>>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>> 
>>>> 4. Mount using: mount /export/hdfs.
>>>> 
>>>> But getting error :
>>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open 
>>>> shared object file: No such file or directory.
>>>> 
>>>> How to fix this ?
>>>> 
>>>> Thanks
>>>> 
>>>> ::DISCLAIMER::
>>>> -------------------------------------------------------------------
>>>> --
>>>> -
>>>> -------------------------------------------------
>>>> 
>>>> The contents of this e-mail and any attachment(s) are confidential and 
>>>> intended for the named recipient(s) only.
>>>> It shall not attach any liability on the originator or HCL or its 
>>>> affiliates. Any views or opinions presented in this email are solely those 
>>>> of the author and may not necessarily reflect the opinions of HCL or its 
>>>> affiliates.
>>>> Any form of reproduction, dissemination, copying, disclosure, 
>>>> modification, distribution and / or publication of this message 
>>>> without the prior written consent of the author of this e-mail is 
>>>> strictly prohibited. If you have received this email in error please 
>>>> delete it and notify the sender immediately. Before opening any mail and 
>>>> attachments please check them for viruses and defect.
>>>> 
>>>> -------------------------------------------------------------------
>>>> --
>>>> -
>>>> -------------------------------------------------
>>> 
>> 
> 
> 
> 
> --
> Harsh J

Reply via email to