Hi Hemanth,

Thanks for your kind response and support. But what if I am using third party API and it also uses the java IO File ?? I think there must be some way to use hdfs by default with changing the code !!

Thanks,
Amit Kumar Verma
Verchaska Infotech Pvt. Ltd.



On 07/09/2010 03:18 PM, Hemanth Yamijala wrote:
Amit,

On Fri, Jul 9, 2010 at 3:00 PM, amit kumar verma<[email protected]>  wrote:
  Hi  Hemanth,

Yeah I have gone through the api documentation and there is no issue in
accessing files from HDFS, but my concern is what about the API which
already got developed without hadoop. OK, what I mean, I developed an
application when I didn't know about the hadoop, but as now I need to
implement grid environment so I am looking for Hadoop.

So no the question is, how can use the same code to work for HDFS, do I need
to change my code and use hadoop API to used the HDFS. If that is the case
then the change will be major, or there is any way where the default
java.file can be integrated with hdfs.

Did you get the issue ??

Yes, I think I do. Unfortunately, AFAIK, there's no easy way out. If
your application had previously used Java I/O File APIs, they need to
be migrated to the Hadoop FS API. If you are moving from a
non-Distributed application to Hadoop for a reason (such as handling
scale for e.g.) the investment will be well worth the effort, IMHO.

Thanks,
Amit Kumar Verma
Verchaska Infotech Pvt. Ltd.



On 07/09/2010 02:47 PM, Hemanth Yamijala wrote:
Amit,

On Fri, Jul 9, 2010 at 2:39 PM, amit kumar verma<[email protected]>
  wrote:
  Hi Hemant,

The version are same as copied it to all client machine.

I think I got a solution. As I read more about hadoop and JNI, I learned
that I need to copy jni files to
HADOOP_INSTALLATION_DIR//lib/native/Linux-xxx-xxx. I though my linux
machine
is Linux-i386-32. then I found in "org.apache.hadoop.util.PlatformName"
class gives you your machine type and its Linux-amd64-64 and asa I copied
jni files to this directory error are not coming.

Though full code is still not running as I developed the application
using
java.file class and i am still thinking how to make changes so that it
can
access hdfs !!!  Do i need to change my all API with respect to HDFS and
rewrite using hadoop fs or ??!!!

To access files from HDFS, you should use the Hadoop FileSystem API.
Please take a look at the Javadoc and also a tutorial such as this:
http://developer.yahoo.com/hadoop/tutorial/module2.html#programmatically
for more information.

It will be great if someone advice on this.



Thanks,
Amit Kumar Verma
Verchaska Infotech Pvt. Ltd.



On 07/09/2010 02:04 PM, Hemanth Yamijala wrote:
Hi,

Possibly another silly question, but can you cross check if the
versions of Hadoop on the client and the server are the same ?

Thanks
hemanth

On Thu, Jul 8, 2010 at 10:57 PM, Allen Wittenauer
<[email protected]>      wrote:
On Jul 8, 2010, at 1:08 AM, amit kumar verma wrote:

     DistributedCache.addCacheFile("hdfs://*
     /192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);
Do you actually have asterisks in this?  If so, that's the problem.


Reply via email to