Hello all,
I am having a bit of a problem with a seemingly simple problem. I would like
to have some global variable which is a byte array that all of my map tasks
have access to. The best way that I currently know of to do this is to have
a file sitting on the DFS and load that into each map task (note: the global
variable is very small ~20kB). My problem is that I can't seem to load any
file from the Hadoop DFS into my program via the API. I know that the
DistributedFileSystem class has to come into play, but for the life of me I
can't get it to work. 

I noticed there is an initialize() method within the DistributedFileSystem
class, and I thought that I would need to call that, however I'm unsure what
the URI parameter ought to be. I tried "localhost:50070" which stalled the
system and threw a connectionTimeout error. I went on to just attempt to
call DistributedFileSystem.open() but again my program failed this time with
a NullPointerException. I'm assuming that is stemming from he fact that my
DFS object is not "initialized".

Does anyone have any information on how exactly one programatically goes
about loading in a file from the DFS? I would greatly appreciate any help.

Cheers,
Sean M. Arietta
-- 
View this message in context: 
http://www.nabble.com/Global-Variables-via-DFS-tp18115661p18115661.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Reply via email to