We are using hadoop for multiple users and the DFS is using a shared
directory for data as noted by FAQ #13.  Is there a way to have hadoop
use a different classpath per job?  
 
Currently if I startup the hadoop instance with no script modifications,
and then run a job "bin/hadoop <classname> <params>"; I get a class not
found exception.  Setting the HADOOP_CLASSPATH variable to include my
user's classes and libraries doesn't seem to work.  The only way I can
get it to work is to shutdown hadoop and start it up including my user's
HADOOP_CLASSPATH.
 
If this has been asked an answered before feel free just to point me to
the previous chain.
 
Thanks,
 
-Xavier

Reply via email to