Phantom wrote:
When I submit jobs in Hadoop how do the physical class files get distributed
to the nodes on which the Map/Reduce jobs run ? Is some kind of dynamic
class loading used or are the jar files copied to the machines where they
are needed ?

The job's jar file is unpacked in the directory that tasks are connected to when they run. The classpath of the task JVM includes all jars in the lib/ directory in the job's jar, the classes/ directory from the jar, and the top-level directory of the jar.

Doug

Reply via email to