I thought this might be because that hadoop wants to pack everything (including the -files dfs cache files) into one single jar, so I removed the -files commands I have.
but it still extracts the jar. this is rather confusing On Fri, Oct 24, 2014 at 11:51 AM, Yang <[email protected]> wrote: > I just noticed that when I run a "hadoop jar > my-fat-jar-with-all-dependencies.jar" , it unjars the job jar in > /tmp/hadoop-username/hadoop-unjar-xxxx/ and extracts all the classes in > there. > > the fat jar is pretty big, so it took up a lot of space (particularly > inodes ) and ran out of quota. > > I wonder why do we have to unjar these classes on the **client node** ? > the jar won't even be accessed until on the compute nodes, right? >
