On 04/04/2011 07:06 PM, Allen Wittenauer wrote:

On Apr 4, 2011, at 8:06 AM, Shuja Rehman wrote:

Hi All

I have created a map reduce job and to run on it on the cluster, i have
bundled all jars(hadoop, hbase etc) into single jar which increases the size
of overall file. During the development process, i need to copy again and
again this complete file which is very time consuming so is there any way
that i just copy the program jar only and do not need to copy the lib files
again and again. i am using net beans to develop the program.

kindly let me know how to solve this issue?

        This was in the FAQ, but in a non-obvious place.  I've updated it to be 
more visible (hopefully):

http://wiki.apache.org/hadoop/FAQ#How_do_I_submit_extra_content_.28jars.2C_static_files.2C_etc.29_for_my_job_to_use_during_runtime.3F

Does the same apply to jar containing libraries? Let's suppose I need lucene-core.jar to run my project. Can I put my this jar into my job jar and have hadoop "see" lucene's classes? Or should I use distributed cache??

MD

Reply via email to