<property> <name>keep.task.files.pattern</name> <value>.*_m_123456_0</value> <description>Keep all files from tasks whose task names match the given regular expression. Defaults to none.</description> </property>
On Sun, Nov 22, 2009 at 11:46 PM, Amogh Vasekar <am...@yahoo-inc.com> wrote: > Hi, > keep.tasks.files.pattern is what you need, as the name suggests its a > pattern match on intermediate outputs generated. > > Wrt to copying map data to hdfs, your mappers close() method should help > you achieve this, but might slow up your tasks. > > Amogh > > > On 11/23/09 8:08 AM, "Jeff Zhang" <zjf...@gmail.com> wrote: > > Hi Jason, > > which option is for setting disable the removal of intermediate data ? > > Thank you > > Jeff Zhang > > > On Mon, Nov 23, 2009 at 10:27 AM, Jason Venner <jason.had...@gmail.com > >wrote: > > > You can manually write the map output to a new file, there are a number > of > > examples of opening a sequence file and writing to it on the web or in > the > > example code for various hadoop books. > > > > You can also disable the removal of intermediate data, which will result > in > > potentially large amounts of data being left in the mapred.local.dir. > > > > > > On Sun, Nov 22, 2009 at 3:56 PM, Gordon Linoff <glin...@gmail.com> > wrote: > > > > > I am starting to learn Hadoop, using the Yahoo virtual machine with > > version > > > 0.18. > > > > > > My question is rather simple. I would like to execute a map/reduce > job. > > > In > > > addition to getting the results from the reduce, I would also like to > > save > > > the intermediate results from the map in another HDFS file. Is this > > > possible? > > > > > > --gordon > > > > > > > > > > > -- > > Pro Hadoop, a book to guide you from beginner to hadoop mastery, > > http://www.amazon.com/dp/1430219424?tag=jewlerymall > > www.prohadoopbook.com a community for Hadoop Professionals > > > > -- Pro Hadoop, a book to guide you from beginner to hadoop mastery, http://www.amazon.com/dp/1430219424?tag=jewlerymall www.prohadoopbook.com a community for Hadoop Professionals