Just do:

$ echo "DIR\t/foo/bar/directory" > file
$ hadoop -put file hfile

And you got yourself a file.

On 12/17/07 7:10 PM, "Jim the Standing Bear" <[EMAIL PROTECTED]> wrote:

> Hi Ted,
> 
> I guess I didn't make it clear enough.  I don't have a file to start
> with.  When I run the program, I pass in an argument.  The program,
> before doing its map/red jobs, is supposed to create a file on the
> DFS, and saves whatever I just passed in.  And my trouble is, I am not
> sure how to create such a file so that both the key and values are
> clear Text, and they can subsequently be read by
> KeyValueTextInputFormat.
> 
> On Dec 17, 2007 10:07 PM, Ted Dunning <[EMAIL PROTECTED]> wrote:
>> 
>> 
>> I thought that is what your input file already was.  The
>> KeyValueTextInputFormat should read your input as-is.
>> 
>> When you write out your intermediate values, just make sure that you use
>> TextOutputFormat and put "DIR" as the key and the directory name as the
>> value (same with files).
>> 
>> 
>> 
>> On 12/17/07 6:46 PM, "Jim the Standing Bear" <[EMAIL PROTECTED]> wrote:
>> 
>>> With KeyValueTextInputFormat, the problem is not reading it - I know
>>> how to set the separator byte and all that... my problem is with
>>> creating the very first file - I simply don't know how.
>> 
>> 
> 
> 

Reply via email to