I've got some temporary files on DFS that get used by the DistributedCache mechanism (they're zipped JRuby files). Once the job is done, they can be deleted. Is there a way to tell Hadoop that? Right now I'm just deleting them myself at the end of the job, but that code isn't guaranteed to execute.
-- James Moore | [EMAIL PROTECTED] Ruby and Ruby on Rails consulting blog.restphone.com
