Hi,

If the pig process is not closed ,try lsof  may helpJ

The files used by one process 

/proc/<pid>/fd/<file>,use lsof to find full path ,copy the file in memory to 
disk.

God bless you.

From: feng jiang [mailto:[email protected]] 
Sent: Thursday, June 13, 2013 5:33 PM
To: [email protected]; [email protected]
Subject: Re: recovery accidently deleted pig script

 

The script is on local file system. it's on linux box.

 

I totally agree we need version control for the source code. this is a good 
example to show the importance of version control.

 

Thank you Michael and Chris for your inputs anyway.

 

 

On Thu, Jun 13, 2013 at 10:35 AM, Chris Embree <[email protected]> wrote:

This is not a Hadoop question (IMHO).

2 words:  Version Control

Did the advent of Hadoop somehow circumvent all IT convention?

Sorry folks, it's been a rough day.


On 6/12/13, Michael Segel <[email protected]> wrote:
> Where was the pig script? On HDFS?
>
> How often does your cluster clean up the trash?
>
> (Deleted stuff doesn't get cleaned up when the file is deleted... ) Its a
> configurable setting so YMMV
>
> On Jun 12, 2013, at 8:58 PM, feng jiang <[email protected]> wrote:
>
>> Hi everyone,
>>
>> We have a pig script scheduled running every 4 hours. Someone accidentally
>> deleted the pig script(rm). Is there any way to recover the script?
>>
>> I am guessing Hadoop copy the program to every nodes before running. Just
>> in case it has any copy in the nodes.
>>
>>
>> Best regards,
>> Feng Jiang
>
>





 

-- 
Best regards,
Feng Jiang 

Reply via email to