I have logged a comment in
https://issues.apache.org/jira/browse/HADOOP-4829which is related to
IllegalStateException that I saw when Cache.remove()
tried to remove shutdown hook in the process of JVM shutting down.

Cheers

On Wed, Mar 10, 2010 at 11:00 AM, Todd Lipcon <[email protected]> wrote:

> Hi,
>
> The issue here is that Hadoop itself uses a shutdown hook to close all open
> filesystems when the JVM shuts down. Since JVM shutdown hooks don't have a
> specified order, you shouldn't access Hadoop filesystem objects from a
> shutdown hook.
>
> To get around this you can use the fs.automatic.close configuration
> variable
> (provided by this patch: https://issues.apache.org/jira/browse/HADOOP-4829)
> to disable the Hadoop shutdown hook. This patch is applied to CDH2 (or else
> you'll have to apply it manually)
>
> Note that if you disable the shutdown hook, you'll need to manually close
> the filesystems using FileSystem.closeAll
>
> Thanks
> -Todd
>
> On Tue, Mar 9, 2010 at 9:39 PM, Silllllence <[email protected]> wrote:
>
> >
> > Hi fellows
> > Below code segment add a shutdown hook to JVM, but when I got a strange
> > exception,
> > java.lang.IllegalStateException: Shutdown in progress
> >        at
> > java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
> >        at java.lang.Runtime.addShutdownHook(Runtime.java:192)
> >        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1387)
> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
> >        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
> >        at young.Main$1.run(Main.java:21)
> > Java doc said this exception is threw when the virtual machine is already
> > in
> > the process of shutting down, (http://java.sun.com/j2se/1.5.0/docs/api/
> ),
> > what does this mean? Why this happen? How to fix ?
> > I'm really appreciate if you can try this code, and help me to figure out
> > what's going on here, thank you !
> >
> >
> ---------------------------------------------------------------------------------------
> > import org.apache.hadoop.conf.Configuration;
> > import org.apache.hadoop.fs.FileSystem;
> > import org.apache.hadoop.fs.Path;
> > import org.apache.hadoop.mapred.JobConf;
> >
> > @SuppressWarnings("deprecation")
> > public class Main {
> >
> >        public static void main(String[] args) {
> >                Runtime.getRuntime().addShutdownHook(new Thread() {
> >                        @Override
> >                        public void run() {
> >                                Path path = new
> Path("/temp/hadoop-young");
> >                                System.out.println("Thread run : " +
> path);
> >                                Configuration conf = new JobConf();
> >                                FileSystem fs;
> >                                try {
> >                                        fs = path.getFileSystem(conf);
> >                                        if(fs.exists(path)){
> >                                                fs.delete(path);
> >                                        }
> >                                } catch (Exception e) {
> >
>  System.err.println(e.getMessage());
> >                                        e.printStackTrace();
> >                                }
> >                        };
> >                });
> >        }
> > }
> > --
> > View this message in context:
> >
> http://old.nabble.com/%28Strange%21%29getFileSystem-in-JVM-shutdown-hook-throws-shutdown-in-progress-exception-tp27845803p27845803.html
> > Sent from the Hadoop core-user mailing list archive at Nabble.com.
> >
> >
>
>
> --
> Todd Lipcon
> Software Engineer, Cloudera
>

Reply via email to