Hi,

Just figured out that if I want to perform graceful shutdown of Spark
Streaming 1.4 ( from master ) , the Runtime.getRuntime().addShutdownHook no
longer works . As in Spark 1.4 there is Utils.addShutdownHook defined for
Spark Core, that gets anyway called , which leads to graceful shutdown from
Spark streaming failed with error like "Sparkcontext already closed" issue.

To solve this , I need to explicitly add Utils.addShutdownHook in my driver
with higher priority ( say 150 ) than Spark's shutdown priority of 50 , and
there I specified streamingcontext stop method with (false , true)
parameter.

Just curious to know , if this is how we need to handle shutdown hook going
forward ?

Can't we make the streaming shutdown default to gracefully  shutdown ?

Also the Java Api for adding shutdownhook in Utils looks very dirty with
methods like this ..



Utils.addShutdownHook(150, new Function0<BoxedUnit>() {
 @Override
public BoxedUnit apply() {
return null;
}

@Override
public byte apply$mcB$sp() {
return 0;
}

@Override
public char apply$mcC$sp() {
return 0;
}

@Override
public double apply$mcD$sp() {
return 0;
}

@Override
public float apply$mcF$sp() {
return 0;
}

@Override
public int apply$mcI$sp() {
// TODO Auto-generated method stub
return 0;
}

@Override
public long apply$mcJ$sp() {
return 0;
}

@Override
public short apply$mcS$sp() {
return 0;
}

@Override
public void apply$mcV$sp() {
 *jsc.stop(false, true);*
 }

@Override
public boolean apply$mcZ$sp() {
// TODO Auto-generated method stub
return false;
}
});

Reply via email to