I face the same problem and decided to go with your third solution. I use 
Groovy as the scripting language, which has access to Java classes and 
therefore also to Flink constructs like Time.seconds(10). See below for an 
example of a pattern definition with Groovy:

private static Binding bind = new Binding();
private static GroovyShell gs = new GroovyShell(bind);

Pattern<BDS,?> dynPattern = Pattern
bds -> {
                bind.setVariable("bds", bds);
                Object ergPat = gs.evaluate(two.getWhere());
                return ( (ergPat instanceof Boolean)) ?(Boolean)ergPat : false;
            .within((gs.evaluate(two.getWithin()) instanceof Time) ? 
(Time)gs.evaluate(two.getWithin()) : null);

Sorry, but I'm not aware on how to format this nicely.
I don't know, if it's the best way, but it works :)


-----Urspr√ľngliche Nachricht-----
Von: PedroMrChaves [mailto:pedro.mr.cha...@gmail.com] 
Gesendet: Mittwoch, 12. Oktober 2016 10:32
An: user@flink.apache.org
Betreff: Re: What is the best way to load/add patterns dynamically (at runtime) 
with Flink?

I've been thinking in several options to solve this problem:

1. I can use Flink savepoints in order to save the application state , change 
the jar file and submit a new job (as the new jar file with the patterns 
added/changed). The problem in this case is to be able to correctly handle the 
savepoints and because I must stop and start the job, the events will be 

2. I can compile java code at runtime using using java compiler library I don't 
know if this would be a viable solution.

3. I can use a scripting language like you did, but I would lose the ability to 
use the native Flink library which is available in scala and java. 

View this message in context: 
Sent from the Apache Flink User Mailing List archive. mailing list archive at 

Reply via email to