Thank you for the feedback.  I'm not hearing objections, so I'll go ahead
and draft a PR for ValueSupplier (possibly renamed to ValueProvider).

Cheers,
Sam

On Wed, Aug 10, 2016 at 3:50 PM, Sam McVeety <[email protected]> wrote:

> We can probably build a "real" case around the TextIO boilerplate -- say
> that a user wants to regularly run a Beam job with a different input path
> according to the day.  TextIO would be modified to support a dynamic value:
>
> TextIO.Read.withFilepattern(ValueSupplier<String>);
>
> ... and then the pipeline author would supply this via their own option:
>
> class MyPIpelineOptions extends PipelineOptions {
> @Default.RuntimeValueSupplier<String>("gs://bar")
> RuntimeValueSupplier<String> getInputPath();
> setInputPath(RuntimeValueSupplier<String>);
> }
>
> At this point, the same job graph could be reused with different values
> for --inputPath.
>
>
> Cheers,
> Sam
>
> On Wed, Aug 10, 2016 at 12:17 PM, Ismaël Mejía <[email protected]> wrote:
>
>> +1 It sounds really nice, (4) is by far the most consistent with the
>> current Options implementation.
>> One extra thing, maybe it is a good idea to sketch a 'real' use case to
>> make the concepts/need more evident.
>>
>> Ismaël
>>
>> On Tue, Aug 9, 2016 at 8:49 PM, Sam McVeety <[email protected]>
>> wrote:
>>
>> > Thanks, Amit and JB.  Amit, to your question: the intention with
>> > availability to PTransforms is provide the ValueProvider abstraction
>> (which
>> > may be implemented on top of PipelineOptions) so that they do not take a
>> > dependency on PipelineOptions.
>> >
>> > Cheers,
>> > Sam
>> >
>> > On Mon, Aug 8, 2016 at 12:26 AM, Jean-Baptiste Onofré <[email protected]>
>> > wrote:
>> >
>> > > +1
>> > >
>> > > Thanks Sam, it sounds interesting.
>> > >
>> > > Regards
>> > > JB
>> > >
>> > >
>> > > On 07/29/2016 09:14 PM, Sam McVeety wrote:
>> > >
>> > >> During the graph construction phase, the given SDK generates an
>> initial
>> > >> execution graph for the program.  At execution time, this graph is
>> > >> executed, either locally or by a service.  Currently, Beam only
>> supports
>> > >> parameterization at graph construction time.  Both Flink and Spark
>> > supply
>> > >> functionality that allows a pre-compiled job to be run without SDK
>> > >> interaction with updated runtime parameters.
>> > >>
>> > >> In its current incarnation, Dataflow can read values of
>> PipelineOptions
>> > at
>> > >> job submission time, but this requires the presence of an SDK to
>> > properly
>> > >> encode these values into the job.  We would like to build a common
>> layer
>> > >> into the Beam model so that these dynamic options can be properly
>> > provided
>> > >> to jobs.
>> > >>
>> > >> Please see
>> > >> https://docs.google.com/document/d/1I-iIgWDYasb7ZmXbGBHdok_I
>> > >> K1r1YAJ90JG5Fz0_28o/edit
>> > >> for the high-level model, and
>> > >> https://docs.google.com/document/d/17I7HeNQmiIfOJi0aI70tgGMM
>> > >> kOSgGi8ZUH-MOnFatZ8/edit
>> > >> for
>> > >> the specific API proposal.
>> > >>
>> > >> Cheers,
>> > >> Sam
>> > >>
>> > >>
>> > > --
>> > > Jean-Baptiste Onofré
>> > > [email protected]
>> > > http://blog.nanthrax.net
>> > > Talend - http://www.talend.com
>> > >
>> >
>>
>
>

Reply via email to