Re: Passing variables from dataset to workflow in Coordinator

2020-08-18 Thread Lars Francke
Hi Peter,

great, thanks for the answer. I was afraid that'd be the case.

Cheers,
Lars

On Mon, Aug 17, 2020 at 4:14 PM Peter Cseh 
wrote:

> Hey!
>
> Unfortunately there is no way I can recall to do this more easily.
> But yeah, it would make sense to have a function for this.
>
> gp
>
> On Mon, Aug 10, 2020 at 5:27 PM Lars Francke 
> wrote:
>
> > Hi,
> >
> > I have a simple coordinator with a single dataset:
> >
> >
> >
> ${hadoop_nameNode}/${coord_stagingFolder}/${YEAR}/${MONTH}/${DAY}
> >
> > I also have a corresponding input-event:
> >
> > 
> >   ${coord:current(-1)}
> > 
> >
> > Now in my workflow I need to pass in the ${YEAR}, MONTH and DAY variables
> > as properties.
> > So far we've always used "fake"  and basically template
> our
> > variables that way (so used data-out things that don't actually
> correspond
> > to a folder anywhere).
> >
> > What's the correct way of accessing the data we need?
> >
> > We can also parse the output of ${coord:dataIn('foo_event')} but that
> seems
> > a bit redundant.
> > I hope I'm missing something.
> >
> > Thank you for your help!
> >
> > Cheers,
> > Lars
> >
>
>
> --
> *Peter Cseh* | Software Engineer, Cloudera Search
> cloudera.com 
> [image: Cloudera] 
> [image: Cloudera on Twitter]  [image:
> Cloudera on Facebook]  [image: Cloudera
> on LinkedIn] 
> 
> --
>


Re: Passing variables from dataset to workflow in Coordinator

2020-08-17 Thread Peter Cseh
Hey!

Unfortunately there is no way I can recall to do this more easily.
But yeah, it would make sense to have a function for this.

gp

On Mon, Aug 10, 2020 at 5:27 PM Lars Francke  wrote:

> Hi,
>
> I have a simple coordinator with a single dataset:
>
>
> ${hadoop_nameNode}/${coord_stagingFolder}/${YEAR}/${MONTH}/${DAY}
>
> I also have a corresponding input-event:
>
> 
>   ${coord:current(-1)}
> 
>
> Now in my workflow I need to pass in the ${YEAR}, MONTH and DAY variables
> as properties.
> So far we've always used "fake"  and basically template our
> variables that way (so used data-out things that don't actually correspond
> to a folder anywhere).
>
> What's the correct way of accessing the data we need?
>
> We can also parse the output of ${coord:dataIn('foo_event')} but that seems
> a bit redundant.
> I hope I'm missing something.
>
> Thank you for your help!
>
> Cheers,
> Lars
>


-- 
*Peter Cseh* | Software Engineer, Cloudera Search
cloudera.com 
[image: Cloudera] 
[image: Cloudera on Twitter]  [image:
Cloudera on Facebook]  [image: Cloudera
on LinkedIn] 

--


Passing variables from dataset to workflow in Coordinator

2020-08-10 Thread Lars Francke
Hi,

I have a simple coordinator with a single dataset:

${hadoop_nameNode}/${coord_stagingFolder}/${YEAR}/${MONTH}/${DAY}

I also have a corresponding input-event:


  ${coord:current(-1)}


Now in my workflow I need to pass in the ${YEAR}, MONTH and DAY variables
as properties.
So far we've always used "fake"  and basically template our
variables that way (so used data-out things that don't actually correspond
to a folder anywhere).

What's the correct way of accessing the data we need?

We can also parse the output of ${coord:dataIn('foo_event')} but that seems
a bit redundant.
I hope I'm missing something.

Thank you for your help!

Cheers,
Lars