Ok, I've got "split" working but that still leaves me with the dilemma: i
have to write multiple rows for "mylist" into the same table... Note that I
am working with db (Phoenix Query Server)  that doesn't support "batching"
of multiple records in one insert as mysql. Any advise?


On Wed, Sep 5, 2018 at 10:04 PM Matt Burgess <[email protected]> wrote:

> V,
>
> Perhaps ironically but for the greater good, NiFi (IMHO, with record
> processors) performs better when you don't touch the content but just
> describe what you want from it. So in your case you could send the
> content to two different branches, one could be a PutDatabaseRecord
> with a reader schema that ignores the "mylist" element, and the other
> could use Jolt / SplitRecord / ForkRecord to "hoist" the fields from
> the "mylist" element into a root array/RecordSet, then use
> PutDatabaseRecord to process it. Then you don't need a "split" or "get
> into attribute" or any other operation that isn't in line with your
> business logic.
>
> Regards,
> Matt
>
> On Fri, Aug 31, 2018 at 7:34 PM l vic <[email protected]> wrote:
> >
> > Hi,
> > I have a json record that contains array "mylst":
> > {
> >     "id": 0,
> >     "name": "Root",
> >      "mylist": [{
> >       "id": 10,
> >       "info": "2am-3am"
> >     },
> >     {
> >         "id": 11,
> >         "info": "3AM-4AM"
> >     },
> >     {
> >         "id": 12,
> > "info": "4am-5am"
> >     }]
> >
> > }
> > I have to save root data into one db table and array into another... Can
> someone recommend an approach to "splitting" of record for 2 different
> database writers?
> > Thank you,
> > V
>

Reply via email to