Ah, makes sense. Hopefully GetFile -> ValidateRecord -> SplitRecord ->
downstream will work for you then, let us know if you run into any
issues!

Thanks,
Matt
On Mon, Aug 13, 2018 at 9:13 AM saloni udani <[email protected]> wrote:
>
> Thanks Matt.
>
> Validaterecord would be useful. I am using SplitRecord to create batches for 
> the large input files for efficiency of processing.
>
> On Mon, Aug 13, 2018 at 6:02 PM, Matt Burgess <[email protected]> wrote:
>>
>> You can use ValidateRecord (with a CSVReader and JSONRecordSetWriter,
>> and another "invalid CSV Reader" for invalid records) for that, then
>> SplitRecord if you need it. However if you can describe your
>> downstream flow, perhaps we can help you avoid the need to split the
>> records at all (unless you are using a downstream processor that only
>> handles one record/JSON object at a time).
>>
>> Regards,
>> Matt
>>
>> On Mon, Aug 13, 2018 at 7:41 AM saloni udani <[email protected]> 
>> wrote:
>> >
>> > Hi
>> >
>> > I have a bunch of CSV files which I need to convert to JSON.
>> > My current flow is
>> >
>> > GetFile --> SplitRecord (CSVReader and JSONRecordSetWriter)
>> >
>> >
>> > The issue is if the csv contains an invalid records then the file gets 
>> > stuck in the queue. Is there a way to discard the invalid CSV lines 
>> > encountered to failure relationship? Documentation says that only those 
>> > records will be routed to failure which fails csv-->json conversion. But 
>> > here I want even the invalid csv records to be routed to failure.
>> >
>> >
>> > Thanks
>> > Saloni Udani
>
>

Reply via email to