Hello all,

I have one file on local disk with thousands of lines each representing
valid JSON object.
My flow is like this:

GetFile > SplitText > PartitionRecord ( based on a key ) >  MergeRecord >
PutElasticSearchRecord

This works well, however, I seem to bottleneck at PartitionRecord

So I looked at using
GetFile > ConvertRecord > SplitRecord > PartitionRecord

But it seems to only convert the first line of the content from my GetFile.

Am I missing something?

I have a bottleneck that could very well be a system resource issue, but
still, what is the best way to take a file with lines of JSON and convert
them into records? I assume its through the record readers and writers, and
then its implied that it converts it "object" based on the AvroSchema ( in
my case)?

Reply via email to