If you can provide a sample csv file ( cleaned of proprietary data ), and a
sample flow attached to a jira it could help, in the event nobody has seen
these issues.


On July 3, 2018 at 10:19:02, Jeremy Taylor ([email protected])
wrote:

Greetings,

My team and I have been using the SplitRecord processor to to convert CSV
to JSON and to later ingest it for about a year.  I’m noticing some things
that I feel worked a lot better before.  I just don’t know exactly when
“before” was, but let’s say at initial testing and development time, my CSV
ingesting to JSON used to work a lot better. I have an AVRO schema setup
that gives default values to facilitate error handling.  I am observing
some troubling issues.  We recently upgraded to nifi 1.6.0 last month, so
I’m  naturally suspicious of the upgrade, but do not see known issues that
relate to this.  Has anyone noticed these issues or are some of these items
known issues? I’m dealing w/ probably over 40 CSV columns that I generate
for testing and then ingest.

Observations:

   1. I have decimals that I’m trying to take in as Strings.  Instead, we
   are seeing the empty string being retrieved on those fields from the CSV to
   JSON line-by-line conversion phase.  The empty string is the default if
   something is not found.
   2. I have integers w/ values being taken in as -1, the default, so the
   conversion phase is not working here properly.
   3. I have strings that exist that get dropped to empty string.
   4. And, I have a few integer fields that get correctly taken in as
   strings and match the right values in the CSV.



Regards,



-- 

Jeremy H. Taylor

Software Developer

ACES, Incorporated

http://acesinc.net

Reply via email to