On Tue, Oct 6, 2015 at 7:17 AM, Bryan Bende wrote:
> Hi Russell,
>
> I understand what you are getting at... I don't think the current
> processors we have are designed to handle this bulk load scenario.
>
That's what I'm gathering so far myself, having been poring over the history of
related pro
Hi Russell,
I understand what you are getting at... I don't think the current
processors we have are designed to handle this bulk load scenario.
The series of processors you outlined in your original email would likely
work, but I *think* it would suffer the same problem of producing a lot of
Flo
Really, what I'd like to do is this type of msql bread 'n butter task:
LOAD DATA INFILE
INTO TABLE
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 3 ROWS;
Russell
On Mon, Oct 5, 2015 at 2:09 PM, Russell Whitaker
wrote:
> Bryan,
>
> Some of the CSV files are as small
Bryan,
Some of the CSV files are as small as 6 columns and a thousand lines
or so of entries;
some are many more columns and thousands of lines. I'm hoping to avoid
the necessity
of spawning a flowfile per line; I'm hoping there's the Nifi
equivalent of the SQL DML
statement LOAD DATA INFILE. (Rel
Russell,
How big are these CSVs in terms of rows and columns?
If they aren't too big, another option could be to use SplitText +
ReplaceText to split the csv into a FlowFile per line, and then convert
each line into SQL in ReplaceText. The downside is that this would create a
lot of FlowFiles for
Use case I'm attempting:
1.) ingest a CSV file with header lines;
2.) remove header lines (i.e. remove N lines at head);
2.) SQL INSERT each remaining line as a row in an existing mysql table.
My thinking so far:
#1 is given (CSV fetched already);
#2 simple, should be handled in the context of E