Chris,

Don't know if you've created them yet, but here's a few things you might
want to consider:

> 2. *StandardizeDate* - Reads a key/value pair from an attribute and loops
over the keys within the incoming data.  If it finds a match, it will
standardize the value of that key as ISO-8601.

We had to implement something a bit similar and chose to do it as a
LookupService so that you can operate on a record set. If you're working
with large volumes of data and having to standardize dates, a pivot to
using the Record API would be a really good idea.

> *AvroBulkInsert* - We utilize the bulk insert functionality within MSSQL
to insert incoming avro files.

Might want to look at PutDatabaseRecord if you haven't and see if it meets
your use case:

https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.8.0/org.apache.nifi.processors.standard.PutDatabaseRecord/index.html

Just wanted to throw those out there in case you hadn't considered them
because a lot of us have similar use cases.


On Tue, Mar 26, 2019 at 10:11 AM Chris Lundeberg <[email protected]>
wrote:

> Hi all,
>
>  I hope this message finds everyone well. My company is starting to build a
> few custom solutions using Nifi, for a few clients.  We want to be more
> involved in the Nifi community and start contributing back some of the work
> we have done. We have a few processors that we have created and pushed to
> open repos, but would like to try and get some of them built into the base
> Nifi distro, if possible.  We are doing a lot of research now to understand
> what that looks like and I think are ready to start picking up and creating
> Jira tickets.  My main question for this thread is with new processors; if
> we have several that we think could be a good addition, is there some kind
> of voting process that might help us understand which ones would actually
> be of value to the greater community or is that just decided on a PR
> basis?  Some of the example processors that we have created / are creating
> are:
>
> 1. *EncryptValue* - Reads a list of values from an attribute and loops over
> the keys within the data.  As it finds the matches, it will hash the value
> based on the type that the user selects (we support all the normal ones).
> 2. *StandardizeDate* - Reads a key/value pair from an attribute and loops
> over the keys within the incoming data.  If it finds a match, it will
> standardize the value of that key as ISO-8601.
> 3. *AvroBulkInsert* - We utilize the bulk insert functionality within MSSQL
> to insert incoming avro files.
> 4. *GetColumns* - A user selects the controller service and database type
> and we will fetch the columns from the database/schema.table provided and
> attach as a comma separated value on an attribute or flowfile.
>
> Any advice/suggestions would be greatly appreciated. Thanks!
>
>
> Chris Lundeberg
> *Modern Data Engineer / Data Engineering Practice Lead*
> <https://1904labs.com/>
> <https://1904labs.com/>  <https://www.linkedin.com/company/1904labs/>
> <https://twitter.com/1904labs>  <https://www.facebook.com/1904labs/>
> 1904labs is proud to be a Top Workplace 2018
> <https://1904labs.com/o/TopWorkplace2018>
>

Reply via email to