I worked on a team that was packaging NiFi for distribution for people to
use the flow as a service, and what they did to make it easy was export the
flow.xml.gz file and add it to a custom Docker image. That way it became
essentially a lift-and-shift operation. Once you do something like that,
you could have a tool like Ansible talk to the NiFi installation to update
variables and things like that to be more specific to the location.

On Tue, Oct 23, 2018 at 7:56 AM Martijn Dekkers <mart...@dekkers.org.uk>
wrote:

> FYI, we are just about to deploy NiFi to 100+ windows machines where we
> have to collect data, do some small transforms,  and immediately write to
> an S3 compatible storage device. We would eventually like to use MiNiFi,
> but have not yet had the time to look at integrating the AWS processors
> into MiNiFi.
>
> We are managing the deployment of the JVM, NiFi, the actual flow, as well
> as the NiFi config with SaltStack. the actual flow actions are controlled
> centrally via the API.
>
> Very nice, and very do-able. Solved a couple of real problems for us.
>
> Martijn
>
>
> On Wed, 17 Oct 2018, at 16:12, Alan O'Regan wrote:
>
> Thanks for the reply Chris,
>
>
>
> The one concern I have is having to have Nifi running on all of the
> client/source locations.   That could be a lot of Nifi Instances to manage!
>
> Considered writing the new rows to a location (in json or csv) so that
> they could either be pushed to a bucket that the Azure nifi could read from
> or have the Azure nifi reach out and pull from those client locations.
>
> Is that a little crazy?
>
>
>
> Alan O’Regan
>
> *Solution Architect*
>
> O: 404 601 6000
>
> C: 404 216 9060
>
> [image: SolTech Logo]
>
> www.soltech.net
>
> 950 East Paces Ferry Rd NE, Suite 2400, Atlanta GA 30326
>
> [image: LinkedIn Logo] <http://www.linkedin.com/company/soltech-inc> [image:
> Facebook Logo] <http://www.facebook.com/SolTechInc> [image: Twitter Logo]
> <http://twitter.com/soltechatlanta>
>
>
>
> *From:* Chris Herrera <chris.herrer...@gmail.com>
> *Sent:* Saturday, October 13, 2018 4:15 PM
> *To:* users@nifi.apache.org
> *Subject:* Re: Potential Use Case
>
>
>
> This is very much a perfect use case for NiFi. Using site to site to link
> up minifi or other nifi instances is something I have done multiple times,
> in use cases almost exactly like yours. Just make sure that your nifi
> instances that are collecting the data at your sites are able to talk to a
> nifi deployed in Azure (either on IaaS or AKS).
>
>
>
> Regards,
>
> Chris
>
>
> On Oct 13, 2018, at 2:31 PM, Alan O'Regan <alan.ore...@soltech.net> wrote:
>
> I have a potentially large number of distributed database instances that
> track inventory levels in a number of manafacturing plants.
> These plants are not on a single network. Each plant has a SQL Server
> database that tracks certain inventory levels periodically.
>
> These databases are written to periodically by a Programmable Logic
> Controller that inserts rows with inventory levels.
>
>
>
> I would like to be able to aggregate these levels across all plants into
> a single (think Master) database instance hosted in Azure. (We can then
> report against this database). I am new to Nifi but I think this might be
> an ideal use case based on what I am understanding from the docs. Has
> anyone done anything similar to this?
>
>
>
> Thanks in Advance!
>
>
>
> Alan O’Regan
>
> *Solution Architect*
>
> O: 404 601 6000
>
> C: 404 216 9060
>
> [image: SolTech Logo]
>
> www.soltech.net
>
> 950 East Paces Ferry Rd NE, Suite 2400, Atlanta GA 30326
>
> [image: LinkedIn Logo] <http://www.linkedin.com/company/soltech-inc> [image:
> Facebook Logo] <http://www.facebook.com/SolTechInc> [image: Twitter Logo]
> <http://twitter.com/soltechatlanta>
>
>
>
>
>

Reply via email to