Yes and no. You can do the majority of the ingestion work from NiFi by
having NiFi write your records as Parquet files and upload them to S3,
followed by a small Spark job to integrate them into your existing
Delta Lake. I've done a demo on that before (can't share the code) and
it was pretty easy to write. NiFi does most of the work of converting
and cleaning up the input; the Spark job would just read the Parquet
files and append them to your Delta Lake.

On Fri, Jan 7, 2022 at 1:41 AM Hao Wang <[email protected]> wrote:
>
> Dear NiFi devs :
>
> I'm new to NiFi, and I want to know if NiFi supports Data Lake or Streaming 
> Platform ?
>
> Bravo !
> Hao Wang

Reply via email to