NiFi Versions: 1.2.0 and 1.5.0
Java Version: 1.8.0_162

It appears to me that when I add a PutHDFS processor to the canvas, the
hadoop classes are loaded along with dependencies (normal/desired
behavior).  Then if I delete the PutHDFS processor, the garbage collection
is not able to unload any of the classes that were loaded.  This is the
same behavior with every PutHDFS processor that is added and then deleted.
The results in the slow degradation of NiFi over time with the way we use
NiFi.

Our usage of NiFi does not expose the NiFi interface for those that are
designing data flows.  We have a web interface that allows the user to
define the data they need and our application will build data flows for the
user.  As we upgrade our application we make changes that require us to
remove the old data flow and rebuild it with the new changes.  Over time,
we add and remove a lot of processors using the NiFi API and some of those
processors are HDFS related.

To me, there seems to be a problem with the Hadoop dependency loading and
processor implementation that doesn't allow the JVM to unload those classes
when they are no longer needed.

Thanks,

Dann

Reply via email to