Hey folks,

I added PutHDFS into my workflow, pointed it to core-site.xml and
hdfs-site.xml and it crashed. Restarting doesn't seem to help as the
browser will say NiFi is loading, and then fails to load.

How do I remove the PutHDFS processor without the web UI?

Errors in the log are:-
o.a.n.p.PersistentProvenanceRepository Failed to rollover Provenance
repository due to java.lang.OutOfMemoryError: PermGen space

o.apache.nifi.processors.hadoop.PutHDFS
[PutHDFS[id=4cc42c80-e37f-4122-a6f2-0c3b61ffc00b]] Failed to write to HDFS
due to {} java.lang.OutOfMemoryError: PermGen space at
java.lang.ClassLoader.defineClass1(Native Method) ~[na:1.7.0_67] at
java.lang.ClassLoader.defineClass(ClassLoader.java:800) ~[na:1.7.0_67] at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
~[na:1.7.0_67] at
java.net.URLClassLoader.defineClass(URLClassLoader.java:449) ~[na:1.7.0_67]
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
~[na:1.7.0_67] at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
~[na:1.7.0_67] at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
~[na:1.7.0_67] at java.security.AccessController.doPrivileged(Native
Method) ~[na:1.7.0_67] at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) ~[na:1.7.0_67]
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) ~[na:1.7.0_67] at
java.lang.ClassLoader.loadClass(ClassLoader.java:358) ~[na:1.7.0_67] at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:755)
~[hadoop-hdfs-2.6.0.jar: na] at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_67]
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
~[na:1.7.0_67] at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[na:1.7.0_67] at java.lang.reflect.Method.invoke(Method.java:606)
~[na:1.7.0_67] at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
~[hadoop-common-2.6.0.jar:na] at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
~[hadoop-common-2.6.0.jar:na] at
com.sun.proxy.$Proxy156.getFileInfo(Unknown Source) ~[na:na] at
org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
~[hadoop-hdfs-2.6.0.jar:na] at
org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
~[hadoop-hdfs-2.6.0.jar:na] at
org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
~[hadoop-hdfs-2.6.0.jar:na] at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
~[hadoop-common-2.6.0.jar:na] at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
~[hadoop-hdfs-2.6.0.jar:na] at
org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:232)
~[nifi-hdfs-processors-0.2.1.jar:0.2.1] at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
~[nifi-api-0.2.1.jar:0.2.1] at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1077)
~[na:na] at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:127)
~[na:na] at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:49)
~[na:na] at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:119)
~[na:na] at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
~[na:1.7.0_67] at
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
~[na:1.7.0_67]

Reply via email to