Shahzad,

Joe is correct in that we do not have anything that maps directly to this
data stream source.

As a means of getting the data into a NiFi flow, you could also consider
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.ExecuteProcess/index.html
processor.  I have attached a template that uses an ExecuteProcess instance
to perform the functionality you are interested in.

>From the template descripton:

This template makes use of ExecuteProcess to invoke curl and perform
batching on the streaming response provided by pushshift.io.  These are
batched into 1s intervals. This is inexact, and as a result, some results
may get truncated depending on time boundaries.  We perform a very naive
RouteOnContent to filter out those events without the data payload.


This gives a nice proof of concept of how you could interact with the data
source before diving into a custom processor.  As Joe mentioned, a custom
processor might be nice to handle the data format to be cognizant of event
boundaries and would potentially obviate the need for the included
SplitContent processor.  The attached template will discard some events
that fall on those time batch boundaries.

Let us know if you have any more questions or if you are seeing a number of
"similar" data APIs that could potentially be generically supported from a
project standpoint.

--aldrin

On Mon, Jan 11, 2016 at 11:14 AM, Joe Percivall <
[email protected]> wrote:

> Hello Shahzad,
>
> Unfortunately the "stream" functionality of pushshift.io doesn't fit into
> any current NiFi processor. Processors work by having an "OnTrigger" method
> that is used to create FlowFiles with each call. This works nicely for
> aspects of the pushshit.io api like "
> https://api.pushshift.io/reddit/search?q=Einstein&limit=100"; where it
> returns a single "unit" of information with each http request. If you are
> able to get the same information you need using the base "api" call for
> pushshift instead of "stream that would work best.
>
> Else you may be able to create a custom processor around your java code
> although it may be pretty difficult. You would need to translate the stream
> into chunks of information that would be put into the contents of FlowFiles
> and routed to a relationship using Session.Transter. For more information
> on creating a custom processor check out the developer guide:
> https://nifi.apache.org/developer-guide.html.
>
> Do either of those help or is a general processor that streams over HTTP
> necessary?
>
> Joe
> - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
> e: [email protected]
>
>
>
> On Friday, January 8, 2016 11:16 AM, Shahzad K <[email protected]> wrote:
>
>
>
> Hi
>
> My name is Shahzad Karamat, i am trying to read some tweets from
> http://stream.pushshift.io/ <http://stream.pushshift.io/> into nifi.
> I am using a mac and can read the stream using curl -i '
> http://stream.pushshift.io/?subreddit=askreddit'
> I can get the stream into my terminal and i have also developed system to
> read this by using Java code.
> The question is:
> The strings i read from http://stream.pushshift.io/ <
> http://stream.pushshift.io/>  by using java, how can i make flowFile of
> this stream to transfer it to a certain relation?
>
> Regards
>
> Shahzad K
>
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><template><description>This template makes use of ExecuteProcess to invoke curl and perform batching on the streaming response provided by pushshift.io.  These are batched into 1s intervals. This is inexact, and as a result, some results may get truncated depending on time boundaries.  We perform a very naive RouteOnContent to filter out those events without the data payload.</description><name>Retrieve Data from pushshift.io Streaming</name><snippet><connections><id>595462e7-e7ee-41eb-b77b-1aa11ebe8d2e</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><backPressureDataSizeThreshold>0 MB</backPressureDataSizeThreshold><backPressureObjectThreshold>0</backPressureObjectThreshold><destination><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>839f733d-cf04-4cd6-af6a-0a6c75cc4567</id><type>PROCESSOR</type></destination><flowFileExpiration>0 sec</flowFileExpiration><labelIndex>1</labelIndex><name></name><selectedRelationships>full.events</selectedRelationships><source><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>975703db-4912-49e7-b8a1-d670e059a03b</id><type>PROCESSOR</type></source><zIndex>0</zIndex></connections><connections><id>99b5737c-398b-4551-af00-06a99ba9ce9d</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><backPressureDataSizeThreshold>0 MB</backPressureDataSizeThreshold><backPressureObjectThreshold>0</backPressureObjectThreshold><destination><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>975703db-4912-49e7-b8a1-d670e059a03b</id><type>PROCESSOR</type></destination><flowFileExpiration>0 sec</flowFileExpiration><labelIndex>1</labelIndex><name></name><selectedRelationships>splits</selectedRelationships><source><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>ae019332-fc08-4272-a136-c3889d29f6d4</id><type>PROCESSOR</type></source><zIndex>0</zIndex></connections><connections><id>a911c7e8-47ad-4a09-810b-2c3495056d03</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><backPressureDataSizeThreshold>0 MB</backPressureDataSizeThreshold><backPressureObjectThreshold>0</backPressureObjectThreshold><destination><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>ae019332-fc08-4272-a136-c3889d29f6d4</id><type>PROCESSOR</type></destination><flowFileExpiration>0 sec</flowFileExpiration><labelIndex>1</labelIndex><name></name><selectedRelationships>success</selectedRelationships><source><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>034a76df-22a9-4052-be68-2d49425a290f</id><type>PROCESSOR</type></source><zIndex>0</zIndex></connections><connections><id>9fbb6b8d-6f8c-4aa7-b1f7-d64e781c97ea</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><backPressureDataSizeThreshold>0 MB</backPressureDataSizeThreshold><backPressureObjectThreshold>0</backPressureObjectThreshold><destination><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>a2608424-8eec-4a4c-ac7b-1c074ae8c800</id><type>PROCESSOR</type></destination><flowFileExpiration>0 sec</flowFileExpiration><labelIndex>0</labelIndex><name>Incomplete Events</name><selectedRelationships>unmatched</selectedRelationships><source><groupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</groupId><id>975703db-4912-49e7-b8a1-d670e059a03b</id><type>PROCESSOR</type></source><zIndex>0</zIndex></connections><processors><id>975703db-4912-49e7-b8a1-d670e059a03b</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><position><x>2002.9272939105067</x><y>578.2776955537918</y></position><config><bulletinLevel>WARN</bulletinLevel><comments></comments><concurrentlySchedulableTaskCount>1</concurrentlySchedulableTaskCount><defaultConcurrentTasks><entry><key>TIMER_DRIVEN</key><value>1</value></entry><entry><key>EVENT_DRIVEN</key><value>0</value></entry><entry><key>CRON_DRIVEN</key><value>1</value></entry></defaultConcurrentTasks><defaultSchedulingPeriod><entry><key>TIMER_DRIVEN</key><value>0 sec</value></entry><entry><key>CRON_DRIVEN</key><value>* * * * * ?</value></entry></defaultSchedulingPeriod><descriptors><entry><key>Match Requirement</key><value><allowableValues><displayName>content must match exactly</displayName><value>content must match exactly</value></allowableValues><allowableValues><displayName>content must contain match</displayName><value>content must contain match</value></allowableValues><defaultValue>content must match exactly</defaultValue><description>Specifies whether the entire content of the file must match the regular expression exactly, or if any part of the file (up to Content Buffer Size) can contain the regular expression in order to be considered a match</description><displayName>Match Requirement</displayName><dynamic>false</dynamic><name>Match Requirement</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Character Set</key><value><defaultValue>UTF-8</defaultValue><description>The Character Set in which the file is encoded</description><displayName>Character Set</displayName><dynamic>false</dynamic><name>Character Set</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Content Buffer Size</key><value><defaultValue>1 MB</defaultValue><description>Specifies the maximum amount of data to buffer in order to apply the regular expressions. If the size of the FlowFile exceeds this value, any amount of this value will be ignored</description><displayName>Content Buffer Size</displayName><dynamic>false</dynamic><name>Content Buffer Size</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>full.events</key><value><description></description><displayName>full.events</displayName><dynamic>true</dynamic><name>full.events</name><required>false</required><sensitive>false</sensitive><supportsEl>true</supportsEl></value></entry></descriptors><lossTolerant>false</lossTolerant><penaltyDuration>30 sec</penaltyDuration><properties><entry><key>Match Requirement</key><value>content must contain match</value></entry><entry><key>Character Set</key></entry><entry><key>Content Buffer Size</key></entry><entry><key>full.events</key><value>.*data:.*</value></entry></properties><runDurationMillis>0</runDurationMillis><schedulingPeriod>0 sec</schedulingPeriod><schedulingStrategy>TIMER_DRIVEN</schedulingStrategy><yieldDuration>1 sec</yieldDuration></config><name>Select Events where &quot;data&quot; exists</name><relationships><autoTerminate>false</autoTerminate><description></description><name>full.events</name></relationships><relationships><autoTerminate>false</autoTerminate><description>FlowFiles that do not match any of the user-supplied regular expressions will be routed to this relationship</description><name>unmatched</name></relationships><state>RUNNING</state><style/><supportsEventDriven>true</supportsEventDriven><supportsParallelProcessing>true</supportsParallelProcessing><type>org.apache.nifi.processors.standard.RouteOnContent</type></processors><processors><id>839f733d-cf04-4cd6-af6a-0a6c75cc4567</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><position><x>1733.643333800328</x><y>764.0436983489815</y></position><config><bulletinLevel>INFO</bulletinLevel><comments></comments><concurrentlySchedulableTaskCount>1</concurrentlySchedulableTaskCount><defaultConcurrentTasks><entry><key>TIMER_DRIVEN</key><value>1</value></entry><entry><key>EVENT_DRIVEN</key><value>0</value></entry><entry><key>CRON_DRIVEN</key><value>1</value></entry></defaultConcurrentTasks><defaultSchedulingPeriod><entry><key>TIMER_DRIVEN</key><value>0 sec</value></entry><entry><key>CRON_DRIVEN</key><value>* * * * * ?</value></entry></defaultSchedulingPeriod><descriptors><entry><key>Log Level</key><value><allowableValues><displayName>trace</displayName><value>trace</value></allowableValues><allowableValues><displayName>debug</displayName><value>debug</value></allowableValues><allowableValues><displayName>info</displayName><value>info</value></allowableValues><allowableValues><displayName>warn</displayName><value>warn</value></allowableValues><allowableValues><displayName>error</displayName><value>error</value></allowableValues><defaultValue>info</defaultValue><description>The Log Level to use when logging the Attributes</description><displayName>Log Level</displayName><dynamic>false</dynamic><name>Log Level</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Log Payload</key><value><allowableValues><displayName>true</displayName><value>true</value></allowableValues><allowableValues><displayName>false</displayName><value>false</value></allowableValues><defaultValue>false</defaultValue><description>If true, the FlowFile's payload will be logged, in addition to its attributes; otherwise, just the Attributes will be logged.</description><displayName>Log Payload</displayName><dynamic>false</dynamic><name>Log Payload</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Attributes to Log</key><value><description>A comma-separated list of Attributes to Log. If not specified, all attributes will be logged.</description><displayName>Attributes to Log</displayName><dynamic>false</dynamic><name>Attributes to Log</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Attributes to Ignore</key><value><description>A comma-separated list of Attributes to ignore. If not specified, no attributes will be ignored.</description><displayName>Attributes to Ignore</displayName><dynamic>false</dynamic><name>Attributes to Ignore</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Log prefix</key><value><description>Log prefix appended to the log lines. It helps to distinguish the output of multiple LogAttribute processors.</description><displayName>Log prefix</displayName><dynamic>false</dynamic><name>Log prefix</name><required>false</required><sensitive>false</sensitive><supportsEl>true</supportsEl></value></entry></descriptors><lossTolerant>false</lossTolerant><penaltyDuration>30 sec</penaltyDuration><properties><entry><key>Log Level</key></entry><entry><key>Log Payload</key><value>true</value></entry><entry><key>Attributes to Log</key></entry><entry><key>Attributes to Ignore</key></entry><entry><key>Log prefix</key></entry></properties><runDurationMillis>0</runDurationMillis><schedulingPeriod>0 sec</schedulingPeriod><schedulingStrategy>TIMER_DRIVEN</schedulingStrategy><yieldDuration>1 sec</yieldDuration></config><name>LogAttributes and Content</name><relationships><autoTerminate>true</autoTerminate><description>All FlowFiles are routed to this relationship</description><name>success</name></relationships><state>RUNNING</state><style/><supportsEventDriven>true</supportsEventDriven><supportsParallelProcessing>true</supportsParallelProcessing><type>org.apache.nifi.processors.standard.LogAttribute</type></processors><processors><id>034a76df-22a9-4052-be68-2d49425a290f</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><position><x>1997.1483063564872</x><y>239.77007046872853</y></position><config><bulletinLevel>WARN</bulletinLevel><comments></comments><concurrentlySchedulableTaskCount>1</concurrentlySchedulableTaskCount><defaultConcurrentTasks><entry><key>TIMER_DRIVEN</key><value>1</value></entry><entry><key>EVENT_DRIVEN</key><value>0</value></entry><entry><key>CRON_DRIVEN</key><value>1</value></entry></defaultConcurrentTasks><defaultSchedulingPeriod><entry><key>TIMER_DRIVEN</key><value>0 sec</value></entry><entry><key>CRON_DRIVEN</key><value>* * * * * ?</value></entry></defaultSchedulingPeriod><descriptors><entry><key>Command</key><value><description>Specifies the command to be executed; if just the name of an executable is provided, it must be in the user's environment PATH.</description><displayName>Command</displayName><dynamic>false</dynamic><name>Command</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Command Arguments</key><value><description>The arguments to supply to the executable delimited by white space. White space can be escaped by enclosing it in double-quotes.</description><displayName>Command Arguments</displayName><dynamic>false</dynamic><name>Command Arguments</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Batch Duration</key><value><description>If the process is expected to be long-running and produce textual output, a batch duration can be specified so that the output will be captured for this amount of time and a FlowFile will then be sent out with the results and a new FlowFile will be started, rather than waiting for the process to finish before sending out the results</description><displayName>Batch Duration</displayName><dynamic>false</dynamic><name>Batch Duration</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Redirect Error Stream</key><value><allowableValues><displayName>true</displayName><value>true</value></allowableValues><allowableValues><displayName>false</displayName><value>false</value></allowableValues><defaultValue>false</defaultValue><description>If true will redirect any error stream output of the process to the output stream. This is particularly helpful for processes which write extensively to the error stream or for troubleshooting.</description><displayName>Redirect Error Stream</displayName><dynamic>false</dynamic><name>Redirect Error Stream</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Argument Delimiter</key><value><defaultValue> </defaultValue><description>Delimiter to use to separate arguments for a command [default: space]. Must be a single character.</description><displayName>Argument Delimiter</displayName><dynamic>false</dynamic><name>Argument Delimiter</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry></descriptors><lossTolerant>false</lossTolerant><penaltyDuration>30 sec</penaltyDuration><properties><entry><key>Command</key><value>curl</value></entry><entry><key>Command Arguments</key><value>http://stream.pushshift.io/?subreddit=askreddit</value></entry><entry><key>Batch Duration</key><value>1s</value></entry><entry><key>Redirect Error Stream</key></entry><entry><key>Argument Delimiter</key></entry></properties><runDurationMillis>0</runDurationMillis><schedulingPeriod>0 sec</schedulingPeriod><schedulingStrategy>TIMER_DRIVEN</schedulingStrategy><yieldDuration>1 sec</yieldDuration></config><name>curl pushshift.io (1s batch)</name><relationships><autoTerminate>false</autoTerminate><description>All created FlowFiles are routed to this relationship</description><name>success</name></relationships><state>RUNNING</state><style/><supportsEventDriven>false</supportsEventDriven><supportsParallelProcessing>true</supportsParallelProcessing><type>org.apache.nifi.processors.standard.ExecuteProcess</type></processors><processors><id>ae019332-fc08-4272-a136-c3889d29f6d4</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><position><x>2004.0138758487847</x><y>407.8823175586383</y></position><config><bulletinLevel>WARN</bulletinLevel><comments></comments><concurrentlySchedulableTaskCount>1</concurrentlySchedulableTaskCount><defaultConcurrentTasks><entry><key>TIMER_DRIVEN</key><value>1</value></entry><entry><key>EVENT_DRIVEN</key><value>0</value></entry><entry><key>CRON_DRIVEN</key><value>1</value></entry></defaultConcurrentTasks><defaultSchedulingPeriod><entry><key>TIMER_DRIVEN</key><value>0 sec</value></entry><entry><key>CRON_DRIVEN</key><value>* * * * * ?</value></entry></defaultSchedulingPeriod><descriptors><entry><key>Byte Sequence Format</key><value><allowableValues><description>The Byte Sequence will be interpreted as a hexadecimal representation of bytes</description><displayName>Hexadecimal</displayName><value>Hexadecimal</value></allowableValues><allowableValues><description>The Byte Sequence will be interpreted as UTF-8 Encoded text</description><displayName>Text</displayName><value>Text</value></allowableValues><defaultValue>Hexadecimal</defaultValue><description>Specifies how the &lt;Byte Sequence&gt; property should be interpreted</description><displayName>Byte Sequence Format</displayName><dynamic>false</dynamic><name>Byte Sequence Format</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Byte Sequence</key><value><description>A representation of bytes to look for and upon which to split the source file into separate files</description><displayName>Byte Sequence</displayName><dynamic>false</dynamic><name>Byte Sequence</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Keep Byte Sequence</key><value><allowableValues><displayName>true</displayName><value>true</value></allowableValues><allowableValues><displayName>false</displayName><value>false</value></allowableValues><defaultValue>false</defaultValue><description>Determines whether or not the Byte Sequence should be included with each Split</description><displayName>Keep Byte Sequence</displayName><dynamic>false</dynamic><name>Keep Byte Sequence</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Byte Sequence Location</key><value><allowableValues><description>Keep the Byte Sequence at the end of the first split if &lt;Keep Byte Sequence&gt; is true</description><displayName>Trailing</displayName><value>Trailing</value></allowableValues><allowableValues><description>Keep the Byte Sequence at the beginning of the second split if &lt;Keep Byte Sequence&gt; is true</description><displayName>Leading</displayName><value>Leading</value></allowableValues><defaultValue>Trailing</defaultValue><description>If &lt;Keep Byte Sequence&gt; is set to true, specifies whether the byte sequence should be added to the end of the first split or the beginning of the second; if &lt;Keep Byte Sequence&gt; is false, this property is ignored.</description><displayName>Byte Sequence Location</displayName><dynamic>false</dynamic><name>Byte Sequence Location</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry></descriptors><lossTolerant>false</lossTolerant><penaltyDuration>30 sec</penaltyDuration><properties><entry><key>Byte Sequence Format</key><value>Hexadecimal</value></entry><entry><key>Byte Sequence</key><value>0A0A</value></entry><entry><key>Keep Byte Sequence</key></entry><entry><key>Byte Sequence Location</key></entry></properties><runDurationMillis>0</runDurationMillis><schedulingPeriod>0 sec</schedulingPeriod><schedulingStrategy>TIMER_DRIVEN</schedulingStrategy><yieldDuration>1 sec</yieldDuration></config><name>Split into Events (0x0A0x0A delimited)</name><relationships><autoTerminate>true</autoTerminate><description>The original file</description><name>original</name></relationships><relationships><autoTerminate>false</autoTerminate><description>All Splits will be routed to the splits relationship</description><name>splits</name></relationships><state>RUNNING</state><style/><supportsEventDriven>true</supportsEventDriven><supportsParallelProcessing>true</supportsParallelProcessing><type>org.apache.nifi.processors.standard.SplitContent</type></processors><processors><id>a2608424-8eec-4a4c-ac7b-1c074ae8c800</id><parentGroupId>fa08eb86-1e2f-4879-b0e8-96ad9add8cee</parentGroupId><position><x>2293.402855284703</x><y>759.824582138044</y></position><config><bulletinLevel>INFO</bulletinLevel><comments></comments><concurrentlySchedulableTaskCount>1</concurrentlySchedulableTaskCount><defaultConcurrentTasks><entry><key>TIMER_DRIVEN</key><value>1</value></entry><entry><key>EVENT_DRIVEN</key><value>0</value></entry><entry><key>CRON_DRIVEN</key><value>1</value></entry></defaultConcurrentTasks><defaultSchedulingPeriod><entry><key>TIMER_DRIVEN</key><value>0 sec</value></entry><entry><key>CRON_DRIVEN</key><value>* * * * * ?</value></entry></defaultSchedulingPeriod><descriptors><entry><key>Log Level</key><value><allowableValues><displayName>trace</displayName><value>trace</value></allowableValues><allowableValues><displayName>debug</displayName><value>debug</value></allowableValues><allowableValues><displayName>info</displayName><value>info</value></allowableValues><allowableValues><displayName>warn</displayName><value>warn</value></allowableValues><allowableValues><displayName>error</displayName><value>error</value></allowableValues><defaultValue>info</defaultValue><description>The Log Level to use when logging the Attributes</description><displayName>Log Level</displayName><dynamic>false</dynamic><name>Log Level</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Log Payload</key><value><allowableValues><displayName>true</displayName><value>true</value></allowableValues><allowableValues><displayName>false</displayName><value>false</value></allowableValues><defaultValue>false</defaultValue><description>If true, the FlowFile's payload will be logged, in addition to its attributes; otherwise, just the Attributes will be logged.</description><displayName>Log Payload</displayName><dynamic>false</dynamic><name>Log Payload</name><required>true</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Attributes to Log</key><value><description>A comma-separated list of Attributes to Log. If not specified, all attributes will be logged.</description><displayName>Attributes to Log</displayName><dynamic>false</dynamic><name>Attributes to Log</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Attributes to Ignore</key><value><description>A comma-separated list of Attributes to ignore. If not specified, no attributes will be ignored.</description><displayName>Attributes to Ignore</displayName><dynamic>false</dynamic><name>Attributes to Ignore</name><required>false</required><sensitive>false</sensitive><supportsEl>false</supportsEl></value></entry><entry><key>Log prefix</key><value><description>Log prefix appended to the log lines. It helps to distinguish the output of multiple LogAttribute processors.</description><displayName>Log prefix</displayName><dynamic>false</dynamic><name>Log prefix</name><required>false</required><sensitive>false</sensitive><supportsEl>true</supportsEl></value></entry></descriptors><lossTolerant>false</lossTolerant><penaltyDuration>30 sec</penaltyDuration><properties><entry><key>Log Level</key></entry><entry><key>Log Payload</key><value>true</value></entry><entry><key>Attributes to Log</key></entry><entry><key>Attributes to Ignore</key></entry><entry><key>Log prefix</key></entry></properties><runDurationMillis>0</runDurationMillis><schedulingPeriod>0 sec</schedulingPeriod><schedulingStrategy>TIMER_DRIVEN</schedulingStrategy><yieldDuration>1 sec</yieldDuration></config><name>LogAttributes and Content</name><relationships><autoTerminate>true</autoTerminate><description>All FlowFiles are routed to this relationship</description><name>success</name></relationships><state>RUNNING</state><style/><supportsEventDriven>true</supportsEventDriven><supportsParallelProcessing>true</supportsParallelProcessing><type>org.apache.nifi.processors.standard.LogAttribute</type></processors></snippet><timestamp>01/11/2016 11:59:19 EST</timestamp></template>

Reply via email to