Author: buildbot
Date: Wed Oct  4 16:24:02 2017
New Revision: 1019094

Log:
Staging update by buildbot for flume

Added:
    websites/staging/flume/trunk/content/.doctrees/releases/1.8.0.doctree   
(with props)
    websites/staging/flume/trunk/content/_sources/releases/1.8.0.txt
    websites/staging/flume/trunk/content/releases/1.8.0.html
Modified:
    websites/staging/flume/trunk/content/   (props changed)
    websites/staging/flume/trunk/content/.buildinfo
    websites/staging/flume/trunk/content/.doctrees/FlumeDeveloperGuide.doctree
    websites/staging/flume/trunk/content/.doctrees/FlumeUserGuide.doctree
    websites/staging/flume/trunk/content/.doctrees/download.doctree
    websites/staging/flume/trunk/content/.doctrees/environment.pickle
    websites/staging/flume/trunk/content/.doctrees/index.doctree
    websites/staging/flume/trunk/content/.doctrees/releases/index.doctree
    websites/staging/flume/trunk/content/FlumeDeveloperGuide.html
    websites/staging/flume/trunk/content/FlumeUserGuide.html
    websites/staging/flume/trunk/content/_sources/FlumeDeveloperGuide.txt
    websites/staging/flume/trunk/content/_sources/FlumeUserGuide.txt
    websites/staging/flume/trunk/content/_sources/download.txt
    websites/staging/flume/trunk/content/_sources/index.txt
    websites/staging/flume/trunk/content/_sources/releases/index.txt
    websites/staging/flume/trunk/content/documentation.html
    websites/staging/flume/trunk/content/download.html
    websites/staging/flume/trunk/content/getinvolved.html
    websites/staging/flume/trunk/content/index.html
    websites/staging/flume/trunk/content/license.html
    websites/staging/flume/trunk/content/mailinglists.html
    websites/staging/flume/trunk/content/project-reports.html
    websites/staging/flume/trunk/content/releases/1.0.0.html
    websites/staging/flume/trunk/content/releases/1.1.0.html
    websites/staging/flume/trunk/content/releases/1.2.0.html
    websites/staging/flume/trunk/content/releases/1.3.0.html
    websites/staging/flume/trunk/content/releases/1.3.1.html
    websites/staging/flume/trunk/content/releases/1.4.0.html
    websites/staging/flume/trunk/content/releases/1.5.0.1.html
    websites/staging/flume/trunk/content/releases/1.5.0.html
    websites/staging/flume/trunk/content/releases/1.5.2.html
    websites/staging/flume/trunk/content/releases/1.6.0.html
    websites/staging/flume/trunk/content/releases/1.7.0.html
    websites/staging/flume/trunk/content/releases/index.html
    websites/staging/flume/trunk/content/search.html
    websites/staging/flume/trunk/content/searchindex.js
    websites/staging/flume/trunk/content/source.html
    websites/staging/flume/trunk/content/team.html

Propchange: websites/staging/flume/trunk/content/
------------------------------------------------------------------------------
--- cms:source-revision (original)
+++ cms:source-revision Wed Oct  4 16:24:02 2017
@@ -1 +1 @@
-1765309
+1811107

Modified: websites/staging/flume/trunk/content/.buildinfo
==============================================================================
--- websites/staging/flume/trunk/content/.buildinfo (original)
+++ websites/staging/flume/trunk/content/.buildinfo Wed Oct  4 16:24:02 2017
@@ -1,4 +1,4 @@
 # Sphinx build info version 1
 # This file hashes the configuration used when building these files. When it 
is not found, a full rebuild will be done.
-config: fd5a3d36bc29ea19cb318efba9d265bf
+config: 70bc5554a593f851e82b5766714b6957
 tags: fbb0d17656682115ca4d033fb2f83ba1

Modified: 
websites/staging/flume/trunk/content/.doctrees/FlumeDeveloperGuide.doctree
==============================================================================
Binary files - no diff available.

Modified: websites/staging/flume/trunk/content/.doctrees/FlumeUserGuide.doctree
==============================================================================
Binary files - no diff available.

Modified: websites/staging/flume/trunk/content/.doctrees/download.doctree
==============================================================================
Binary files - no diff available.

Modified: websites/staging/flume/trunk/content/.doctrees/environment.pickle
==============================================================================
Binary files - no diff available.

Modified: websites/staging/flume/trunk/content/.doctrees/index.doctree
==============================================================================
Binary files - no diff available.

Added: websites/staging/flume/trunk/content/.doctrees/releases/1.8.0.doctree
==============================================================================
Binary file - no diff available.

Propchange: 
websites/staging/flume/trunk/content/.doctrees/releases/1.8.0.doctree
------------------------------------------------------------------------------
    svn:mime-type = application/octet-stream

Modified: websites/staging/flume/trunk/content/.doctrees/releases/index.doctree
==============================================================================
Binary files - no diff available.

Modified: websites/staging/flume/trunk/content/FlumeDeveloperGuide.html
==============================================================================
--- websites/staging/flume/trunk/content/FlumeDeveloperGuide.html (original)
+++ websites/staging/flume/trunk/content/FlumeDeveloperGuide.html Wed Oct  4 
16:24:02 2017
@@ -7,7 +7,7 @@
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
     
-    <title>Flume 1.7.0 Developer Guide &mdash; Apache Flume</title>
+    <title>Flume 1.8.0 Developer Guide &mdash; Apache Flume</title>
     
     <link rel="stylesheet" href="_static/flume.css" type="text/css" />
     <link rel="stylesheet" href="_static/pygments.css" type="text/css" />
@@ -27,7 +27,7 @@
     <link rel="top" title="Apache Flume" href="index.html" />
     <link rel="up" title="Documentation" href="documentation.html" />
     <link rel="next" title="Releases" href="releases/index.html" />
-    <link rel="prev" title="Flume 1.7.0 User Guide" href="FlumeUserGuide.html" 
/> 
+    <link rel="prev" title="Flume 1.8.0 User Guide" href="FlumeUserGuide.html" 
/> 
   </head>
   <body>
 <div class="header">
@@ -59,8 +59,8 @@
         <div class="bodywrapper">
           <div class="body">
             
-  <div class="section" id="flume-1-7-0-developer-guide">
-<h1>Flume 1.7.0 Developer Guide<a class="headerlink" 
href="#flume-1-7-0-developer-guide" title="Permalink to this 
headline">¶</a></h1>
+  <div class="section" id="flume-1-8-0-developer-guide">
+<h1>Flume 1.8.0 Developer Guide<a class="headerlink" 
href="#flume-1-8-0-developer-guide" title="Permalink to this 
headline">¶</a></h1>
 <div class="section" id="introduction">
 <h2>Introduction<a class="headerlink" href="#introduction" title="Permalink to 
this headline">¶</a></h2>
 <div class="section" id="overview">
@@ -144,6 +144,19 @@ commands:</p>
 be in the path. You can download and install it by following the instructions
 <a class="reference external" 
href="https://developers.google.com/protocol-buffers/";>here</a>.</p>
 </div>
+<div class="section" id="updating-protocol-buffer-version">
+<h4>Updating Protocol Buffer Version<a class="headerlink" 
href="#updating-protocol-buffer-version" title="Permalink to this 
headline">¶</a></h4>
+<p>File channel has a dependency on Protocol Buffer. When updating the version 
of Protocol Buffer
+used by Flume, it is necessary to regenerate the data access classes using the 
protoc compiler
+that is part of Protocol Buffer as follows.</p>
+<ol class="arabic simple">
+<li>Install the desired version of Protocol Buffer on your local machine</li>
+<li>Update version of Protocol Buffer in pom.xml</li>
+<li>Generate new Protocol Buffer data access classes in Flume: <tt 
class="docutils literal"><span class="pre">cd</span> <span 
class="pre">flume-ng-channels/flume-file-channel;</span> <span 
class="pre">mvn</span> <span class="pre">-P</span> <span 
class="pre">compile-proto</span> <span class="pre">clean</span> <span 
class="pre">package</span> <span class="pre">-DskipTests</span></tt></li>
+<li>Add Apache license header to any of the generated files that are missing 
it</li>
+<li>Rebuild and test Flume:  <tt class="docutils literal"><span 
class="pre">cd</span> <span class="pre">../..;</span> <span 
class="pre">mvn</span> <span class="pre">clean</span> <span 
class="pre">install</span></tt></li>
+</ol>
+</div>
 </div>
 <div class="section" id="developing-custom-components">
 <h3>Developing custom components<a class="headerlink" 
href="#developing-custom-components" title="Permalink to this 
headline">¶</a></h3>
@@ -924,7 +937,7 @@ mechanism that captures the new data and
 
 <h3><a href="index.html">This Page</a></h3>
 <ul>
-<li><a class="reference internal" href="#">Flume 1.7.0 Developer Guide</a><ul>
+<li><a class="reference internal" href="#">Flume 1.8.0 Developer Guide</a><ul>
 <li><a class="reference internal" href="#introduction">Introduction</a><ul>
 <li><a class="reference internal" href="#overview">Overview</a></li>
 <li><a class="reference internal" href="#architecture">Architecture</a><ul>
@@ -935,6 +948,7 @@ mechanism that captures the new data and
 <li><a class="reference internal" href="#building-flume">Building Flume</a><ul>
 <li><a class="reference internal" href="#getting-the-source">Getting the 
source</a></li>
 <li><a class="reference internal" href="#compile-test-flume">Compile/test 
Flume</a></li>
+<li><a class="reference internal" 
href="#updating-protocol-buffer-version">Updating Protocol Buffer 
Version</a></li>
 </ul>
 </li>
 <li><a class="reference internal" 
href="#developing-custom-components">Developing custom components</a><ul>
@@ -965,7 +979,7 @@ mechanism that captures the new data and
       <div class="clearer"></div>
     </div>
 <div class="footer">
-    &copy; Copyright 2009-2012 The Apache Software Foundation. Apache Flume, 
Flume, Apache, the Apache feather logo, and the Apache Flume project logo are 
trademarks of The Apache Software Foundation..
+    &copy; Copyright 2009-2017 The Apache Software Foundation. Apache Flume, 
Flume, Apache, the Apache feather logo, and the Apache Flume project logo are 
trademarks of The Apache Software Foundation..
 </div>
   </body>
 </html>
\ No newline at end of file

Modified: websites/staging/flume/trunk/content/FlumeUserGuide.html
==============================================================================
--- websites/staging/flume/trunk/content/FlumeUserGuide.html (original)
+++ websites/staging/flume/trunk/content/FlumeUserGuide.html Wed Oct  4 
16:24:02 2017
@@ -7,7 +7,7 @@
   <head>
     <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
     
-    <title>Flume 1.7.0 User Guide &mdash; Apache Flume</title>
+    <title>Flume 1.8.0 User Guide &mdash; Apache Flume</title>
     
     <link rel="stylesheet" href="_static/flume.css" type="text/css" />
     <link rel="stylesheet" href="_static/pygments.css" type="text/css" />
@@ -26,7 +26,7 @@
     <script type="text/javascript" src="_static/doctools.js"></script>
     <link rel="top" title="Apache Flume" href="index.html" />
     <link rel="up" title="Documentation" href="documentation.html" />
-    <link rel="next" title="Flume 1.7.0 Developer Guide" 
href="FlumeDeveloperGuide.html" />
+    <link rel="next" title="Flume 1.8.0 Developer Guide" 
href="FlumeDeveloperGuide.html" />
     <link rel="prev" title="Documentation" href="documentation.html" /> 
   </head>
   <body>
@@ -59,8 +59,8 @@
         <div class="bodywrapper">
           <div class="body">
             
-  <div class="section" id="flume-1-7-0-user-guide">
-<h1>Flume 1.7.0 User Guide<a class="headerlink" href="#flume-1-7-0-user-guide" 
title="Permalink to this headline">¶</a></h1>
+  <div class="section" id="flume-1-8-0-user-guide">
+<h1>Flume 1.8.0 User Guide<a class="headerlink" href="#flume-1-8-0-user-guide" 
title="Permalink to this headline">¶</a></h1>
 <div class="section" id="introduction">
 <h2>Introduction<a class="headerlink" href="#introduction" title="Permalink to 
this headline">¶</a></h2>
 <div class="section" id="overview">
@@ -84,7 +84,7 @@ in the latest architecture.</p>
 <div class="section" id="system-requirements">
 <h3>System Requirements<a class="headerlink" href="#system-requirements" 
title="Permalink to this headline">¶</a></h3>
 <ol class="arabic simple">
-<li>Java Runtime Environment - Java 1.7 or later</li>
+<li>Java Runtime Environment - Java 1.8 or later</li>
 <li>Memory - Sufficient memory for configurations used by sources, channels or 
sinks</li>
 <li>Disk Space - Sufficient disk space for configurations used by channels or 
sinks</li>
 <li>Directory Permissions - Read/Write permissions for directories used by 
agent</li>
@@ -248,6 +248,24 @@ OK</pre>
 </div>
 <p>Congratulations - you&#8217;ve successfully configured and deployed a Flume 
agent! Subsequent sections cover agent configuration in much more detail.</p>
 </div>
+<div class="section" id="using-environment-variables-in-configuration-files">
+<h4>Using environment variables in configuration files<a class="headerlink" 
href="#using-environment-variables-in-configuration-files" title="Permalink to 
this headline">¶</a></h4>
+<p>Flume has the ability to substitute environment variables in the 
configuration. For example:</p>
+<div class="highlight-none"><div class="highlight"><pre>a1.sources = r1
+a1.sources.r1.type = netcat
+a1.sources.r1.bind = 0.0.0.0
+a1.sources.r1.port = ${NC_PORT}
+a1.sources.r1.channels = c1
+</pre></div>
+</div>
+<p>NB: it currently works for values only, not for keys. (Ie. only on the 
&#8220;right side&#8221; of the <cite>=</cite> mark of the config lines.)</p>
+<p>This can be enabled via Java system properties on agent invocation by 
setting <cite>propertiesImplementation = 
org.apache.flume.node.EnvVarResolverProperties</cite>.</p>
+<dl class="docutils">
+<dt>For example::</dt>
+<dd>$ NC_PORT=44444 bin/flume-ng agent &#8211;conf conf &#8211;conf-file 
example.conf &#8211;name a1 -Dflume.root.logger=INFO,console 
-DpropertiesImplementation=org.apache.flume.node.EnvVarResolverProperties</dd>
+</dl>
+<p>Note the above is just an example, environment variables can be configured 
in other ways, including being set in <cite>conf/flume-env.sh</cite>.</p>
+</div>
 <div class="section" id="logging-raw-data">
 <h4>Logging raw data<a class="headerlink" href="#logging-raw-data" 
title="Permalink to this headline">¶</a></h4>
 <p>Logging the raw stream of data flowing through the ingest pipeline is not 
desired behaviour in
@@ -367,10 +385,6 @@ source listening on that ports.</p>
 <p>There&#8217;s an exec source that executes a given command and consumes the 
output. A
 single &#8216;line&#8217; of output ie. text followed by carriage return 
(&#8216;\r&#8217;) or line
 feed (&#8216;\n&#8217;) or both together.</p>
-<div class="admonition note">
-<p class="first admonition-title">Note</p>
-<p class="last">Flume does not support tail as a source. One can wrap the tail 
command in an exec source to stream the file.</p>
-</div>
 </div>
 <div class="section" id="network-streams">
 <h4>Network streams<a class="headerlink" href="#network-streams" 
title="Permalink to this headline">¶</a></h4>
@@ -565,9 +579,9 @@ mounted for storage.</p>
 <span class="na">agent_foo.sinks.avro-forward-sink.channel</span> <span 
class="o">=</span> <span class="s">file-channel</span>
 
 <span class="c"># avro sink properties</span>
-<span class="na">agent_foo.sources.avro-forward-sink.type</span> <span 
class="o">=</span> <span class="s">avro</span>
-<span class="na">agent_foo.sources.avro-forward-sink.hostname</span> <span 
class="o">=</span> <span class="s">10.1.1.100</span>
-<span class="na">agent_foo.sources.avro-forward-sink.port</span> <span 
class="o">=</span> <span class="s">10000</span>
+<span class="na">agent_foo.sinks.avro-forward-sink.type</span> <span 
class="o">=</span> <span class="s">avro</span>
+<span class="na">agent_foo.sinks.avro-forward-sink.hostname</span> <span 
class="o">=</span> <span class="s">10.1.1.100</span>
+<span class="na">agent_foo.sinks.avro-forward-sink.port</span> <span 
class="o">=</span> <span class="s">10000</span>
 
 <span class="c"># configure other pieces</span>
 <span class="c">#...</span>
@@ -583,7 +597,7 @@ mounted for storage.</p>
 <span class="na">agent_foo.sources.avro-collection-source.channels</span> 
<span class="o">=</span> <span class="s">mem-channel</span>
 <span class="na">agent_foo.sinks.hdfs-sink.channel</span> <span 
class="o">=</span> <span class="s">mem-channel</span>
 
-<span class="c"># avro sink properties</span>
+<span class="c"># avro source properties</span>
 <span class="na">agent_foo.sources.avro-collection-source.type</span> <span 
class="o">=</span> <span class="s">avro</span>
 <span class="na">agent_foo.sources.avro-collection-source.bind</span> <span 
class="o">=</span> <span class="s">10.1.1.100</span>
 <span class="na">agent_foo.sources.avro-collection-source.port</span> <span 
class="o">=</span> <span class="s">10000</span>
@@ -1007,15 +1021,9 @@ never guarantee data has been received w
 asynchronous interface such as ExecSource! As an extension of this
 warning - and to be completely clear - there is absolutely zero guarantee
 of event delivery when using this source. For stronger reliability
-guarantees, consider the Spooling Directory Source or direct integration
+guarantees, consider the Spooling Directory Source, Taildir Source or direct 
integration
 with Flume via the SDK.</p>
 </div>
-<div class="admonition note">
-<p class="first admonition-title">Note</p>
-<p class="last">You can use ExecSource to emulate TailSource from Flume 0.9x 
(flume og).
-Just use unix command <tt class="docutils literal"><span 
class="pre">tail</span> <span class="pre">-F</span> <span 
class="pre">/full/path/to/your/file</span></tt>. Parameter
--F is better in this case than -f as it will also follow file rotation.</p>
-</div>
 <p>Example for agent named a1:</p>
 <div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources</span> <span class="o">=</span> <span class="s">r1</span>
 <span class="na">a1.channels</span> <span class="o">=</span> <span 
class="s">c1</span>
@@ -1114,6 +1122,21 @@ via FLUME_CLASSPATH variable in flume-en
 <td>UTF-8</td>
 <td>Default converter only. Charset to use when converting JMS TextMessages to 
byte arrays.</td>
 </tr>
+<tr class="row-even"><td>createDurableSubscription</td>
+<td>false</td>
+<td>Whether to create durable subscription. Durable subscription can only be 
used with
+destinationType topic. If true, &#8220;clientId&#8221; and 
&#8220;durableSubscriptionName&#8221;
+have to be specified.</td>
+</tr>
+<tr class="row-odd"><td>clientId</td>
+<td>&#8211;</td>
+<td>JMS client identifier set on Connection right after it is created.
+Required for durable subscriptions.</td>
+</tr>
+<tr class="row-even"><td>durableSubscriptionName</td>
+<td>&#8211;</td>
+<td>Name used to identify the durable subscription. Required for durable 
subscriptions.</td>
+</tr>
 </tbody>
 </table>
 <div class="section" id="converter">
@@ -1688,6 +1711,18 @@ true to read events as the Flume Avro bi
 on the KafkaSink or with the parseAsFlumeEvent property on the Kafka Channel 
this will preserve
 any Flume headers sent on the producing side.</td>
 </tr>
+<tr class="row-odd"><td>setTopicHeader</td>
+<td>true</td>
+<td>When set to true, stores the topic of the retrieved message into a header, 
defined by the
+<tt class="docutils literal"><span class="pre">topicHeader</span></tt> 
property.</td>
+</tr>
+<tr class="row-even"><td>topicHeader</td>
+<td>topic</td>
+<td>Defines the name of the header in which to store the name of the topic the 
message was received
+from, if the <tt class="docutils literal"><span 
class="pre">setTopicHeader</span></tt> property is set to <tt class="docutils 
literal"><span class="pre">true</span></tt>. Care should be taken if combining
+with the Kafka Sink <tt class="docutils literal"><span 
class="pre">topicHeader</span></tt> property so as to avoid sending the message 
back to the same
+topic in a loop.</td>
+</tr>
 <tr class="row-odd"><td>migrateZookeeperOffsets</td>
 <td>true</td>
 <td>When no Kafka stored offset is found, look up the offsets in Zookeeper and 
commit them to Kafka.
@@ -1793,19 +1828,19 @@ and the jira for tracking this issue:
 to learn about additional configuration settings for fine tuning for example 
any of the following:
 security provider, cipher suites, enabled protocols, truststore or keystore 
types.</p>
 <p>Example configuration with server side authentication and data 
encryption.</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.channel.kafka.KafkaChannel</span>
-<span class="na">a1.channels.channel1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
-<span class="na">a1.channels.channel1.kafka.topic</span> <span 
class="o">=</span> <span class="s">channel1</span>
-<span class="na">a1.channels.channel1.kafka.consumer.group.id</span> <span 
class="o">=</span> <span class="s">flume-consumer</span>
-<span class="na">a1.channels.channel1.kafka.consumer.security.protocol</span> 
<span class="o">=</span> <span class="s">SSL</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.ssl.truststore.location</span><span
 class="o">=</span><span class="s">/path/to/truststore.jks</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.ssl.truststore.password</span><span
 class="o">=</span><span class="s">&lt;password to access the 
truststore&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources.source1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.source.kafka.KafkaSource</span>
+<span class="na">a1.sources.source1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
+<span class="na">a1.sources.source1.kafka.topics</span> <span 
class="o">=</span> <span class="s">mytopic</span>
+<span class="na">a1.sources.source1.kafka.consumer.group.id</span> <span 
class="o">=</span> <span class="s">flume-consumer</span>
+<span class="na">a1.sources.source1.kafka.consumer.security.protocol</span> 
<span class="o">=</span> <span class="s">SSL</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.ssl.truststore.location</span><span
 class="o">=</span><span class="s">/path/to/truststore.jks</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.ssl.truststore.password</span><span
 class="o">=</span><span class="s">&lt;password to access the 
truststore&gt;</span>
 </pre></div>
 </div>
 <p>Note: By default the property <tt class="docutils literal"><span 
class="pre">ssl.endpoint.identification.algorithm</span></tt>
 is not defined, so hostname verification is not performed.
 In order to enable hostname verification, set the following properties</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.kafka.consumer.ssl.endpoint.identification.algorithm</span><span
 class="o">=</span><span class="s">HTTPS</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources.source1.kafka.consumer.ssl.endpoint.identification.algorithm</span><span
 class="o">=</span><span class="s">HTTPS</span>
 </pre></div>
 </div>
 <p>Once enabled, clients will verify the server&#8217;s fully qualified domain 
name (FQDN)
@@ -1818,13 +1853,13 @@ against one of the following two fields:
 Each Flume agent has to have its client certificate which has to be trusted by 
Kafka brokers either
 individually or by their signature chain. Common example is to sign each 
client certificate by a single Root CA
 which in turn is trusted by Kafka brokers.</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.kafka.consumer.ssl.keystore.location</span><span
 class="o">=</span><span class="s">/path/to/client.keystore.jks</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.ssl.keystore.password</span><span
 class="o">=</span><span class="s">&lt;password to access the 
keystore&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources.source1.kafka.consumer.ssl.keystore.location</span><span 
class="o">=</span><span class="s">/path/to/client.keystore.jks</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.ssl.keystore.password</span><span 
class="o">=</span><span class="s">&lt;password to access the keystore&gt;</span>
 </pre></div>
 </div>
 <p>If keystore and key use different password protection then <tt 
class="docutils literal"><span class="pre">ssl.key.password</span></tt> 
property will
 provide the required additional secret for both consumer keystores:</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.kafka.consumer.ssl.key.password</span><span 
class="o">=</span><span class="s">&lt;password to access the key&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources.source1.kafka.consumer.ssl.key.password</span><span 
class="o">=</span><span class="s">&lt;password to access the key&gt;</span>
 </pre></div>
 </div>
 <p><strong>Kerberos and Kafka Source:</strong></p>
@@ -1837,25 +1872,25 @@ for information on the JAAS file content
 </pre></div>
 </div>
 <p>Example secure configuration using SASL_PLAINTEXT:</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.channel.kafka.KafkaChannel</span>
-<span class="na">a1.channels.channel1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
-<span class="na">a1.channels.channel1.kafka.topic</span> <span 
class="o">=</span> <span class="s">channel1</span>
-<span class="na">a1.channels.channel1.kafka.consumer.group.id</span> <span 
class="o">=</span> <span class="s">flume-consumer</span>
-<span class="na">a1.channels.channel1.kafka.consumer.security.protocol</span> 
<span class="o">=</span> <span class="s">SASL_PLAINTEXT</span>
-<span class="na">a1.channels.channel1.kafka.consumer.sasl.mechanism</span> 
<span class="o">=</span> <span class="s">GSSAPI</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.sasl.kerberos.service.name</span>
 <span class="o">=</span> <span class="s">kafka</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources.source1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.source.kafka.KafkaSource</span>
+<span class="na">a1.sources.source1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
+<span class="na">a1.sources.source1.kafka.topics</span> <span 
class="o">=</span> <span class="s">mytopic</span>
+<span class="na">a1.sources.source1.kafka.consumer.group.id</span> <span 
class="o">=</span> <span class="s">flume-consumer</span>
+<span class="na">a1.sources.source1.kafka.consumer.security.protocol</span> 
<span class="o">=</span> <span class="s">SASL_PLAINTEXT</span>
+<span class="na">a1.sources.source1.kafka.consumer.sasl.mechanism</span> <span 
class="o">=</span> <span class="s">GSSAPI</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.sasl.kerberos.service.name</span> 
<span class="o">=</span> <span class="s">kafka</span>
 </pre></div>
 </div>
 <p>Example secure configuration using SASL_SSL:</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.channel.kafka.KafkaChannel</span>
-<span class="na">a1.channels.channel1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
-<span class="na">a1.channels.channel1.kafka.topic</span> <span 
class="o">=</span> <span class="s">channel1</span>
-<span class="na">a1.channels.channel1.kafka.consumer.group.id</span> <span 
class="o">=</span> <span class="s">flume-consumer</span>
-<span class="na">a1.channels.channel1.kafka.consumer.security.protocol</span> 
<span class="o">=</span> <span class="s">SASL_SSL</span>
-<span class="na">a1.channels.channel1.kafka.consumer.sasl.mechanism</span> 
<span class="o">=</span> <span class="s">GSSAPI</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.sasl.kerberos.service.name</span>
 <span class="o">=</span> <span class="s">kafka</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.ssl.truststore.location</span><span
 class="o">=</span><span class="s">/path/to/truststore.jks</span>
-<span 
class="na">a1.channels.channel1.kafka.consumer.ssl.truststore.password</span><span
 class="o">=</span><span class="s">&lt;password to access the 
truststore&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources.source1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.source.kafka.KafkaSource</span>
+<span class="na">a1.sources.source1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
+<span class="na">a1.sources.source1.kafka.topics</span> <span 
class="o">=</span> <span class="s">mytopic</span>
+<span class="na">a1.sources.source1.kafka.consumer.group.id</span> <span 
class="o">=</span> <span class="s">flume-consumer</span>
+<span class="na">a1.sources.source1.kafka.consumer.security.protocol</span> 
<span class="o">=</span> <span class="s">SASL_SSL</span>
+<span class="na">a1.sources.source1.kafka.consumer.sasl.mechanism</span> <span 
class="o">=</span> <span class="s">GSSAPI</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.sasl.kerberos.service.name</span> 
<span class="o">=</span> <span class="s">kafka</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.ssl.truststore.location</span><span
 class="o">=</span><span class="s">/path/to/truststore.jks</span>
+<span 
class="na">a1.sources.source1.kafka.consumer.ssl.truststore.password</span><span
 class="o">=</span><span class="s">&lt;password to access the 
truststore&gt;</span>
 </pre></div>
 </div>
 <p>Sample JAAS file. For reference of its content please see client config 
sections of the desired authentication mechanism (GSSAPI/PLAIN)
@@ -1881,8 +1916,8 @@ Also please make sure that the operating
 </pre></div>
 </div>
 </div>
-<div class="section" id="netcat-source">
-<h4>NetCat Source<a class="headerlink" href="#netcat-source" title="Permalink 
to this headline">¶</a></h4>
+<div class="section" id="netcat-tcp-source">
+<h4>NetCat TCP Source<a class="headerlink" href="#netcat-tcp-source" 
title="Permalink to this headline">¶</a></h4>
 <p>A netcat-like source that listens on a given port and turns each line of 
text
 into an event. Acts like <tt class="docutils literal"><span 
class="pre">nc</span> <span class="pre">-k</span> <span class="pre">-l</span> 
<span class="pre">[host]</span> <span class="pre">[port]</span></tt>. In other 
words,
 it opens a specified port and listens for data. The expectation is that the
@@ -1954,16 +1989,85 @@ Flume event and sent via the connected c
 </pre></div>
 </div>
 </div>
+<div class="section" id="netcat-udp-source">
+<h4>NetCat UDP Source<a class="headerlink" href="#netcat-udp-source" 
title="Permalink to this headline">¶</a></h4>
+<p>As per the original Netcat (TCP) source, this source that listens on a given
+port and turns each line of text into an event and sent via the connected 
channel.
+Acts like <tt class="docutils literal"><span class="pre">nc</span> <span 
class="pre">-u</span> <span class="pre">-k</span> <span class="pre">-l</span> 
<span class="pre">[host]</span> <span class="pre">[port]</span></tt>.</p>
+<p>Required properties are in <strong>bold</strong>.</p>
+<table border="1" class="docutils">
+<colgroup>
+<col width="24%" />
+<col width="14%" />
+<col width="63%" />
+</colgroup>
+<thead valign="bottom">
+<tr class="row-odd"><th class="head">Property Name</th>
+<th class="head">Default</th>
+<th class="head">Description</th>
+</tr>
+</thead>
+<tbody valign="top">
+<tr class="row-even"><td><strong>channels</strong></td>
+<td>&#8211;</td>
+<td>&nbsp;</td>
+</tr>
+<tr class="row-odd"><td><strong>type</strong></td>
+<td>&#8211;</td>
+<td>The component type name, needs to be <tt class="docutils literal"><span 
class="pre">netcatudp</span></tt></td>
+</tr>
+<tr class="row-even"><td><strong>bind</strong></td>
+<td>&#8211;</td>
+<td>Host name or IP address to bind to</td>
+</tr>
+<tr class="row-odd"><td><strong>port</strong></td>
+<td>&#8211;</td>
+<td>Port # to bind to</td>
+</tr>
+<tr class="row-even"><td>remoteAddressHeader</td>
+<td>&#8211;</td>
+<td>&nbsp;</td>
+</tr>
+<tr class="row-odd"><td>selector.type</td>
+<td>replicating</td>
+<td>replicating or multiplexing</td>
+</tr>
+<tr class="row-even"><td>selector.*</td>
+<td>&nbsp;</td>
+<td>Depends on the selector.type value</td>
+</tr>
+<tr class="row-odd"><td>interceptors</td>
+<td>&#8211;</td>
+<td>Space-separated list of interceptors</td>
+</tr>
+<tr class="row-even"><td>interceptors.*</td>
+<td>&nbsp;</td>
+<td>&nbsp;</td>
+</tr>
+</tbody>
+</table>
+<p>Example for agent named a1:</p>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sources</span> <span class="o">=</span> <span class="s">r1</span>
+<span class="na">a1.channels</span> <span class="o">=</span> <span 
class="s">c1</span>
+<span class="na">a1.sources.r1.type</span> <span class="o">=</span> <span 
class="s">netcatudp</span>
+<span class="na">a1.sources.r1.bind</span> <span class="o">=</span> <span 
class="s">0.0.0.0</span>
+<span class="na">a1.sources.r1.port</span> <span class="o">=</span> <span 
class="s">6666</span>
+<span class="na">a1.sources.r1.channels</span> <span class="o">=</span> <span 
class="s">c1</span>
+</pre></div>
+</div>
+</div>
 <div class="section" id="sequence-generator-source">
 <h4>Sequence Generator Source<a class="headerlink" 
href="#sequence-generator-source" title="Permalink to this headline">¶</a></h4>
 <p>A simple sequence generator that continuously generates events with a 
counter that starts from 0,
 increments by 1 and stops at totalEvents. Retries when it can&#8217;t send 
events to the channel. Useful
-mainly for testing. Required properties are in <strong>bold</strong>.</p>
+mainly for testing. During retries it keeps the body of the retried messages 
the same as before so
+that the number of unique events - after de-duplication at destination - is 
expected to be
+equal to the specified <tt class="docutils literal"><span 
class="pre">totalEvents</span></tt>. Required properties are in 
<strong>bold</strong>.</p>
 <table border="1" class="docutils">
 <colgroup>
-<col width="19%" />
-<col width="21%" />
-<col width="60%" />
+<col width="16%" />
+<col width="18%" />
+<col width="66%" />
 </colgroup>
 <thead valign="bottom">
 <tr class="row-odd"><th class="head">Property Name</th>
@@ -1998,7 +2102,7 @@ mainly for testing. Required properties
 </tr>
 <tr class="row-even"><td>batchSize</td>
 <td>1</td>
-<td>&nbsp;</td>
+<td>Number of events to attempt to process per request loop.</td>
 </tr>
 <tr class="row-odd"><td>totalEvents</td>
 <td>Long.MAX_VALUE</td>
@@ -2838,9 +2942,9 @@ this automatically is to use the Timesta
 </div>
 <table border="1" class="docutils">
 <colgroup>
-<col width="12%" />
-<col width="6%" />
-<col width="82%" />
+<col width="9%" />
+<col width="5%" />
+<col width="86%" />
 </colgroup>
 <thead valign="bottom">
 <tr class="row-odd"><th class="head">Name</th>
@@ -2919,8 +3023,8 @@ this automatically is to use the Timesta
 <td>Specify minimum number of replicas per HDFS block. If not specified, it 
comes from the default Hadoop config in the classpath.</td>
 </tr>
 <tr class="row-even"><td>hdfs.writeFormat</td>
-<td>&#8211;</td>
-<td>Format for sequence file records. One of &#8220;Text&#8221; or 
&#8220;Writable&#8221; (the default).</td>
+<td>Writable</td>
+<td>Format for sequence file records. One of <tt class="docutils 
literal"><span class="pre">Text</span></tt> or <tt class="docutils 
literal"><span class="pre">Writable</span></tt>. Set to <tt class="docutils 
literal"><span class="pre">Text</span></tt> before creating data files with 
Flume, otherwise those files cannot be read by either Apache Impala 
(incubating) or Apache Hive.</td>
 </tr>
 <tr class="row-odd"><td>hdfs.callTimeout</td>
 <td>10000</td>
@@ -4195,9 +4299,9 @@ through various Flume sources. This curr
 <p>Required properties are marked in bold font.</p>
 <table border="1" class="docutils">
 <colgroup>
-<col width="18%" />
+<col width="17%" />
 <col width="10%" />
-<col width="72%" />
+<col width="73%" />
 </colgroup>
 <thead valign="bottom">
 <tr class="row-odd"><th class="head">Property Name</th>
@@ -4221,7 +4325,9 @@ The format is comma separated list of ho
 <td>The topic in Kafka to which the messages will be published. If this 
parameter is configured,
 messages will be published to this topic.
 If the event header contains a &#8220;topic&#8221; field, the event will be 
published to that topic
-overriding the topic configured here.</td>
+overriding the topic configured here.
+Arbitrary header substitution is supported, eg. %{header} is replaced with 
value of event header named &#8220;header&#8221;.
+(If using the substitution, it is recommended to set 
&#8220;auto.create.topics.enable&#8221; property of Kafka broker to true.)</td>
 </tr>
 <tr class="row-odd"><td>flumeBatchSize</td>
 <td>100</td>
@@ -4254,6 +4360,15 @@ from the event header and send the messa
 value represents an invalid partition, an EventDeliveryException will be 
thrown. If the header value
 is present then this setting overrides <tt class="docutils literal"><span 
class="pre">defaultPartitionId</span></tt>.</td>
 </tr>
+<tr class="row-even"><td>allowTopicOverride</td>
+<td>true</td>
+<td>When set, the sink will allow a message to be produced into a topic 
specified by the <tt class="docutils literal"><span 
class="pre">topicHeader</span></tt> property (if provided).</td>
+</tr>
+<tr class="row-odd"><td>topicHeader</td>
+<td>topic</td>
+<td>When set in conjunction with <tt class="docutils literal"><span 
class="pre">allowTopicOverride</span></tt> will produce a message into the 
value of the header named using the value of this property.
+Care should be taken when using in conjunction with the Kafka Source <tt 
class="docutils literal"><span class="pre">topicHeader</span></tt> property to 
avoid creating a loopback.</td>
+</tr>
 <tr class="row-even"><td>kafka.producer.security.protocol</td>
 <td>PLAINTEXT</td>
 <td>Set to SASL_PLAINTEXT, SASL_SSL or SSL if writing to Kafka using some 
level of security. See below for additional info on secure setup.</td>
@@ -4326,7 +4441,7 @@ argument.</p>
 <span class="na">a1.sinks.k1.kafka.flumeBatchSize</span> <span 
class="o">=</span> <span class="s">20</span>
 <span class="na">a1.sinks.k1.kafka.producer.acks</span> <span 
class="o">=</span> <span class="s">1</span>
 <span class="na">a1.sinks.k1.kafka.producer.linger.ms</span> <span 
class="o">=</span> <span class="s">1</span>
-<span class="na">a1.sinks.ki.kafka.producer.compression.type</span> <span 
class="o">=</span> <span class="s">snappy</span>
+<span class="na">a1.sinks.k1.kafka.producer.compression.type</span> <span 
class="o">=</span> <span class="s">snappy</span>
 </pre></div>
 </div>
 <p><strong>Security and Kafka Sink:</strong></p>
@@ -4352,18 +4467,18 @@ and the jira for tracking this issue:
 to learn about additional configuration settings for fine tuning for example 
any of the following:
 security provider, cipher suites, enabled protocols, truststore or keystore 
types.</p>
 <p>Example configuration with server side authentication and data 
encryption.</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.channel.kafka.KafkaChannel</span>
-<span class="na">a1.channels.channel1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
-<span class="na">a1.channels.channel1.kafka.topic</span> <span 
class="o">=</span> <span class="s">channel1</span>
-<span class="na">a1.channels.channel1.kafka.producer.security.protocol</span> 
<span class="o">=</span> <span class="s">SSL</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.ssl.truststore.location</span> 
<span class="o">=</span> <span class="s">/path/to/truststore.jks</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.ssl.truststore.password</span> 
<span class="o">=</span> <span class="s">&lt;password to access the 
truststore&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sinks.sink1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.sink.kafka.KafkaSink</span>
+<span class="na">a1.sinks.sink1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
+<span class="na">a1.sinks.sink1.kafka.topic</span> <span class="o">=</span> 
<span class="s">mytopic</span>
+<span class="na">a1.sinks.sink1.kafka.producer.security.protocol</span> <span 
class="o">=</span> <span class="s">SSL</span>
+<span class="na">a1.sinks.sink1.kafka.producer.ssl.truststore.location</span> 
<span class="o">=</span> <span class="s">/path/to/truststore.jks</span>
+<span class="na">a1.sinks.sink1.kafka.producer.ssl.truststore.password</span> 
<span class="o">=</span> <span class="s">&lt;password to access the 
truststore&gt;</span>
 </pre></div>
 </div>
 <p>Note: By default the property <tt class="docutils literal"><span 
class="pre">ssl.endpoint.identification.algorithm</span></tt>
 is not defined, so hostname verification is not performed.
 In order to enable hostname verification, set the following properties</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.kafka.producer.ssl.endpoint.identification.algorithm</span>
 <span class="o">=</span> <span class="s">HTTPS</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sinks.sink1.kafka.producer.ssl.endpoint.identification.algorithm</span>
 <span class="o">=</span> <span class="s">HTTPS</span>
 </pre></div>
 </div>
 <p>Once enabled, clients will verify the server&#8217;s fully qualified domain 
name (FQDN)
@@ -4376,13 +4491,13 @@ against one of the following two fields:
 Each Flume agent has to have its client certificate which has to be trusted by 
Kafka brokers either
 individually or by their signature chain. Common example is to sign each 
client certificate by a single Root CA
 which in turn is trusted by Kafka brokers.</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.kafka.producer.ssl.keystore.location</span> 
<span class="o">=</span> <span class="s">/path/to/client.keystore.jks</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.ssl.keystore.password</span> 
<span class="o">=</span> <span class="s">&lt;password to access the 
keystore&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sinks.sink1.kafka.producer.ssl.keystore.location</span> <span 
class="o">=</span> <span class="s">/path/to/client.keystore.jks</span>
+<span class="na">a1.sinks.sink1.kafka.producer.ssl.keystore.password</span> 
<span class="o">=</span> <span class="s">&lt;password to access the 
keystore&gt;</span>
 </pre></div>
 </div>
 <p>If keystore and key use different password protection then <tt 
class="docutils literal"><span class="pre">ssl.key.password</span></tt> 
property will
 provide the required additional secret for producer keystore:</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.kafka.producer.ssl.key.password</span> <span 
class="o">=</span> <span class="s">&lt;password to access the key&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sinks.sink1.kafka.producer.ssl.key.password</span> <span 
class="o">=</span> <span class="s">&lt;password to access the key&gt;</span>
 </pre></div>
 </div>
 <p><strong>Kerberos and Kafka Sink:</strong></p>
@@ -4395,23 +4510,23 @@ for information on the JAAS file content
 </pre></div>
 </div>
 <p>Example secure configuration using SASL_PLAINTEXT:</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.channel.kafka.KafkaChannel</span>
-<span class="na">a1.channels.channel1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
-<span class="na">a1.channels.channel1.kafka.topic</span> <span 
class="o">=</span> <span class="s">channel1</span>
-<span class="na">a1.channels.channel1.kafka.producer.security.protocol</span> 
<span class="o">=</span> <span class="s">SASL_PLAINTEXT</span>
-<span class="na">a1.channels.channel1.kafka.producer.sasl.mechanism</span> 
<span class="o">=</span> <span class="s">GSSAPI</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.sasl.kerberos.service.name</span>
 <span class="o">=</span> <span class="s">kafka</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sinks.sink1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.sink.kafka.KafkaSink</span>
+<span class="na">a1.sinks.sink1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
+<span class="na">a1.sinks.sink1.kafka.topic</span> <span class="o">=</span> 
<span class="s">mytopic</span>
+<span class="na">a1.sinks.sink1.kafka.producer.security.protocol</span> <span 
class="o">=</span> <span class="s">SASL_PLAINTEXT</span>
+<span class="na">a1.sinks.sink1.kafka.producer.sasl.mechanism</span> <span 
class="o">=</span> <span class="s">GSSAPI</span>
+<span 
class="na">a1.sinks.sink1.kafka.producer.sasl.kerberos.service.name</span> 
<span class="o">=</span> <span class="s">kafka</span>
 </pre></div>
 </div>
 <p>Example secure configuration using SASL_SSL:</p>
-<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels.channel1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.channel.kafka.KafkaChannel</span>
-<span class="na">a1.channels.channel1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
-<span class="na">a1.channels.channel1.kafka.topic</span> <span 
class="o">=</span> <span class="s">channel1</span>
-<span class="na">a1.channels.channel1.kafka.producer.security.protocol</span> 
<span class="o">=</span> <span class="s">SASL_SSL</span>
-<span class="na">a1.channels.channel1.kafka.producer.sasl.mechanism</span> 
<span class="o">=</span> <span class="s">GSSAPI</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.sasl.kerberos.service.name</span>
 <span class="o">=</span> <span class="s">kafka</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.ssl.truststore.location</span> 
<span class="o">=</span> <span class="s">/path/to/truststore.jks</span>
-<span 
class="na">a1.channels.channel1.kafka.producer.ssl.truststore.password</span> 
<span class="o">=</span> <span class="s">&lt;password to access the 
truststore&gt;</span>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.sinks.sink1.type</span> <span class="o">=</span> <span 
class="s">org.apache.flume.sink.kafka.KafkaSink</span>
+<span class="na">a1.sinks.sink1.kafka.bootstrap.servers</span> <span 
class="o">=</span> <span class="s">kafka-1:9093,kafka-2:9093,kafka-3:9093</span>
+<span class="na">a1.sinks.sink1.kafka.topic</span> <span class="o">=</span> 
<span class="s">mytopic</span>
+<span class="na">a1.sinks.sink1.kafka.producer.security.protocol</span> <span 
class="o">=</span> <span class="s">SASL_SSL</span>
+<span class="na">a1.sinks.sink1.kafka.producer.sasl.mechanism</span> <span 
class="o">=</span> <span class="s">GSSAPI</span>
+<span 
class="na">a1.sinks.sink1.kafka.producer.sasl.kerberos.service.name</span> 
<span class="o">=</span> <span class="s">kafka</span>
+<span class="na">a1.sinks.sink1.kafka.producer.ssl.truststore.location</span> 
<span class="o">=</span> <span class="s">/path/to/truststore.jks</span>
+<span class="na">a1.sinks.sink1.kafka.producer.ssl.truststore.password</span> 
<span class="o">=</span> <span class="s">&lt;password to access the 
truststore&gt;</span>
 </pre></div>
 </div>
 <p>Sample JAAS file. For reference of its content please see client config 
sections of the desired authentication mechanism (GSSAPI/PLAIN)
@@ -4428,6 +4543,114 @@ that the operating system user of the Fl
 </pre></div>
 </div>
 </div>
+<div class="section" id="http-sink">
+<h4>HTTP Sink<a class="headerlink" href="#http-sink" title="Permalink to this 
headline">¶</a></h4>
+<p>Behaviour of this sink is that it will take events from the channel, and
+send those events to a remote service using an HTTP POST request. The event
+content is sent as the POST body.</p>
+<p>Error handling behaviour of this sink depends on the HTTP response returned
+by the target server. The sink backoff/ready status is configurable, as is the
+transaction commit/rollback result and whether the event contributes to the
+successful event drain count.</p>
+<p>Any malformed HTTP response returned by the server where the status code is
+not readable will result in a backoff signal and the event is not consumed
+from the channel.</p>
+<p>Required properties are in <strong>bold</strong>.</p>
+<table border="1" class="docutils">
+<colgroup>
+<col width="18%" />
+<col width="12%" />
+<col width="70%" />
+</colgroup>
+<thead valign="bottom">
+<tr class="row-odd"><th class="head">Property Name</th>
+<th class="head">Default</th>
+<th class="head">Description</th>
+</tr>
+</thead>
+<tbody valign="top">
+<tr class="row-even"><td><strong>channel</strong></td>
+<td>&#8211;</td>
+<td>&nbsp;</td>
+</tr>
+<tr class="row-odd"><td><strong>type</strong></td>
+<td>&#8211;</td>
+<td>The component type name, needs to be <tt class="docutils literal"><span 
class="pre">http</span></tt>.</td>
+</tr>
+<tr class="row-even"><td><strong>endpoint</strong></td>
+<td>&#8211;</td>
+<td>The fully qualified URL endpoint to POST to</td>
+</tr>
+<tr class="row-odd"><td>connectTimeout</td>
+<td>5000</td>
+<td>The socket connection timeout in milliseconds</td>
+</tr>
+<tr class="row-even"><td>requestTimeout</td>
+<td>5000</td>
+<td>The maximum request processing time in milliseconds</td>
+</tr>
+<tr class="row-odd"><td>contentTypeHeader</td>
+<td>text/plain</td>
+<td>The HTTP Content-Type header</td>
+</tr>
+<tr class="row-even"><td>acceptHeader</td>
+<td>text/plain</td>
+<td>The HTTP Accept header value</td>
+</tr>
+<tr class="row-odd"><td>defaultBackoff</td>
+<td>true</td>
+<td>Whether to backoff by default on receiving all HTTP status codes</td>
+</tr>
+<tr class="row-even"><td>defaultRollback</td>
+<td>true</td>
+<td>Whether to rollback by default on receiving all HTTP status codes</td>
+</tr>
+<tr class="row-odd"><td>defaultIncrementMetrics</td>
+<td>false</td>
+<td>Whether to increment metrics by default on receiving all HTTP status 
codes</td>
+</tr>
+<tr class="row-even"><td>backoff.CODE</td>
+<td>&#8211;</td>
+<td>Configures a specific backoff for an individual (i.e. 200) code or a group 
(i.e. 2XX) code</td>
+</tr>
+<tr class="row-odd"><td>rollback.CODE</td>
+<td>&#8211;</td>
+<td>Configures a specific rollback for an individual (i.e. 200) code or a 
group (i.e. 2XX) code</td>
+</tr>
+<tr class="row-even"><td>incrementMetrics.CODE</td>
+<td>&#8211;</td>
+<td>Configures a specific metrics increment for an individual (i.e. 200) code 
or a group (i.e. 2XX) code</td>
+</tr>
+</tbody>
+</table>
+<p>Note that the most specific HTTP status code match is used for the backoff,
+rollback and incrementMetrics configuration options. If there are configuration
+values for both 2XX and 200 status codes, then 200 HTTP codes will use the 200
+value, and all other HTTP codes in the 201-299 range will use the 2XX 
value.</p>
+<p>Any empty or null events are consumed without any request being made to the
+HTTP endpoint.</p>
+<p>Example for agent named a1:</p>
+<div class="highlight-properties"><div class="highlight"><pre><span 
class="na">a1.channels</span> <span class="o">=</span> <span class="s">c1</span>
+<span class="na">a1.sinks</span> <span class="o">=</span> <span 
class="s">k1</span>
+<span class="na">a1.sinks.k1.type</span> <span class="o">=</span> <span 
class="s">http</span>
+<span class="na">a1.sinks.k1.channel</span> <span class="o">=</span> <span 
class="s">c1</span>
+<span class="na">a1.sinks.k1.endpoint</span> <span class="o">=</span> <span 
class="s">http://localhost:8080/someuri</span>
+<span class="na">a1.sinks.k1.connectTimeout</span> <span class="o">=</span> 
<span class="s">2000</span>
+<span class="na">a1.sinks.k1.requestTimeout</span> <span class="o">=</span> 
<span class="s">2000</span>
+<span class="na">a1.sinks.k1.acceptHeader</span> <span class="o">=</span> 
<span class="s">application/json</span>
+<span class="na">a1.sinks.k1.contentTypeHeader</span> <span class="o">=</span> 
<span class="s">application/json</span>
+<span class="na">a1.sinks.k1.defaultBackoff</span> <span class="o">=</span> 
<span class="s">true</span>
+<span class="na">a1.sinks.k1.defaultRollback</span> <span class="o">=</span> 
<span class="s">true</span>
+<span class="na">a1.sinks.k1.defaultIncrementMetrics</span> <span 
class="o">=</span> <span class="s">false</span>
+<span class="na">a1.sinks.k1.backoff.4XX</span> <span class="o">=</span> <span 
class="s">false</span>
+<span class="na">a1.sinks.k1.rollback.4XX</span> <span class="o">=</span> 
<span class="s">false</span>
+<span class="na">a1.sinks.k1.incrementMetrics.4XX</span> <span 
class="o">=</span> <span class="s">true</span>
+<span class="na">a1.sinks.k1.backoff.200</span> <span class="o">=</span> <span 
class="s">false</span>
+<span class="na">a1.sinks.k1.rollback.200</span> <span class="o">=</span> 
<span class="s">false</span>
+<span class="na">a1.sinks.k1.incrementMetrics.200</span> <span 
class="o">=</span> <span class="s">true</span>
+</pre></div>
+</div>
+</div>
 <div class="section" id="custom-sink">
 <h4>Custom Sink<a class="headerlink" href="#custom-sink" title="Permalink to 
this headline">¶</a></h4>
 <p>A custom sink is your own implementation of the Sink interface. A custom
@@ -5697,13 +5920,13 @@ the HostInterceptor.</p>
 <div class="section" id="timestamp-interceptor">
 <h4>Timestamp Interceptor<a class="headerlink" href="#timestamp-interceptor" 
title="Permalink to this headline">¶</a></h4>
 <p>This interceptor inserts into the event headers, the time in millis at 
which it processes the event. This interceptor
-inserts a header with key <tt class="docutils literal"><span 
class="pre">timestamp</span></tt> whose value is the relevant timestamp. This 
interceptor
-can preserve an existing timestamp if it is already present in the 
configuration.</p>
+inserts a header with key <tt class="docutils literal"><span 
class="pre">timestamp</span></tt> (or as specified by the <tt class="docutils 
literal"><span class="pre">header</span></tt> property) whose value is the 
relevant timestamp.
+This interceptor can preserve an existing timestamp if it is already present 
in the configuration.</p>
 <table border="1" class="docutils">
 <colgroup>
-<col width="17%" />
-<col width="7%" />
-<col width="76%" />
+<col width="16%" />
+<col width="9%" />
+<col width="74%" />
 </colgroup>
 <thead valign="bottom">
 <tr class="row-odd"><th class="head">Property Name</th>
@@ -5716,7 +5939,11 @@ can preserve an existing timestamp if it
 <td>&#8211;</td>
 <td>The component type name, has to be <tt class="docutils literal"><span 
class="pre">timestamp</span></tt> or the FQCN</td>
 </tr>
-<tr class="row-odd"><td>preserveExisting</td>
+<tr class="row-odd"><td>header</td>
+<td>timestamp</td>
+<td>The name of the header in which to place the generated timestamp.</td>
+</tr>
+<tr class="row-even"><td>preserveExisting</td>
 <td>false</td>
 <td>If the timestamp already exists, should it be preserved - true or 
false</td>
 </tr>
@@ -5772,7 +5999,6 @@ with key <tt class="docutils literal"><s
 <span class="na">a1.channels</span> <span class="o">=</span> <span 
class="s">c1</span>
 <span class="na">a1.sources.r1.interceptors</span> <span class="o">=</span> 
<span class="s">i1</span>
 <span class="na">a1.sources.r1.interceptors.i1.type</span> <span 
class="o">=</span> <span class="s">host</span>
-<span class="na">a1.sources.r1.interceptors.i1.hostHeader</span> <span 
class="o">=</span> <span class="s">hostname</span>
 </pre></div>
 </div>
 </div>
@@ -5824,6 +6050,46 @@ multiple static interceptors each defini
 </pre></div>
 </div>
 </div>
+<div class="section" id="remove-header-interceptor">
+<h4>Remove Header Interceptor<a class="headerlink" 
href="#remove-header-interceptor" title="Permalink to this headline">¶</a></h4>
+<p>This interceptor manipulates Flume event headers, by removing one or many 
headers. It can remove a statically defined header, headers based on a regular 
expression or headers in a list. If none of these is defined, or if no header 
matches the criteria, the Flume events are not modified.</p>
+<p>Note that if only one header needs to be removed, specifying it by name 
provides performance benefits over the other 2 methods.</p>
+<table border="1" class="docutils">
+<colgroup>
+<col width="11%" />
+<col width="6%" />
+<col width="84%" />
+</colgroup>
+<thead valign="bottom">
+<tr class="row-odd"><th class="head">Property Name</th>
+<th class="head">Default</th>
+<th class="head">Description</th>
+</tr>
+</thead>
+<tbody valign="top">
+<tr class="row-even"><td><strong>type</strong></td>
+<td>&#8211;</td>
+<td>The component type name has to be <tt class="docutils literal"><span 
class="pre">remove_header</span></tt></td>
+</tr>
+<tr class="row-odd"><td>withName</td>
+<td>&#8211;</td>
+<td>Name of the header to remove</td>
+</tr>
+<tr class="row-even"><td>fromList</td>
+<td>&#8211;</td>
+<td>List of headers to remove, separated with the separator specified by <tt 
class="docutils literal"><span class="pre">fromListSeparator</span></tt></td>
+</tr>
+<tr class="row-odd"><td>fromListSeparator</td>
+<td>\s*,\s*</td>
+<td>Regular expression used to separate multiple header names in the list 
specified by <tt class="docutils literal"><span 
class="pre">fromList</span></tt>. Default is a comma surrounded by any number 
of whitespace characters</td>
+</tr>
+<tr class="row-even"><td>matching</td>
+<td>&#8211;</td>
+<td>All the headers which names match this regular expression are removed</td>
+</tr>
+</tbody>
+</table>
+</div>
 <div class="section" id="uuid-interceptor">
 <h4>UUID Interceptor<a class="headerlink" href="#uuid-interceptor" 
title="Permalink to this headline">¶</a></h4>
 <p>This interceptor sets a universally unique identifier on all events that 
are intercepted. An example UUID is <tt class="docutils literal"><span 
class="pre">b5755073-77a9-43c1-8fad-b7a586fc1b97</span></tt>, which represents 
a 128-bit value.</p>
@@ -6119,7 +6385,7 @@ polling rather than terminating.</p>
 <h2>Log4J Appender<a class="headerlink" href="#log4j-appender" 
title="Permalink to this headline">¶</a></h2>
 <p>Appends Log4j events to a flume agent&#8217;s avro source. A client using 
this
 appender must have the flume-ng-sdk in the classpath (eg,
-flume-ng-sdk-1.8.0-SNAPSHOT.jar).
+flume-ng-sdk-1.8.0.jar).
 Required properties are in <strong>bold</strong>.</p>
 <table border="1" class="docutils">
 <colgroup>
@@ -6199,7 +6465,7 @@ then the schema will be included as a Fl
 <h2>Load Balancing Log4J Appender<a class="headerlink" 
href="#load-balancing-log4j-appender" title="Permalink to this 
headline">¶</a></h2>
 <p>Appends Log4j events to a list of flume agent&#8217;s avro source. A client 
using this
 appender must have the flume-ng-sdk in the classpath (eg,
-flume-ng-sdk-1.8.0-SNAPSHOT.jar). This appender supports a round-robin and 
random
+flume-ng-sdk-1.8.0.jar). This appender supports a round-robin and random
 scheme for performing the load balancing. It also supports a configurable 
backoff
 timeout so that down agents are removed temporarily from the set of hosts
 Required properties are in <strong>bold</strong>.</p>
@@ -7028,7 +7294,7 @@ can be leveraged to move the Flume agent
 
 <h3><a href="index.html">This Page</a></h3>
 <ul>
-<li><a class="reference internal" href="#">Flume 1.7.0 User Guide</a><ul>
+<li><a class="reference internal" href="#">Flume 1.8.0 User Guide</a><ul>
 <li><a class="reference internal" href="#introduction">Introduction</a><ul>
 <li><a class="reference internal" href="#overview">Overview</a></li>
 <li><a class="reference internal" href="#system-requirements">System 
Requirements</a></li>
@@ -7047,6 +7313,7 @@ can be leveraged to move the Flume agent
 <li><a class="reference internal" href="#wiring-the-pieces-together">Wiring 
the pieces together</a></li>
 <li><a class="reference internal" href="#starting-an-agent">Starting an 
agent</a></li>
 <li><a class="reference internal" href="#a-simple-example">A simple 
example</a></li>
+<li><a class="reference internal" 
href="#using-environment-variables-in-configuration-files">Using environment 
variables in configuration files</a></li>
 <li><a class="reference internal" href="#logging-raw-data">Logging raw 
data</a></li>
 <li><a class="reference internal" 
href="#zookeeper-based-configuration">Zookeeper based Configuration</a></li>
 <li><a class="reference internal" 
href="#installing-third-party-plugins">Installing third-party plugins</a><ul>
@@ -7093,7 +7360,8 @@ can be leveraged to move the Flume agent
 <li><a class="reference internal" href="#taildir-source">Taildir 
Source</a></li>
 <li><a class="reference internal" 
href="#twitter-1-firehose-source-experimental">Twitter 1% firehose Source 
(experimental)</a></li>
 <li><a class="reference internal" href="#kafka-source">Kafka Source</a></li>
-<li><a class="reference internal" href="#netcat-source">NetCat Source</a></li>
+<li><a class="reference internal" href="#netcat-tcp-source">NetCat TCP 
Source</a></li>
+<li><a class="reference internal" href="#netcat-udp-source">NetCat UDP 
Source</a></li>
 <li><a class="reference internal" href="#sequence-generator-source">Sequence 
Generator Source</a></li>
 <li><a class="reference internal" href="#syslog-sources">Syslog Sources</a><ul>
 <li><a class="reference internal" href="#syslog-tcp-source">Syslog TCP 
Source</a></li>
@@ -7134,6 +7402,7 @@ can be leveraged to move the Flume agent
 <li><a class="reference internal" 
href="#elasticsearchsink">ElasticSearchSink</a></li>
 <li><a class="reference internal" href="#kite-dataset-sink">Kite Dataset 
Sink</a></li>
 <li><a class="reference internal" href="#kafka-sink">Kafka Sink</a></li>
+<li><a class="reference internal" href="#http-sink">HTTP Sink</a></li>
 <li><a class="reference internal" href="#custom-sink">Custom Sink</a></li>
 </ul>
 </li>
@@ -7170,6 +7439,7 @@ can be leveraged to move the Flume agent
 <li><a class="reference internal" href="#timestamp-interceptor">Timestamp 
Interceptor</a></li>
 <li><a class="reference internal" href="#host-interceptor">Host 
Interceptor</a></li>
 <li><a class="reference internal" href="#static-interceptor">Static 
Interceptor</a></li>
+<li><a class="reference internal" href="#remove-header-interceptor">Remove 
Header Interceptor</a></li>
 <li><a class="reference internal" href="#uuid-interceptor">UUID 
Interceptor</a></li>
 <li><a class="reference internal" href="#morphline-interceptor">Morphline 
Interceptor</a></li>
 <li><a class="reference internal" 
href="#search-and-replace-interceptor">Search and Replace Interceptor</a></li>
@@ -7231,7 +7501,7 @@ can be leveraged to move the Flume agent
       <div class="clearer"></div>
     </div>
 <div class="footer">
-    &copy; Copyright 2009-2012 The Apache Software Foundation. Apache Flume, 
Flume, Apache, the Apache feather logo, and the Apache Flume project logo are 
trademarks of The Apache Software Foundation..
+    &copy; Copyright 2009-2017 The Apache Software Foundation. Apache Flume, 
Flume, Apache, the Apache feather logo, and the Apache Flume project logo are 
trademarks of The Apache Software Foundation..
 </div>
   </body>
 </html>
\ No newline at end of file

Modified: websites/staging/flume/trunk/content/_sources/FlumeDeveloperGuide.txt
==============================================================================
--- websites/staging/flume/trunk/content/_sources/FlumeDeveloperGuide.txt 
(original)
+++ websites/staging/flume/trunk/content/_sources/FlumeDeveloperGuide.txt Wed 
Oct  4 16:24:02 2017
@@ -15,7 +15,7 @@
 
 
 ======================================
-Flume 1.7.0 Developer Guide
+Flume 1.8.0 Developer Guide
 ======================================
 
 Introduction
@@ -114,6 +114,18 @@ Please note that Flume builds requires t
 be in the path. You can download and install it by following the instructions
 `here <https://developers.google.com/protocol-buffers/>`_.
 
+Updating Protocol Buffer Version
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+File channel has a dependency on Protocol Buffer. When updating the version of 
Protocol Buffer
+used by Flume, it is necessary to regenerate the data access classes using the 
protoc compiler
+that is part of Protocol Buffer as follows.
+
+#. Install the desired version of Protocol Buffer on your local machine
+#. Update version of Protocol Buffer in pom.xml
+#. Generate new Protocol Buffer data access classes in Flume: ``cd 
flume-ng-channels/flume-file-channel; mvn -P compile-proto clean package 
-DskipTests``
+#. Add Apache license header to any of the generated files that are missing it
+#. Rebuild and test Flume:  ``cd ../..; mvn clean install``
+
 Developing custom components
 ----------------------------
 
@@ -573,23 +585,23 @@ Required properties are in **bold**.
 Property Name          Default           Description
 =====================  ================  
======================================================================
 source.type            embedded          The only available source is the 
embedded source.
-**channel.type**       --                Either ``memory`` or ``file`` which 
correspond
+**channel.type**       --                Either ``memory`` or ``file`` which 
correspond 
                                         to MemoryChannel and FileChannel 
respectively.
 channel.*              --                Configuration options for the channel 
type requested,
                                         see MemoryChannel or FileChannel user 
guide for an exhaustive list.
 **sinks**              --                List of sink names
-**sink.type**          --                Property name must match a name in 
the list of sinks.
+**sink.type**          --                Property name must match a name in 
the list of sinks. 
                                         Value must be ``avro``
-sink.*                 --                Configuration options for the sink.
+sink.*                 --                Configuration options for the sink. 
                                         See AvroSink user guide for an 
exhaustive list,
                                         however note AvroSink requires at 
least hostname and port.
 **processor.type**     --                Either ``failover`` or 
``load_balance`` which correspond
                                         to FailoverSinksProcessor and 
LoadBalancingSinkProcessor respectively.
 processor.*            --                Configuration options for the sink 
processor selected.
-                                        See FailoverSinksProcessor and 
LoadBalancingSinkProcessor
+                                        See FailoverSinksProcessor and 
LoadBalancingSinkProcessor 
                                         user guide for an exhaustive list.
 source.interceptors    --                Space-separated list of interceptors
-source.interceptors.*  --                Configuration options for individual 
interceptors
+source.interceptors.*  --                Configuration options for individual 
interceptors 
                                         specified in the source.interceptors 
property
 =====================  ================  
======================================================================
 


Reply via email to