Github user mjsax commented on a diff in the pull request:
https://github.com/apache/flink/pull/884#discussion_r34166660
--- Diff: docs/apis/storm_compatibility.md ---
@@ -0,0 +1,155 @@
+---
+title: "Storm Compatibility"
+is_beta: true
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements. See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership. The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied. See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+[Flink streaming](streaming_guide.html) is compatible with Apache Storm
interfaces and therefore allows
+reusing code that was implemented for Storm.
+
+You can:
+
+- execute a whole Storm `Topology` in Flink.
+- use Storm `Spout`/`Bolt` as source/operator in Flink streaming programs.
+
+This document shows how to use existing Storm code with Flink.
+
+* This will be replaced by the TOC
+{:toc}
+
+### Project Configuration
+
+Support for Storm is contained in the `flink-storm-compatibility-core`
Maven module.
+The code resides in the `org.apache.flink.stormcompatibility` package.
+
+Add the following dependency to your `pom.xml` if you want to execute
Storm code in Flink.
+
+~~~xml
+<dependency>
+ <groupId>org.apache.flink</groupId>
+ <artifactId>flink-storm-compatibility-core</artifactId>
+ <version>{{site.version}}</version>
+</dependency>
+~~~
+
+**Please note**: `flink-storm-compatibility-core` is not part of the
provided binary Flink distribution. Thus, you need to include
`flink-storm-compatiblitly-core` classes (and their dependencies) in your
program jar that is submitted to Flink's JobManager.
+
+### Execute Storm Topologies
+
+Flink provides a Storm compatible API
(`org.apache.flink.stormcompatibility.api`) that offers replacements for the
following classes:
+
+- `TopologyBuilder` replaced by `FlinkTopologyBuilder`
+- `StormSubmitter` replaced by `FlinkSubmitter`
+- `NimbusClient` and `Client` replaced by `FlinkClient`
+- `LocalCluster` replaced by `FlinkLocalCluster`
+
+In order to submit a Storm topology to Flink, it is sufficient to replace
the used Storm classed with their Flink replacements in the original Storm
client code that assembles the topology.
+If a topology is executed in a remote cluster, parameters `nimbus.host`
and `nimbus.thrift.port` are used as `jobmanger.rpc.address` and
`jobmanger.rpc.port`, respectively.
+If a parameter is not specified, the value is taken from `flink-conf.yaml`.
+
+<div class="codetabs" markdown="1">
+<div data-lang="java" markdown="1">
+~~~java
+FlinkTopologyBuilder builder = new FlinkTopologyBuilder(); // replaces:
TopologyBuilder builder = new FlinkTopology();
+
+builder.setSpout("source", new StormFileSpout(inputFilePath));
+builder.setBolt("tokenizer", new
StormBoltTokenizer()).shuffleGrouping("source");
+builder.setBolt("counter", new
StormBoltCounter()).fieldsGroupign("tokenizer", new Fields("word"));
--- End diff --
fixed. :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---