This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new e6de0fe  Publishing website 2020/03/03 22:26:42 at commit abfe08b
e6de0fe is described below

commit e6de0fe9422252c18592069f6785beb2d1b76a48
Author: jenkins <[email protected]>
AuthorDate: Tue Mar 3 22:26:42 2020 +0000

    Publishing website 2020/03/03 22:26:42 at commit abfe08b
---
 website/generated-content/documentation/runners/flink/index.html | 2 +-
 website/generated-content/documentation/runners/spark/index.html | 7 +++----
 2 files changed, 4 insertions(+), 5 deletions(-)

diff --git a/website/generated-content/documentation/runners/flink/index.html 
b/website/generated-content/documentation/runners/flink/index.html
index 9f5533d..0ee13ae 100644
--- a/website/generated-content/documentation/runners/flink/index.html
+++ b/website/generated-content/documentation/runners/flink/index.html
@@ -534,7 +534,7 @@ To run on a separate <a 
href="https://ci.apache.org/projects/flink/flink-docs-re
 
 <p><span class="language-py">3. Submit the pipeline as above.
 Note however that <code 
class="highlighter-rouge">environment_type=LOOPBACK</code> is only intended for 
local testing.
-See <a href="/roadmap/portability/#sdk-harness-config">here</a> for details.
+See <a href="/documentation/runtime/sdk-harness-config/">here</a> for details.
 </span></p>
 
 <p><span class="language-py">Steps 2 and 3 can be automated in Python by using 
the <code class="highlighter-rouge">FlinkRunner</code>,
diff --git a/website/generated-content/documentation/runners/spark/index.html 
b/website/generated-content/documentation/runners/spark/index.html
index 895b297..b8eac5b 100644
--- a/website/generated-content/documentation/runners/spark/index.html
+++ b/website/generated-content/documentation/runners/spark/index.html
@@ -438,12 +438,11 @@ For more details on the different deployment modes see: 
<a href="http://spark.ap
 <p><span class="language-py">2. Start JobService that will connect with the 
Spark master: <code class="highlighter-rouge">./gradlew 
:runners:spark:job-server:runShadow 
-PsparkMasterUrl=spark://localhost:7077</code>.</span></p>
 
 <p><span class="language-py">3. Submit the pipeline as above.
-Note however that <code 
class="highlighter-rouge">environment_type=LOOPBACK</code> is only intended for 
local testing.
-See <a href="/roadmap/portability/#sdk-harness-config">here</a> for 
details.</span></p>
+Note however that <code 
class="highlighter-rouge">environment_type=LOOPBACK</code> is only intended for 
local testing.</span></p>
 
 <p><span class="language-py">
-(Note that, depending on your cluster setup, you may need to change the <code 
class="highlighter-rouge">environment_type</code> option.
-See <a href="/roadmap/portability/#sdk-harness-config">here</a> for details.)
+Depending on your cluster setup, you may need to change the <code 
class="highlighter-rouge">environment_type</code> option.
+See <a href="/documentation/runtime/sdk-harness-config/">here</a> for details.
 </span></p>
 
 <h2 id="pipeline-options-for-the-spark-runner">Pipeline options for the Spark 
Runner</h2>

Reply via email to