This is an automated email from the ASF dual-hosted git repository.
git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 1c05c5c Publishing website 2021/07/30 06:03:09 at commit 7c2b4b0
1c05c5c is described below
commit 1c05c5cbc26a79dee6e0c3ca3bafca356ea9dff0
Author: jenkins <[email protected]>
AuthorDate: Fri Jul 30 06:03:10 2021 +0000
Publishing website 2021/07/30 06:03:09 at commit 7c2b4b0
---
website/generated-content/documentation/index.xml | 14 ++++++--------
.../documentation/io/built-in/hadoop/index.html | 16 ++++++++--------
website/generated-content/sitemap.xml | 2 +-
3 files changed, 15 insertions(+), 17 deletions(-)
diff --git a/website/generated-content/documentation/index.xml
b/website/generated-content/documentation/index.xml
index 8690ebb..9c1230c 100644
--- a/website/generated-content/documentation/index.xml
+++ b/website/generated-content/documentation/index.xml
@@ -1047,7 +1047,7 @@ limitations under the License.
<li><code>key.class</code> - The <code>Key</code> class
returned by the <code>InputFormat</code> in
<code>mapreduce.job.inputformat.class</code>.</li>
<li><code>value.class</code> - The <code>Value</code> class
returned by the <code>InputFormat</code> in
<code>mapreduce.job.inputformat.class</code>.</li>
</ul>
-<p>For example:
+<p>For example:</p>
<div class='language-java snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1061,7 +1061,6 @@ limitations under the License.
<span class="n">myHadoopConfiguration</span><span
class="o">.</span><span class="na">setClass</span><span
class="o">(</span><span
class="s">&#34;value.class&#34;</span><span class="o">,</span>
<span class="n">InputFormatValueClass</span><span
class="o">,</span> <span class="n">Object</span><span
class="o">.</span><span class="na">class</span><span
class="o">);</span></code></pre></div>
</div>
</div>
-</p>
<div class='language-py snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1071,7 +1070,8 @@ limitations under the License.
</div>
</div>
<p>You will need to check if the <code>Key</code> and
<code>Value</code> classes output by the <code>InputFormat</code>
have a Beam <code>Coder</code> available. If not, you can use
<code>withKeyTranslation</code> or
<code>withValueTranslation</code> to specify a method transforming
instances of those classes into another class that is supported by a Beam
<code>Coder</code>. These settings are optional and you don&rsquo;t
need to specify t [...]
-<p>For example:
+<p>For example:</p>
+<p>
<div class='language-java snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1091,7 +1091,6 @@ limitations under the License.
<span class="o">};</span></code></pre></div>
</div>
</div>
-</p>
<div class='language-py snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1100,6 +1099,7 @@ limitations under the License.
<div class="highlight"><pre class="chroma"><code class="language-py"
data-lang="py"> <span class="c1"># The Beam SDK for Python does not support
Hadoop Input/Output Format IO.</span></code></pre></div>
</div>
</div>
+</p>
<h4 id="read-data-only-with-hadoop-configuration">Read data only with
Hadoop configuration.</h4>
<div class='language-java snippet'>
<div class="notebook-skip code-snippet">
@@ -1399,7 +1399,7 @@ The below example uses one such available wrapper API -
<a href="https://gith
Reading from a table snapshot bypasses the HBase region servers, instead
reading HBase data files directly from the filesystem.
This is useful for cases such as reading historical data or offloading of work
from the HBase cluster.
There are scenarios when this may prove faster than accessing content through
the region servers using the <code>HBaseIO</code>.</p>
-<p>A table snapshot can be taken using the HBase shell or programmatically:
+<p>A table snapshot can be taken using the HBase shell or
programmatically:</p>
<div class='language-java snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1416,7 +1416,6 @@ There are scenarios when this may prove faster than
accessing content through th
<span class="o">}</span></code></pre></div>
</div>
</div>
-</p>
<div class='language-py snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1491,7 +1490,7 @@ There are scenarios when this may prove faster than
accessing content through th
<li><code>mapreduce.job.partitioner.class</code> - Hadoop partitioner
class which will be used for distributing of records among partitions. This
property is not required for
<code>Write.PartitionedWriterBuilder#withoutPartitioning()</code>
write.</li>
</ul>
<p><em>Note</em>: All mentioned values have appropriate constants.
E.g.: <code>HadoopFormatIO.OUTPUT_FORMAT_CLASS_ATTR</code>.</p>
-<p>For example:
+<p>For example:</p>
<div class='language-java snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
@@ -1510,7 +1509,6 @@ There are scenarios when this may prove faster than
accessing content through th
<span class="n">myHadoopConfiguration</span><span
class="o">.</span><span class="na">setInt</span><span
class="o">(</span><span
class="s">&#34;mapreduce.job.reduces&#34;</span><span
class="o">,</span> <span class="n">2</span><span
class="o">);</span></code></pre></div>
</div>
</div>
-</p>
<div class='language-py snippet'>
<div class="notebook-skip code-snippet">
<a class="copy" type="button" data-bs-toggle="tooltip"
data-bs-placement="bottom" title="Copy to clipboard">
diff --git
a/website/generated-content/documentation/io/built-in/hadoop/index.html
b/website/generated-content/documentation/io/built-in/hadoop/index.html
index 789e2a5..a150e0d 100644
--- a/website/generated-content/documentation/io/built-in/hadoop/index.html
+++ b/website/generated-content/documentation/io/built-in/hadoop/index.html
@@ -18,12 +18,12 @@
function addPlaceholder(){$('input:text').attr('placeholder',"What are you
looking for?");}
function endSearch(){var
search=document.querySelector(".searchBar");search.classList.add("disappear");var
icons=document.querySelector("#iconsBar");icons.classList.remove("disappear");}
function blockScroll(){$("body").toggleClass("fixedPosition");}
-function openMenu(){addPlaceholder();blockScroll();}</script><div
class="clearfix container-main-content"><div class="section-nav closed"
data-offset-top=90 data-offset-bottom=500><span class="section-nav-back
glyphicon glyphicon-menu-left"></span><nav><ul class=section-nav-list
data-section-nav><li><span
class=section-nav-list-main-title>Documentation</span></li><li><a
href=/documentation>Using the Documentation</a></li><li
class=section-nav-item--collapsible><span class=section-nav-lis [...]
+function openMenu(){addPlaceholder();blockScroll();}</script><div
class="clearfix container-main-content"><div class="section-nav closed"
data-offset-top=90 data-offset-bottom=500><span class="section-nav-back
glyphicon glyphicon-menu-left"></span><nav><ul class=section-nav-list
data-section-nav><li><span
class=section-nav-list-main-title>Documentation</span></li><li><a
href=/documentation>Using the Documentation</a></li><li
class=section-nav-item--collapsible><span class=section-nav-lis [...]
<span class=c1>// Set Hadoop InputFormat, key and value class in configuration
</span><span class=c1></span><span class=n>myHadoopConfiguration</span><span
class=o>.</span><span class=na>setClass</span><span class=o>(</span><span
class=s>"mapreduce.job.inputformat.class"</span><span class=o>,</span>
<span class=n>InputFormatClass</span><span class=o>,</span>
<span class=n>InputFormat</span><span class=o>.</span><span
class=na>class</span><span class=o>);</span>
<span class=n>myHadoopConfiguration</span><span class=o>.</span><span
class=na>setClass</span><span class=o>(</span><span
class=s>"key.class"</span><span class=o>,</span> <span
class=n>InputFormatKeyClass</span><span class=o>,</span> <span
class=n>Object</span><span class=o>.</span><span class=na>class</span><span
class=o>);</span>
-<span class=n>myHadoopConfiguration</span><span class=o>.</span><span
class=na>setClass</span><span class=o>(</span><span
class=s>"value.class"</span><span class=o>,</span> <span
class=n>InputFormatValueClass</span><span class=o>,</span> <span
class=n>Object</span><span class=o>.</span><span class=na>class</span><span
class=o>);</span></code></pre></div></div></div></p><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggl [...]
+<span class=n>myHadoopConfiguration</span><span class=o>.</span><span
class=na>setClass</span><span class=o>(</span><span
class=s>"value.class"</span><span class=o>,</span> <span
class=n>InputFormatValueClass</span><span class=o>,</span> <span
class=n>Object</span><span class=o>.</span><span class=na>class</span><span
class=o>);</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=to [...]
<span class=k>new</span> <span class=n>SimpleFunction</span><span
class=o><</span><span class=n>InputFormatKeyClass</span><span
class=o>,</span> <span class=n>MyKeyClass</span><span class=o>>()</span>
<span class=o>{</span>
<span class=kd>public</span> <span class=n>MyKeyClass</span> <span
class=nf>apply</span><span class=o>(</span><span
class=n>InputFormatKeyClass</span> <span class=n>input</span><span
class=o>)</span> <span class=o>{</span>
<span class=c1>// ...logic to transform InputFormatKeyClass to MyKeyClass
@@ -34,7 +34,7 @@ function
openMenu(){addPlaceholder();blockScroll();}</script><div class="clearfi
<span class=kd>public</span> <span class=n>MyValueClass</span> <span
class=nf>apply</span><span class=o>(</span><span
class=n>InputFormatValueClass</span> <span class=n>input</span><span
class=o>)</span> <span class=o>{</span>
<span class=c1>// ...logic to transform InputFormatValueClass to MyValueClass
</span><span class=c1></span> <span class=o>}</span>
-<span class=o>};</span></code></pre></div></div></div></p><div
class="language-py snippet"><div class="notebook-skip code-snippet"><a
class=copy type=button data-bs-toggle=tooltip data-bs-placement=bottom
title="Copy to clipboard"><img src=/images/copy-icon.svg></a><div
class=highlight><pre class=chroma><code class=language-py data-lang=py> <span
class=c1># The Beam SDK for Python does not support Hadoop Input/Output Format
IO.</span></code></pre></div></div></div><h4 id=read-data-only- [...]
+<span class=o>};</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-py data-lang=py> <span class=c1># The Beam SDK for Python does
not support Hadoop Input/Output Format
IO.</span></code></pre></div></div></div></p><h4 id=read-data-only- [...]
<span class=n>HadoopFormatIO</span><span class=o>.<</span><span
class=n>InputFormatKeyClass</span><span class=o>,</span> <span
class=n>InputFormatKeyClass</span><span class=o>></span><span
class=n>read</span><span class=o>()</span>
<span class=o>.</span><span class=na>withConfiguration</span><span
class=o>(</span><span class=n>myHadoopConfiguration</span><span
class=o>);</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-py data-lang=py> <span class=c1># The Beam [...]
<span class=n>HadoopFormatIO</span><span class=o>.<</span><span
class=n>MyKeyClass</span><span class=o>,</span> <span
class=n>InputFormatKeyClass</span><span class=o>></span><span
class=n>read</span><span class=o>()</span>
@@ -102,7 +102,7 @@ The below example uses one such available wrapper API - <a
href=https://github.c
<span class=o>.</span><span class=na>withConfiguration</span><span
class=o>(</span><span class=n>dynamoDBConf</span><span
class=o>);</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-py data-lang=py> <span class=c1># The Beam SDK for [...]
Reading from a table snapshot bypasses the HBase region servers, instead
reading HBase data files directly from the filesystem.
This is useful for cases such as reading historical data or offloading of work
from the HBase cluster.
-There are scenarios when this may prove faster than accessing content through
the region servers using the <code>HBaseIO</code>.</p><p>A table snapshot can
be taken using the HBase shell or programmatically:<div class="language-java
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-java data-lang= [...]
+There are scenarios when this may prove faster than accessing content through
the region servers using the <code>HBaseIO</code>.</p><p>A table snapshot can
be taken using the HBase shell or programmatically:</p><div
class="language-java snippet"><div class="notebook-skip code-snippet"><a
class=copy type=button data-bs-toggle=tooltip data-bs-placement=bottom
title="Copy to clipboard"><img src=/images/copy-icon.svg></a><div
class=highlight><pre class=chroma><code class=language-java data-l [...]
<span class=n>Connection</span> <span class=n>connection</span> <span
class=o>=</span> <span class=n>ConnectionFactory</span><span
class=o>.</span><span class=na>createConnection</span><span
class=o>(</span><span class=n>hbaseConf</span><span class=o>);</span>
<span class=n>Admin</span> <span class=n>admin</span> <span
class=o>=</span> <span class=n>connection</span><span class=o>.</span><span
class=na>getAdmin</span><span class=o>()</span>
<span class=o>)</span> <span class=o>{</span>
@@ -110,7 +110,7 @@ There are scenarios when this may prove faster than
accessing content through th
<span class=s>"my_snaphshot"</span><span class=o>,</span>
<span class=n>TableName</span><span class=o>.</span><span
class=na>valueOf</span><span class=o>(</span><span
class=s>"my_table"</span><span class=o>),</span>
<span class=n>HBaseProtos</span><span class=o>.</span><span
class=na>SnapshotDescription</span><span class=o>.</span><span
class=na>Type</span><span class=o>.</span><span class=na>FLUSH</span><span
class=o>);</span>
-<span class=o>}</span></code></pre></div></div></div></p><div
class="language-py snippet"><div class="notebook-skip code-snippet"><a
class=copy type=button data-bs-toggle=tooltip data-bs-placement=bottom
title="Copy to clipboard"><img src=/images/copy-icon.svg></a><div
class=highlight><pre class=chroma><code class=language-py data-lang=py> <span
class=c1># The Beam SDK for Python does not support Hadoop Input/Output Format
IO.</span></code></pre></div></div></div><p>A <code>TableSnapsho [...]
+<span class=o>}</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-py data-lang=py> <span class=c1># The Beam SDK for Python does
not support Hadoop Input/Output Format
IO.</span></code></pre></div></div></div><p>A <code>TableSnapshotInp [...]
</span><span class=c1></span><span class=n>Scan</span> <span
class=n>scan</span> <span class=o>=</span> <span class=k>new</span> <span
class=n>Scan</span><span class=o>();</span>
<span class=n>scan</span><span class=o>.</span><span
class=na>setCaching</span><span class=o>(</span><span class=n>1000</span><span
class=o>);</span>
<span class=n>scan</span><span class=o>.</span><span
class=na>setBatch</span><span class=o>(</span><span class=n>1000</span><span
class=o>);</span>
@@ -133,7 +133,7 @@ There are scenarios when this may prove faster than
accessing content through th
<span class=n>hbaseConf</span> <span class=o>=</span> <span
class=n>job</span><span class=o>.</span><span
class=na>getConfiguration</span><span class=o>();</span> <span
class=o>//</span> <span class=n>extract</span> <span class=n>the</span> <span
class=n>modified</span> <span
class=n>clone</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to cl [...]
<span class=n>p</span><span class=o>.</span><span class=na>apply</span><span
class=o>(</span><span class=s>"read"</span><span class=o>,</span>
<span class=n>HadoopFormatIO</span><span class=o>.<</span><span
class=n>ImmutableBytesWritable</span><span class=o>,</span> <span
class=n>Result</span><span class=o>></span><span class=n>read</span><span
class=o>()</span>
- <span class=o>.</span><span class=na>withConfiguration</span><span
class=o>(</span><span class=n>hbaseConf</span><span
class=o>);</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-py data-lang=py> <span class=c1># The Beam SDK for Pyt [...]
+ <span class=o>.</span><span class=na>withConfiguration</span><span
class=o>(</span><span class=n>hbaseConf</span><span
class=o>);</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div class=highlight><pre class=chroma><code
class=language-py data-lang=py> <span class=c1># The Beam SDK for Pyt [...]
<span class=c1>// Set Hadoop OutputFormat, key and value class in configuration
</span><span class=c1></span><span class=n>myHadoopConfiguration</span><span
class=o>.</span><span class=na>setClass</span><span class=o>(</span><span
class=s>"mapreduce.job.outputformat.class"</span><span class=o>,</span>
<span class=n>MyDbOutputFormatClass</span><span class=o>,</span> <span
class=n>OutputFormat</span><span class=o>.</span><span
class=na>class</span><span class=o>);</span>
@@ -143,7 +143,7 @@ There are scenarios when this may prove faster than
accessing content through th
<span class=n>MyDbOutputFormatValueClass</span><span class=o>,</span> <span
class=n>Object</span><span class=o>.</span><span class=na>class</span><span
class=o>);</span>
<span class=n>myHadoopConfiguration</span><span class=o>.</span><span
class=na>setClass</span><span class=o>(</span><span
class=s>"mapreduce.job.partitioner.class"</span><span class=o>,</span>
<span class=n>MyPartitionerClass</span><span class=o>,</span> <span
class=n>Object</span><span class=o>.</span><span class=na>class</span><span
class=o>);</span>
-<span class=n>myHadoopConfiguration</span><span class=o>.</span><span
class=na>setInt</span><span class=o>(</span><span
class=s>"mapreduce.job.reduces"</span><span class=o>,</span> <span
class=n>2</span><span class=o>);</span></code></pre></div></div></div></p><div
class="language-py snippet"><div class="notebook-skip code-snippet"><a
class=copy type=button data-bs-toggle=tooltip data-bs-placement=bottom
title="Copy to clipboard"><img src=/images/copy-icon.svg></a><div class=high
[...]
+<span class=n>myHadoopConfiguration</span><span class=o>.</span><span
class=na>setInt</span><span class=o>(</span><span
class=s>"mapreduce.job.reduces"</span><span class=o>,</span> <span
class=n>2</span><span class=o>);</span></code></pre></div></div></div><div
class="language-py snippet"><div class="notebook-skip code-snippet"><a
class=copy type=button data-bs-toggle=tooltip data-bs-placement=bottom
title="Copy to clipboard"><img src=/images/copy-icon.svg></a><div
class=highligh [...]
</span><span class=c1></span><span class=n>PCollection</span><span
class=o><</span><span class=n>KV</span><span class=o><</span><span
class=n>Text</span><span class=o>,</span> <span
class=n>LongWritable</span><span class=o>>></span> <span
class=n>boundedWordsCount</span> <span class=o>=</span> <span class=o>...</span>
<span class=c1>// Hadoop configuration for write
@@ -168,7 +168,7 @@ There are scenarios when this may prove faster than
accessing content through th
<span class=s>"writeStream"</span><span class=o>,</span>
<span class=n>HadoopFormatIO</span><span class=o>.<</span><span
class=n>Text</span><span class=o>,</span> <span
class=n>LongWritable</span><span class=o>></span><span
class=n>write</span><span class=o>()</span>
<span class=o>.</span><span
class=na>withConfigurationTransform</span><span class=o>(</span><span
class=n>configTransform</span><span class=o>)</span>
- <span class=o>.</span><span
class=na>withExternalSynchronization</span><span class=o>(</span><span
class=k>new</span> <span class=n>HDFSSynchronization</span><span
class=o>(</span><span class=n>locksDirPath</span><span
class=o>)));</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div clas [...]
+ <span class=o>.</span><span
class=na>withExternalSynchronization</span><span class=o>(</span><span
class=k>new</span> <span class=n>HDFSSynchronization</span><span
class=o>(</span><span class=n>locksDirPath</span><span
class=o>)));</span></code></pre></div></div></div><div class="language-py
snippet"><div class="notebook-skip code-snippet"><a class=copy type=button
data-bs-toggle=tooltip data-bs-placement=bottom title="Copy to clipboard"><img
src=/images/copy-icon.svg></a><div clas [...]
<a href=http://www.apache.org>The Apache Software Foundation</a>
| <a href=/privacy_policy>Privacy Policy</a>
| <a href=/feed.xml>RSS Feed</a><br><br>Apache Beam, Apache, Beam, the Beam
logo, and the Apache feather logo are either registered trademarks or
trademarks of The Apache Software Foundation. All other products or name brands
are trademarks of their respective holders, including The Apache Software
Foundation.</div></div></div></div></footer></body></html>
\ No newline at end of file
diff --git a/website/generated-content/sitemap.xml
b/website/generated-content/sitemap.xml
index f5d773e..81a6308 100644
--- a/website/generated-content/sitemap.xml
+++ b/website/generated-content/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>/blog/beam-2.31.0/</loc><lastmod>2021-06-22T18:45:24-07:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2021-07-01T15:48:01-07:00</lastmod></url><url><loc>/blog/</loc><lastmod>2021-07-01T15:48:01-07:00</lastmod></url><url><loc>/categories/</loc><lastmod>2021-07-01T15:48:01-07:00</lastmod></url><url><loc>/blog/b
[...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:xhtml="http://www.w3.org/1999/xhtml"><url><loc>/blog/beam-2.31.0/</loc><lastmod>2021-06-22T18:45:24-07:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2021-07-01T15:48:01-07:00</lastmod></url><url><loc>/blog/</loc><lastmod>2021-07-01T15:48:01-07:00</lastmod></url><url><loc>/categories/</loc><lastmod>2021-07-01T15:48:01-07:00</lastmod></url><url><loc>/blog/b
[...]
\ No newline at end of file