This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/beam.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 6dd0b19  Publishing website 2020/12/11 18:03:17 at commit 77683be
6dd0b19 is described below

commit 6dd0b195e69d4746a56aab117f09f02451828286
Author: jenkins <[email protected]>
AuthorDate: Fri Dec 11 18:03:17 2020 +0000

    Publishing website 2020/12/11 18:03:17 at commit 77683be
---
 website/generated-content/documentation/index.xml                   | 6 ++++--
 .../documentation/io/built-in/snowflake/index.html                  | 6 ++++--
 website/generated-content/sitemap.xml                               | 2 +-
 3 files changed, 9 insertions(+), 5 deletions(-)

diff --git a/website/generated-content/documentation/index.xml 
b/website/generated-content/documentation/index.xml
index 1945012..9431171 100644
--- a/website/generated-content/documentation/index.xml
+++ b/website/generated-content/documentation/index.xml
@@ -1974,7 +1974,7 @@ SnowflakeIO uses COPY statements behind the scenes to 
write (using &lt;a href="h
 &lt;p>&lt;code>.toTable()&lt;/code>&lt;/p>
 &lt;ul>
 &lt;li>Accepts the target Snowflake table name.&lt;/li>
-&lt;li>Example: &lt;code>.toTable(&amp;quot;MY_TABLE)&lt;/code>&lt;/li>
+&lt;li>Example: 
&lt;code>.toTable(&amp;quot;MY_TABLE&amp;quot;)&lt;/code>&lt;/li>
 &lt;/ul>
 &lt;/li>
 &lt;li>
@@ -2032,7 +2032,9 @@ AS COPY INTO stream_table from 
@streamstage;&lt;/code>&lt;/pre>
 &lt;/li>
 &lt;/ul>
 &lt;p>&lt;strong>Note&lt;/strong>:&lt;/p>
-&lt;p>SnowflakeIO uses COPY statements behind the scenes to write (using &lt;a 
href="https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-table.html";>COPY
 to table&lt;/a>). StagingBucketName will be used to save CSV files which will 
end up in Snowflake. Those CSV files will be saved under the 
“stagingBucketName” path.&lt;/p>
+&lt;p>As mentioned before SnowflakeIO uses &lt;a 
href="https://docs.snowflake.com/en/user-guide/data-load-snowpipe.html";>SnowPipe
 REST calls&lt;/a>
+behind the scenes for writing from unbounded sources. StagingBucketName will 
be used to save CSV files which will end up in Snowflake.
+SnowflakeIO is not going to delete created CSV files from path under the 
“stagingBucketName” either during or after finishing streaming.&lt;/p>
 &lt;p>&lt;strong>Optional&lt;/strong> for streaming:&lt;/p>
 &lt;ul>
 &lt;li>
diff --git 
a/website/generated-content/documentation/io/built-in/snowflake/index.html 
b/website/generated-content/documentation/io/built-in/snowflake/index.html
index 60fa404..7433519 100644
--- a/website/generated-content/documentation/io/built-in/snowflake/index.html
+++ b/website/generated-content/documentation/io/built-in/snowflake/index.html
@@ -93,7 +93,7 @@ SnowflakeIO uses COPY statements behind the scenes to write 
(using <a href=https
       <span class=o>.</span><span class=na>withFlushTimeLimit</span><span 
class=o>(</span><span class=n>Duration</span><span class=o>.</span><span 
class=na>millis</span><span class=o>(</span><span class=n>time</span><span 
class=o>))</span>
       <span class=o>.</span><span class=na>withFlushRowLimit</span><span 
class=o>(</span><span class=n>rowsNumber</span><span class=o>)</span>
       <span class=o>.</span><span class=na>withShardsNumber</span><span 
class=o>(</span><span class=n>shardsNumber</span><span class=o>)</span>
-<span class=o>)</span></code></pre></div></div></p><h4 
id=parameters>Parameters</h4><p><strong>Required</strong> for 
streaming:</p><ul><li><p><code>.withDataSourceConfiguration()</code></p><ul><li>Accepts
 a DatasourceConfiguration 
object.</li></ul></li><li><p><code>.toTable()</code></p><ul><li>Accepts the 
target Snowflake table name.</li><li>Example: 
<code>.toTable("MY_TABLE)</code></li></ul></li><li><p><code>.withStagingBucketName()</code></p><ul><li>Accepts
 a cloud bucket path ended wi [...]
+<span class=o>)</span></code></pre></div></div></p><h4 
id=parameters>Parameters</h4><p><strong>Required</strong> for 
streaming:</p><ul><li><p><code>.withDataSourceConfiguration()</code></p><ul><li>Accepts
 a DatasourceConfiguration 
object.</li></ul></li><li><p><code>.toTable()</code></p><ul><li>Accepts the 
target Snowflake table name.</li><li>Example: 
<code>.toTable("MY_TABLE")</code></li></ul></li><li><p><code>.withStagingBucketName()</code></p><ul><li>Accepts
 a cloud bucket path ended w [...]
 TYPE = EXTERNAL_STAGE
 STORAGE_PROVIDER = GCS
 ENABLED = TRUE
@@ -104,7 +104,9 @@ ENABLED = TRUE
 STORAGE_AWS_ROLE_ARN = &#39;&lt;ARN ROLE NAME&gt;&#39;
 STORAGE_ALLOWED_LOCATIONS = 
(&#39;s3://bucket/&#39;)</code></pre>Then:<pre><code>.withStorageIntegrationName(test_integration)</code></pre></li></ul></li><li><p><code>.withSnowPipe()</code></p><ul><li><p>Accepts
 the target SnowPipe name. <code>.withSnowPipe()</code> accepts the exact name 
of snowpipe.
 Example:<pre><code>CREATE OR REPLACE PIPE test_database.public.test_gcs_pipe
-AS COPY INTO stream_table from 
@streamstage;</code></pre></p></li><li><p>Then:<pre><code>.withSnowPipe(test_gcs_pipe)</code></pre></p></li></ul></li></ul><p><strong>Note</strong>:
 this is important to provide <strong>schema</strong> and 
<strong>database</strong> 
names.</p><ul><li><code>.withUserDataMapper()</code><ul><li>Accepts the <a 
href=https://beam.apache.org/documentation/io/built-in/snowflake/#userdatamapper-function>UserDataMapper</a>
 function that will map a user&rsquo;s PCollec [...]
+AS COPY INTO stream_table from 
@streamstage;</code></pre></p></li><li><p>Then:<pre><code>.withSnowPipe(test_gcs_pipe)</code></pre></p></li></ul></li></ul><p><strong>Note</strong>:
 this is important to provide <strong>schema</strong> and 
<strong>database</strong> 
names.</p><ul><li><code>.withUserDataMapper()</code><ul><li>Accepts the <a 
href=https://beam.apache.org/documentation/io/built-in/snowflake/#userdatamapper-function>UserDataMapper</a>
 function that will map a user&rsquo;s PCollec [...]
+behind the scenes for writing from unbounded sources. StagingBucketName will 
be used to save CSV files which will end up in Snowflake.
+SnowflakeIO is not going to delete created CSV files from path under the 
“stagingBucketName” either during or after finishing 
streaming.</p><p><strong>Optional</strong> for 
streaming:</p><ul><li><p><code>.withFlushTimeLimit()</code></p><ul><li>Default 
value: 30 seconds</li><li>Accepts Duration objects with the specified time 
after each the streaming write will be repeated</li><li>Example: 
<code>.withFlushTimeLimit(Duration.millis(180000))</code></li></ul></li><li><p><code>.withFlushRowLi
 [...]
     <span class=k>return</span> <span class=o>(</span><span 
class=n>SnowflakeIO</span><span class=o>.</span><span 
class=na>UserDataMapper</span><span class=o>&lt;</span><span 
class=n>Long</span><span class=o>&gt;)</span> <span class=n>recordLine</span> 
<span class=o>-&gt;</span> <span class=k>new</span> <span 
class=n>String</span><span class=o>[]</span> <span class=o>{</span><span 
class=n>recordLine</span><span class=o>.</span><span 
class=na>toString</span><span class=o>()};</span>
 <span class=o>}</span></code></pre></div></div></p><h3 
id=additional-write-options>Additional write options</h3><h4 
id=transformation-query>Transformation query</h4><p>The 
<code>.withQueryTransformation()</code> option for the <code>write()</code> 
operation accepts a SQL query as a String value, which will be performed while 
transfering data staged in CSV files directly to the target Snowflake table. 
For information about the transformation SQL syntax, see the <a 
href=https://docs.snowfl [...]
 <span class=n>data</span><span class=o>.</span><span 
class=na>apply</span><span class=o>(</span>
diff --git a/website/generated-content/sitemap.xml 
b/website/generated-content/sitemap.xml
index 7278dbf..11cc0e9 100644
--- a/website/generated-content/sitemap.xml
+++ b/website/generated-content/sitemap.xml
@@ -1 +1 @@
-<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset 
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"; 
xmlns:xhtml="http://www.w3.org/1999/xhtml";><url><loc>/blog/beam-2.25.0/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/blog/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/categories/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/blog/b
 [...]
\ No newline at end of file
+<?xml version="1.0" encoding="utf-8" standalone="yes"?><urlset 
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"; 
xmlns:xhtml="http://www.w3.org/1999/xhtml";><url><loc>/blog/beam-2.25.0/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/categories/blog/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/blog/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/categories/</loc><lastmod>2020-10-29T14:08:19-07:00</lastmod></url><url><loc>/blog/b
 [...]
\ No newline at end of file

Reply via email to