This is an automated email from the ASF dual-hosted git repository.

git-site-role pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/drill-site.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 1feb969  Automatic Site Publish by Buildbot
1feb969 is described below

commit 1feb9698eaab76ac02e5d17af5f84ca99905c420
Author: buildbot <[email protected]>
AuthorDate: Tue Sep 7 10:01:18 2021 +0000

    Automatic Site Publish by Buildbot
---
 output/data/index.html                             |   6 +
 .../index.html                                     | 186 ++-------------------
 output/docs/s3-storage-plugin/index.html           |   2 +-
 output/feed.xml                                    |   4 +-
 output/zh/data/index.html                          |   6 +
 .../index.html                                     | 186 ++-------------------
 output/zh/docs/s3-storage-plugin/index.html        |   2 +-
 output/zh/feed.xml                                 |   4 +-
 8 files changed, 44 insertions(+), 352 deletions(-)

diff --git a/output/data/index.html b/output/data/index.html
index 621a2e8..ce1ddf6 100644
--- a/output/data/index.html
+++ b/output/data/index.html
@@ -414,6 +414,12 @@
     "relative_path": 
"_docs/en/connect-a-data-source/plugins/110-s3-storage-plugin.md"
 },
 {
+    "url": "/docs/oci-os-storage-plugin/",
+    "title": "OCI OS Storage Plugin",
+    "parent": "Connect a Data Source",
+    "relative_path": 
"_docs/en/connect-a-data-source/plugins/111-OCI-OS-storage-plugin.md"
+},
+{
     "url": "/docs/opentsdb-storage-plugin/",
     "title": "OpenTSDB Storage Plugin",
     "parent": "Connect a Data Source",
diff --git a/output/docs/s3-storage-plugin/index.html 
b/output/docs/oci-os-storage-plugin/index.html
similarity index 83%
copy from output/docs/s3-storage-plugin/index.html
copy to output/docs/oci-os-storage-plugin/index.html
index 03ff01e..639293b 100644
--- a/output/docs/s3-storage-plugin/index.html
+++ b/output/docs/oci-os-storage-plugin/index.html
@@ -7,7 +7,7 @@
 <meta name=viewport content="width=device-width, initial-scale=1">
 
 
-<title>S3 Storage Plugin - Apache Drill</title>
+<title>OCI OS Storage Plugin - Apache Drill</title>
 
 <link 
href="https://maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css";
 rel="stylesheet" type="text/css"/>
 <link href='https://fonts.googleapis.com/css?family=PT+Sans' rel='stylesheet' 
type='text/css'/>
@@ -44,11 +44,11 @@
        <ul>
                
                <li>
-                       <a style="font-weight: bold;" 
href="/docs/s3-storage-plugin/" >en</a>
+                       <a style="font-weight: bold;" 
href="/docs/oci-os-storage-plugin/" >en</a>
                </li>
                
                <li>
-                       <a  href="/zh/docs/s3-storage-plugin/" >zh</a>
+                       <a  href="/zh/docs/oci-os-storage-plugin/" >zh</a>
                </li>
                
        </ul>
@@ -450,8 +450,8 @@
         
       
         
-          <li class="toctree-l1 current_section "><a href="javascript: 
void(0);">Connect a Data Source</a></li>
-          <ul class="current_section">
+          <li class="toctree-l1"><a href="javascript: void(0);">Connect a Data 
Source</a></li>
+          <ul style="display: none">
           
             
               <li class="toctree-l2"><a class="reference internal" 
href="/docs/connect-a-data-source-introduction/">Connect a Data Source 
Introduction</a></li>
@@ -501,7 +501,7 @@
             
           
             
-              <li class="toctree-l2 current"><a class="reference internal" 
href="/docs/s3-storage-plugin/">S3 Storage Plugin</a></li>
+              <li class="toctree-l2"><a class="reference internal" 
href="/docs/s3-storage-plugin/">S3 Storage Plugin</a></li>
             
           
             
@@ -1394,9 +1394,7 @@
   <li><a href="/docs/">Docs</a></li>
  
   
-    <li><a href="/docs/connect-a-data-source/">Connect a Data Source</a></li>
-  
-  <li>S3 Storage Plugin</li>
+  <li>OCI OS Storage Plugin</li>
 </nav>
 
 
@@ -1404,11 +1402,11 @@
   <div class="main-content">
 
     
-      <a class="edit-link" 
href="https://github.com/apache/drill/blob/gh-pages/_docs/en/connect-a-data-source/plugins/110-s3-storage-plugin.md";
 target="_blank"><i class="fa fa-pencil-square-o"></i></a>
+      <a class="edit-link" 
href="https://github.com/apache/drill/blob/gh-pages/_docs/en/connect-a-data-source/plugins/111-OCI-OS-storage-plugin.md";
 target="_blank"><i class="fa fa-pencil-square-o"></i></a>
     
 
     <div class="int_title left">
-      <h1>S3 Storage Plugin</h1>
+      <h1>OCI OS Storage Plugin</h1>
 
     </div>
 
@@ -1418,172 +1416,14 @@
 
     <div class="int_text" align="left">
       
-        <p>Drill works with data stored in the cloud. With a few simple steps, 
you can configure the S3 storage plugin for Drill and be off to the races 
running queries.</p>
-
-<p>Drill has the ability to query files stored on Amazon’s S3 cloud storage 
using the HDFS s3a library. The HDFS s3a library adds support for files larger 
than 5 gigabytes (these were unsupported using the older HDFS s3n library).</p>
-
-<p>To connect Drill to S3:</p>
-
-<ul>
-  <li>Provide your AWS credentials.</li>
-  <li>Configure the S3 storage plugin with an S3 bucket name.</li>
-</ul>
-
-<p>For additional information, refer to the <a 
href="https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html";>HDFS
 S3 documentation</a>.</p>
-
-<p><strong>Note:</strong> Drill does not use HDFS 3.x, therefore Drill does 
not support AWS temporary credentials, as described in the s3a 
documentation.</p>
-
-<h2 id="providing-aws-credentials">Providing AWS Credentials</h2>
-
-<p>Your environment determines where you provide your AWS credentials. You 
define your AWS credentials:</p>
-
-<ul>
-  <li>In the S3 storage plugin configuration:
-    <ul>
-      <li><a 
href="/docs/s3-storage-plugin/#using-an-external-provider-for-credentials">You 
can point to an encrypted file in an external provider.</a> (Drill 1.15 and 
later)</li>
-      <li><a 
href="/docs/s3-storage-plugin/#adding-credentials-directly-to-the-s3-plugin">You
 can put your access and secret keys directly in the storage plugin 
configuration.</a> Note that this method is the least secure, but sufficient 
for use on a single machine, such as a laptop.</li>
-    </ul>
-  </li>
-  <li>In a non-Hadoop environment, you can use the Drill-specific 
core-site.xml file to provide the AWS credentials.</li>
-</ul>
-
-<h3 id="defining-access-keys-in-the-s3-storage-plugin">Defining Access Keys in 
the S3 Storage Plugin</h3>
-
-<p>Refer to <a 
href="/docs/s3-storage-plugin/#configuring-the-s3-storage-plugin">Configuring 
the S3 Storage Plugin</a>.</p>
-
-<h3 id="defining-access-keys-in-the-drill-core-sitexml-file">Defining Access 
Keys in the Drill core-site.xml File</h3>
-
-<p>To configure the access keys in Drill’s core-site.xml file, navigate to the 
<code class="language-plaintext highlighter-rouge">$DRILL_HOME/conf</code> or 
<code class="language-plaintext highlighter-rouge">$DRILL_SITE</code> 
directory, and rename the <code class="language-plaintext 
highlighter-rouge">core-site-example.xml</code> file to <code 
class="language-plaintext highlighter-rouge">core-site.xml</code>. Replace the 
text <code class="language-plaintext highlighter-rouge">ENTER_YOUR [...]
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>   &lt;configuration&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.access.key&lt;/name&gt;
-           &lt;value&gt;ACCESS-KEY&lt;/value&gt;
-       &lt;/property&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.secret.key&lt;/name&gt;
-           &lt;value&gt;SECRET-KEY&lt;/value&gt;
-       &lt;/property&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.endpoint&lt;/name&gt;
-           &lt;value&gt;s3.REGION.amazonaws.com&lt;/value&gt;
-       &lt;/property&gt;
-   &lt;/configuration&gt;  
-</code></pre></div></div>
-
-<h3 id="configuring-drill-to-use-aws-iam-roles-for-accessing-s3">Configuring 
Drill to use AWS IAM Roles for Accessing S3</h3>
-
-<p>If you use IAM roles/instance profiles, to access data in s3, use the 
following settings in your core-site.xml. Do not specify the secret key or 
access key properties. For example:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>   &lt;configuration&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.aws.credentials.provider&lt;/name&gt;
-           
&lt;value&gt;com.amazonaws.auth.InstanceProfileCredentialsProvider&lt;/value&gt;
-       &lt;/property&gt;
-   &lt;/configuration&gt;    
-</code></pre></div></div>
-
-<h2 id="configuring-the-s3-storage-plugin">Configuring the S3 Storage 
Plugin</h2>
-
-<p>The <strong>Storage</strong> page in the Drill Web UI provides an S3 
storage plugin that you configure to connect Drill to the S3 distributed file 
system registered in <code class="language-plaintext 
highlighter-rouge">core-site.xml</code>. If you did not define your AWS 
credentials in the <code class="language-plaintext 
highlighter-rouge">core-site.xml</code> file, you can define them in the 
storage plugin configuration. You can define the credentials directly in the S3 
storage plugi [...]
-
-<p>To configure the S3 storage plugin, log in to the Drill Web UI at <code 
class="language-plaintext 
highlighter-rouge">http://&lt;drill-hostname&gt;:8047</code>. The <code 
class="language-plaintext highlighter-rouge">drill-hostname</code> is a node on 
which Drill is running. Go to the <strong>Storage</strong> page and click 
<strong>Update</strong> next to the S3 storage plugin option.</p>
-
-<p><strong>Note:</strong> The <code class="language-plaintext 
highlighter-rouge">"config"</code> block in the S3 storage plugin configuration 
contains properties to define your AWS credentials. Do not include the <code 
class="language-plaintext highlighter-rouge">"config"</code> block in your S3 
storage plugin configuration if you defined your AWS credentials in the <code 
class="language-plaintext highlighter-rouge">core-site.xml</code> file.</p>
-
-<p>Configure the S3 storage plugin configuration to use an external provider 
for credentials or directly add the credentials in the configuration itself, as 
described in the following sections. Click <strong>Update</strong> to save the 
configuration when done.</p>
-
-<h3 id="using-an-external-provider-for-credentials">Using an External Provider 
for Credentials</h3>
-<p>Starting in Drill 1.15, the S3 storage plugin supports the <a 
href="https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CredentialProviderAPI.html";>Hadoop
 Credential Provider API</a>, which allows you to store secret keys and other 
sensitive data in an encrypted file in an external provider versus storing them 
in plain text in a configuration file or directly in the storage plugin 
configuration.</p>
-
-<p>When you configure the S3 storage plugin to use an external provider, Drill 
first checks the external provider for the keys. If the keys are not available 
via the provider, or the provider is not configured, Drill can fall back to 
using the plain text data in the <code class="language-plaintext 
highlighter-rouge">core-site.xml</code> file or S3 storage plugin 
configuration.</p>
-
-<p>For fallback to work, you must include the <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.clear-text-fallback</code> 
property in the S3 storage plugin configuration, with the property set to 
‘true’.</p>
-
-<p>For subsequent connections, if you want Drill to connect using different 
credentials, you can include the <code class="language-plaintext 
highlighter-rouge">fs.s3a.impl.disable.cache</code> property in the  
configuration. See <a 
href="/docs/s3-storage-plugin/#reconnecting-to-an-s3-bucket-using-different-credentials">Reconnecting
 to an S3 Bucket Using Different Credentials</a> for more information.</p>
-
-<p><strong>Configuring the S3 Plugin to use an External Provider</strong><br />
-Add the bucket name and the <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.provider.path</code> property to 
the S3 storage plugin configuration. The <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.provider.path</code> property 
should point to a file that contains your encrypted passwords. Optionally, 
include the <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.clear-text-fallback</code> 
property for [...]
-
-<p>The following example shows an S3 storage plugin configuration with the S3 
bucket, <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.provider.path</code>, and <code 
class="language-plaintext highlighter-rouge">fs.s3a.impl.disable.cache 
properties</code> set:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>{
-       "type":
-"file",
-  "connection": "s3a://bucket-name/",
-  "config": {
-    "hadoop.security.credential.provider.path":"jceks://file/tmp/s3.jceks",
-    "fs.s3a.impl.disable.cache":"true",
-    ...
-    },
-  "workspaces": {
-    ...
-  }  
-</code></pre></div></div>
-
-<h3 id="adding-credentials-directly-to-the-s3-plugin">Adding Credentials 
Directly to the S3 Plugin</h3>
-<p>You can add your AWS credentials directly to the S3 configuration, though 
this method is the least secure, but sufficient for use on a single machine, 
such as a laptop. Include the S3 bucket name, the AWS access keys, and the S3 
endpoint in the configuration.</p>
-
-<p>Optionally, for subsequent connections, if you want Drill to connect using 
different credentials, you can include the <code class="language-plaintext 
highlighter-rouge">fs.s3a.impl.disable.cache</code> property in the  
configuration. See <a 
href="/docs/s3-storage-plugin/#reconnecting-to-an-s3-bucket-using-different-credentials">Reconnecting
 to an S3 Bucket Using Different Credentials</a> for more information.</p>
-
-<p>The following example shows an S3 storage plugin configuration with the S3 
bucket, access key properties, and <code class="language-plaintext 
highlighter-rouge">fs.s3a.impl.disable.cache</code> property:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>{
-"type": "file",
-"enabled": true,
-"connection": "s3a://bucket-name/",
-"config": {
-       "fs.s3a.access.key": "&lt;key&gt;",
-       "fs.s3a.secret.key": "&lt;key&gt;",
-       "fs.s3a.endpoint": "s3.us-west-1.amazonaws.com",
-    "fs.s3a.impl.disable.cache":"true"
-},
-"workspaces": {...
-       },  
-</code></pre></div></div>
-
-<h3 id="reconnecting-to-an-s3-bucket-using-different-credentials">Reconnecting 
to an S3 Bucket Using Different Credentials</h3>
-<p>Whether you store credentials in the S3 storage plugin configuration 
directly or in an external provider, you can reconnect to an existing S3 bucket 
using different credentials when you include the <code 
class="language-plaintext highlighter-rouge">fs.s3a.impl.disable.cache</code> 
property in the S3 storage plugin configuration. The <code 
class="language-plaintext highlighter-rouge">fs.s3a.impl.disable.cache</code> 
property disables the S3 file system cache when set to ‘true’. If <cod [...]
-
-<p>The following example S3 storage plugin configuration includes the 
fs.s3a.impl.disable.cache property:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>{
- "type":
-"file",
-  "connection": "s3a://bucket-name/",
-  "config": {
-    "hadoop.security.credential.provider.path":"jceks://file/tmp/s3.jceks",
-    "fs.s3a.impl.disable.cache":"true",
-    ...
-    },
-  "workspaces": {
-    ...
-  }
-</code></pre></div></div>
-
-<h2 id="quering-parquet-format-files-on-s3">Quering Parquet Format Files On 
S3</h2>
-
-<p>Drill uses the Hadoop distributed file system (HDFS) for reading S3 input 
files, which ultimately uses the Apache HttpClient. The HttpClient has a 
default limit of four simultaneous requests, and it puts the subsequent S3 
requests in the queue. A Drill query with large number of columns or a Select * 
query, on Parquet formatted files ends up issuing many S3 requests and can fail 
with ConnectionPoolTimeoutException.</p>
-
-<p>Fortunately, as a part of S3a implementation in Hadoop 2.7.1, HttpClient’s 
required limit parameter is extracted out in a config and can be raised to 
avoid ConnectionPoolTimeoutException. This is how you can set this parameter in 
core-site.xml:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>   &lt;configuration&gt;
-     ...
-     
-     &lt;property&gt;
-       &lt;name&gt;fs.s3a.connection.maximum&lt;/name&gt;
-       &lt;value&gt;100&lt;/value&gt;
-     &lt;/property&gt;
-   
-   &lt;/configuration&gt;
-</code></pre></div></div>
-
-
+        <ul>
+        
+      </ul>
     
       
         <div class="doc-nav">
   
-  <span class="previous-toc"><a href="/docs/mapr-db-format/">← MapR-DB 
Format</a></span><span class="next-toc"><a 
href="/docs/opentsdb-storage-plugin/">OpenTSDB Storage Plugin →</a></span>
+  <span class="previous-toc"><a href="">← </a></span><span class="next-toc"><a 
href=""> →</a></span>
 </div>
 
     
diff --git a/output/docs/s3-storage-plugin/index.html 
b/output/docs/s3-storage-plugin/index.html
index 03ff01e..3278d6a 100644
--- a/output/docs/s3-storage-plugin/index.html
+++ b/output/docs/s3-storage-plugin/index.html
@@ -1431,7 +1431,7 @@
 
 <p>For additional information, refer to the <a 
href="https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html";>HDFS
 S3 documentation</a>.</p>
 
-<p><strong>Note:</strong> Drill does not use HDFS 3.x, therefore Drill does 
not support AWS temporary credentials, as described in the s3a 
documentation.</p>
+<p><strong>Note:</strong> Drill started using HDFS 3.0, but support of AWS 
temporary credentials (as described in the s3a documentation) wasn’t verified 
yet.</p>
 
 <h2 id="providing-aws-credentials">Providing AWS Credentials</h2>
 
diff --git a/output/feed.xml b/output/feed.xml
index bbef5f0..76f0418 100644
--- a/output/feed.xml
+++ b/output/feed.xml
@@ -6,8 +6,8 @@
 </description>
     <link>/</link>
     <atom:link href="/feed.xml" rel="self" type="application/rss+xml"/>
-    <pubDate>Tue, 07 Sep 2021 05:33:34 +0000</pubDate>
-    <lastBuildDate>Tue, 07 Sep 2021 05:33:34 +0000</lastBuildDate>
+    <pubDate>Tue, 07 Sep 2021 09:58:24 +0000</pubDate>
+    <lastBuildDate>Tue, 07 Sep 2021 09:58:24 +0000</lastBuildDate>
     <generator>Jekyll v3.9.1</generator>
     
       <item>
diff --git a/output/zh/data/index.html b/output/zh/data/index.html
index 4d8af94..36ef070 100644
--- a/output/zh/data/index.html
+++ b/output/zh/data/index.html
@@ -414,6 +414,12 @@
     "relative_path": 
"_docs/en/connect-a-data-source/plugins/110-s3-storage-plugin.md"
 },
 {
+    "url": "/docs/oci-os-storage-plugin/",
+    "title": "OCI OS Storage Plugin",
+    "parent": "Connect a Data Source",
+    "relative_path": 
"_docs/en/connect-a-data-source/plugins/111-OCI-OS-storage-plugin.md"
+},
+{
     "url": "/docs/opentsdb-storage-plugin/",
     "title": "OpenTSDB Storage Plugin",
     "parent": "Connect a Data Source",
diff --git a/output/zh/docs/s3-storage-plugin/index.html 
b/output/zh/docs/oci-os-storage-plugin/index.html
similarity index 83%
copy from output/zh/docs/s3-storage-plugin/index.html
copy to output/zh/docs/oci-os-storage-plugin/index.html
index 917ae3b..1cd4797 100644
--- a/output/zh/docs/s3-storage-plugin/index.html
+++ b/output/zh/docs/oci-os-storage-plugin/index.html
@@ -7,7 +7,7 @@
 <meta name=viewport content="width=device-width, initial-scale=1">
 
 
-<title>S3 Storage Plugin - Apache Drill</title>
+<title>OCI OS Storage Plugin - Apache Drill</title>
 
 <link 
href="https://maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css";
 rel="stylesheet" type="text/css"/>
 <link href='https://fonts.googleapis.com/css?family=PT+Sans' rel='stylesheet' 
type='text/css'/>
@@ -44,11 +44,11 @@
        <ul>
                
                <li>
-                       <a  href="/docs/s3-storage-plugin/" >en</a>
+                       <a  href="/docs/oci-os-storage-plugin/" >en</a>
                </li>
                
                <li>
-                       <a style="font-weight: bold;" 
href="/zh/docs/s3-storage-plugin/" >zh</a>
+                       <a style="font-weight: bold;" 
href="/zh/docs/oci-os-storage-plugin/" >zh</a>
                </li>
                
        </ul>
@@ -450,8 +450,8 @@
         
       
         
-          <li class="toctree-l1 current_section "><a href="javascript: 
void(0);">Connect a Data Source</a></li>
-          <ul class="current_section">
+          <li class="toctree-l1"><a href="javascript: void(0);">Connect a Data 
Source</a></li>
+          <ul style="display: none">
           
             
               <li class="toctree-l2"><a class="reference internal" 
href="/zh/docs/connect-a-data-source-introduction/">Connect a Data Source 
Introduction</a></li>
@@ -501,7 +501,7 @@
             
           
             
-              <li class="toctree-l2 current"><a class="reference internal" 
href="/zh/docs/s3-storage-plugin/">S3 Storage Plugin</a></li>
+              <li class="toctree-l2"><a class="reference internal" 
href="/zh/docs/s3-storage-plugin/">S3 Storage Plugin</a></li>
             
           
             
@@ -1394,9 +1394,7 @@
   <li><a href="/zh/docs/">Docs</a></li>
  
   
-    <li><a href="/zh/docs/connect-a-data-source/">Connect a Data 
Source</a></li>
-  
-  <li>S3 Storage Plugin</li>
+  <li>OCI OS Storage Plugin</li>
 </nav>
 
 
@@ -1404,11 +1402,11 @@
   <div class="main-content">
 
     
-      <a class="edit-link" 
href="https://github.com/apache/drill/blob/gh-pages/_docs/en/connect-a-data-source/plugins/110-s3-storage-plugin.md";
 target="_blank"><i class="fa fa-pencil-square-o"></i></a>
+      <a class="edit-link" 
href="https://github.com/apache/drill/blob/gh-pages/_docs/en/connect-a-data-source/plugins/111-OCI-OS-storage-plugin.md";
 target="_blank"><i class="fa fa-pencil-square-o"></i></a>
     
 
     <div class="int_title left">
-      <h1>S3 Storage Plugin</h1>
+      <h1>OCI OS Storage Plugin</h1>
 
     </div>
 
@@ -1418,172 +1416,14 @@
 
     <div class="int_text" align="left">
       
-        <p>Drill works with data stored in the cloud. With a few simple steps, 
you can configure the S3 storage plugin for Drill and be off to the races 
running queries.</p>
-
-<p>Drill has the ability to query files stored on Amazon’s S3 cloud storage 
using the HDFS s3a library. The HDFS s3a library adds support for files larger 
than 5 gigabytes (these were unsupported using the older HDFS s3n library).</p>
-
-<p>To connect Drill to S3:</p>
-
-<ul>
-  <li>Provide your AWS credentials.</li>
-  <li>Configure the S3 storage plugin with an S3 bucket name.</li>
-</ul>
-
-<p>For additional information, refer to the <a 
href="https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html";>HDFS
 S3 documentation</a>.</p>
-
-<p><strong>Note:</strong> Drill does not use HDFS 3.x, therefore Drill does 
not support AWS temporary credentials, as described in the s3a 
documentation.</p>
-
-<h2 id="providing-aws-credentials">Providing AWS Credentials</h2>
-
-<p>Your environment determines where you provide your AWS credentials. You 
define your AWS credentials:</p>
-
-<ul>
-  <li>In the S3 storage plugin configuration:
-    <ul>
-      <li><a 
href="/zh/docs/s3-storage-plugin/#using-an-external-provider-for-credentials">You
 can point to an encrypted file in an external provider.</a> (Drill 1.15 and 
later)</li>
-      <li><a 
href="/zh/docs/s3-storage-plugin/#adding-credentials-directly-to-the-s3-plugin">You
 can put your access and secret keys directly in the storage plugin 
configuration.</a> Note that this method is the least secure, but sufficient 
for use on a single machine, such as a laptop.</li>
-    </ul>
-  </li>
-  <li>In a non-Hadoop environment, you can use the Drill-specific 
core-site.xml file to provide the AWS credentials.</li>
-</ul>
-
-<h3 id="defining-access-keys-in-the-s3-storage-plugin">Defining Access Keys in 
the S3 Storage Plugin</h3>
-
-<p>Refer to <a 
href="/zh/docs/s3-storage-plugin/#configuring-the-s3-storage-plugin">Configuring
 the S3 Storage Plugin</a>.</p>
-
-<h3 id="defining-access-keys-in-the-drill-core-sitexml-file">Defining Access 
Keys in the Drill core-site.xml File</h3>
-
-<p>To configure the access keys in Drill’s core-site.xml file, navigate to the 
<code class="language-plaintext highlighter-rouge">$DRILL_HOME/conf</code> or 
<code class="language-plaintext highlighter-rouge">$DRILL_SITE</code> 
directory, and rename the <code class="language-plaintext 
highlighter-rouge">core-site-example.xml</code> file to <code 
class="language-plaintext highlighter-rouge">core-site.xml</code>. Replace the 
text <code class="language-plaintext highlighter-rouge">ENTER_YOUR [...]
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>   &lt;configuration&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.access.key&lt;/name&gt;
-           &lt;value&gt;ACCESS-KEY&lt;/value&gt;
-       &lt;/property&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.secret.key&lt;/name&gt;
-           &lt;value&gt;SECRET-KEY&lt;/value&gt;
-       &lt;/property&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.endpoint&lt;/name&gt;
-           &lt;value&gt;s3.REGION.amazonaws.com&lt;/value&gt;
-       &lt;/property&gt;
-   &lt;/configuration&gt;  
-</code></pre></div></div>
-
-<h3 id="configuring-drill-to-use-aws-iam-roles-for-accessing-s3">Configuring 
Drill to use AWS IAM Roles for Accessing S3</h3>
-
-<p>If you use IAM roles/instance profiles, to access data in s3, use the 
following settings in your core-site.xml. Do not specify the secret key or 
access key properties. For example:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>   &lt;configuration&gt;
-       &lt;property&gt;
-           &lt;name&gt;fs.s3a.aws.credentials.provider&lt;/name&gt;
-           
&lt;value&gt;com.amazonaws.auth.InstanceProfileCredentialsProvider&lt;/value&gt;
-       &lt;/property&gt;
-   &lt;/configuration&gt;    
-</code></pre></div></div>
-
-<h2 id="configuring-the-s3-storage-plugin">Configuring the S3 Storage 
Plugin</h2>
-
-<p>The <strong>Storage</strong> page in the Drill Web UI provides an S3 
storage plugin that you configure to connect Drill to the S3 distributed file 
system registered in <code class="language-plaintext 
highlighter-rouge">core-site.xml</code>. If you did not define your AWS 
credentials in the <code class="language-plaintext 
highlighter-rouge">core-site.xml</code> file, you can define them in the 
storage plugin configuration. You can define the credentials directly in the S3 
storage plugi [...]
-
-<p>To configure the S3 storage plugin, log in to the Drill Web UI at <code 
class="language-plaintext 
highlighter-rouge">http://&lt;drill-hostname&gt;:8047</code>. The <code 
class="language-plaintext highlighter-rouge">drill-hostname</code> is a node on 
which Drill is running. Go to the <strong>Storage</strong> page and click 
<strong>Update</strong> next to the S3 storage plugin option.</p>
-
-<p><strong>Note:</strong> The <code class="language-plaintext 
highlighter-rouge">"config"</code> block in the S3 storage plugin configuration 
contains properties to define your AWS credentials. Do not include the <code 
class="language-plaintext highlighter-rouge">"config"</code> block in your S3 
storage plugin configuration if you defined your AWS credentials in the <code 
class="language-plaintext highlighter-rouge">core-site.xml</code> file.</p>
-
-<p>Configure the S3 storage plugin configuration to use an external provider 
for credentials or directly add the credentials in the configuration itself, as 
described in the following sections. Click <strong>Update</strong> to save the 
configuration when done.</p>
-
-<h3 id="using-an-external-provider-for-credentials">Using an External Provider 
for Credentials</h3>
-<p>Starting in Drill 1.15, the S3 storage plugin supports the <a 
href="https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CredentialProviderAPI.html";>Hadoop
 Credential Provider API</a>, which allows you to store secret keys and other 
sensitive data in an encrypted file in an external provider versus storing them 
in plain text in a configuration file or directly in the storage plugin 
configuration.</p>
-
-<p>When you configure the S3 storage plugin to use an external provider, Drill 
first checks the external provider for the keys. If the keys are not available 
via the provider, or the provider is not configured, Drill can fall back to 
using the plain text data in the <code class="language-plaintext 
highlighter-rouge">core-site.xml</code> file or S3 storage plugin 
configuration.</p>
-
-<p>For fallback to work, you must include the <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.clear-text-fallback</code> 
property in the S3 storage plugin configuration, with the property set to 
‘true’.</p>
-
-<p>For subsequent connections, if you want Drill to connect using different 
credentials, you can include the <code class="language-plaintext 
highlighter-rouge">fs.s3a.impl.disable.cache</code> property in the  
configuration. See <a 
href="/zh/docs/s3-storage-plugin/#reconnecting-to-an-s3-bucket-using-different-credentials">Reconnecting
 to an S3 Bucket Using Different Credentials</a> for more information.</p>
-
-<p><strong>Configuring the S3 Plugin to use an External Provider</strong><br />
-Add the bucket name and the <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.provider.path</code> property to 
the S3 storage plugin configuration. The <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.provider.path</code> property 
should point to a file that contains your encrypted passwords. Optionally, 
include the <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.clear-text-fallback</code> 
property for [...]
-
-<p>The following example shows an S3 storage plugin configuration with the S3 
bucket, <code class="language-plaintext 
highlighter-rouge">hadoop.security.credential.provider.path</code>, and <code 
class="language-plaintext highlighter-rouge">fs.s3a.impl.disable.cache 
properties</code> set:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>{
-       "type":
-"file",
-  "connection": "s3a://bucket-name/",
-  "config": {
-    "hadoop.security.credential.provider.path":"jceks://file/tmp/s3.jceks",
-    "fs.s3a.impl.disable.cache":"true",
-    ...
-    },
-  "workspaces": {
-    ...
-  }  
-</code></pre></div></div>
-
-<h3 id="adding-credentials-directly-to-the-s3-plugin">Adding Credentials 
Directly to the S3 Plugin</h3>
-<p>You can add your AWS credentials directly to the S3 configuration, though 
this method is the least secure, but sufficient for use on a single machine, 
such as a laptop. Include the S3 bucket name, the AWS access keys, and the S3 
endpoint in the configuration.</p>
-
-<p>Optionally, for subsequent connections, if you want Drill to connect using 
different credentials, you can include the <code class="language-plaintext 
highlighter-rouge">fs.s3a.impl.disable.cache</code> property in the  
configuration. See <a 
href="/zh/docs/s3-storage-plugin/#reconnecting-to-an-s3-bucket-using-different-credentials">Reconnecting
 to an S3 Bucket Using Different Credentials</a> for more information.</p>
-
-<p>The following example shows an S3 storage plugin configuration with the S3 
bucket, access key properties, and <code class="language-plaintext 
highlighter-rouge">fs.s3a.impl.disable.cache</code> property:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>{
-"type": "file",
-"enabled": true,
-"connection": "s3a://bucket-name/",
-"config": {
-       "fs.s3a.access.key": "&lt;key&gt;",
-       "fs.s3a.secret.key": "&lt;key&gt;",
-       "fs.s3a.endpoint": "s3.us-west-1.amazonaws.com",
-    "fs.s3a.impl.disable.cache":"true"
-},
-"workspaces": {...
-       },  
-</code></pre></div></div>
-
-<h3 id="reconnecting-to-an-s3-bucket-using-different-credentials">Reconnecting 
to an S3 Bucket Using Different Credentials</h3>
-<p>Whether you store credentials in the S3 storage plugin configuration 
directly or in an external provider, you can reconnect to an existing S3 bucket 
using different credentials when you include the <code 
class="language-plaintext highlighter-rouge">fs.s3a.impl.disable.cache</code> 
property in the S3 storage plugin configuration. The <code 
class="language-plaintext highlighter-rouge">fs.s3a.impl.disable.cache</code> 
property disables the S3 file system cache when set to ‘true’. If <cod [...]
-
-<p>The following example S3 storage plugin configuration includes the 
fs.s3a.impl.disable.cache property:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>{
- "type":
-"file",
-  "connection": "s3a://bucket-name/",
-  "config": {
-    "hadoop.security.credential.provider.path":"jceks://file/tmp/s3.jceks",
-    "fs.s3a.impl.disable.cache":"true",
-    ...
-    },
-  "workspaces": {
-    ...
-  }
-</code></pre></div></div>
-
-<h2 id="quering-parquet-format-files-on-s3">Quering Parquet Format Files On 
S3</h2>
-
-<p>Drill uses the Hadoop distributed file system (HDFS) for reading S3 input 
files, which ultimately uses the Apache HttpClient. The HttpClient has a 
default limit of four simultaneous requests, and it puts the subsequent S3 
requests in the queue. A Drill query with large number of columns or a Select * 
query, on Parquet formatted files ends up issuing many S3 requests and can fail 
with ConnectionPoolTimeoutException.</p>
-
-<p>Fortunately, as a part of S3a implementation in Hadoop 2.7.1, HttpClient’s 
required limit parameter is extracted out in a config and can be raised to 
avoid ConnectionPoolTimeoutException. This is how you can set this parameter in 
core-site.xml:</p>
-
-<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre 
class="highlight"><code>   &lt;configuration&gt;
-     ...
-     
-     &lt;property&gt;
-       &lt;name&gt;fs.s3a.connection.maximum&lt;/name&gt;
-       &lt;value&gt;100&lt;/value&gt;
-     &lt;/property&gt;
-   
-   &lt;/configuration&gt;
-</code></pre></div></div>
-
-
+        <ul>
+        
+      </ul>
     
       
         <div class="doc-nav">
   
-  <span class="previous-toc"><a href="/zh/docs/mapr-db-format/">← MapR-DB 
Format</a></span><span class="next-toc"><a 
href="/zh/docs/opentsdb-storage-plugin/">OpenTSDB Storage Plugin →</a></span>
+  <span class="previous-toc"><a href="">← </a></span><span class="next-toc"><a 
href=""> →</a></span>
 </div>
 
     
diff --git a/output/zh/docs/s3-storage-plugin/index.html 
b/output/zh/docs/s3-storage-plugin/index.html
index 917ae3b..5e8dd24 100644
--- a/output/zh/docs/s3-storage-plugin/index.html
+++ b/output/zh/docs/s3-storage-plugin/index.html
@@ -1431,7 +1431,7 @@
 
 <p>For additional information, refer to the <a 
href="https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html";>HDFS
 S3 documentation</a>.</p>
 
-<p><strong>Note:</strong> Drill does not use HDFS 3.x, therefore Drill does 
not support AWS temporary credentials, as described in the s3a 
documentation.</p>
+<p><strong>Note:</strong> Drill started using HDFS 3.0, but support of AWS 
temporary credentials (as described in the s3a documentation) wasn’t verified 
yet.</p>
 
 <h2 id="providing-aws-credentials">Providing AWS Credentials</h2>
 
diff --git a/output/zh/feed.xml b/output/zh/feed.xml
index eb1c091..577e3ac 100644
--- a/output/zh/feed.xml
+++ b/output/zh/feed.xml
@@ -6,8 +6,8 @@
 </description>
     <link>/</link>
     <atom:link href="/zh/feed.xml" rel="self" type="application/rss+xml"/>
-    <pubDate>Tue, 07 Sep 2021 05:33:34 +0000</pubDate>
-    <lastBuildDate>Tue, 07 Sep 2021 05:33:34 +0000</lastBuildDate>
+    <pubDate>Tue, 07 Sep 2021 09:58:24 +0000</pubDate>
+    <lastBuildDate>Tue, 07 Sep 2021 09:58:24 +0000</lastBuildDate>
     <generator>Jekyll v3.9.1</generator>
     
       <item>

Reply via email to