This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/spark-website.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new b5e249e  Fix the release note of Spark 3.0.0
b5e249e is described below

commit b5e249e6cc86dca728fc0ed7c8b86949eb2c92f9
Author: Xiao Li <gatorsm...@gmail.com>
AuthorDate: Sat Jun 20 08:41:07 2020 -0500

    Fix the release note of Spark 3.0.0
    
    This PR is to fix the release note of Spark 3.0.0. Update the md file to 
reflect the changes made in HTML.
    
    Author: Xiao Li <gatorsm...@gmail.com>
    
    Closes #274 from gatorsmile/updateReleaseMessage3.0.
---
 releases/_posts/2020-06-18-spark-release-3-0-0.md | 10 +++++-----
 site/releases/spark-release-3-0-0.html            | 10 +++++-----
 2 files changed, 10 insertions(+), 10 deletions(-)

diff --git a/releases/_posts/2020-06-18-spark-release-3-0-0.md 
b/releases/_posts/2020-06-18-spark-release-3-0-0.md
index 96033c5..2957049 100644
--- a/releases/_posts/2020-06-18-spark-release-3-0-0.md
+++ b/releases/_posts/2020-06-18-spark-release-3-0-0.md
@@ -315,8 +315,8 @@ Please read the [migration 
guide](https://spark.apache.org/docs/3.0.0/sparkr-mig
 
 ### Known Issues
 
-  - Streaming queries with `dropDuplicates` operator may not be able to 
restart with the checkpoint written by Spark 2.x. 
([SPARK-31990](https://issues.apache.org/jira/browse/SPARK-31990))
-  - In Web UI, the job list page may hang for more than 40 seconds. 
([SPARK-31967](https://issues.apache.org/jira/browse/SPARK-31967))
+  - Streaming queries with `dropDuplicates` operator may not be able to 
restart with the checkpoint written by Spark 2.x. This will be fixed in Spark 
3.0.1. ([SPARK-31990](https://issues.apache.org/jira/browse/SPARK-31990))
+  - In Web UI, the job list page may hang for more than 40 seconds. This will 
be fixed in Spark 3.0.1. 
([SPARK-31967](https://issues.apache.org/jira/browse/SPARK-31967))
   - Set `io.netty.tryReflectionSetAccessible` for Arrow on JDK9+ 
([SPARK-29923](https://issues.apache.org/jira/browse/SPARK-29923))
   - With AWS SDK upgrade to 1.11.655, we strongly encourage the users that use 
S3N file system (open-source NativeS3FileSystem that is based on jets3t 
library) on Hadoop 2.7.3 to upgrade to use AWS Signature V4 and set the bucket 
endpoint or migrate to S3A (“s3a://” prefix) - jets3t library uses AWS v2 by 
default and s3.amazonaws.com as an endpoint. Otherwise, the 403 Forbidden error 
may be thrown in the following cases:
     - If a user accesses an S3 path that contains “+” characters and uses the 
legacy S3N file system, e.g. s3n://bucket/path/+file.
@@ -324,9 +324,9 @@ Please read the [migration 
guide](https://spark.apache.org/docs/3.0.0/sparkr-mig
 
     Note that if you use S3AFileSystem, e.g. (“s3a://bucket/path”) to access 
S3 in S3Select or SQS connectors, then everything will work as expected. 
([SPARK-30968](https://issues.apache.org/jira/browse/SPARK-30968))
 
-  - Parsing day of year using pattern letter 'D' returns the wrong result if 
the year field is missing. This can happen in SQL functions like `to_timestamp` 
which parses datetime string to datetime values using a pattern string. 
([SPARK-31939](https://issues.apache.org/jira/browse/SPARK-31939))
-  - Join/Window/Aggregate inside subqueries may lead to wrong results if the 
keys have values -0.0 and 0.0. 
([SPARK-31958](https://issues.apache.org/jira/browse/SPARK-31958))
-  - A window query may fail with ambiguous self-join error unexpectedly. 
([SPARK-31956](https://issues.apache.org/jira/browse/SPARK-31956))
+  - Parsing day of year using pattern letter 'D' returns the wrong result if 
the year field is missing. This can happen in SQL functions like `to_timestamp` 
which parses datetime string to datetime values using a pattern string. This 
will be fixed in Spark 3.0.1. 
([SPARK-31939](https://issues.apache.org/jira/browse/SPARK-31939))
+  - Join/Window/Aggregate inside subqueries may lead to wrong results if the 
keys have values -0.0 and 0.0. This will be fixed in Spark 3.0.1. 
([SPARK-31958](https://issues.apache.org/jira/browse/SPARK-31958))
+  - A window query may fail with ambiguous self-join error unexpectedly. This 
will be fixed in Spark 3.0.1. 
([SPARK-31956](https://issues.apache.org/jira/browse/SPARK-31956))
 
 
 ### Credits
diff --git a/site/releases/spark-release-3-0-0.html 
b/site/releases/spark-release-3-0-0.html
index 5a6a667..e289370 100644
--- a/site/releases/spark-release-3-0-0.html
+++ b/site/releases/spark-release-3-0-0.html
@@ -577,8 +577,8 @@
 <h3 id="known-issues">Known Issues</h3>
 
 <ul>
-  <li>Streaming queries with <code 
class="highlighter-rouge">dropDuplicates</code> operator may not be able to 
restart with the checkpoint written by Spark 2.x. This will be fixed in the 
next release 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31990";>SPARK-31990</a>)</li>
-  <li>In Web UI, the job list page may hang for more than 40 seconds. This 
will be fixed in the next release 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31967";>SPARK-31967</a>)</li>
+  <li>Streaming queries with <code 
class="highlighter-rouge">dropDuplicates</code> operator may not be able to 
restart with the checkpoint written by Spark 2.x. This will be fixed in Spark 
3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31990";>SPARK-31990</a>)</li>
+  <li>In Web UI, the job list page may hang for more than 40 seconds. This 
will be fixed in Spark 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31967";>SPARK-31967</a>)</li>
   <li>Set <code 
class="highlighter-rouge">io.netty.tryReflectionSetAccessible</code> for Arrow 
on JDK9+ (<a 
href="https://issues.apache.org/jira/browse/SPARK-29923";>SPARK-29923</a>)</li>
   <li>With AWS SDK upgrade to 1.11.655, we strongly encourage the users that 
use S3N file system (open-source NativeS3FileSystem that is based on jets3t 
library) on Hadoop 2.7.3 to upgrade to use AWS Signature V4 and set the bucket 
endpoint or migrate to S3A (“s3a://” prefix) - jets3t library uses AWS v2 by 
default and s3.amazonaws.com as an endpoint. Otherwise, the 403 Forbidden error 
may be thrown in the following cases:
     <ul>
@@ -588,9 +588,9 @@
 
     <p>Note that if you use S3AFileSystem, e.g. (“s3a://bucket/path”) to 
access S3 in S3Select or SQS connectors, then everything will work as expected. 
(<a 
href="https://issues.apache.org/jira/browse/SPARK-30968";>SPARK-30968</a>)</p>
   </li>
-  <li>Parsing day of year using pattern letter &#8216;D&#8217; returns the 
wrong result if the year field is missing. This can happen in SQL functions 
like <code class="highlighter-rouge">to_timestamp</code> which parses datetime 
string to datetime values using a pattern string. This will be fixed in the 
next release 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31939";>SPARK-31939</a>)</li>
-  <li>Join/Window/Aggregate inside subqueries may lead to wrong results if the 
keys have values -0.0 and 0.0. This will be fixed in the next release 3.0.1. 
(<a 
href="https://issues.apache.org/jira/browse/SPARK-31958";>SPARK-31958</a>)</li>
-  <li>A window query may fail with ambiguous self-join error unexpectedly. 
This will be fixed in the next release 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31956";>SPARK-31956</a>)</li>
+  <li>Parsing day of year using pattern letter &#8216;D&#8217; returns the 
wrong result if the year field is missing. This can happen in SQL functions 
like <code class="highlighter-rouge">to_timestamp</code> which parses datetime 
string to datetime values using a pattern string. This will be fixed in Spark 
3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31939";>SPARK-31939</a>)</li>
+  <li>Join/Window/Aggregate inside subqueries may lead to wrong results if the 
keys have values -0.0 and 0.0. This will be fixed in Spark 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31958";>SPARK-31958</a>)</li>
+  <li>A window query may fail with ambiguous self-join error unexpectedly. 
This will be fixed in Spark 3.0.1. (<a 
href="https://issues.apache.org/jira/browse/SPARK-31956";>SPARK-31956</a>)</li>
 </ul>
 
 <h3 id="credits">Credits</h3>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to