This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.1 by this push:
     new fa50fa1  [SPARK-33641][SQL][DOC][FOLLOW-UP] Add migration guide for 
CHAR VARCHAR types
fa50fa1 is described below

commit fa50fa1bc17c1904121b35bad381063d9c7c8e70
Author: Kent Yao <yaooq...@hotmail.com>
AuthorDate: Wed Dec 9 06:44:10 2020 +0000

    [SPARK-33641][SQL][DOC][FOLLOW-UP] Add migration guide for CHAR VARCHAR 
types
    
    ### What changes were proposed in this pull request?
    
    Add migration guide for CHAR VARCHAR types
    
    ### Why are the changes needed?
    
    for migration
    
    ### Does this PR introduce _any_ user-facing change?
    
    doc change
    
    ### How was this patch tested?
    
    passing ci
    
    Closes #30654 from yaooqinn/SPARK-33641-F.
    
    Authored-by: Kent Yao <yaooq...@hotmail.com>
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
    (cherry picked from commit c88eddac3bf860d04bba91fc913f8b2069a94153)
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
---
 docs/sql-migration-guide.md | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/docs/sql-migration-guide.md b/docs/sql-migration-guide.md
index 2c86e7a..2bc04a0 100644
--- a/docs/sql-migration-guide.md
+++ b/docs/sql-migration-guide.md
@@ -54,6 +54,8 @@ license: |
   
   - In Spark 3.1, creating or altering a view will capture runtime SQL configs 
and store them as view properties. These configs will be applied during the 
parsing and analysis phases of the view resolution. To restore the behavior 
before Spark 3.1, you can set `spark.sql.legacy.useCurrentConfigsForView` to 
`true`.
 
+  - Since Spark 3.1, CHAR/CHARACTER and VARCHAR types are supported in the 
table schema. Table scan/insertion will respect the char/varchar semantic. If 
char/varchar is used in places other than table schema, an exception will be 
thrown (CAST is an exception that simply treats char/varchar as string like 
before). To restore the behavior before Spark 3.1, which treats them as STRING 
types and ignores a length parameter, e.g. `CHAR(4)`, you can set 
`spark.sql.legacy.charVarcharAsString` to [...]
+
 ## Upgrading from Spark SQL 3.0 to 3.0.1
 
 - In Spark 3.0, JSON datasource and JSON function `schema_of_json` infer 
TimestampType from string values if they match to the pattern defined by the 
JSON option `timestampFormat`. Since version 3.0.1, the timestamp type 
inference is disabled by default. Set the JSON option `inferTimestamp` to 
`true` to enable such type inference.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to