This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new ca916258b991 [SPARK-47953][DOCS] MsSQLServer: Document Mapping Spark 
SQL Data Types to Microsoft SQL Server
ca916258b991 is described below

commit ca916258b9916452aa2f377608e6be8df65550e5
Author: Kent Yao <y...@apache.org>
AuthorDate: Tue Apr 23 07:41:04 2024 -0700

    [SPARK-47953][DOCS] MsSQLServer: Document Mapping Spark SQL Data Types to 
Microsoft SQL Server
    
    ### What changes were proposed in this pull request?
    
    This PR adds Document Mapping Spark SQL Data Types to Microsoft SQL Server
    
    ### Why are the changes needed?
    
    doc improvement
    ### Does this PR introduce _any_ user-facing change?
    
    no
    
    ### How was this patch tested?
    
    doc build
    
![image](https://github.com/apache/spark/assets/8326978/7220d96a-c5ca-4780-9fc5-f93c99f91c10)
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    no
    
    Closes #46177 from yaooqinn/SPARK-47953.
    
    Authored-by: Kent Yao <y...@apache.org>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 docs/sql-data-sources-jdbc.md | 106 ++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 106 insertions(+)

diff --git a/docs/sql-data-sources-jdbc.md b/docs/sql-data-sources-jdbc.md
index 51c0886430a3..734ed43f912a 100644
--- a/docs/sql-data-sources-jdbc.md
+++ b/docs/sql-data-sources-jdbc.md
@@ -1630,3 +1630,109 @@ as the activated JDBC Driver.
     </tr>
   </tbody>
 </table>
+
+### Mapping Spark SQL Data Types to Microsoft SQL Server
+
+The below table describes the data type conversions from Spark SQL Data Types 
to Microsoft SQL Server data types,
+when creating, altering, or writing data to a Microsoft SQL Server table using 
the built-in jdbc data source with
+the mssql-jdbc as the activated JDBC Driver.
+
+<table>
+  <thead>
+    <tr>
+      <th><b>Spark SQL Data Type</b></th>
+      <th><b>SQL Server Data Type</b></th>
+      <th><b>Remarks</b></th>
+    </tr>
+  </thead>
+  <tbody>
+    <tr>
+      <td>BooleanType</td>
+      <td>bit</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>ByteType</td>
+      <td>smallint</td>
+      <td>Supported since Spark 4.0.0, previous versions throw errors</td>
+    </tr>
+    <tr>
+      <td>ShortType</td>
+      <td>smallint</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>IntegerType</td>
+      <td>int</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>LongType</td>
+      <td>bigint</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>FloatType</td>
+      <td>real</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>DoubleType</td>
+      <td>double precision</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>DecimalType(p, s)</td>
+      <td>number(p,s)</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>DateType</td>
+      <td>date</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>TimestampType</td>
+      <td>datetime</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>TimestampNTZType</td>
+      <td>datetime</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>StringType</td>
+      <td>nvarchar(max)</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>BinaryType</td>
+      <td>varbinary(max)</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>CharType(n)</td>
+      <td>char(n)</td>
+      <td></td>
+    </tr>
+    <tr>
+      <td>VarcharType(n)</td>
+      <td>varchar(n)</td>
+      <td></td>
+    </tr>
+  </tbody>
+</table>
+
+The Spark Catalyst data types below are not supported with suitable SQL Server 
types.
+
+- DayTimeIntervalType
+- YearMonthIntervalType
+- CalendarIntervalType
+- ArrayType
+- MapType
+- StructType
+- UserDefinedType
+- NullType
+- ObjectType
+- VariantType


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to