[ 
https://issues.apache.org/jira/browse/SPARK-44638?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot reassigned SPARK-44638:
--------------------------------------

    Assignee:     (was: Apache Spark)

> Unable to read from JDBC data sources when using custom schema containing 
> varchar
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-44638
>                 URL: https://issues.apache.org/jira/browse/SPARK-44638
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0, 3.2.4, 3.3.2, 3.4.1
>            Reporter: Michael Said
>            Priority: Critical
>              Labels: pull-request-available
>
> When querying the data from JDBC databases with custom schema containing 
> varchar I got this error :
> {code:java}
> [23/07/14 06:12:19 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 1) ( 
> executor 1): java.sql.SQLException: Unsupported type varchar(100) at 
> org.apache.spark.sql.errors.QueryExecutionErrors$.unsupportedJdbcTypeError(QueryExecutionErrors.scala:818)
>  23/07/14 06:12:21 INFO TaskSetManager: Lost task 0.1 in stage 1.0 (TID 2) on 
> , executor 0: java.sql.SQLException (Unsupported type varchar(100)){code}
> Code example: 
> {code:java}
> CUSTOM_SCHEMA="ID Integer, NAME VARCHAR(100)"
> df = spark.read.format("jdbc")
> .option("url", "jdbc:oracle:thin:@0.0.0.0:1521:db")
> .option("driver", "oracle.jdbc.OracleDriver")
> .option("dbtable", "table")
> .option("customSchema", CUSTOM_SCHEMA)
> .option("user", "user")
> .option("password", "password")
> .load()
> df.show(){code}
> I tried to set {{spark.sql.legacy.charVarcharAsString = true}} to restore the 
> behavior before Spark 3.1 but it doesn't help.
> The issue occurs in version 3.1.0 and above. I believe that this issue is 
> caused by https://issues.apache.org/jira/browse/SPARK-33480



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to