Gankun Luo created SPARK-7359:
---------------------------------

             Summary: DataFrame.saveAsParquetFile throws : Unsupported datatype 
DecimalType
                 Key: SPARK-7359
                 URL: https://issues.apache.org/jira/browse/SPARK-7359
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.1
         Environment: github master 
            Reporter: Gankun Luo


Prepare JDBC table and data: 
{quote}
CREATE TABLE demo (
  CUST_ID bigint(20) NOT NULL,
  SHOULD_PAY decimal(14,2) DEFAULT '0.00'
) 
INSERT INTO demo VALUES ('11804028169448', '2000.00');
INSERT INTO demo VALUES ('11004005614720', '1234.00');
{quote}

{quote}
val df = 
sqlContext.jdbc("jdbc:mysql://hadoop000:3306/test?user=root&password=root", 
"demo")
#df: org.apache.spark.sql.DataFrame = [CUST_ID: bigint, SHOULD_PAY: 
decimal(10,0)]
df.saveAsParquetFile("hdfs://hadoop000:8020/data/df_parquet")
{quote}
Has two questions as following:
1、precision is 10, scale is 0, not same as the jdbc table column type
{quote}
val df = sqlContext.jdbc(...)
df: org.apache.spark.sql.DataFrame = [CUST_ID: bigint, SHOULD_PAY: 
decimal(10,0)]
{quote}

2、throw exception:
{quote}
java.lang.RuntimeException: Unsupported datatype DecimalType()
        at scala.sys.package$.error(package.scala:27)
        at 
org.apache.spark.sql.parquet.ParquetTypesConverter$$anonfun$fromDataType$2.apply(ParquetTypes.scala:372)
        at 
org.apache.spark.sql.parquet.ParquetTypesConverter$$anonfun$fromDataType$2.apply(ParquetTypes.scala:316)
        at scala.Option.getOrElse(Option.scala:120)
        at 
org.apache.spark.sql.parquet.ParquetTypesConverter$.fromDataType(ParquetTypes.scala:315)
        at 
org.apache.spark.sql.parquet.ParquetTypesConverter$$anonfun$4.apply(ParquetTypes.scala:396)
        at 
org.apache.spark.sql.parquet.ParquetTypesConverter$$anonfun$4.apply(ParquetTypes.scala:395)
{quote}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to