Hyukjin Kwon created SPARK-17665:
------------------------------------

             Summary: SparkR supports options in other types consistently other 
APIs
                 Key: SPARK-17665
                 URL: https://issues.apache.org/jira/browse/SPARK-17665
             Project: Spark
          Issue Type: Improvement
            Reporter: Hyukjin Kwon
            Priority: Minor


Currently, SparkR only supports a string as option in some APIs such as 
`read.df`/`write.df` and etc.

It'd be great if they support other types consistently with 
Python/Scala/Java/SQL APIs.

- Python supports all types but converts it to string
- Scala/Java/SQL - Long/Boolean/String/Double.

Currently, 
{code}
> read.df("text.json", "csv", inferSchema=FALSE)
{code}

throws an exception as below:

{code}
Error in value[[3L]](cond) :
  Error in invokeJava(isStatic = TRUE, className, methodName, ...): 
java.lang.ClassCastException: java.lang.Boolean cannot be cast to 
java.lang.String
        at 
org.apache.spark.sql.internal.SessionState$$anonfun$newHadoopConfWithOptions$1.apply(SessionState.scala:59)
        at 
org.apache.spark.sql.internal.SessionState$$anonfun$newHadoopConfWithOptions$1.apply(SessionState.scala:59)
        at scala.collection.immutable.Map$Map3.foreach(Map.scala:161)
        at 
org.apache.spark.sql.internal.SessionState.newHadoopConfWithOptions(SessionState.scala:59)
        at 
org.apache.spark.sql.execution.datasources.PartitioningAwareFileCatalog.<init>(PartitioningAwareFileCatalog.scala:45)
        at 
org.apache.spark.sql.execution.datasources.ListingFileCatalog.<init>(ListingFileCatalog.scala:45)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:401)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
        at org.apache.spark.sql.DataFrameReader.lo
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to