[ https://issues.apache.org/jira/browse/SPARK-13261?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-13261. -------------------------------------- Resolution: Fixed Fix Version/s: 2.0.0 Issue resolved by pull request 11147 [https://github.com/apache/spark/pull/11147] > Expose maxCharactersPerColumn as a user configurable option > ----------------------------------------------------------- > > Key: SPARK-13261 > URL: https://issues.apache.org/jira/browse/SPARK-13261 > Project: Spark > Issue Type: Sub-task > Components: SQL > Reporter: Hossein Falaki > Fix For: 2.0.0 > > > We are using Univocity parser in the CSV data source in Spark. The parser has > a fairly small limit for maximum number of characters per column. Spark's CSV > data source updates it but it is not exposed to user. There are still use > cases where the limit is too small. I think we should just expose it as an > option. I suggest "maxCharsPerColumn" for the option. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org