[ https://issues.apache.org/jira/browse/SPARK-28471?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16899633#comment-16899633 ]
Maxim Gekk edited comment on SPARK-28471 at 8/4/19 1:53 PM: ------------------------------------------------------------ [~yumwang] There are 2 equal methods to output negative years (for BC era): # With sign '-' # or add suffix `BC` The proposed PR outputs `-` for negative years. I have fixed this only Spark's formatters. In your example, you use standard formatter of *java.sql.Date*. For example, if you switch on Java 8 API: {code:java} scala> spark.conf.set("spark.sql.datetime.java8API.enabled", true) scala> val df = spark.sql("select make_date(-44, 3, 15)") df: org.apache.spark.sql.DataFrame = [make_date(-44, 3, 15): date] scala> val date = df.collect.head.getAs[java.time.LocalDate](0) date: java.time.LocalDate = -0044-03-15 {code} was (Author: maxgekk): [~yumwang] There are 2 equal methods to output negative years (for BC era): # With sign '-' # or add suffix `BC` The proposed PR outputs `-` for negative years. I have fixed this only Spark's formatters. In your example, you uses standard formatter of java.sql.Date. For example, if you switch on Java 8 API: {code:java} scala> spark.conf.set("spark.sql.datetime.java8API.enabled", true) scala> val df = spark.sql("select make_date(-44, 3, 15)") df: org.apache.spark.sql.DataFrame = [make_date(-44, 3, 15): date] scala> val date = df.collect.head.getAs[java.time.LocalDate](0) date: java.time.LocalDate = -0044-03-15 {code} > Formatting dates with negative years > ------------------------------------ > > Key: SPARK-28471 > URL: https://issues.apache.org/jira/browse/SPARK-28471 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.0.0 > Reporter: Maxim Gekk > Assignee: Maxim Gekk > Priority: Minor > Fix For: 3.0.0 > > > While converting dates with negative years to strings, Spark skips era > sub-field by default. That's can confuse users since years from BC era are > mirrored to current era. For example: > {code} > spark-sql> select make_date(-44, 3, 15); > 0045-03-15 > {code} > Even negative years are out of supported range by the DATE type, it would be > nice to indicate the era for such dates. > PostgreSQL outputs the era for such inputs: > {code} > # select make_date(-44, 3, 15); > make_date > --------------- > 0044-03-15 BC > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.14#76016) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org