[
https://issues.apache.org/jira/browse/SPARK-20412?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-20412:
------------------------------------
Assignee: Apache Spark
> NullPointerException in places expecting non-optional partitionSpec.
> --------------------------------------------------------------------
>
> Key: SPARK-20412
> URL: https://issues.apache.org/jira/browse/SPARK-20412
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.1.0, 2.2.0
> Reporter: Juliusz Sompolski
> Assignee: Apache Spark
>
> A number of commands expect a partition specification without empty values,
> e.g. {{SHOW PARTITIONS}}.
> But then running {{SHOW PARTITIONS tbl (colStr='foo', colInt)}} throws it in
> an unfriendly way:
> {code}
> java.lang.NullPointerException
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireNonEmptyValueInPartitionSpec$1$$anonfun$apply$1.apply(SessionCatalog.scala:927)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireNonEmptyValueInPartitionSpec$1$$anonfun$apply$1.apply(SessionCatalog.scala:927)
> at scala.collection.Iterator$class.exists(Iterator.scala:919)
> at scala.collection.AbstractIterator.exists(Iterator.scala:1336)
> at scala.collection.IterableLike$class.exists(IterableLike.scala:77)
> at scala.collection.AbstractIterable.exists(Iterable.scala:54)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireNonEmptyValueInPartitionSpec$1.apply(SessionCatalog.scala:927)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireNonEmptyValueInPartitionSpec$1.apply(SessionCatalog.scala:926)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireNonEmptyValueInPartitionSpec(SessionCatalog.scala:926)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$listPartitionNames$1.apply(SessionCatalog.scala:882)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$listPartitionNames$1.apply(SessionCatalog.scala:880)
> at scala.Option.foreach(Option.scala:257)
> at
> org.apache.spark.sql.catalyst.catalog.SessionCatalog.listPartitionNames(SessionCatalog.scala:880)
> at
> org.apache.spark.sql.execution.command.ShowPartitionsCommand.run(tables.scala:817)
> {code}
> where {{requireNonEmptyValueInPartitionSpec}} does not expect a NULL there.
> It seems that {{visitNonOptionalPartitionSpec}} could throw
> {{ParseException}} instead of putting in {{null}}, but I'm not sure if there
> are any implications for other commands using non-optional partitionSpec.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]