Repository: spark
Updated Branches:
refs/heads/master 348ddfd20 -> 32acfa78c
Improve implicitNotFound message for Encoder
The `implicitNotFound` message for `Encoder` doesn't mention the name of
the type for which it can't find an encoder. Furthermore, it covers up
the fact that `Encoder` is the name of the relevant type class.
Hopefully this new message provides a little more specific type detail
while still giving the general message about which types are supported.
## What changes were proposed in this pull request?
Augment the existing message to mention that it's looking for an `Encoder` and
what the type of the encoder is.
For example instead of:
```
Unable to find encoder for type stored in a Dataset. Primitive types (Int,
String, etc) and Product types (case classes) are supported by importing
spark.implicits._ Support for serializing other types will be added in future
releases.
```
return this message:
```
Unable to find encoder for type Exception. An implicit Encoder[Exception] is
needed to store Exception instances in a Dataset. Primitive types (Int, String,
etc) and Product types (ca
se classes) are supported by importing spark.implicits._ Support for
serializing other types will be added in future releases.
```
## How was this patch tested?
It was tested manually in the Scala REPL, since triggering this in a test would
cause a compilation error.
```
scala> implicitly[Encoder[Exception]]
<console>:51: error: Unable to find encoder for type Exception. An implicit
Encoder[Exception] is needed to store Exception instances in a Dataset.
Primitive types (Int, String, etc) and Product types (ca
se classes) are supported by importing spark.implicits._ Support for
serializing other types will be added in future releases.
implicitly[Encoder[Exception]]
^
```
Author: Cody Allen <[email protected]>
Closes #20869 from ceedubs/encoder-implicit-msg.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/32acfa78
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/32acfa78
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/32acfa78
Branch: refs/heads/master
Commit: 32acfa78c60465efc03ae01e022614ad91345b1c
Parents: 348ddfd
Author: Cody Allen <[email protected]>
Authored: Sat May 12 14:35:40 2018 -0500
Committer: Sean Owen <[email protected]>
Committed: Sat May 12 14:35:40 2018 -0500
----------------------------------------------------------------------
.../src/main/scala/org/apache/spark/sql/Encoder.scala | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/32acfa78/sql/catalyst/src/main/scala/org/apache/spark/sql/Encoder.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/Encoder.scala
b/sql/catalyst/src/main/scala/org/apache/spark/sql/Encoder.scala
index ccdb6bc..7b02317 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/Encoder.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/Encoder.scala
@@ -68,10 +68,10 @@ import org.apache.spark.sql.types._
*/
@Experimental
@InterfaceStability.Evolving
-@implicitNotFound("Unable to find encoder for type stored in a Dataset.
Primitive types " +
- "(Int, String, etc) and Product types (case classes) are supported by
importing " +
- "spark.implicits._ Support for serializing other types will be added in
future " +
- "releases.")
+@implicitNotFound("Unable to find encoder for type ${T}. An implicit
Encoder[${T}] is needed to " +
+ "store ${T} instances in a Dataset. Primitive types (Int, String, etc) and
Product types (case " +
+ "classes) are supported by importing spark.implicits._ Support for
serializing other types " +
+ "will be added in future releases.")
trait Encoder[T] extends Serializable {
/** Returns the schema of encoding this type of object as a Row. */
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]