Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/14419#discussion_r72914197
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetCompatibilityTest.scala
---
@@ -119,8 +119,18 @@ private[sql] object ParquetCompatibilityTest {
metadata: Map[String, String],
recordWriters: (RecordConsumer => Unit)*): Unit = {
val messageType = MessageTypeParser.parseMessageType(schema)
- val writeSupport = new DirectWriteSupport(messageType, metadata)
- val parquetWriter = new ParquetWriter[RecordConsumer => Unit](new
Path(path), writeSupport)
+ val testWriteSupport = new DirectWriteSupport(messageType, metadata)
+ case class ParquetWriterBuilder() extends
--- End diff --
One thing I am a bit wondering though is why this has to be `case class`
instead of just `class`. Because it seems it is self-contained, so it seems not
have to be serializable and it seems there is no pattern matching with this.
Just curious.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]