dbtsai commented on a change in pull request #27728: 
[SPARK-25556][SPARK-17636][SPARK-31026][SPARK-31060][SQL][test-hive1.2] Nested 
Column Predicate Pushdown for Parquet
URL: https://github.com/apache/spark/pull/27728#discussion_r397601474
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala
 ##########
 @@ -62,13 +62,21 @@ private[sql] trait ParquetTest extends 
FileBasedDataSourceTest {
       (data: Seq[T])
       (f: String => Unit): Unit = withDataSourceFile(data)(f)
 
+  protected def toDF[T <: Product: ClassTag: TypeTag](data: Seq[T]): DataFrame 
= {
+    spark.createDataFrame(data)
+  }
+
   /**
-   * Writes `data` to a Parquet file and reads it back as a [[DataFrame]],
+   * Writes `df` dataframe to a Parquet file and reads it back as a 
[[DataFrame]],
    * which is then passed to `f`. The Parquet file will be deleted after `f` 
returns.
    */
-  protected def withParquetDataFrame[T <: Product: ClassTag: TypeTag]
-      (data: Seq[T], testVectorized: Boolean = true)
-      (f: DataFrame => Unit): Unit = withDataSourceDataFrame(data, 
testVectorized)(f)
+  protected def withParquetDataFrame(df: DataFrame, testVectorized: Boolean = 
true)
 
 Review comment:
   It's because the original test framework takes `Seq[T]` which is very hard 
to programmatically to manipulate to create different favor of nested data as 
new test cases. See  
   
https://github.com/apache/spark/pull/27728/files#diff-43b427b8b0b4b9d8dd7e4367c0526f83R128
   
   By taking a dataframe instead, it's very easier to create new nested data 
based on single level data for tests. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to