srowen commented on a change in pull request #26826: 
[SPARK-30195][SQL][CORE][ML] Change some function, import definitions to work 
with stricter compiler in Scala 2.13
URL: https://github.com/apache/spark/pull/26826#discussion_r355827382
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/columnar/compression/IntegralDeltaSuite.scala
 ##########
 @@ -173,9 +173,7 @@ class IntegralDeltaSuite extends SparkFunSuite {
     }
 
     test(s"$scheme: long random series") {
-      // Have to workaround with `Any` since no `ClassTag[I#JvmType]` 
available here.
-      val input = Array.fill[Any](10000)(makeRandomValue(columnType))
-      skeleton(input.map(_.asInstanceOf[I#InternalType]))
+      skeleton(Seq.fill[I#InternalType](10000)(makeRandomValue(columnType)))
 
 Review comment:
   This arguably was more a part of my last "Array -> Seq" change; I missed it, 
but it concerns using correct types. Here the problems arose because this tried 
to use a type without a ClassTag and that causes a few problems, among them 
dealing with Arrays. A few workarounds here like using Seq made it cleaner.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to