karolchmist commented on pull request #28545:
URL: https://github.com/apache/spark/pull/28545#issuecomment-681962135
Thanks @srowen, it worked better, but now it fails in `spark-sql`. I can try
to fix it if no one is working on it...
```
...
[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
spark-sql_2.13 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file:
/home/karol/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.13-1.3.1-bin_2.13.3__52.0-1.3.1_20191012T045515.jar
[INFO] Compiling 473 Scala sources and 59 Java sources to
/home/karol/workspace/open-source/spark/sql/core/target/scala-2.13/classes ...
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:121:
value += is not a member of
org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
Expression does not convert to assignment because:
type mismatch;
found : scala.collection.immutable.Map[String,String]
required: org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
expansion: this.extraOptions =
this.extraOptions.+(key.$minus$greater(value))
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:132:
value += is not a member of
org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
Expression does not convert to assignment because:
type mismatch;
found : scala.collection.immutable.Map[String,String]
required: org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
expansion: this.extraOptions =
this.extraOptions.+(key.$minus$greater(value))
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:294:
value += is not a member of
org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
Expression does not convert to assignment because:
type mismatch;
found : scala.collection.immutable.Map[String,String]
required: org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
expansion: this.extraOptions =
this.extraOptions.+("path".$minus$greater(path))
Error occurred in an application involving default arguments.
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:317:
type mismatch;
found : Iterable[(String, String)]
required: java.util.Map[String,String]
Error occurred in an application involving default arguments.
[INFO] [Info] : Iterable[(String, String)] <: java.util.Map[String,String]?
[INFO] [Info] : false
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:412:
value += is not a member of
org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
Expression does not convert to assignment because:
type mismatch;
found : scala.collection.immutable.Map[String,String]
required: org.apache.spark.sql.catalyst.util.CaseInsensitiveMap[String]
expansion: DataFrameWriter.this.extraOptions =
DataFrameWriter.this.extraOptions.+(DataSourceUtils.PARTITIONING_COLUMNS_KEY.$minus$greater(DataSourceUtils.encodePartitioningColumns(columns)))
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcFiltersBase.scala:85:
type mismatch;
found :
scala.collection.MapView[String,OrcFiltersBase.this.OrcPrimitiveField]
required: Map[String,OrcFiltersBase.this.OrcPrimitiveField]
[ERROR] [Error]
/home/karol/workspace/open-source/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/FileDataSourceV2.scala:64:
type mismatch;
found : Iterable[(String, String)]
required: java.util.Map[String,String]
[INFO] [Info] : Iterable[(String, String)] <: java.util.Map[String,String]?
[INFO] [Info] : false[ERROR] 7 errors found
[INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary for Spark Project Parent POM 3.1.0-SNAPSHOT:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 3.160
s]
[INFO] Spark Project Tags ................................. SUCCESS [ 4.082
s]
[INFO] Spark Project Sketch ............................... SUCCESS [ 2.523
s]
[INFO] Spark Project Local DB ............................. SUCCESS [ 3.184
s]
[INFO] Spark Project Networking ........................... SUCCESS [ 5.766
s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 5.206
s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 3.028
s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 3.517
s]
[INFO] Spark Project Core ................................. SUCCESS [02:02
min]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 23.898
s]
[INFO] Spark Project GraphX ............................... SUCCESS [ 30.828
s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 56.213
s]
[INFO] Spark Project Catalyst ............................. SUCCESS [02:44
min]
[INFO] Spark Project SQL .................................. FAILURE [ 24.530
s]
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 07:33 min
[INFO] Finished at: 2020-08-27T15:42:43+02:00
[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:4.3.0:compile (scala-compile-first) on
project spark-sql_2.13: Execution scala-compile-first of goal
net.alchim31.maven:scala-maven-plugin:4.3.0:compile failed.: CompileFailed ->
[Help 1]
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]