[ https://issues.apache.org/jira/browse/SPARK-29925?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yuming Wang resolved SPARK-29925. --------------------------------- Resolution: Invalid > Maven Build fails with Hadoop Version 3.2.0 > ------------------------------------------- > > Key: SPARK-29925 > URL: https://issues.apache.org/jira/browse/SPARK-29925 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 3.1.0 > Environment: The build was tested in two environments. The first was > Debian 10 running OpenJDK 11 with Scala 2.12. The second was Debian 9.1 with > OpenJDK 8 and Scala 2.12. > The same error occurred in both environments. > Both environments used Linux kernel 4.19. Both environments were VirtualBox > VMs running on a MacBook. > Reporter: Douglas Colkitt > Priority: Minor > > Build fails at Spark Core stage when using Maven with specified Hadoop > version 3.2. The build command run is: > {code:java} > ./build/mvn -DskipTests -Dhadoop.version=3.2.0 package > {code} > The build error output is > {code:java} > [INFO] > [INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ > spark-core_2.12 --- > [INFO] Using incremental compilation using Mixed compile order > [INFO] Compiling 262 Scala sources and 27 Java sources to > /usr/local/src/spark/core/target/scala-2.12/test-classes ... > [ERROR] [Error] > /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: > object lang is not a member of package org.apache.commons > [ERROR] [Error] > /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: > not found: value SerializationUtils > [ERROR] two errors found{code} > The problem does _not_ occur when building without Hadoop package > specification, i.e. when running: > {code:java} > ./build/mvn -DskipTests package > {code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org