Hi, Spark community. We are trying to upgrade our application from spark 2.3.1 to 2.4.3, and came across a weird problem.
We are using Gradle for dependency management. Spark depends on twitter-chill, which depends on kryo-shaded. All our dependencies to kryo-shaded come from twitter, and all request version 4.0.2 Gradle, however, in its infinite wisdom, pulls version 3.0.3 instead, and that won't even compile. dependencyInsight returns the following: com.esotericsoftware:kryo-shaded:3.0.3 (selected by rule) variant "runtime" [ org.gradle.status = release (not requested) Requested attributes not found in the selected variant: org.gradle.usage = java-api ] com.esotericsoftware:kryo-shaded:4.0.2 -> 3.0.3 +--- com.twitter:chill-java:0.9.3 | +--- org.apache.spark:spark-core_2.11:2.4.3 | | +--- compileClasspath | | +--- org.apache.spark:spark-mllib_2.11:2.4.3 | | | \--- compileClasspath | | +--- org.apache.spark:spark-sql_2.11:2.4.3 | | | +--- compileClasspath | | | \--- org.apache.spark:spark-mllib_2.11:2.4.3 (*) | | +--- org.apache.spark:spark-catalyst_2.11:2.4.3 | | | \--- org.apache.spark:spark-sql_2.11:2.4.3 (*) | | +--- org.apache.spark:spark-streaming_2.11:2.4.3 | | | \--- org.apache.spark:spark-mllib_2.11:2.4.3 (*) | | \--- org.apache.spark:spark-graphx_2.11:2.4.3 | | \--- org.apache.spark:spark-mllib_2.11:2.4.3 (*) | \--- com.twitter:chill_2.11:0.9.3 | +--- org.apache.spark:spark-core_2.11:2.4.3 (*) | \--- org.apache.spark:spark-unsafe_2.11:2.4.3 | +--- org.apache.spark:spark-catalyst_2.11:2.4.3 (*) | \--- org.apache.spark:spark-core_2.11:2.4.3 (*) \--- com.twitter:chill_2.11:0.9.3 (*) I presume this means that Gradle can't find the property/value org.gradle.usage=java-api in kryo-shaded version 4.0.2, but it can in 3.0.3? Does anyone know why this might occur? I see no reference to org.gradle.usage in either our or spark's build files, so (assuming I even understand the problem correctly) I have no idea where this requirement is coming from. We can work around the problem by setting the kryo-shaded version explicitly, but of course this means we would have to keep setting it as we upgrade in the future, so of course this is not ideal. I realize this is likely (though not certainly) a gradle, not a spark, problem, but I'm hoping someone else here has encountered this before? Thanks in advance, -Nathan Kronenfeld