Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/5727#issuecomment-96816752
  
    It looks like SQL failed 11 MiMa checks, although it looks like all of them 
are in test code or internal APIs (so we can just double-check, then add the 
proper excludes / annotations):
    
    ```
    [info] spark-sql: found 13 potential binary incompatibilities (filtered 101)
    [error]  * method 
checkAnalysis()org.apache.spark.sql.catalyst.analysis.CheckAnalysis in class 
org.apache.spark.sql.SQLContext does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.SQLContext.checkAnalysis")
    [error]  * method children()scala.collection.immutable.Nil# in class 
org.apache.spark.sql.execution.ExecutedCommand has now a different result type; 
was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.ExecutedCommand.children")
    [error]  * class org.apache.spark.sql.execution.AddExchange does not have a 
correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.execution.AddExchange")
    [error]  * method children()scala.collection.immutable.Nil# in class 
org.apache.spark.sql.execution.LogicalLocalTable has now a different result 
type; was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalLocalTable.children")
    [error]  * method 
newInstance()org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation in 
class org.apache.spark.sql.execution.LogicalLocalTable has now a different 
result type; was: org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation, 
is now: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalLocalTable.newInstance")
    [error]  * method children()scala.collection.immutable.Nil# in class 
org.apache.spark.sql.execution.PhysicalRDD has now a different result type; 
was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.PhysicalRDD.children")
    [error]  * method children()scala.collection.immutable.Nil# in class 
org.apache.spark.sql.execution.LocalTableScan has now a different result type; 
was: scala.collection.immutable.Nil#, is now: scala.collection.Seq
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LocalTableScan.children")
    [error]  * object org.apache.spark.sql.execution.AddExchange does not have 
a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.execution.AddExchange$")
    [error]  * method children()scala.collection.immutable.Nil# in class 
org.apache.spark.sql.execution.LogicalRDD has now a different result type; was: 
scala.collection.immutable.Nil#, is now: scala.collection.Seq
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalRDD.children")
    [error]  * method 
newInstance()org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation in 
class org.apache.spark.sql.execution.LogicalRDD has now a different result 
type; was: org.apache.spark.sql.catalyst.analysis.MultiInstanceRelation, is 
now: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.execution.LogicalRDD.newInstance")
    [error]  * class org.apache.spark.sql.parquet.ParquetTestData does not have 
a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.parquet.ParquetTestData")
    [error]  * object org.apache.spark.sql.parquet.ParquetTestData does not 
have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.parquet.ParquetTestData$")
    [error]  * class org.apache.spark.sql.parquet.TestGroupWriteSupport does 
not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.parquet.TestGroupWriteSupport")
    ```
    
    `launcher` failed its checks because it couldn't find a spark-launcher JAR 
on Maven:
    
    ```
    [info] spark-mllib: found 0 potential binary incompatibilities (filtered 
242)
    sbt.ResolveException: unresolved dependency: 
org.apache.spark#spark-launcher_2.10;1.3.0: not found
        at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278)
        at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
        at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
        at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
        at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
        at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
        at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
        at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
        at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
        at 
xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
        at 
xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
        at xsbt.boot.Using$.withResource(Using.scala:10)
        at xsbt.boot.Using$.apply(Using.scala:9)
        at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
        at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
        at xsbt.boot.Locks$.apply0(Locks.scala:31)
        at xsbt.boot.Locks$.apply(Locks.scala:28)
        at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
        at sbt.IvySbt.withIvy(Ivy.scala:123)
        at sbt.IvySbt.withIvy(Ivy.scala:120)
        at sbt.IvySbt$Module.withModule(Ivy.scala:151)
        at sbt.IvyActions$.updateEither(IvyActions.scala:157)
        at sbt.IvyActions$.update(IvyActions.scala:145)
        at 
com.typesafe.tools.mima.plugin.SbtMima$.getPreviousArtifact(SbtMima.scala:75)
        at 
com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4$$anonfun$apply$2.apply(MimaPlugin.scala:30)
        at 
com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4$$anonfun$apply$2.apply(MimaPlugin.scala:30)
        at scala.Option.map(Option.scala:145)
        at 
com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4.apply(MimaPlugin.scala:30)
        at 
com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaDefaultSettings$4.apply(MimaPlugin.scala:29)
        at scala.Function3$$anonfun$tupled$1.apply(Function3.scala:35)
        at scala.Function3$$anonfun$tupled$1.apply(Function3.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at 
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at 
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at 
sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
    java.lang.RuntimeException: spark-sql: Binary compatibility check failed!
        at scala.sys.package$.error(package.scala:27)
        at 
com.typesafe.tools.mima.plugin.SbtMima$.reportErrors(SbtMima.scala:64)
        at 
com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23)
        at 
com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23)
        at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:35)
        at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at 
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at 
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at 
sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
    [error] (launcher/*:mimaPreviousClassfiles) sbt.ResolveException: 
unresolved dependency: org.apache.spark#spark-launcher_2.10;1.3.0: not found
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to