[ 
https://issues.apache.org/jira/browse/SPARK-15651?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Stavros Kontopoulos updated SPARK-15651:
----------------------------------------
    Description: 
https://ci.typesafe.com/job/ghprb-spark-multi-conf/label=mesos-spark-docker,scala_version=2.10/186/console

[info] spark-core: found 1 potential binary incompatibilities (filtered 737)
[error]  * class 
org.apache.spark.rdd.SqlNewHadoopRDD#NewHadoopMapPartitionsWithSplitRDD does 
not have a correspondent in new version
[error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.rdd.SqlNewHadoopRDD$NewHadoopMapPartitionsWithSplitRDD")

[info] spark-sql: found 2 potential binary incompatibilities (filtered 371)

[error]  * method leafFiles()scala.collection.mutable.Map in class 
org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache has now a 
different result type; was: scala.collection.mutable.Map, is now: 
scala.collection.mutable.LinkedHashMap

[error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache.leafFiles")
[error]  * method leafFiles_=(scala.collection.mutable.Map)Unit in class 
org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache's type has 
changed; was (scala.collection.mutable.Map)Unit, is now: 
(scala.collection.mutable.LinkedHashMap)Unit
[error]    filter with: 
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache.leafFiles_=")

I am curious why i get this error.  Are the changes between 1.5 and 1.6 in 
those classes normally excluded from mima check?
I just run the run_test script locally. I have checked the contents here: 
.generated-mima-class-excludes in current directory.
.generated-mima-member-excludes in current directory.

seem valid compared to a local run on my pc.


  was:
https://ci.typesafe.com/job/ghprb-spark-multi-conf/label=mesos-spark-docker,scala_version=2.10/186/console

[info] spark-core: found 1 potential binary incompatibilities (filtered 737)
[error]  * class 
org.apache.spark.rdd.SqlNewHadoopRDD#NewHadoopMapPartitionsWithSplitRDD does 
not have a correspondent in new version
[error]    filter with: 
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.rdd.SqlNewHadoopRDD$NewHadoopMapPartitionsWithSplitRDD")

[info] spark-sql: found 2 potential binary incompatibilities (filtered 371)

[error]  * method leafFiles()scala.collection.mutable.Map in class 
org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache has now a 
different result type; was: scala.collection.mutable.Map, is now: 
scala.collection.mutable.LinkedHashMap

[error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache.leafFiles")
[error]  * method leafFiles_=(scala.collection.mutable.Map)Unit in class 
org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache's type has 
changed; was (scala.collection.mutable.Map)Unit, is now: 
(scala.collection.mutable.LinkedHashMap)Unit
[error]    filter with: 
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache.leafFiles_=")

I am curious why i get this error.  Is the changes between 1.5 and 1.6 in those 
classes normally excluded from mima check?
I just run the run_test script locally. I have checked the contents here: 
.generated-mima-class-excludes in current directory.
.generated-mima-member-excludes in current directory.

seem valid compared to a local run on my pc.



> mima seems to fail for some excluded classes
> --------------------------------------------
>
>                 Key: SPARK-15651
>                 URL: https://issues.apache.org/jira/browse/SPARK-15651
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.6.0
>            Reporter: Stavros Kontopoulos
>
> https://ci.typesafe.com/job/ghprb-spark-multi-conf/label=mesos-spark-docker,scala_version=2.10/186/console
> [info] spark-core: found 1 potential binary incompatibilities (filtered 737)
> [error]  * class 
> org.apache.spark.rdd.SqlNewHadoopRDD#NewHadoopMapPartitionsWithSplitRDD does 
> not have a correspondent in new version
> [error]    filter with: 
> ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.rdd.SqlNewHadoopRDD$NewHadoopMapPartitionsWithSplitRDD")
> [info] spark-sql: found 2 potential binary incompatibilities (filtered 371)
> [error]  * method leafFiles()scala.collection.mutable.Map in class 
> org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache has now a 
> different result type; was: scala.collection.mutable.Map, is now: 
> scala.collection.mutable.LinkedHashMap
> [error]    filter with: 
> ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache.leafFiles")
> [error]  * method leafFiles_=(scala.collection.mutable.Map)Unit in class 
> org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache's type has 
> changed; was (scala.collection.mutable.Map)Unit, is now: 
> (scala.collection.mutable.LinkedHashMap)Unit
> [error]    filter with: 
> ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation#FileStatusCache.leafFiles_=")
> I am curious why i get this error.  Are the changes between 1.5 and 1.6 in 
> those classes normally excluded from mima check?
> I just run the run_test script locally. I have checked the contents here: 
> .generated-mima-class-excludes in current directory.
> .generated-mima-member-excludes in current directory.
> seem valid compared to a local run on my pc.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to