Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/11443#issuecomment-195009022
  
    ```
    [info] spark-mllib: found 3 potential binary incompatibilities (filtered 
184)
    [error]  * method this(org.apache.spark.sql.DataFrame)Unit in class 
org.apache.spark.mllib.evaluation.MultilabelMetrics does not have a 
correspondent with same parameter signature among 
(org.apache.spark.rdd.RDD)Unit, (org.apache.spark.sql.Dataset)Unit
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.mllib.evaluation.MultilabelMetrics.this")
    [error]  * method predictions()org.apache.spark.sql.DataFrame in interface 
org.apache.spark.ml.classification.LogisticRegressionSummary has now a 
different result type; was: org.apache.spark.sql.DataFrame, is now: 
org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.ml.classification.LogisticRegressionSummary.predictions")
    [error]  * abstract method predictions()org.apache.spark.sql.Dataset in 
interface org.apache.spark.ml.classification.LogisticRegressionSummary does not 
have a correspondent in old version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.classification.LogisticRegressionSummary.predictions")
    [info] spark-sql: found 171 potential binary incompatibilities (filtered 
801)
    [error]  * class org.apache.spark.sql.DataFrame is declared final in new 
version
    [error]    filter with: 
ProblemFilters.exclude[FinalClassProblem]("org.apache.spark.sql.DataFrame")
    [error]  * deprecated method toSchemaRDD()org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toSchemaRDD")
    [error]  * method 
selectExpr(scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.selectExpr")
    [error]  * method 
selectExpr(Array[java.lang.String])org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.selectExpr")
    [error]  * method limit(Int)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.limit")
    [error]  * method 
queryExecution()org.apache.spark.sql.execution.QueryExecution in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.queryExecution")
    [error]  * method 
sortWithinPartitions(scala.collection.Seq)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sortWithinPartitions")
    [error]  * method 
sortWithinPartitions(java.lang.String,scala.collection.Seq)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sortWithinPartitions")
    [error]  * method 
sortWithinPartitions(Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sortWithinPartitions")
    [error]  * method 
sortWithinPartitions(java.lang.String,Array[java.lang.String])org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sortWithinPartitions")
    [error]  * method count()Long in class org.apache.spark.sql.DataFrame does 
not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.count")
    [error]  * method dtypes()Array[scala.Tuple2] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.dtypes")
    [error]  * method 
flatMap(scala.Function1,scala.reflect.ClassTag)org.apache.spark.rdd.RDD in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.flatMap")
    [error]  * method filter(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.filter")
    [error]  * method 
filter(org.apache.spark.sql.Column)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.filter")
    [error]  * method unpersist()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.unpersist")
    [error]  * method unpersist(Boolean)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.unpersist")
    [error]  * method numericColumns()scala.collection.Seq in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.numericColumns")
    [error]  * method toJavaRDD()org.apache.spark.api.java.JavaRDD in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toJavaRDD")
    [error]  * method toDF(scala.collection.Seq)org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toDF")
    [error]  * method toDF()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toDF")
    [error]  * method 
toDF(Array[java.lang.String])org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toDF")
    [error]  * method distinct()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.distinct")
    [error]  * synthetic method showString$default$2()Boolean in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.showString$default$2")
    [error]  * method transform(scala.Function1)org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.transform")
    [error]  * method sample(Boolean,Double)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sample")
    [error]  * method sample(Boolean,Double,Long)org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sample")
    [error]  * method na()org.apache.spark.sql.DataFrameNaFunctions in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.na")
    [error]  * method collect()Array[org.apache.spark.sql.Row] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.collect")
    [error]  * method take(Int)Array[org.apache.spark.sql.Row] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.take")
    [error]  * synthetic method 
org$apache$spark$sql$DataFrame$$execute$1()Array[org.apache.spark.sql.Row] in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$execute$1")
    [error]  * method alias(scala.Symbol)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.alias")
    [error]  * method alias(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.alias")
    [error]  * method 
describe(scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.describe")
    [error]  * method 
describe(Array[java.lang.String])org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.describe")
    [error]  * method 
randomSplit(scala.collection.immutable.List,Long)Array[org.apache.spark.sql.DataFrame]
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.randomSplit")
    [error]  * method 
randomSplit(Array[Double])Array[org.apache.spark.sql.DataFrame] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.randomSplit")
    [error]  * method 
randomSplit(Array[Double],Long)Array[org.apache.spark.sql.DataFrame] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.randomSplit")
    [error]  * method rdd()org.apache.spark.rdd.RDD in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.rdd")
    [error]  * method isLocal()Boolean in class org.apache.spark.sql.DataFrame 
does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.isLocal")
    [error]  * method columns()Array[java.lang.String] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.columns")
    [error]  * deprecated method insertInto(java.lang.String)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.insertInto")
    [error]  * deprecated method insertInto(java.lang.String,Boolean)Unit in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.insertInto")
    [error]  * method 
join(org.apache.spark.sql.DataFrame,org.apache.spark.sql.Column,java.lang.String)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.join")
    [error]  * method 
join(org.apache.spark.sql.DataFrame,org.apache.spark.sql.Column)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.join")
    [error]  * method 
join(org.apache.spark.sql.DataFrame,scala.collection.Seq,java.lang.String)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.join")
    [error]  * method 
join(org.apache.spark.sql.DataFrame,scala.collection.Seq)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.join")
    [error]  * method 
join(org.apache.spark.sql.DataFrame,java.lang.String)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.join")
    [error]  * method 
join(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.join")
    [error]  * method 
persist(org.apache.spark.storage.StorageLevel)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.persist")
    [error]  * method persist()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.persist")
    [error]  * method cache()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.cache")
    [error]  * method 
cube(java.lang.String,scala.collection.Seq)org.apache.spark.sql.GroupedData in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.cube")
    [error]  * method 
cube(scala.collection.Seq)org.apache.spark.sql.GroupedData in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.cube")
    [error]  * method 
cube(java.lang.String,Array[java.lang.String])org.apache.spark.sql.GroupedData 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.cube")
    [error]  * method 
cube(Array[org.apache.spark.sql.Column])org.apache.spark.sql.GroupedData in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.cube")
    [error]  * deprecated method saveAsParquetFile(java.lang.String)Unit in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsParquetFile")
    [error]  * method 
drop(org.apache.spark.sql.Column)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.drop")
    [error]  * method drop(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.drop")
    [error]  * method sort(scala.collection.Seq)org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sort")
    [error]  * method 
sort(java.lang.String,scala.collection.Seq)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sort")
    [error]  * method 
sort(Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sort")
    [error]  * method 
sort(java.lang.String,Array[java.lang.String])org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sort")
    [error]  * method 
groupBy(java.lang.String,scala.collection.Seq)org.apache.spark.sql.GroupedData 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.groupBy")
    [error]  * method 
groupBy(scala.collection.Seq)org.apache.spark.sql.GroupedData in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.groupBy")
    [error]  * method 
groupBy(java.lang.String,Array[java.lang.String])org.apache.spark.sql.GroupedData
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.groupBy")
    [error]  * method 
groupBy(Array[org.apache.spark.sql.Column])org.apache.spark.sql.GroupedData in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.groupBy")
    [error]  * method 
rollup(java.lang.String,scala.collection.Seq)org.apache.spark.sql.GroupedData 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.rollup")
    [error]  * method 
rollup(scala.collection.Seq)org.apache.spark.sql.GroupedData in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.rollup")
    [error]  * method 
rollup(java.lang.String,Array[java.lang.String])org.apache.spark.sql.GroupedData
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.rollup")
    [error]  * method 
rollup(Array[org.apache.spark.sql.Column])org.apache.spark.sql.GroupedData in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.rollup")
    [error]  * deprecated method 
saveAsTable(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode,scala.collection.immutable.Map)Unit
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable")
    [error]  * deprecated method 
saveAsTable(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode,java.util.Map)Unit
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable")
    [error]  * deprecated method 
saveAsTable(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode)Unit
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable")
    [error]  * deprecated method 
saveAsTable(java.lang.String,java.lang.String)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable")
    [error]  * deprecated method 
saveAsTable(java.lang.String,org.apache.spark.sql.SaveMode)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable")
    [error]  * deprecated method saveAsTable(java.lang.String)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable")
    [error]  * method coalesce(Int)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.coalesce")
    [error]  * method showString(Int,Boolean)java.lang.String in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.showString")
    [error]  * method 
logicalPlan()org.apache.spark.sql.catalyst.plans.logical.LogicalPlan in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.logicalPlan")
    [error]  * method sqlContext()org.apache.spark.sql.SQLContext in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.sqlContext")
    [error]  * method explain()Unit in class org.apache.spark.sql.DataFrame 
does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.explain")
    [error]  * method explain(Boolean)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.explain")
    [error]  * method 
intersect(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.intersect")
    [error]  * method withNewExecutionId(scala.Function0)java.lang.Object in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.withNewExecutionId")
    [error]  * method col(java.lang.String)org.apache.spark.sql.Column in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.col")
    [error]  * method 
withColumn(java.lang.String,org.apache.spark.sql.Column,org.apache.spark.sql.types.Metadata)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.withColumn")
    [error]  * method 
withColumn(java.lang.String,org.apache.spark.sql.Column)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.withColumn")
    [error]  * synthetic method 
org$apache$spark$sql$DataFrame$$collect(Boolean)Array[org.apache.spark.sql.Row] 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$collect")
    [error]  * method 
unionAll(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.unionAll")
    [error]  * method 
withColumnRenamed(java.lang.String,java.lang.String)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.withColumnRenamed")
    [error]  * deprecated method 
insertIntoJDBC(java.lang.String,java.lang.String,Boolean)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.insertIntoJDBC")
    [error]  * method write()org.apache.spark.sql.DataFrameWriter in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.write")
    [error]  * method 
select(java.lang.String,scala.collection.Seq)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.select")
    [error]  * method 
select(scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.select")
    [error]  * method 
select(java.lang.String,Array[java.lang.String])org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.select")
    [error]  * method 
select(Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.select")
    [error]  * method collectAsList()java.util.List in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.collectAsList")
    [error]  * method registerTempTable(java.lang.String)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.registerTempTable")
    [error]  * method collectToPython()Int in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.collectToPython")
    [error]  * method 
orderBy(scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.orderBy")
    [error]  * method 
orderBy(java.lang.String,scala.collection.Seq)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.orderBy")
    [error]  * method 
orderBy(Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.orderBy")
    [error]  * method 
orderBy(java.lang.String,Array[java.lang.String])org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.orderBy")
    [error]  * method 
explode(java.lang.String,java.lang.String,scala.Function1,scala.reflect.api.TypeTags#TypeTag)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.explode")
    [error]  * method 
explode(scala.collection.Seq,scala.Function1,scala.reflect.api.TypeTags#TypeTag)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.explode")
    [error]  * synthetic method 
org$apache$spark$sql$DataFrame$$withPlan(scala.Function0)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$withPlan")
    [error]  * method javaRDD()org.apache.spark.api.java.JavaRDD in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.javaRDD")
    [error]  * method 
mapPartitions(scala.Function1,scala.reflect.ClassTag)org.apache.spark.rdd.RDD 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.mapPartitions")
    [error]  * method takeAsList(Int)java.util.List in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.takeAsList")
    [error]  * method foreachPartition(scala.Function1)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.foreachPartition")
    [error]  * method apply(java.lang.String)org.apache.spark.sql.Column in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.apply")
    [error]  * method where(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.where")
    [error]  * method 
where(org.apache.spark.sql.Column)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.where")
    [error]  * deprecated method 
save(java.lang.String,org.apache.spark.sql.SaveMode,scala.collection.immutable.Map)Unit
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save")
    [error]  * deprecated method 
save(java.lang.String,org.apache.spark.sql.SaveMode,java.util.Map)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save")
    [error]  * deprecated method 
save(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode)Unit in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save")
    [error]  * deprecated method save(java.lang.String,java.lang.String)Unit in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save")
    [error]  * deprecated method 
save(java.lang.String,org.apache.spark.sql.SaveMode)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save")
    [error]  * deprecated method save(java.lang.String)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save")
    [error]  * synthetic method 
org$apache$spark$sql$DataFrame$$rowFunction$1(org.apache.spark.sql.Row,scala.Function1,org.apache.spark.sql.types.DataType)scala.collection.TraversableOnce
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$rowFunction$1")
    [error]  * deprecated method 
createJDBCTable(java.lang.String,java.lang.String,Boolean)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.createJDBCTable")
    [error]  * method 
repartition(scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.repartition")
    [error]  * method 
repartition(Int,scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.repartition")
    [error]  * method repartition(Int)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.repartition")
    [error]  * method 
repartition(Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.repartition")
    [error]  * method 
repartition(Int,Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.repartition")
    [error]  * method 
dropDuplicates(Array[java.lang.String])org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.dropDuplicates")
    [error]  * method 
dropDuplicates(scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.dropDuplicates")
    [error]  * method dropDuplicates()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.dropDuplicates")
    [error]  * method javaToPython()org.apache.spark.api.java.JavaRDD in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.javaToPython")
    [error]  * method 
agg(org.apache.spark.sql.Column,scala.collection.Seq)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.agg")
    [error]  * method agg(java.util.Map)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.agg")
    [error]  * method 
agg(scala.collection.immutable.Map)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.agg")
    [error]  * method 
agg(scala.Tuple2,scala.collection.Seq)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.agg")
    [error]  * method 
agg(org.apache.spark.sql.Column,Array[org.apache.spark.sql.Column])org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.agg")
    [error]  * method show(Int,Boolean)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.show")
    [error]  * method show(Boolean)Unit in class org.apache.spark.sql.DataFrame 
does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.show")
    [error]  * method show()Unit in class org.apache.spark.sql.DataFrame does 
not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.show")
    [error]  * method show(Int)Unit in class org.apache.spark.sql.DataFrame 
does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.show")
    [error]  * method 
except(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.except")
    [error]  * method schema()org.apache.spark.sql.types.StructType in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.schema")
    [error]  * method toJSON()org.apache.spark.rdd.RDD in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toJSON")
    [error]  * method foreach(scala.Function1)Unit in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.foreach")
    [error]  * method 
resolve(java.lang.String)org.apache.spark.sql.catalyst.expressions.NamedExpression
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.resolve")
    [error]  * method toString()java.lang.String in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toString")
    [error]  * method first()org.apache.spark.sql.Row in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.first")
    [error]  * method 
this(org.apache.spark.sql.SQLContext,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan)Unit
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.this")
    [error]  * method 
this(org.apache.spark.sql.SQLContext,org.apache.spark.sql.execution.QueryExecution)Unit
 in class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.this")
    [error]  * method inputFiles()Array[java.lang.String] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.inputFiles")
    [error]  * method 
map(scala.Function1,scala.reflect.ClassTag)org.apache.spark.rdd.RDD in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.map")
    [error]  * method as(scala.Symbol)org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.as")
    [error]  * method as(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.as")
    [error]  * method 
as(org.apache.spark.sql.Encoder)org.apache.spark.sql.Dataset in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.as")
    [error]  * method head()org.apache.spark.sql.Row in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.head")
    [error]  * method head(Int)Array[org.apache.spark.sql.Row] in class 
org.apache.spark.sql.DataFrame does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.head")
    [error]  * method stat()org.apache.spark.sql.DataFrameStatFunctions in 
class org.apache.spark.sql.DataFrame does not have a correspondent in new 
version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.stat")
    [error]  * method printSchema()Unit in class org.apache.spark.sql.DataFrame 
does not have a correspondent in new version
    [error]    filter with: 
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.printSchema")
    [error]  * method tables(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.SQLContext has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.SQLContext.tables")
    [error]  * method tables()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.SQLContext has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.SQLContext.tables")
    [error]  * method sql(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.SQLContext has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.SQLContext.sql")
    [error]  * method 
baseRelationToDataFrame(org.apache.spark.sql.sources.BaseRelation)org.apache.spark.sql.DataFrame
 in class org.apache.spark.sql.SQLContext has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.SQLContext.baseRelationToDataFrame")
    [error]  * method table(java.lang.String)org.apache.spark.sql.DataFrame in 
class org.apache.spark.sql.SQLContext has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.SQLContext.table")
    [error]  * method 
apply(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrameHolder in 
object org.apache.spark.sql.DataFrameHolder does not have a correspondent with 
same parameter signature among (java.lang.Object)java.lang.Object, 
(org.apache.spark.sql.Dataset)org.apache.spark.sql.DataFrameHolder
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.DataFrameHolder.apply")
    [error]  * method toDF(scala.collection.Seq)org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrameHolder has now a different result type; 
was: org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.DataFrameHolder.toDF")
    [error]  * method toDF()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrameHolder has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.DataFrameHolder.toDF")
    [error]  * method 
copy(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrameHolder in 
class org.apache.spark.sql.DataFrameHolder's type has changed; was 
(org.apache.spark.sql.DataFrame)org.apache.spark.sql.DataFrameHolder, is now: 
(org.apache.spark.sql.Dataset)org.apache.spark.sql.DataFrameHolder
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.DataFrameHolder.copy")
    [error]  * synthetic method copy$default$1()org.apache.spark.sql.DataFrame 
in class org.apache.spark.sql.DataFrameHolder has now a different result type; 
was: org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.DataFrameHolder.copy$default$1")
    [error]  * synthetic method df$1()org.apache.spark.sql.DataFrame in class 
org.apache.spark.sql.DataFrameHolder has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.DataFrameHolder.df$1")
    [error]  * method this(org.apache.spark.sql.DataFrame)Unit in class 
org.apache.spark.sql.DataFrameHolder's type has changed; was 
(org.apache.spark.sql.DataFrame)Unit, is now: (org.apache.spark.sql.Dataset)Unit
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.DataFrameHolder.this")
    [error]  * method 
apply(org.apache.spark.sql.SQLContext,org.apache.spark.sql.catalyst.plans.logical.LogicalPlan)org.apache.spark.sql.DataFrame
 in object org.apache.spark.sql.DataFrame has now a different result type; was: 
org.apache.spark.sql.DataFrame, is now: org.apache.spark.sql.Dataset
    [error]    filter with: 
ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.DataFrame.apply")
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to