Repository: spark Updated Branches: refs/heads/branch-1.0 2fe6b183e -> bfb09c6b8
Use scala deprecation instead of java. This gets rid of a warning when compiling core (since we were depending on a deprecated interface with a non-deprecated function). I also tested with javac, and this does the right thing when compiling java code. Author: Michael Armbrust <[email protected]> Closes #452 from marmbrus/scalaDeprecation and squashes the following commits: f628b4d [Michael Armbrust] Use scala deprecation instead of java. (cherry picked from commit 5d0f58b2eb8e48a95c4ab34bc89f7251d093f301) Signed-off-by: Matei Zaharia <[email protected]> Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/bfb09c6b Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/bfb09c6b Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/bfb09c6b Branch: refs/heads/branch-1.0 Commit: bfb09c6b8358c96a01d01af5f5f608b75a7be3e6 Parents: 2fe6b18 Author: Michael Armbrust <[email protected]> Authored: Sat Apr 19 15:06:04 2014 -0700 Committer: Matei Zaharia <[email protected]> Committed: Sat Apr 19 15:06:18 2014 -0700 ---------------------------------------------------------------------- .../main/scala/org/apache/spark/api/java/JavaSparkContext.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/bfb09c6b/core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala ---------------------------------------------------------------------- diff --git a/core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala b/core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala index cf30523..bda9272 100644 --- a/core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala +++ b/core/src/main/scala/org/apache/spark/api/java/JavaSparkContext.scala @@ -114,7 +114,7 @@ class JavaSparkContext(val sc: SparkContext) extends JavaSparkContextVarargsWork * @deprecated As of Spark 1.0.0, defaultMinSplits is deprecated, use * {@link #defaultMinPartitions()} instead */ - @Deprecated + @deprecated("use defaultMinPartitions", "1.0.0") def defaultMinSplits: java.lang.Integer = sc.defaultMinSplits /** Default min number of partitions for Hadoop RDDs when not given by user */
