[
https://issues.apache.org/jira/browse/SPARK-2549?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14068329#comment-14068329
]
Prashant Sharma edited comment on SPARK-2549 at 7/21/14 9:16 AM:
-----------------------------------------------------------------
Ironically for this snippet:
{noformat}
class A {
def b {
def c = {
println()
}
}
}
{noformat}
javap gave me:
{noformat}
Warning: Binary file A contains org.apache.spark.tools.A
Classfile
/home/prashant/work/spark/tools/target/scala-2.10/classes/org/apache/spark/tools/A.class
Last modified 21 Jul, 2014; size 753 bytes
MD5 checksum 0fe728ed5816503deeaaafce412f6ffd
Compiled from "A.scala"
public class org.apache.spark.tools.A
SourceFile: "A.scala"
RuntimeVisibleAnnotations:
0: #6(#7=s#8)
ScalaSig: length = 0x3
05 00 00
minor version: 0
major version: 50
flags: ACC_PUBLIC, ACC_SUPER
Constant pool:
#1 = Utf8 org/apache/spark/tools/A
#2 = Class #1 // org/apache/spark/tools/A
#3 = Utf8 java/lang/Object
#4 = Class #3 // java/lang/Object
#5 = Utf8 A.scala
#6 = Utf8 Lscala/reflect/ScalaSignature;
#7 = Utf8 bytes
#8 = Utf8 u1A!\t\tI\t)Ao\8mg*QABgB
'o!\ta!9bG\",'\"A=xmaCAq!\"AM\r\1\nEq!AB!osJ+g\rCA#=S:LGO+AaA!)C3\t!-F!\ti1$\t!QK\5u
#9 = Utf8 b
#10 = Utf8 ()V
#11 = Utf8 this
#12 = Utf8 Lorg/apache/spark/tools/A;
#13 = Utf8 c$1
#14 = Utf8 scala/Predef$
#15 = Class #14 // scala/Predef$
#16 = Utf8 MODULE$
#17 = Utf8 Lscala/Predef$;
#18 = NameAndType #16:#17 // MODULE$:Lscala/Predef$;
#19 = Fieldref #15.#18 //
scala/Predef$.MODULE$:Lscala/Predef$;
#20 = Utf8 println
#21 = NameAndType #20:#10 // println:()V
#22 = Methodref #15.#21 // scala/Predef$.println:()V
#23 = Utf8 <init>
#24 = NameAndType #23:#10 // "<init>":()V
#25 = Methodref #4.#24 // java/lang/Object."<init>":()V
#26 = Utf8 Code
#27 = Utf8 LocalVariableTable
#28 = Utf8 LineNumberTable
#29 = Utf8 SourceFile
#30 = Utf8 RuntimeVisibleAnnotations
#31 = Utf8 ScalaSig
{
public void b();
flags: ACC_PUBLIC
Code:
stack=0, locals=1, args_size=1
0: return
LocalVariableTable:
Start Length Slot Name Signature
0 1 0 this Lorg/apache/spark/tools/A;
LineNumberTable:
line 24: 0
private final void c$1();
flags: ACC_PRIVATE, ACC_FINAL
Code:
stack=1, locals=1, args_size=1
0: getstatic #19 // Field
scala/Predef$.MODULE$:Lscala/Predef$;
3: invokevirtual #22 // Method
scala/Predef$.println:()V
6: return
LocalVariableTable:
Start Length Slot Name Signature
0 7 0 this Lorg/apache/spark/tools/A;
LineNumberTable:
line 22: 0
public org.apache.spark.tools.A();
flags: ACC_PUBLIC
Code:
stack=1, locals=1, args_size=1
0: aload_0
1: invokespecial #25 // Method
java/lang/Object."<init>":()V
4: return
LocalVariableTable:
Start Length Slot Name Signature
0 5 0 this Lorg/apache/spark/tools/A;
LineNumberTable:
line 19: 0
}
{noformat}
Here def c was converted to c$1 and it is private which should not bother our
Mima checker (I hope).
However if the snippet is:
{noformat}
class A {
def d(x: () => Int) = x()
def b {
def c() = 1
d(c)
}
}
{noformat}
Javap generates then name as:
{noformat}
public final int org$apache$spark$tools$A$$c$1();
{noformat}
So the point is I guess, whenever an inner function is passed to another
function which could be outside and thus requires this inner function to be
visible and thus they are upgraded to public with a name with $$ in it.
I think then excluding all the functions with $$ in their name should do it.
was (Author: prashant_):
Ironically for this snippet:
{noformat}
class A {
def b {
def c = {
println()
}
}
}
{noformat}
javap gave me:
{noformat}
Warning: Binary file A contains org.apache.spark.tools.A
Classfile
/home/prashant/work/spark/tools/target/scala-2.10/classes/org/apache/spark/tools/A.class
Last modified 21 Jul, 2014; size 753 bytes
MD5 checksum 0fe728ed5816503deeaaafce412f6ffd
Compiled from "A.scala"
public class org.apache.spark.tools.A
SourceFile: "A.scala"
RuntimeVisibleAnnotations:
0: #6(#7=s#8)
ScalaSig: length = 0x3
05 00 00
minor version: 0
major version: 50
flags: ACC_PUBLIC, ACC_SUPER
Constant pool:
#1 = Utf8 org/apache/spark/tools/A
#2 = Class #1 // org/apache/spark/tools/A
#3 = Utf8 java/lang/Object
#4 = Class #3 // java/lang/Object
#5 = Utf8 A.scala
#6 = Utf8 Lscala/reflect/ScalaSignature;
#7 = Utf8 bytes
#8 = Utf8 u1A!\t\tI\t)Ao\8mg*QABgB
'o!\ta!9bG\",'\"A=xmaCAq!\"AM\r\1\nEq!AB!osJ+g\rCA#=S:LGO+AaA!)C3\t!-F!\ti1$\t!QK\5u
#9 = Utf8 b
#10 = Utf8 ()V
#11 = Utf8 this
#12 = Utf8 Lorg/apache/spark/tools/A;
#13 = Utf8 c$1
#14 = Utf8 scala/Predef$
#15 = Class #14 // scala/Predef$
#16 = Utf8 MODULE$
#17 = Utf8 Lscala/Predef$;
#18 = NameAndType #16:#17 // MODULE$:Lscala/Predef$;
#19 = Fieldref #15.#18 //
scala/Predef$.MODULE$:Lscala/Predef$;
#20 = Utf8 println
#21 = NameAndType #20:#10 // println:()V
#22 = Methodref #15.#21 // scala/Predef$.println:()V
#23 = Utf8 <init>
#24 = NameAndType #23:#10 // "<init>":()V
#25 = Methodref #4.#24 // java/lang/Object."<init>":()V
#26 = Utf8 Code
#27 = Utf8 LocalVariableTable
#28 = Utf8 LineNumberTable
#29 = Utf8 SourceFile
#30 = Utf8 RuntimeVisibleAnnotations
#31 = Utf8 ScalaSig
{
public void b();
flags: ACC_PUBLIC
Code:
stack=0, locals=1, args_size=1
0: return
LocalVariableTable:
Start Length Slot Name Signature
0 1 0 this Lorg/apache/spark/tools/A;
LineNumberTable:
line 24: 0
private final void c$1();
flags: ACC_PRIVATE, ACC_FINAL
Code:
stack=1, locals=1, args_size=1
0: getstatic #19 // Field
scala/Predef$.MODULE$:Lscala/Predef$;
3: invokevirtual #22 // Method
scala/Predef$.println:()V
6: return
LocalVariableTable:
Start Length Slot Name Signature
0 7 0 this Lorg/apache/spark/tools/A;
LineNumberTable:
line 22: 0
public org.apache.spark.tools.A();
flags: ACC_PUBLIC
Code:
stack=1, locals=1, args_size=1
0: aload_0
1: invokespecial #25 // Method
java/lang/Object."<init>":()V
4: return
LocalVariableTable:
Start Length Slot Name Signature
0 5 0 this Lorg/apache/spark/tools/A;
LineNumberTable:
line 19: 0
}
{noformat}
Here def c was converted to c$1 and it is private which should not bother our
Mima checker (I hope).
However if the snippet is:
{noformat}
class A {
def d(x: () => Int) = x()
def b {
def c() = 1
d(c)
}
}
{noformat}
Javap generates then name as:
{noformat}
public final int org$apache$spark$tools$A$$c$1();
{noformat}
> Functions defined inside of other functions trigger failures
> ------------------------------------------------------------
>
> Key: SPARK-2549
> URL: https://issues.apache.org/jira/browse/SPARK-2549
> Project: Spark
> Issue Type: Sub-task
> Components: Build
> Reporter: Patrick Wendell
> Assignee: Prashant Sharma
> Fix For: 1.1.0
>
>
> If we have a function declaration inside of another function, it still
> triggers mima failures. We should look at how that is implemented in byte
> code and just always exclude functions like that.
> {code}
> def a() = {
> /* Changing b() should not trigger failures, but it does. */
> def b() = {}
> }
> {code}
> I dug into the byte code for inner functions a bit more. I noticed that they
> tend to use `$$` before the function name.
> There is more information on that string sequence here:
> https://github.com/scala/scala/blob/2.10.x/src/reflect/scala/reflect/internal/StdNames.scala#L286
> I did a cursory look and it appears that symbol is mostly (exclusively?) used
> for anonymous or inner functions:
> {code}
> # in RDD package classes
> $ ls *.class | xargs -I {} javap {} |grep "\\$\\$"
> public final java.lang.Object
> org$apache$spark$rdd$PairRDDFunctions$$createZero$1(scala.reflect.ClassTag,
> byte[], scala.runtime.ObjectRef, scala.runtime.VolatileByteRef);
> public final java.lang.Object
> org$apache$spark$rdd$PairRDDFunctions$$createZero$2(byte[],
> scala.runtime.ObjectRef, scala.runtime.VolatileByteRef);
> public final scala.collection.Iterator
> org$apache$spark$rdd$PairRDDFunctions$$reducePartition$1(scala.collection.Iterator,
> scala.Function2);
> public final java.util.HashMap
> org$apache$spark$rdd$PairRDDFunctions$$mergeMaps$1(java.util.HashMap,
> java.util.HashMap, scala.Function2);
> ...
> public final class org.apache.spark.rdd.AsyncRDDActions$$anonfun$countAsync$1
> extends scala.runtime.AbstractFunction0$mcJ$sp implements scala.Serializable {
> public
> org.apache.spark.rdd.AsyncRDDActions$$anonfun$countAsync$1(org.apache.spark.rdd.AsyncRDDActions<T>);
> public final class org.apache.spark.rdd.AsyncRDDActions$$anonfun$countAsync$2
> extends scala.runtime.AbstractFunction2$mcVIJ$sp implements
> scala.Serializable {
> public
> org.apache.spark.rdd.AsyncRDDActions$$anonfun$countAsync$2(org.apache.spark.rdd.AsyncRDDActions<T>);
> {code}
--
This message was sent by Atlassian JIRA
(v6.2#6252)