Github user retronym commented on the issue:
https://github.com/apache/spark/pull/19675
Another, semi-nuclear option is to use an unsupported option of
`LambdaMetafactory` to dump generated classfiles to disk.
```
â¡ mkdir /tmp/dump
~/code/compiler-benchmark/target/profile-basic on master*
â¡ scala -J-Djdk.internal.lambda.dumpProxyClasses=/tmp/dump
Welcome to Scala 2.12.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_162).
Type in expressions for evaluation. Or try :help.
scala> :paste -raw
// Entering paste mode (ctrl-D to finish)
package p1
class C { val x = "foo"; def test1 = () => this.x; def test2 = () =>
C.this; def test3 = () => "" }
// Exiting paste mode, now interpreting.
scala> new p1.C().test1
res1: () => String = p1.C$$Lambda$1071/1940783703@6fc1020a
scala> :quit
â¡ find /tmp/dump/p1
/tmp/dump/p1
/tmp/dump/p1/C$$Lambda$1071.class
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]