[ 
https://issues.apache.org/jira/browse/SPARK-16845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15664866#comment-15664866
 ] 

Barry Becker commented on SPARK-16845:
--------------------------------------

I am encountering a similar exception in spark 1.6.3 when applying ml to a 
Dataframe with 204 columns.
The code used is that from the spark-MDLP library, but I don't yet have a 
reproducible case for you.
{code}
        at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:602)
        at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:622)
        at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:619)
        at 
org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
        at 
org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
        at 
org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
        ... 39 more
Caused by: org.codehaus.janino.JaninoRuntimeException: Code of method 
"(Lorg/apache/spark/sql/catalyst/InternalRow;Lorg/apache/spark/sql/catalyst/InternalRow;)I"
 of class 
"org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" 
grows beyond 64 KB
        at org.codehaus.janino.CodeContext.makeSpace(CodeContext.java:941)
        at org.codehaus.janino.CodeContext.write(CodeContext.java:836)
        at org.codehaus.janino.UnitCompiler.writeByte(UnitCompiler.java:10235)
        at org.codehaus.janino.UnitCompiler.invoke(UnitCompiler.java:10048)
        at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:3986)
        at org.codehaus.janino.UnitCompiler.access$6900(UnitCompiler.java:185)
        at 
org.codehaus.janino.UnitCompiler$10.visitMethodInvocation(UnitCompiler.java:3263)
        at org.codehaus.janino.Java$MethodInvocation.accept(Java.java:3974)
        at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:3290)
        at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:3868)
        at org.codehaus.janino.UnitCompiler.access$8600(UnitCompiler.java:185)
        at 
org.codehaus.janino.UnitCompiler$10.visitParenthesizedExpression(UnitCompiler.java:3286)
        at 
org.codehaus.janino.Java$ParenthesizedExpression.accept(Java.java:3830)
        at org.codehaus.janino.UnitCompiler.compileGet(UnitCompiler.java:3290)
        at 
org.codehaus.janino.UnitCompiler.compileGetValue(UnitCompiler.java:4368)
        at org.codehaus.janino.UnitCompiler.compileGet2(UnitCompiler.java:3571)
        at org.codehaus.janino.UnitCompiler.access$6600(UnitCompiler.java:185)
        at 
org.codehaus.janino.UnitCompiler$10.visitConditionalExpression(UnitCompiler.java:3260)
{code}

> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" 
> grows beyond 64 KB
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16845
>                 URL: https://issues.apache.org/jira/browse/SPARK-16845
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API, ML, MLlib
>    Affects Versions: 2.0.0
>            Reporter: hejie
>         Attachments: error.txt.zip
>
>
> I have a wide table(400 columns), when I try fitting the traindata on all 
> columns,  the fatal error occurs. 
>       ... 46 more
> Caused by: org.codehaus.janino.JaninoRuntimeException: Code of method 
> "(Lorg/apache/spark/sql/catalyst/InternalRow;Lorg/apache/spark/sql/catalyst/InternalRow;)I"
>  of class 
> "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" 
> grows beyond 64 KB
>       at org.codehaus.janino.CodeContext.makeSpace(CodeContext.java:941)
>       at org.codehaus.janino.CodeContext.write(CodeContext.java:854)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to