[ 
https://issues.apache.org/jira/browse/SPARK-16845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15614100#comment-15614100
 ] 

Don Drake commented on SPARK-16845:
-----------------------------------

I'm struggling to get a simple case created. 

I'm curious though, if I compile my .jar file using sbt with Spark 2.0.1 but 
use your compiled branch of Spark 2.1.0-SNAPSHOT as a run-time (spark-submit), 
would you expect it to work? 

When using your compile branch of Spark 2.1.0-SNAPSHOT and execute a 
spark-shell the test cases provided in this JIRA pass.  But my code fails.

Also, the error message says "grows beyond 64k" as the compiler error but the 
output generates over 400k of source code. I'll try to attach the exact error 
message java code.


> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" 
> grows beyond 64 KB
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16845
>                 URL: https://issues.apache.org/jira/browse/SPARK-16845
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API, ML, MLlib
>    Affects Versions: 2.0.0
>            Reporter: hejie
>
> I have a wide table(400 columns), when I try fitting the traindata on all 
> columns,  the fatal error occurs. 
>       ... 46 more
> Caused by: org.codehaus.janino.JaninoRuntimeException: Code of method 
> "(Lorg/apache/spark/sql/catalyst/InternalRow;Lorg/apache/spark/sql/catalyst/InternalRow;)I"
>  of class 
> "org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering" 
> grows beyond 64 KB
>       at org.codehaus.janino.CodeContext.makeSpace(CodeContext.java:941)
>       at org.codehaus.janino.CodeContext.write(CodeContext.java:854)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to