[
https://issues.apache.org/jira/browse/SPARK-22226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16197111#comment-16197111
]
Marco Gaido commented on SPARK-22226:
-------------------------------------
I am not sure about what the current open PR is going to address: in the
current state it doesn't solve the problem I am facing and I'd like to address
with the PR I have prepared.
I think that there are many issues about code generation and many things in the
current implementation which limit the scalability on the number of columns.
Therefore I guess that there are cases which need to be handled differently.
Anyway, since that PR is not yet ready, I am unable to state what it will
address and what it won't.
The only thing that I can say is that as you can see from my branch
(https://github.com/mgaido91/spark/commits/SPARK-22226), I am doing something
completely different to what is done in the open PR.
> Code generation fails for dataframes with 10000 columns
> -------------------------------------------------------
>
> Key: SPARK-22226
> URL: https://issues.apache.org/jira/browse/SPARK-22226
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Marco Gaido
>
> Code generation for very wide datasets can fail because of the Constant Pool
> limit reached.
> This can be caused by many reasons. One of them is that we are currently
> splitting the definition of the generated methods among several
> {{NestedClass}} but all these methods are called in the main class. Since we
> have entries added to the constant pool for each method invocation, this is
> limiting the number of rows and is leading for very wide dataset to:
> {noformat}
> org.codehaus.janino.JaninoRuntimeException: Constant pool for class
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificMutableProjection
> has grown past JVM limit of 0xFFFF
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]