[
https://issues.apache.org/jira/browse/SPARK-22500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16306197#comment-16306197
]
Łukasz Żukowski commented on SPARK-22500:
-----------------------------------------
Hmm
Below still make 64k exception.
Maybe it can be done otherwise but I don't know how to apply cast to root.
val spark = SparkSession.builder().appName("name").master("local[*]")
.getOrCreate()
import org.apache.spark.sql.functions._
val names = (1 to 100).map(i => "f" + i)
val largeStruct = StructType(names.map(f => StructField(f, IntegerType,
true)))
implicit def rowEnc = RowEncoder(largeStruct)
val values = Seq(Row.fromSeq(1 to 100))
val rdd = spark.sparkContext.parallelize(values)
val df = spark.createDataFrame(rdd, largeStruct)
val casted =
df.select(struct("*").cast(largeStruct).as("_toUnpack")).select("_toUnpack.*")
casted.show
> 64KB JVM bytecode limit problem with cast
> -----------------------------------------
>
> Key: SPARK-22500
> URL: https://issues.apache.org/jira/browse/SPARK-22500
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Kazuaki Ishizaki
> Assignee: Kazuaki Ishizaki
> Fix For: 2.2.1, 2.3.0
>
>
> {{Cast}} can throw an exception due to the 64KB JVM bytecode limit when they
> use with a lot of structure fields
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]