Github user lokm01 commented on the issue:
https://github.com/apache/spark/pull/21215
@maropu That would work if you had scala case classes for all the types. In
our case, we're working on a generic framework, where we only have Spark
schemas (and I'd rather not generate case classes at runtime).
Can you suggest an existing way to do this using spark's DataType please?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]