Yea, I think that's because it's needed for interoperability between
scala/java.
If it returns a scala decimal, java code cannot handle it.
If you want a scala decimal, you need to convert it by yourself.
Bests,
Takeshi
On Wed, Feb 17, 2021 at 9:48 PM Ivan Petrov wrote:
> Hi, I'm using Spark
Hi, I'm using Spark Scala Dataset API to write spark sql jobs.
I've noticed that Spark dataset accepts scala BigDecimal as the value but
it always returns java.math.BigDecimal when you read it back.
Is it by design?
Should I use java.math.BigDecimal everywhere instead?
Is there any performance
I got similar question recently so had to find some history I missed. If I
understand correctly the class is "intentionally" removed in Spark 3,
because the class refers "kafka 0.8" module which isn't guaranteed to work
with recent Kafka version. And looks like there was another decision to not