Re: Spark SQL Dataset and BigDecimal

2021-02-17 Thread Takeshi Yamamuro
Yea, I think that's because it's needed for interoperability between scala/java. If it returns a scala decimal, java code cannot handle it. If you want a scala decimal, you need to convert it by yourself. Bests, Takeshi On Wed, Feb 17, 2021 at 9:48 PM Ivan Petrov wrote: > Hi, I'm using Spark

Spark SQL Dataset and BigDecimal

2021-02-17 Thread Ivan Petrov
Hi, I'm using Spark Scala Dataset API to write spark sql jobs. I've noticed that Spark dataset accepts scala BigDecimal as the value but it always returns java.math.BigDecimal when you read it back. Is it by design? Should I use java.math.BigDecimal everywhere instead? Is there any performance

Re: KafkaUtils module not found on spark 3 pyspark

2021-02-17 Thread Jungtaek Lim
I got similar question recently so had to find some history I missed. If I understand correctly the class is "intentionally" removed in Spark 3, because the class refers "kafka 0.8" module which isn't guaranteed to work with recent Kafka version. And looks like there was another decision to not