+dev list
Hi Dirceu,
The answer to whether throwing an exception is better or null is better
depends on your use case. If you are debugging and want to find bugs with
your program, you might prefer throwing an exception. However, if you are
running on a large real-world dataset (i.e. data is dirt
Hi Yin, posted here because I think it's a bug.
So, it will return null and I can get a nullpointerexception, as I was
getting. Is this really the expected behavior? Never seen something
returning null in other Scala tools that I used.
Regards,
2015-09-14 18:54 GMT-03:00 Yin Huai :
> btw, move
A scale of 10 means that there are 10 digits at the right of the decimal
point. If you also have precision 10, the range of your data will be [0, 1)
and casting "10.5" to DecimalType(10, 10) will return null, which is
expected.
On Mon, Sep 14, 2015 at 1:42 PM, Dirceu Semighini Filho <
dirceu.semig
Hi all,
I'm moving from spark 1.4 to 1.5, and one of my tests is failing.
It seems that there was some changes in org.apache.spark.sql.types.
DecimalType
This ugly code is a little sample to reproduce the error, don't use it into
your project.
test("spark test") {
val file =
context.sparkConte