Hi Yiannis,
Are you able to post your full stack trace? It might be helpful.
I recall one similar incident where Phoenix was doing some casting to
BigDecimal, so as a workaround I ran a foreach() on my RDD and attempted
the same cast call, and lo and behold I had a NaN record in there.
Josh
On
Hi Josh,
thanks for your reply.
The reason that I cannot follow your suggestion is that I am already
casting the value in Double in a previous point in my code:
var value = PDouble.INSTANCE.getCodec.decodeDouble(cell.getValueArray,
cell.getValueOffset, SortOrder.getDefault)
So I should be getti
Hi Yiannis,
I've found the best solution to this is generally just to add logging
around that area. For example, you could add a try (or Scala Try<>) and
check if an exception has been thrown, then log it somewhere.
As a wild guess, if you're dealing with a Double datatype and getting
NumberForma
Hi there,
I am using phoenix-spark to insert multiple entries on a phoenix table.
I get the following errors:
..Exception while committing to database..
..Caused by: java.lang.NumberFormatException..
I couldn't find on the logs what was the row that was causing the issue.
Is it possible to extra