First of all, thank you for your replies.
I was previously doing this via normal jdbc connection and it worked
without problems. Then I liked the idea that sparksql could take care of
opening/closing the connection.
I tried also with single quotes, since that was my first guess but didn't
work.
Is new a reserved word for MySQL?
On Thu, Apr 30, 2015 at 2:41 PM, Francesco Bigarella
francesco.bigare...@gmail.com wrote:
Do you know how I can check that? I googled a bit but couldn't find a
clear explanation about it. I also tried to use explain() but it doesn't
really help.
I still
Do you know how I can check that? I googled a bit but couldn't find a clear
explanation about it. I also tried to use explain() but it doesn't really
help.
I still find unusual that I have this issue only for the equality operator
but not for the others.
Thank you,
F
On Wed, Apr 29, 2015 at 3:03
I think you need to specify new in single quote. My guess is the query
showing up in dB is like
...where status=new or
...where status=new
Either case mysql assumes new is a column.
What you need is the form below
...where status='new'
You need to provide your quotes accordingly.
Easiest way
Hi all,
I was testing the DataFrame filter functionality and I found what I think
is a strange behaviour.
My dataframe testDF, obtained loading aMySQL table via jdbc, has the
following schema:
root
| -- id: long (nullable = false)
| -- title: string (nullable = true)
| -- value: string
Looks like you DF is based on a MySQL DB using jdbc, and error is thrown
from mySQL. Can you see what SQL is finally getting fired in MySQL? Spark
is pushing down the predicate to mysql so its not a spark problem perse
On Wed, Apr 29, 2015 at 9:56 PM, Francesco Bigarella