Hi Spark Community,

Can you please help on the below query posted on Stackoverflow.

https://stackoverflow.com/questions/73086256/spark-sql-query-filter-behavior-with-special-characters


I am using the below spark sql query, however it doesn't return any records
unless I escape the "$" character in the condition.

My table has rows with the value "$AG72i$GE" in the column.

Below query not returning any results,

select * from mytable where col1 = '$AG72i$GE'

while, below query is returning results as expected.

select * from mytable where col1 = '\$AG72i\$GE'

Is this expected behaviour in Spark. I tried to search the documentation
but didn't find anywhere it is mentioned to escape $ char for equality
filter.


How do I handle joins on 2 tables with data like this.


Regards

Prashanth

Reply via email to