This is a user-list question, not a dev-list question. Moving this conversation
to the user list and BCC-ing the dev list.
Also, this statement
> We are not validating against table or column existence.
is not correct. When you call spark.sql(…), Spark will lookup the table
references and fail
Yes, you can validate the syntax of your PySpark SQL queries without
connecting to an actual dataset or running the queries on a cluster.
PySpark provides a method for syntax validation without executing the
query. Something like below
__
/ __/__ ___ _/ /__
_\ \/ _ \
Hello,
Is there a way to validate pyspark sql to validate only syntax errors?. I
cannot connect do actual data set to perform this validation. Any
help would be appreciated.
Thanks
Ram