Hi, I encountered a weird problem in spark sql.
I use sbt/sbt hive/console to go into the shell.
I test the filter push down by using catalyst.
scala val queryPlan = sql(select value from (select key,value from src)a
where a.key=86 )
scala queryPlan.baseLogicalPlan
res0:
Hi,
queryPlan.baseLogicalPlan is not the plan used to execution. Actually,
the baseLogicalPlan
of a SchemaRDD (queryPlan in your case) is just the parsed plan (the parsed
plan will be analyzed, and then optimized. Finally, a physical plan will be
created). The plan shows up after you execute val
I use queryPlan.queryExecution.analyzed to get the logical plan.
it works.
And What you explained to me is very useful.
Thank you very much.
--
View this message in context: