Hi,

I am using Spark SQL to transform data. My Source is ORACLE, In general, I
am extracting multiple tables and joining them and then doing some other
transformations in Spark.

Is there any possibility for pushing down join operator to ORACLE using
SPARK SQL, instead of fetching and joining in Spark? I am unable find any
options for these optimizations rules at
https://spark.apache.org/docs/1.6.0/sql-programming-guide.html#jdbc-to-other-databases
.

I am currently using spark-1.6 version.

Thanks & Regards,
B Anil Kumar.

Reply via email to