JNSimba commented on PR #335:
URL: 
https://github.com/apache/doris-spark-connector/pull/335#issuecomment-3208884403

   > > Are you using the DataFrame API? DataFrame itself has filtering 
operations, so theoretically it can also be pushed down? Or could you provide 
an example?
   > 
   > ```
   >     val sourceDf = sparkSession.read
   >             .format(SOURCE_FORMAT)
   >             .options(sourceConfigs)
   >             .option(SOURCE_TABLE_KEY, dbTableName)
   >             .load()
   > 
   >               val tableName = dbTableName.split("\\.").last
   >               sourceDf.write
   >                 .format(FORMAT)
   >                 .option("endpoint", ENDPOINT)
   >                 .option("username", USERNAME)
   >                 .option("password", PASSWORD)
   >                 .option("workspace", WORKSPACE)
   >                 .option("virtualCluster", VIRTUAL_CLUSTER)
   >                 .option("schema", SCHEMA)
   >                 .option("table", tableName)
   >                 .option("access_mode", ACCESS_MODE)
   >                 .mode("append")
   >                 .save()
   > ```
   > 
   > @JNSimba FYI
   
   You can try it, Spark will automatically push the filter down 
   ```
   val session = SparkSession.builder().master("local[1]").getOrCreate()
       val dorisSparkDF = session.read.format("doris")
         .option("doris.table.identifier", "test.student")
         .option("doris.fenodes", "127.0.0.1:8030")
         .option("user", "root")
         .option("password", "")
         .load()
         .filter("age > 20")
   
       dorisSparkDF.show()
       
    ```   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to