They're in core/
under org.apache.spark.sql.execution.datasources.jdbc.connection
I don't quite understand, it's an abstraction over lots of concrete
implementations, just simple software design here.
You can implement your own provider too I suppose.

On Thu, Jan 6, 2022 at 8:22 AM Artemis User <[email protected]> wrote:

> The only example I saw in the Spark distribution was
> ExampleJdbcConnectionProvider file in the examples directory.  It basically
> just wraps the abstract class with overriding methods.  I guess my question
> was since Spark embeds the JDBC APIs in the DataFrame reader and writer,
> why such provider API is still needed?  Is there any use cases for using
> the provider API instead of the dataframe reader/writer when dealing with
> JDBC?  Thanks!
>
> On 1/6/22 9:09 AM, Sean Owen wrote:
>
> There are 8 concrete implementations of it? OracleConnectionProvider, etc
>
> On Wed, Jan 5, 2022 at 9:26 PM Artemis User <[email protected]>
> wrote:
>
>> Could someone provide some insight/examples on the usage of this API?
>>
>> https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html
>>
>> Why is it needed since this is an abstract class and there isn't any
>> concrete implementation of it?   Thanks a lot in advance.
>>
>> -- ND
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: [email protected]
>>
>>
>

Reply via email to