We've expected that it would be hard to understand all the aspects at first
so created an explanation for it.
Please see the following readme which hopefully answers most of your
questions:
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/jdbc/README.md


On Thu, Jan 6, 2022 at 3:31 PM Sean Owen <sro...@gmail.com> wrote:

> They're in core/
> under org.apache.spark.sql.execution.datasources.jdbc.connection
> I don't quite understand, it's an abstraction over lots of concrete
> implementations, just simple software design here.
> You can implement your own provider too I suppose.
>
> On Thu, Jan 6, 2022 at 8:22 AM Artemis User <arte...@dtechspace.com>
> wrote:
>
>> The only example I saw in the Spark distribution was
>> ExampleJdbcConnectionProvider file in the examples directory.  It basically
>> just wraps the abstract class with overriding methods.  I guess my question
>> was since Spark embeds the JDBC APIs in the DataFrame reader and writer,
>> why such provider API is still needed?  Is there any use cases for using
>> the provider API instead of the dataframe reader/writer when dealing with
>> JDBC?  Thanks!
>>
>> On 1/6/22 9:09 AM, Sean Owen wrote:
>>
>> There are 8 concrete implementations of it? OracleConnectionProvider, etc
>>
>> On Wed, Jan 5, 2022 at 9:26 PM Artemis User <arte...@dtechspace.com>
>> wrote:
>>
>>> Could someone provide some insight/examples on the usage of this API?
>>>
>>> https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/jdbc/JdbcConnectionProvider.html
>>>
>>> Why is it needed since this is an abstract class and there isn't any
>>> concrete implementation of it?   Thanks a lot in advance.
>>>
>>> -- ND
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>
>>

Reply via email to