Re: Jdbc Hook in Spark Batch Application

2020-12-25 Thread Mich Talebzadeh
If I understand correctly you can store JDBC connection properties in a configuration file and refer to them in the code in your Scala/python module. Example: # oracle variables driverName = "oracle.jdbc.OracleDriver" _username = "user" _password = ".." _dbschema = "schema" _dbtable = "table

Re: Jdbc Hook in Spark Batch Application

2020-12-25 Thread Gabor Somogyi
AFAIK there is no other way. In the latest release JDBC connection provider API is added but it also needs some code modification. BTW, if there would be a hook API then code changes need to be added too, right? On Fri, 25 Dec 2020, 02:35 lec ssmi, wrote: > Thanks. > But there is a problem that

Re: Jdbc Hook in Spark Batch Application

2020-12-24 Thread lec ssmi
Thanks. But there is a problem that the classes referenced in the code need to be modified. I want to try not to change the existing code. Gabor Somogyi 于2020年12月25日周五 上午12:16写道: > One can wrap the JDBC driver and such a way eveything can be sniffed. > > On Thu, 24 Dec 2020, 03:51 lec ssmi, wro

Re: Jdbc Hook in Spark Batch Application

2020-12-24 Thread Gabor Somogyi
One can wrap the JDBC driver and such a way eveything can be sniffed. On Thu, 24 Dec 2020, 03:51 lec ssmi, wrote: > Hi: >guys, I have some spark programs that have database connection > operations. I want to acquire the connection information, such as jdbc > connection properties , but no

Jdbc Hook in Spark Batch Application

2020-12-23 Thread lec ssmi
Hi: guys, I have some spark programs that have database connection operations. I want to acquire the connection information, such as jdbc connection properties , but not too intrusive to the code. Any good ideas ? Can java agent make it ?