Github user sureshthalamati commented on the pull request:

    https://github.com/apache/spark/pull/8374#issuecomment-144155787
  
    The new contract defined in the JdbcDialect class  is sub classes can 
define any one of the getJDBCType methods or both.  Both the methods has to be 
called  to check if dialect has specified different mapping.   I think it makes 
sense to put the call with metadata first, and if it is not defined then call 
the one without the metadata.  If we address this, any existing custom dialects 
I mentioned in my previous comment should also work fine.
     
    I think Code change in JdbcUtils.scala  that call getJDBCType() has to be 
changed to something like the following:
    
    val typ: String = dialect.getJDBCType(field.dataType, 
field.metadata).map(_.databaseTypeDefinition).orElse(dialect.getJDBCType(field.dataType).map(_.databaseTypeDefinition)).getOrElse(
    field.dataType match { …
    
    I hope that helps . Thank you for working on this issue. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to