Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/8982#discussion_r41650090
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/jdbc/JdbcDialects.scala ---
    @@ -287,3 +288,30 @@ case object MsSqlServerDialect extends JdbcDialect {
         case _ => None
       }
     }
    +
    +/**
    + * :: DeveloperApi ::
    + * Default Apache Derby dialect, mapping real on read
    + * and string/byte/short/boolean/decimal on write.
    + */
    +@DeveloperApi
    +case object DerbyDialect extends JdbcDialect {
    +  override def canHandle(url: String): Boolean = 
url.startsWith("jdbc:derby")
    +  override def getCatalystType(
    +      sqlType: Int, typeName: String, size: Int, md: MetadataBuilder): 
Option[DataType] = {
    +    if (sqlType == Types.REAL) Some(FloatType) else None
    --- End diff --
    
    In this case (where you are clearly not passing null) they are equivalent.
    Option will return None in the case of null, which is why I prefer using
    that method of construction.
    On Oct 9, 2015 9:28 AM, "Rick Hillegas" <[email protected]> wrote:
    
    > In sql/core/src/main/scala/org/apache/spark/sql/jdbc/JdbcDialects.scala
    > <https://github.com/apache/spark/pull/8982#discussion_r41649575>:
    >
    > > @@ -287,3 +288,30 @@ case object MsSqlServerDialect extends JdbcDialect 
{
    > >      case _ => None
    > >    }
    > >  }
    > > +
    > > +/**
    > > + * :: DeveloperApi ::
    > > + * Default Apache Derby dialect, mapping real on read
    > > + * and string/byte/short/boolean/decimal on write.
    > > + */
    > > +@DeveloperApi
    > > +case object DerbyDialect extends JdbcDialect {
    > > +  override def canHandle(url: String): Boolean = 
url.startsWith("jdbc:derby")
    > > +  override def getCatalystType(
    > > +      sqlType: Int, typeName: String, size: Int, md: MetadataBuilder): 
Option[DataType] = {
    > > +    if (sqlType == Types.REAL) Some(FloatType) else None
    >
    > I have changed the original code so that it matches the pattern followed
    > by the other getCatalystType() overloads: The code returns a
    > Some(FloatType) rather than an Option(FloatType). I am new to programming
    > in Scala, but the former expression is the pattern which I see all over 
the
    > code. However, Reynold recommended the latter expression. If
    > Option(FloatType) is the correct value to return, then we should probably
    > change all of the other getCatalystType() overloads to return Option(...)
    > rather than Some(...). I would appreciate your guidance on this point.
    > Thanks.
    >
    > —
    > Reply to this email directly or view it on GitHub
    > <https://github.com/apache/spark/pull/8982/files#r41649575>.
    >



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to