[ 
https://issues.apache.org/jira/browse/SPARK-10857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14967930#comment-14967930
 ] 

Rick Hillegas commented on SPARK-10857:
---------------------------------------

Hi Reynold,

Yes, this would fail if the table were a query. But I'm confused about the code 
path which would admit a query in place of a table name. Right now 
JdbcUtils.tableExists() is the only place I see which calls 
JdbcDialect.getTableExistsQuery(). JdbcUtils.tableExists(), in turn, is only 
called by DataFrameWriter.jdbc(). That method then falls through to issue a 
DROP TABLE or a CREATE TABLE statement, which would bomb if the table were a 
query. Please help me understand what current or future code path I'm missing.

Thanks,
-Rick


> SQL injection bug in JdbcDialect.getTableExistsQuery()
> ------------------------------------------------------
>
>                 Key: SPARK-10857
>                 URL: https://issues.apache.org/jira/browse/SPARK-10857
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>            Reporter: Rick Hillegas
>            Priority: Minor
>
> All of the implementations of this method involve constructing a query by 
> concatenating boilerplate text with a user-supplied name. This looks like a 
> SQL injection bug to me.
> A better solution would be to call java.sql.DatabaseMetaData.getTables() to 
> implement this method, using the catalog and schema which are available from 
> Connection.getCatalog() and Connection.getSchema(). This would not work on 
> Java 6 because Connection.getSchema() was introduced in Java 7. However, the 
> solution would work for more modern JVMs. Limiting the vulnerability to 
> obsolete JVMs would at least be an improvement over the current situation. 
> Java 6 has been end-of-lifed and is not an appropriate platform for users who 
> are concerned about security.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to