GitHub user JoshRosen opened a pull request:

    https://github.com/apache/spark/pull/8403

    [SPARK-10195] [SQL] Data sources Filter should not expose internal types

    Spark SQL's data sources API exposes Catalyst's internal types through its 
Filter interfaces. This is a problem because types like UTF8String are not 
stable developer APIs and should not be exposed to third-parties.
    
    This issue caused incompatibilities when upgrading our `spark-redshift` 
library to work against Spark 1.5.0.  To avoid these issues in the future we 
should only expose public types through these Filter objects. This patch 
accomplishes this by using CatalystTypeConverters to add the appropriate 
conversions.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/JoshRosen/spark 
datasources-internal-vs-external-types

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/8403.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #8403
    
----
commit 37e9b95bca3307350cadcc7b8b99a0747567fe21
Author: Josh Rosen <[email protected]>
Date:   2015-08-24T21:52:52Z

    Add failing regression test.

commit 6af0a451bfb750ef1c83acb0dd3ac3a0feb06696
Author: Josh Rosen <[email protected]>
Date:   2015-08-24T22:07:22Z

    Convert types back to Scala when constructing filters

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to