GitHub user JoshRosen opened a pull request:

    https://github.com/apache/spark/pull/7176

    [SPARK-8777] Add random data generator test utilities to Spark SQL

    This commit adds a set of random data generation utilities to Spark SQL, 
for use in its own unit tests.
    
    - `RandomDataGenerator.forType(DataType)` returns an `Option[() => Any]` 
that, if defined, contains a function for generating random values for the 
given DataType.  The random values use the external representations for the 
given DataType (for example, for DateType we return `java.sql.Date` instances 
instead of longs).
    - `DateTypeTestUtilities` defines some convenience fields for looping over 
instances of data types.  For example, `numericTypes` holds `DataType` 
instances for all supported numeric types.  These constants will help us to 
raise the level of abstraction in our tests.  For example, it's now very easy 
to write a test which is parameterized by all common data types. 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/JoshRosen/spark sql-random-data-generators

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/7176.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #7176
    
----
commit d2b4a4a9a2139b1a6c2be5d1f1aa3d98a6c9ed99
Author: Josh Rosen <[email protected]>
Date:   2015-07-02T03:18:05Z

    Add random data generator test utilities to Spark SQL.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to