Do the functions have non-standard syntax? If so then will probably
have to be in Babel. If not they could go into Core (and
SqlLibraryOperators).

Please log a jira case for your work. They will need to be specified
at some point, the earlier the better.

The SqlOperatorTests format is a bit better than other formats. It
allows you to test validation (e.g. number and types of arguments) and
evaluation in one place.

In CALCITE-4885 I introduced fluent test fixtures, so it is now
possible to write tests in the SqlOperatorTests style anywhere in the
code. You just need to call 'Fixtures.forOperators(true)'. If your
functions need to be in Babel (because of syntax, see above) maybe
it's time to add a BabelOperatorTest.

For BigQuery compatibility, we have been adding queries to
big-query.iq (see CALCITE-5269 [1]). That's a pattern you could follow
for Spark compatibility.

Julian

[1] 
https://github.com/apache/calcite/commit/a505b25eacc473c6ec0ef8abd40c1ccae86297b6


On Wed, Sep 28, 2022 at 8:53 AM James Scicluna
<[email protected]> wrote:
>
> Dear Apache Calcite Team,
>
> I’m planning to add several function definitions for Spark and Hive in 
> SqlLibraryOperators for the Babel SQL parser to use. To test these, would it 
> be enough to just add unit tests in BabelTest (as opposed to SqlOperatorTest) 
> using checkSqlResult?
>
> Thanks,
>
> James

Reply via email to