mbutrovich opened a new pull request, #1697:
URL: https://github.com/apache/datafusion-comet/pull/1697
## Which issue does this PR close?
<!--
We generally require a GitHub issue to be filed for all bug fixes and
enhancements and this helps us generate change logs for our releases. You can
link an issue to this PR using the GitHub syntax. For example `Closes #123`
indicates that this PR will close issue #123.
-->
Closes #.
## Rationale for this change
<!--
Why are you proposing this change? If this is already explained clearly in
the issue then this section is not needed.
Explaining clearly why changes are proposed helps reviewers understand your
changes and offer better suggestions for fixes.
-->
Spark SQL's `decode` with two arguments converts a binary column to a string
given an encoding from one of 'US-ASCII', 'ISO-8859-1', 'UTF-8', 'UTF-16BE',
'UTF-16LE', 'UTF-16'.
https://github.com/apache/spark/blob/ef336ad51ba62e64a77896b9da12791051fa92cb/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala#L3028.
Comet currently has no support for this expression, and this PR adds 'UTF-8'
support.
## What changes are included in this PR?
<!--
There is no need to duplicate the description in the issue here but it is
sometimes worth providing a summary of the individual changes in this PR.
-->
Convert a `StringDecode` expression to a `Cast` but only for the 'UTF-8'
case. This scenario maps nicely to an Arrow cast with the safe option to
replace invalid values with nulls:
https://github.com/apache/arrow-rs/blob/07093a49eface9be9208dd427b810abba8d0a755/arrow-cast/src/cast/string.rs#L342
Other encodings are currently unsupported, however 'US-ASCII' might be easy
to convert into the `ascii` scalar function. I need to confirm that the
semantics match.
## How are these changes tested?
<!--
We typically require tests for all PRs in order to:
1. Prevent the code from being accidentally broken by subsequent changes
2. Serve as another way to document the expected behavior of the code
If tests are not included in your PR, please explain why (for example, are
they covered by existing tests)?
-->
New fuzz test and existing Spark SQL tests.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]