srielau commented on code in PR #49210:
URL: https://github.com/apache/spark/pull/49210#discussion_r1896216403
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/randomExpressions.scala:
##########
@@ -330,7 +330,8 @@ object Uniform {
group = "string_funcs")
case class RandStr(
length: Expression, override val seedExpression: Expression, hideSeed:
Boolean)
- extends ExpressionWithRandomSeed with BinaryLike[Expression] with
Nondeterministic {
+ extends ExpressionWithRandomSeed with BinaryLike[Expression] with
Nondeterministic
+ with ExpectsInputTypes {
Review Comment:
This is a really interesting question.
At a minimum I think any function should accept the widest possible type of
the type type family it expects.
E.g. there is nothing wrong with passing a DECIMAL to a function accepting
an integral numeric. Or a TINYINT where a BIGINT is expected.
The exception is if we expect that lateron we may want to overload teh
function and behave different.
E.g. imagine we have ONLY +(BIGINT, BIGINT). we would NOT want DECIMAL to
cast to BIGINT, because we later want +(DECIMAL, DECIMAL) etc...
Cross cvasting from e.g. a STRING to a number or a number to a STRIND is
much more a matter of taste (duck-casting).
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]