[ https://issues.apache.org/jira/browse/SPARK-23901?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16437401#comment-16437401 ]
Marco Gaido edited comment on SPARK-23901 at 4/16/18 12:19 PM: --------------------------------------------------------------- Actually I am facing some issues in the implementation. I have a couple of questions: 1 - In the mask function Hive accepts only constant values as parameters (other than the main string to replace). Shall we enforce this in Spark too? 2 - Despite in the documentation these methods are said to accept strings as parameters, actually they allow basically any type and any type is considered differently. Shall we reproduce the same Hive behavior or shall we support only String? 3 - Moreover, and this is connected with point 2, Hive accepts many more parameters than the ones in the documentation, shall we support them too? was (Author: mgaido): Actually I am facing some issues in the implementation. I have a couple of questions: 1 - In the mask function Hive accepts only constant values as parameters (other than the main string to replace). Shall we enforce this in Spark too? 2 - Despite in the documentation these methods are said to accept strings as parameters, actually they allow basically any type and any type is considered differently. Shall we reproduce the same Hive behavior or shall we support only String? > Data Masking Functions > ---------------------- > > Key: SPARK-23901 > URL: https://issues.apache.org/jira/browse/SPARK-23901 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 2.3.0 > Reporter: Xiao Li > Priority: Major > > - mask() > - mask_first_n() > - mask_last_n() > - mask_hash() > - mask_show_first_n() > - mask_show_last_n() > Reference: > [1] > [https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-DataMaskingFunctions] > [2] https://issues.apache.org/jira/browse/HIVE-13568 > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org