[jira] [Comment Edited] (SPARK-21653) Complement SQL expression document
[ https://issues.apache.org/jira/browse/SPARK-21653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16174514#comment-16174514 ] Liang-Chi Hsieh edited comment on SPARK-21653 at 9/21/17 9:55 AM: -- I just go through all SQL expressions. Looks like we have expression description including usage and example for them now, except for CASE WHEN, if I don't miss any of SQL expressions. I created a trivial ticket for CASE WHEN. was (Author: viirya): I just go through all SQL expressions. Looks like we have expression description for them now, except for CASE WHEN, if I don't miss any of SQL expressions. I created a trivial ticket for CASE WHEN. > Complement SQL expression document > -- > > Key: SPARK-21653 > URL: https://issues.apache.org/jira/browse/SPARK-21653 > Project: Spark > Issue Type: Umbrella > Components: SQL >Affects Versions: 2.2.0 >Reporter: Liang-Chi Hsieh > > We have {{ExpressionDescription}} for SQL expressions. The expression > description tells what an expression's usage, arguments, and examples. Users > can understand how to use those expressions by {{DESCRIBE}} command in SQL: > {code} > spark-sql> DESCRIBE FUNCTION EXTENDED In; > Function: in > Class: org.apache.spark.sql.catalyst.expressions.In > Usage: expr1 in(expr2, expr3, ...) - Returns true if `expr` equals to any > valN. > Extended Usage: > No example/argument for in. > {code} > Not all SQL expressions have complete description now. For example, in the > above case, there is no example for function {{in}}. This task is going to > complement the expression description. -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-21653) Complement SQL expression document
[ https://issues.apache.org/jira/browse/SPARK-21653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16116324#comment-16116324 ] Hyukjin Kwon edited comment on SPARK-21653 at 8/7/17 11:37 AM: --- Yes, there were some discussion for adding arguments in my PR, IIRC, about correctness and adding tests accordingly. I am still fond of describing arguments as long as they look mostly correct in general and the examples produce the expected results because these are some information that did not exist before and we now generate documentation for SQL builtin functions. I am willing to push this if there are no strong objections now. was (Author: hyukjin.kwon): Yes, there was some discussion for adding arguments in my PR. I am still fond of describing arguments as long as they look mostly correct in general and the examples produce the expected results because these are some information that did not exist before and we now generate documentation for SQL builtin functions. I am willing to push this if there are no strong objections now. > Complement SQL expression document > -- > > Key: SPARK-21653 > URL: https://issues.apache.org/jira/browse/SPARK-21653 > Project: Spark > Issue Type: Umbrella > Components: SQL >Affects Versions: 2.2.0 >Reporter: Liang-Chi Hsieh > > We have {{ExpressionDescription}} for SQL expressions. The expression > description tells what an expression's usage, arguments, and examples. Users > can understand how to use those expressions by {{DESCRIBE}} command in SQL: > {code} > spark-sql> DESCRIBE FUNCTION EXTENDED In; > Function: in > Class: org.apache.spark.sql.catalyst.expressions.In > Usage: expr1 in(expr2, expr3, ...) - Returns true if `expr` equals to any > valN. > Extended Usage: > No example/argument for in. > {code} > Not all SQL expressions have complete description now. For example, in the > above case, there is no example for function {{in}}. This task is going to > complement the expression description. -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org