[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-07-27 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/21318


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-07-27 Thread rxin
Github user rxin commented on a diff in the pull request:

https://github.com/apache/spark/pull/21318#discussion_r20582
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -39,7 +39,21 @@ import org.apache.spark.util.Utils
 
 
 /**
- * Functions available for DataFrame operations.
+ * Commonly used functions available for DataFrame operations. Using 
functions defined here provides
+ * a little bit more compile-time safety to make sure the function exists.
+ *
+ * Spark also includes more built-in functions that are less common and 
are not defined here.
+ * You can still access them (and all the functions defined here) using 
the [[functions.expr()]] API
+ * and calling them through a SQL expression string. You can find the 
entire list of functions for
+ * the latest version of Spark at 
[[https://spark.apache.org/docs/latest/api/sql/index.html]].
--- End diff --

it's just a lot of work and i'm sure we will forget to update ... so i'm 
pointing to the latest.



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-05-14 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/spark/pull/21318#discussion_r187857692
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -39,7 +39,21 @@ import org.apache.spark.util.Utils
 
 
 /**
- * Functions available for DataFrame operations.
+ * Commonly used functions available for DataFrame operations. Using 
functions defined here provides
+ * a little bit more compile-time safety to make sure the function exists.
+ *
+ * Spark also includes more built-in functions that are less common and 
are not defined here.
+ * You can still access them (and all the functions defined here) using 
the [[functions.expr()]] API
+ * and calling them through a SQL expression string. You can find the 
entire list of functions for
+ * the latest version of Spark at 
[[https://spark.apache.org/docs/latest/api/sql/index.html]].
+ *
+ * As an example, `isnan` is a function that is defined here. You can use 
`isnan(col("myCol"))`
+ * to invoke the isnan function. This way the programming language's 
compiler ensures isnan exists
+ * and is of the proper form. You can also use `expr("isnan(myCol)")` 
function to invoke the same
+ * function. In this case, Spark itself will ensure isnan exists when it 
analyzes the query.
--- End diff --

nit: `isnan`


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-05-13 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/21318#discussion_r187843889
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -39,7 +39,21 @@ import org.apache.spark.util.Utils
 
 
 /**
- * Functions available for DataFrame operations.
+ * Commonly used functions available for DataFrame operations. Using 
functions defined here provides
+ * a little bit more compile-time safety to make sure the function exists.
+ *
+ * Spark also includes more built-in functions that are less common and 
are not defined here.
+ * You can still access them (and all the functions defined here) using 
the [[functions.expr()]] API
+ * and calling them through a SQL expression string. You can find the 
entire list of functions for
+ * the latest version of Spark at 
[[https://spark.apache.org/docs/latest/api/sql/index.html]].
+ *
+ * As an example, `isnan` is a function that is defined here. You can use 
`isnan(col("myCol"))`
+ * to invoke the isnan function. This way the programming language's 
compiler ensures isnan exists
--- End diff --

nit: `isnan` -> `` `isnan` ``


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-05-13 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/21318#discussion_r187843283
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -39,7 +39,21 @@ import org.apache.spark.util.Utils
 
 
 /**
- * Functions available for DataFrame operations.
+ * Commonly used functions available for DataFrame operations. Using 
functions defined here provides
--- End diff --

Maybe I am too much caring about this but I hope we don't have arguments 
too much if it's common or not ...  


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-05-13 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/21318#discussion_r187843125
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -39,7 +39,21 @@ import org.apache.spark.util.Utils
 
 
 /**
- * Functions available for DataFrame operations.
+ * Commonly used functions available for DataFrame operations. Using 
functions defined here provides
+ * a little bit more compile-time safety to make sure the function exists.
+ *
+ * Spark also includes more built-in functions that are less common and 
are not defined here.
+ * You can still access them (and all the functions defined here) using 
the [[functions.expr()]] API
+ * and calling them through a SQL expression string. You can find the 
entire list of functions for
+ * the latest version of Spark at 
[[https://spark.apache.org/docs/latest/api/sql/index.html]].
--- End diff --

@rxin, it's rather a nit but shouldn't we always update the link for each 
release since it always points the latest?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-05-13 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/21318#discussion_r187842050
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -39,7 +39,21 @@ import org.apache.spark.util.Utils
 
 
 /**
- * Functions available for DataFrame operations.
+ * Commonly used functions available for DataFrame operations. Using 
functions defined here provides
+ * a little bit more compile-time safety to make sure the function exists.
+ *
+ * Spark also includes more built-in functions that are less common and 
are not defined here.
+ * You can still access them (and all the functions defined here) using 
the [[functions.expr()]] API
--- End diff --

```
[error] 
/home/jenkins/workspace/SparkPullRequestBuilder/sql/core/target/java/org/apache/spark/sql/functions.java:7:
 error: unexpected text
[error]  * You can still access them (and all the functions defined here) 
using the {@link functions.expr()} API
[error] 
^
[error] 
/home/jenkins/workspace/SparkPullRequestBuilder/sql/core/target/java/org/apache/spark/sql/functions.java:9:
 error: unexpected text
[error]  * the latest version of Spark at {@link 
https://spark.apache.org/docs/latest/api/sql/index.html}.
[error]   ^
```

Seems both links are the problem in Javadoc. Shall we just use `` 
`functions.expr() `  `` and leave the 
`https://spark.apache.org/docs/latest/api/sql/index.html` like without 
`[[...]]`?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21318: [minor] Update docs for functions.scala to make i...

2018-05-13 Thread rxin
GitHub user rxin opened a pull request:

https://github.com/apache/spark/pull/21318

[minor] Update docs for functions.scala to make it clear not all the 
built-in functions are defined there

The title summarizes the change.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/rxin/spark functions

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/21318.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #21318


commit 83c191fbbe82bf49c81a860f4f1ebde7a4076f00
Author: Reynold Xin 
Date:   2018-05-14T05:15:56Z

[minor] Update docs for functions.scala




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org