[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-12 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/17477


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-12 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111087424
  
--- Diff: core/src/main/scala/org/apache/spark/rpc/RpcEndpoint.scala ---
@@ -35,7 +35,7 @@ private[spark] trait RpcEnvFactory {
  *
  * The life-cycle of an endpoint is:
  *
- * constructor -> onStart -> receive* -> onStop
+ * {@code constructor -> onStart -> receive* -> onStop}
--- End diff --

After this, it produces the documentation as below (manually tested)

**Scaladoc**

![2017-04-12 5 08 
09](https://cloud.githubusercontent.com/assets/6477701/24947668/a9cabad0-1fa2-11e7-84a9-d08ba0eba621.png)

**Javadoc**

![2017-04-12 5 07 
58](https://cloud.githubusercontent.com/assets/6477701/24947667/a9c8628a-1fa2-11e7-9e9c-40b51daa43f0.png)

This also seems not exposed to API documentation anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-12 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111086919
  
--- Diff: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/auth/HttpAuthUtils.java
 ---
@@ -89,7 +89,7 @@ public static String getKerberosServiceTicket(String 
principal, String host,
* @param clientUserName Client User name.
* @return An unsigned cookie token generated from input parameters.
* The final cookie generated is of the following format :
-   * cu=&rn=&s=
+   * {@code cu=&rn=&s=}
--- End diff --

This is java code. So, `@code` should be fine. This also seems not exposed 
to the documentation anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-12 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111086738
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
 ---
@@ -296,7 +296,7 @@ trait MesosSchedulerUtils extends Logging {
 
   /**
* Parses the attributes constraints provided to spark and build a 
matching data struct:
-   *  Map[, Set[values-to-match]]
+   *  {@literal Map[, Set[values-to-match]}
--- End diff --

Same instance with 
https://github.com/apache/spark/pull/17477/files#r111086455. 

- `@code`

  ![2017-04-12 4 54 
57](https://cloud.githubusercontent.com/assets/6477701/24947571/4d299986-1fa2-11e7-8443-ebca9ce0e5bc.png)

- `@literal`

  ![2017-04-12 4 55 
19](https://cloud.githubusercontent.com/assets/6477701/24947572/4d2e4490-1fa2-11e7-9dbb-e2f879df9f5a.png)




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-12 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111086455
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
@@ -704,12 +704,12 @@ private[spark] object TaskSchedulerImpl {
* Used to balance containers across hosts.
*
* Accepts a map of hosts to resource offers for that host, and returns 
a prioritized list of
-   * resource offers representing the order in which the offers should be 
used.  The resource
+   * resource offers representing the order in which the offers should be 
used. The resource
* offers are ordered such that we'll allocate one container on each 
host before allocating a
* second container on any host, and so on, in order to reduce the 
damage if a host fails.
*
-   * For example, given , , , 
returns
-   * [o1, o5, o4, 02, o6, o3]
+   * For example, given {@literal }, {@literal } and
+   * {@literal }, returns {@literal [o1, o5, o4, o2, o6, 
o3]}.
--- End diff --

It seems we can't use `@code` here if there are codes such as `` (it 
seems `< A...>` case looks fine. I ran some tests with the comments below:

```
 * For example, given {@code < h1, [o1, o2, o3] >}, {@code < h2, [o4]>} and 
{@code },
 * returns {@code [o1, o5, o4, o2, o6, o3]}.
 *
 * For example, given
 *
 * {@code },
 *
 * {@code },
 *
 * returns {@code [o1, o5, o4, o2, o6, o3]}.
```

**Scaladoc**

![2017-04-12 4 34 
04](https://cloud.githubusercontent.com/assets/6477701/24947422/b02fc452-1fa1-11e7-90cb-55079edf6acb.png)


**Javadoc**

![2017-04-12 4 34 
38](https://cloud.githubusercontent.com/assets/6477701/24947418/a9235f3e-1fa1-11e7-8aab-d7c41279a67a.png)

If we use `@literal`, it seems fine.

**Scaladoc**

![2017-04-12 4 46 
54](https://cloud.githubusercontent.com/assets/6477701/24947467/e6ded6fa-1fa1-11e7-9cd7-a24c0b3778da.png)


**Javadoc**

![2017-04-12 4 46 
43](https://cloud.githubusercontent.com/assets/6477701/24947470/e95768fc-1fa1-11e7-8bd4-1983208a87a6.png)

This seems not exposed in the API documentation anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-11 Thread JoshRosen
Github user JoshRosen commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111059960
  
--- Diff: 
mllib/src/main/scala/org/apache/spark/ml/classification/Classifier.scala ---
@@ -74,7 +74,7 @@ abstract class Classifier[
* and features (`Vector`).
* @param numClasses  Number of classes label can take.  Labels must be 
integers in the range
*[0, numClasses).
-   * @throws SparkException  if any label is not an integer >= 0
+   * @note Throws `SparkException` if any label is not an integer is 
greater than or equal to 0
--- End diff --

`is not a nonnegative integer`? 
http://mathworld.wolfram.com/NonnegativeInteger.html


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-11 Thread JoshRosen
Github user JoshRosen commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111059991
  
--- Diff: 
mllib/src/main/scala/org/apache/spark/ml/classification/Classifier.scala ---
@@ -74,7 +74,7 @@ abstract class Classifier[
* and features (`Vector`).
* @param numClasses  Number of classes label can take.  Labels must be 
integers in the range
*[0, numClasses).
-   * @throws SparkException  if any label is not an integer >= 0
+   * @note Throws `SparkException` if any label is not an integer is 
greater than or equal to 0
--- End diff --

Or `is a non-integer or is negative`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-11 Thread JoshRosen
Github user JoshRosen commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111059834
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
@@ -704,12 +704,12 @@ private[spark] object TaskSchedulerImpl {
* Used to balance containers across hosts.
*
* Accepts a map of hosts to resource offers for that host, and returns 
a prioritized list of
-   * resource offers representing the order in which the offers should be 
used.  The resource
+   * resource offers representing the order in which the offers should be 
used. The resource
* offers are ordered such that we'll allocate one container on each 
host before allocating a
* second container on any host, and so on, in order to reduce the 
damage if a host fails.
*
-   * For example, given , , , 
returns
-   * [o1, o5, o4, 02, o6, o3]
+   * For example, given a map consisting of h1 to [o1, o2, o3], h2 to [o4] 
and h3 to [o5, o6],
+   * returns a list, [o1, o5, o4, o2, o6, o3].
--- End diff --

Can we also wrap this in code or otherwise escape it or use a different 
symbol?

```
{h1: [o1, o2, o3], h2: [o4], ...}
```

is clearer.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-04-11 Thread JoshRosen
Github user JoshRosen commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r111059727
  
--- Diff: core/src/main/scala/org/apache/spark/rpc/RpcEndpoint.scala ---
@@ -33,9 +33,9 @@ private[spark] trait RpcEnvFactory {
  *
  * It is guaranteed that `onStart`, `receive` and `onStop` will be called 
in sequence.
  *
- * The life-cycle of an endpoint is:
+ * The life-cycle of an endpoint is as below in an order:
--- End diff --

Can we just wrap this block as code? The rewording is confusing and doesn't 
read as clearly to me.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-03-30 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r108882566
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
@@ -704,12 +704,12 @@ private[spark] object TaskSchedulerImpl {
* Used to balance containers across hosts.
*
* Accepts a map of hosts to resource offers for that host, and returns 
a prioritized list of
-   * resource offers representing the order in which the offers should be 
used.  The resource
+   * resource offers representing the order in which the offers should be 
used. The resource
* offers are ordered such that we'll allocate one container on each 
host before allocating a
* second container on any host, and so on, in order to reduce the 
damage if a host fails.
*
-   * For example, given , , , 
returns
-   * [o1, o5, o4, 02, o6, o3]
+   * For example, given a map consisting of h1 to [o1, o2, o3], h2 to [o4] 
and h3 to [o5, o6],
+   * returns a list, [o1, o5, o4, o2, o6, o3].
--- End diff --

There look few typos here. 02 -> o2 and h1 -> h3.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-03-30 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r108857749
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
@@ -704,12 +704,12 @@ private[spark] object TaskSchedulerImpl {
* Used to balance containers across hosts.
*
* Accepts a map of hosts to resource offers for that host, and returns 
a prioritized list of
-   * resource offers representing the order in which the offers should be 
used.  The resource
+   * resource offers representing the order in which the offers should be 
used. The resource
* offers are ordered such that we'll allocate one container on each 
host before allocating a
* second container on any host, and so on, in order to reduce the 
damage if a host fails.
*
-   * For example, given , , , 
returns
-   * [o1, o5, o4, 02, o6, o3]
+   * For example, given maps from h1 to [o1, o2, o3], from h2 to [o4] and 
from h3 to [o5, o6],
+   * returns a list, [o1, o5, o4, o2, o6, o3].
--- End diff --

There look few typos here. 02 -> o2 and h1 -> h3.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-03-30 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r108853043
  
--- Diff: 
mllib/src/main/scala/org/apache/spark/ml/classification/Classifier.scala ---
@@ -74,7 +74,7 @@ abstract class Classifier[
* and features (`Vector`).
* @param numClasses  Number of classes label can take.  Labels must be 
integers in the range
*[0, numClasses).
-   * @throws SparkException  if any label is not an integer >= 0
+   * @note Throws `SparkException` if any label is not an integer is 
greater than or equal to 0
--- End diff --

This case throws an error as below:

```
[error] 
.../spark/mllib/target/java/org/apache/spark/ml/classification/Classifier.java:28:
 error: reference not found
[error]* @throws SparkException  if any label is not an integer >= 0
[error]  ^
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-03-29 Thread HyukjinKwon
Github user HyukjinKwon commented on a diff in the pull request:

https://github.com/apache/spark/pull/17477#discussion_r108852891
  
--- Diff: mllib/src/test/scala/org/apache/spark/ml/PipelineSuite.scala ---
@@ -230,7 +230,9 @@ class PipelineSuite extends SparkFunSuite with 
MLlibTestSparkContext with Defaul
 }
 
 
-/** Used to test [[Pipeline]] with [[MLWritable]] stages */
+/**
+ * Used to test [[Pipeline]] with `MLWritable` stages
--- End diff --

We should avoid inlined comment when there are code blocks (`` ` ... ` ``). 
See https://github.com/apache/spark/pull/16050


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #17477: [SPARK-18692][BUILD][DOCS] Test Java 8 unidoc bui...

2017-03-29 Thread HyukjinKwon
GitHub user HyukjinKwon opened a pull request:

https://github.com/apache/spark/pull/17477

[SPARK-18692][BUILD][DOCS] Test Java 8 unidoc build on Jenkins

## What changes were proposed in this pull request?

This PR proposes to run Spark unidoc to test Javadoc 8 build as Javadoc 8 
is easily re-breakable.

There are several problems with it:

- It introduces little extra bit of time to run the tests. In my case, it 
took 1.5 mins more (`Elapsed :[94.8746569157]`). How it was tested is described 
in "How was this patch tested?".

- > One problem that I noticed was that Unidoc appeared to be processing 
test sources: if we can find a way to exclude those from being processed in the 
first place then that might significantly speed things up.

  (see  @joshrosen's 
[comment](https://issues.apache.org/jira/browse/SPARK-18692?focusedCommentId=15947627&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15947627))

To complete this automated build, It also suggests to fix existing Javadoc 
breaks / ones introduced by test codes as described above.

There fixes were similar instances that previously fixed. Please refer 
https://github.com/apache/spark/pull/15999 and 
https://github.com/apache/spark/pull/16013

Note that this only fixes **errors** not **warnings**. Please see my 
observation https://github.com/apache/spark/pull/17389#issuecomment-288438704 
for spurious errors for warnings.

## How was this patch tested?

Manually via `jekyll build` for building tests. Also, tested via running 
`dev/run-tests.py`. 

This was tested via manually adding `time.time()` as below:

```diff
 profiles_and_goals = build_profiles + sbt_goals

 print("[info] Building Spark unidoc (w/Hive 1.2.1) using SBT with 
these arguments: ",
   " ".join(profiles_and_goals))

+import time
+st = time.time()
 exec_sbt(profiles_and_goals)
+print("Elapsed :[%s]" % str(time.time() - st))
```

produces

```
...

Building Unidoc API Documentation

...
[info] Main Java API documentation successful.
...
Elapsed :[94.8746569157]
...
...

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/HyukjinKwon/spark SPARK-18692

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/17477.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #17477


commit 7ddb6eb11ed17c87355826db5bf2512b785042f5
Author: hyukjinkwon 
Date:   2017-03-30T03:55:09Z

Test Java 8 unidoc build on Jenkins




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org