IGNITE-7643: broken javadoc.

this closes #3480


Project: http://git-wip-us.apache.org/repos/asf/ignite/repo
Commit: http://git-wip-us.apache.org/repos/asf/ignite/commit/7b37f648
Tree: http://git-wip-us.apache.org/repos/asf/ignite/tree/7b37f648
Diff: http://git-wip-us.apache.org/repos/asf/ignite/diff/7b37f648

Branch: refs/heads/ignite-7485-2
Commit: 7b37f6481112e09da231a13c926d67002437c36f
Parents: b50aa5e
Author: YuriBabak <y.ch...@gmail.com>
Authored: Wed Feb 7 12:48:06 2018 +0300
Committer: Yury Babak <yba...@gridgain.com>
Committed: Wed Feb 7 12:48:06 2018 +0300

----------------------------------------------------------------------
 .../ml/dataset/AlgorithmSpecificDatasetExample.java     | 12 +++++++-----
 1 file changed, 7 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/ignite/blob/7b37f648/examples/src/main/java/org/apache/ignite/examples/ml/dataset/AlgorithmSpecificDatasetExample.java
----------------------------------------------------------------------
diff --git 
a/examples/src/main/java/org/apache/ignite/examples/ml/dataset/AlgorithmSpecificDatasetExample.java
 
b/examples/src/main/java/org/apache/ignite/examples/ml/dataset/AlgorithmSpecificDatasetExample.java
index 98f85cd..b693dfa 100644
--- 
a/examples/src/main/java/org/apache/ignite/examples/ml/dataset/AlgorithmSpecificDatasetExample.java
+++ 
b/examples/src/main/java/org/apache/ignite/examples/ml/dataset/AlgorithmSpecificDatasetExample.java
@@ -42,11 +42,13 @@ import 
org.apache.ignite.ml.dataset.primitive.data.SimpleLabeledDatasetData;
  * {@link DatasetWrapper}) in a sequential manner.
  *
  * In this example we need to implement gradient descent. This is iterative 
method that involves calculation of gradient
- * on every step. In according with the common idea we defines {@link 
AlgorithmSpecificDataset} - extended version
- * of {@code Dataset} with {@code gradient} method. As result our gradient 
descent method looks like a simple loop where
- * every iteration includes call of the {@code gradient} method. In the 
example we want to keep iteration number as well
- * for logging. Iteration number cannot be recovered from the {@code upstream} 
data and we need to keep it in the custom
- * partition {@code context} which is represented by {@link 
AlgorithmSpecificPartitionContext} class.
+ * on every step. In according with the common idea we defines
+ * {@link AlgorithmSpecificDatasetExample.AlgorithmSpecificDataset} - extended 
version of {@code Dataset} with
+ * {@code gradient} method. As result our gradient descent method looks like a 
simple loop where every iteration
+ * includes call of the {@code gradient} method. In the example we want to 
keep iteration number as well for logging.
+ * Iteration number cannot be recovered from the {@code upstream} data and we 
need to keep it in the custom
+ * partition {@code context} which is represented by
+ * {@link AlgorithmSpecificDatasetExample.AlgorithmSpecificPartitionContext} 
class.
  */
 public class AlgorithmSpecificDatasetExample {
     /** Run example. */

Reply via email to