Build failed in Jenkins: flink-snapshot-deployment #103

2016-05-23 Thread Apache Jenkins Server
See 

--
[...truncated 712 lines...]
[INFO] --- maven-surefire-plugin:2.19.1:test (integration-tests) @ 
flink-shaded-curator-test ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ 
flink-shaded-curator-test ---
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-SNAPSHOT.jar
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-SNAPSHOT.pom
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-SNAPSHOT-tests.jar
[INFO] 
[INFO] --- maven-deploy-plugin:2.8.1:deploy (default-deploy) @ 
flink-shaded-curator-test ---
[INFO] Downloading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/maven-metadata.xml
[INFO] Downloaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/maven-metadata.xml
 (1003 B at 4.9 KB/sec)
[INFO] Uploading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-20160524.033434-232.jar
[INFO] Uploaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-20160524.033434-232.jar
 (44 KB at 56.0 KB/sec)
[INFO] Uploading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-20160524.033434-232.pom
[INFO] Uploaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-20160524.033434-232.pom
 (5 KB at 22.5 KB/sec)
[INFO] Downloading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/maven-metadata.xml
[INFO] Downloaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/maven-metadata.xml
 (581 B at 1.5 KB/sec)
[INFO] Uploading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/maven-metadata.xml
[INFO] Uploaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/maven-metadata.xml
 (1003 B at 6.6 KB/sec)
[INFO] Uploading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/maven-metadata.xml
[INFO] Uploaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/maven-metadata.xml
 (581 B at 3.9 KB/sec)
[INFO] Uploading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-20160524.033434-232-tests.jar
[INFO] Uploaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/flink-shaded-curator-test-1.1-20160524.033434-232-tests.jar
 (22 B at 0.1 KB/sec)
[INFO] Uploading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/maven-metadata.xml
[INFO] Uploaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-shaded-curator-test/1.1-SNAPSHOT/maven-metadata.xml
 (1003 B at 5.9 KB/sec)
[INFO] 
[INFO] 
[INFO] Building flink-core 1.1-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.16:check (validate) @ flink-core ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-maven) @ flink-core ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ flink-core ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
flink-core ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 

buildbot success in on flink-docs-release-0.9

2016-05-23 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-release-0.9 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-0.9/builds/331

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: orcus_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-0.9' 
triggered this build
Build Source Stamp: [branch release-0.9] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





buildbot success in on flink-docs-release-0.10

2016-05-23 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-release-0.10 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-0.10/builds/216

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: orcus_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-0.10' 
triggered this build
Build Source Stamp: [branch release-0.10] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





buildbot success in on flink-docs-master

2016-05-23 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-master while 
building . Full details are available at:
https://ci.apache.org/builders/flink-docs-master/builds/339

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: orcus_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-master' triggered 
this build
Build Source Stamp: [branch master] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





buildbot success in on flink-docs-release-1.0

2016-05-23 Thread buildbot
The Buildbot has detected a restored build on builder flink-docs-release-1.0 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-1.0/builds/90

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: orcus_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-1.0' 
triggered this build
Build Source Stamp: [branch release-1.0] HEAD
Blamelist: 

Build succeeded!

Sincerely,
 -The Buildbot





buildbot failure in on flink-docs-release-0.10

2016-05-23 Thread buildbot
The Buildbot has detected a new failure on builder flink-docs-release-0.10 
while building . Full details are available at:
https://ci.apache.org/builders/flink-docs-release-0.10/builds/215

Buildbot URL: https://ci.apache.org/

Buildslave for this Build: orcus_ubuntu

Build Reason: The Nightly scheduler named 'flink-nightly-docs-release-0.10' 
triggered this build
Build Source Stamp: [branch release-0.10] HEAD
Blamelist: 

BUILD FAILED: failed

Sincerely,
 -The Buildbot





flink git commit: [hotfix] [tableAPI] Moved tests to correct package.

2016-05-23 Thread fhueske
Repository: flink
Updated Branches:
  refs/heads/master 885fbaf66 -> 343d05a40


[hotfix] [tableAPI] Moved tests to correct package.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/343d05a4
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/343d05a4
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/343d05a4

Branch: refs/heads/master
Commit: 343d05a4067dd154b45e58b6bfce0ae6a4ebd5f4
Parents: 885fbaf
Author: Fabian Hueske 
Authored: Sun May 22 19:32:01 2016 +0200
Committer: Fabian Hueske 
Committed: Mon May 23 18:44:02 2016 +0200

--
 .../scala/expression/ScalarFunctionsTest.scala  | 490 ---
 .../expression/utils/ExpressionEvaluator.scala  | 119 -
 .../api/scala/typeutils/RowComparatorTest.scala | 136 -
 .../api/scala/typeutils/RowSerializerTest.scala | 194 
 .../table/expressions/ScalarFunctionsTest.scala | 490 +++
 .../expressions/utils/ExpressionEvaluator.scala | 119 +
 .../api/table/typeutils/RowComparatorTest.scala | 136 +
 .../api/table/typeutils/RowSerializerTest.scala | 194 
 8 files changed, 939 insertions(+), 939 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/343d05a4/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/expression/ScalarFunctionsTest.scala
--
diff --git 
a/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/expression/ScalarFunctionsTest.scala
 
b/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/expression/ScalarFunctionsTest.scala
deleted file mode 100644
index 8d1cfa2..000
--- 
a/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/expression/ScalarFunctionsTest.scala
+++ /dev/null
@@ -1,490 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.flink.api.scala.expression
-
-import org.apache.flink.api.common.typeinfo.BasicTypeInfo._
-import org.apache.flink.api.common.typeinfo.TypeInformation
-import org.apache.flink.api.scala.expression.utils.ExpressionEvaluator
-import org.apache.flink.api.scala.table._
-import org.apache.flink.api.table.Row
-import org.apache.flink.api.table.expressions.{Expression, ExpressionParser}
-import org.apache.flink.api.table.typeutils.RowTypeInfo
-import org.junit.Assert.assertEquals
-import org.junit.Test
-
-class ScalarFunctionsTest {
-
-  // 
--
-  // String functions
-  // 
--
-
-  @Test
-  def testSubstring(): Unit = {
-testFunction(
-  'f0.substring(2),
-  "f0.substring(2)",
-  "SUBSTRING(f0, 2)",
-  "his is a test String.")
-
-testFunction(
-  'f0.substring(2, 5),
-  "f0.substring(2, 5)",
-  "SUBSTRING(f0, 2, 5)",
-  "his i")
-
-testFunction(
-  'f0.substring(1, 'f7),
-  "f0.substring(1, f7)",
-  "SUBSTRING(f0, 1, f7)",
-  "Thi")
-  }
-
-  @Test
-  def testTrim(): Unit = {
-testFunction(
-  'f8.trim(),
-  "f8.trim()",
-  "TRIM(f8)",
-  "This is a test String.")
-
-testFunction(
-  'f8.trim(removeLeading = true, removeTrailing = true, " "),
-  "trim(f8)",
-  "TRIM(f8)",
-  "This is a test String.")
-
-testFunction(
-  'f8.trim(removeLeading = false, removeTrailing = true, " "),
-  "f8.trim(TRAILING, ' ')",
-  "TRIM(TRAILING FROM f8)",
-  " This is a test String.")
-
-testFunction(
-  'f0.trim(removeLeading = true, removeTrailing = true, "."),
-  "trim(BOTH, '.', f0)",
-  "TRIM(BOTH '.' FROM f0)",
-  "This is a test String")
-  }
-
-  @Test
-  def testCharLength(): Unit = {
-testFunction(
-  'f0.charLength(),
-  "f0.charLength()",
-  "CHAR_LENGTH(f0)",
-  "22")
-
-testFunction(
-  'f0.charLength(),
-  "charLength(f0)",
-  

flink git commit: [docs] Fix outdated default value for akka.ask.timeout

2016-05-23 Thread uce
Repository: flink
Updated Branches:
  refs/heads/master 9cc629662 -> 885fbaf66


[docs] Fix outdated default value for akka.ask.timeout


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/885fbaf6
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/885fbaf6
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/885fbaf6

Branch: refs/heads/master
Commit: 885fbaf66d8078a779595502ed4e7a68fd9faf7d
Parents: 9cc6296
Author: Ufuk Celebi 
Authored: Mon May 23 17:25:57 2016 +0200
Committer: Ufuk Celebi 
Committed: Mon May 23 17:28:33 2016 +0200

--
 docs/setup/config.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/885fbaf6/docs/setup/config.md
--
diff --git a/docs/setup/config.md b/docs/setup/config.md
index 14e9d21..b4d0242 100644
--- a/docs/setup/config.md
+++ b/docs/setup/config.md
@@ -175,7 +175,7 @@ The following parameters configure Flink's JobManager and 
TaskManagers.
 
 ### Distributed Coordination (via Akka)
 
-- `akka.ask.timeout`: Timeout used for all futures and blocking Akka calls. If 
Flink fails due to timeouts then you should try to increase this value. 
Timeouts can be caused by slow machines or a congested network. The timeout 
value requires a time-unit specifier (ms/s/min/h/d) (DEFAULT: **100 s**).
+- `akka.ask.timeout`: Timeout used for all futures and blocking Akka calls. If 
Flink fails due to timeouts then you should try to increase this value. 
Timeouts can be caused by slow machines or a congested network. The timeout 
value requires a time-unit specifier (ms/s/min/h/d) (DEFAULT: **10 s**).
 - `akka.lookup.timeout`: Timeout used for the lookup of the JobManager. The 
timeout value has to contain a time-unit specifier (ms/s/min/h/d) (DEFAULT: 
**10 s**).
 - `akka.framesize`: Maximum size of messages which are sent between the 
JobManager and the TaskManagers. If Flink fails because messages exceed this 
limit, then you should increase it. The message size requires a size-unit 
specifier (DEFAULT: **10485760b**).
 - `akka.watch.heartbeat.interval`: Heartbeat interval for Akka's DeathWatch 
mechanism to detect dead TaskManagers. If TaskManagers are wrongly marked dead 
because of lost or delayed heartbeat messages, then you should increase this 
value. A thorough description of Akka's DeathWatch can be found 
[here](http://doc.akka.io/docs/akka/snapshot/scala/remoting.html#failure-detector)
 (DEFAULT: **akka.ask.timeout/10**).



[4/4] flink git commit: [FLINK-3632] [tableAPI] Clean up TableAPI exceptions.

2016-05-23 Thread fhueske
[FLINK-3632] [tableAPI] Clean up TableAPI exceptions.

This closes #2015


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/9cc62966
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/9cc62966
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/9cc62966

Branch: refs/heads/master
Commit: 9cc629662a34bd9cc6310556a321dd5144a60439
Parents: 53949d1
Author: Yijie Shen 
Authored: Wed May 18 23:57:42 2016 +0800
Committer: Fabian Hueske 
Committed: Mon May 23 14:06:06 2016 +0200

--
 .../api/scala/table/TableConversions.scala  |  2 +-
 .../flink/api/table/TableEnvironment.scala  | 29 +---
 .../api/table/plan/logical/operators.scala  |  3 +-
 .../table/plan/nodes/dataset/DataSetRel.scala   |  4 +--
 .../api/table/plan/rules/FlinkRuleSets.scala|  2 +-
 .../api/table/plan/schema/FlinkTable.scala  | 12 
 .../api/table/sources/CsvTableSource.scala  | 10 +++
 .../apache/flink/api/table/trees/TreeNode.scala |  5 ++--
 .../flink/api/table/typeutils/RowTypeInfo.scala |  6 ++--
 .../api/java/batch/table/FromDataSetITCase.java | 11 
 .../api/scala/batch/table/ToTableITCase.scala   | 12 
 .../api/scala/stream/table/SelectITCase.scala   | 12 
 12 files changed, 54 insertions(+), 54 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/9cc62966/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
--
diff --git 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
index 1fdcbc5..720dac0 100644
--- 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
+++ 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/scala/table/TableConversions.scala
@@ -41,7 +41,7 @@ class TableConversions(table: Table) {
 tEnv.toDataSet(table)
   case _ =>
 throw new TableException(
-  "Only tables that orginate from Scala DataSets can be converted to 
Scala DataSets.")
+  "Only tables that originate from Scala DataSets can be converted to 
Scala DataSets.")
 }
   }
 

http://git-wip-us.apache.org/repos/asf/flink/blob/9cc62966/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/TableEnvironment.scala
--
diff --git 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/TableEnvironment.scala
 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/TableEnvironment.scala
index 8aa9e10..1c592f9 100644
--- 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/TableEnvironment.scala
+++ 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/TableEnvironment.scala
@@ -236,8 +236,7 @@ abstract class TableEnvironment(val config: TableConfig) {
   case c: CaseClassTypeInfo[A] => c.getFieldNames
   case p: PojoTypeInfo[A] => p.getFieldNames
   case tpe =>
-throw new IllegalArgumentException(
-  s"Type $tpe requires explicit field naming.")
+throw new TableException(s"Type $tpe lacks explicit field naming")
 }
 val fieldIndexes = fieldNames.indices.toArray
 (fieldNames, fieldIndexes)
@@ -259,12 +258,11 @@ abstract class TableEnvironment(val config: TableConfig) {
 val indexedNames: Array[(Int, String)] = inputType match {
   case a: AtomicType[A] =>
 if (exprs.length != 1) {
-  throw new IllegalArgumentException("Atomic type may can only have a 
single field.")
+  throw new TableException("Table of atomic type can only have a 
single field.")
 }
 exprs.map {
   case UnresolvedFieldReference(name) => (0, name)
-  case _ => throw new IllegalArgumentException(
-"Field reference expression expected.")
+  case _ => throw new TableException("Field reference expression 
expected.")
 }
   case t: TupleTypeInfo[A] =>
 exprs.zipWithIndex.map {
@@ -272,11 +270,11 @@ abstract class TableEnvironment(val config: TableConfig) {
   case (Alias(UnresolvedFieldReference(origName), name), _) =>
 val idx = t.getFieldIndex(origName)
 if (idx < 0) {
-  throw new IllegalArgumentException(s"$origName is not a field of 
type $t")
+  throw new TableException(s"$origName is not a field of type $t")
 }
 (idx, name)
-  case _ => throw new IllegalArgumentException(
-"Field reference 

[2/4] flink git commit: [hotfix] [tableAPI] Throw helpful exception for unsupported ORDER BY features.

2016-05-23 Thread fhueske
[hotfix] [tableAPI] Throw helpful exception for unsupported ORDER BY features.


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/905ac6ed
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/905ac6ed
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/905ac6ed

Branch: refs/heads/master
Commit: 905ac6edacefac7bed780c86211df6950a2470d3
Parents: 173d24d
Author: Fabian Hueske 
Authored: Mon May 23 14:03:16 2016 +0200
Committer: Fabian Hueske 
Committed: Mon May 23 14:03:16 2016 +0200

--
 .../plan/rules/dataSet/DataSetSortRule.scala| 10 
 .../flink/api/scala/batch/sql/SortITCase.scala  | 25 
 2 files changed, 35 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/905ac6ed/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/plan/rules/dataSet/DataSetSortRule.scala
--
diff --git 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/plan/rules/dataSet/DataSetSortRule.scala
 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/plan/rules/dataSet/DataSetSortRule.scala
index b7f70e3..b26d1de 100644
--- 
a/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/plan/rules/dataSet/DataSetSortRule.scala
+++ 
b/flink-libraries/flink-table/src/main/scala/org/apache/flink/api/table/plan/rules/dataSet/DataSetSortRule.scala
@@ -23,6 +23,7 @@ import org.apache.calcite.rel.RelNode
 import org.apache.calcite.rel.convert.ConverterRule
 import org.apache.calcite.rel.core.JoinRelType
 import org.apache.calcite.rel.logical.{LogicalJoin, LogicalSort}
+import org.apache.flink.api.table.TableException
 import org.apache.flink.api.table.plan.nodes.dataset.{DataSetConvention, 
DataSetSort}
 
 class DataSetSortRule
@@ -37,6 +38,15 @@ class DataSetSortRule
 */
   override def matches(call: RelOptRuleCall): Boolean = {
 val sort = call.rel(0).asInstanceOf[LogicalSort]
+
+if (sort.offset != null) {
+  throw new TableException("ORDER BY OFFSET is currently not supported.")
+}
+
+if (sort.fetch != null) {
+  throw new TableException("ORDER BY FETCH is currently not supported.")
+}
+
 sort.offset == null && sort.fetch == null
   }
 

http://git-wip-us.apache.org/repos/asf/flink/blob/905ac6ed/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/sql/SortITCase.scala
--
diff --git 
a/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/sql/SortITCase.scala
 
b/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/sql/SortITCase.scala
index 0dea0b6a..7206be7 100644
--- 
a/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/sql/SortITCase.scala
+++ 
b/flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/sql/SortITCase.scala
@@ -25,6 +25,7 @@ import org.apache.flink.api.scala.batch.utils.SortTestUtils._
 import org.apache.flink.api.scala.util.CollectionDataSets
 import org.apache.flink.api.scala.table._
 import org.apache.flink.api.scala._
+import org.apache.flink.api.table.plan.PlanGenException
 import org.apache.flink.api.table.{Row, TableEnvironment}
 import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode
 import org.apache.flink.test.util.TestBaseUtils
@@ -60,4 +61,28 @@ class SortITCase(
 TestBaseUtils.compareOrderedResultAsText(result.asJava, expected)
   }
 
+  @Test(expected = classOf[PlanGenException])
+  def testOrderByOffset(): Unit = {
+val env = ExecutionEnvironment.getExecutionEnvironment
+val tEnv = TableEnvironment.getTableEnvironment(env, config)
+
+val sqlQuery = "SELECT * FROM MyTable ORDER BY _1 OFFSET 2 ROWS"
+
+val ds = CollectionDataSets.get3TupleDataSet(env)
+tEnv.registerDataSet("MyTable", ds)
+tEnv.sql(sqlQuery).toDataSet[Row]
+  }
+
+  @Test(expected = classOf[PlanGenException])
+  def testOrderByFirst(): Unit = {
+val env = ExecutionEnvironment.getExecutionEnvironment
+val tEnv = TableEnvironment.getTableEnvironment(env, config)
+
+val sqlQuery = "SELECT * FROM MyTable ORDER BY _1 FETCH NEXT 2 ROWS ONLY"
+
+val ds = CollectionDataSets.get3TupleDataSet(env)
+tEnv.registerDataSet("MyTable", ds)
+tEnv.sql(sqlQuery).toDataSet[Row]
+  }
+
 }



[1/2] flink git commit: [FLINK-3927][yarn] make container id consistent across Hadoop versions

2016-05-23 Thread mxm
Repository: flink
Updated Branches:
  refs/heads/master 707606ac4 -> 5fdf39b1f


[FLINK-3927][yarn] make container id consistent across Hadoop versions

- introduce a unique container id independent of the Hadoop version
- improve printing of exceptions during registration
- minor improvements to the Yarn ResourceManager code

This closes #2013


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/017106e1
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/017106e1
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/017106e1

Branch: refs/heads/master
Commit: 017106e140f3c17ebaaa0507e1dcbbc445c8f0ac
Parents: 707606a
Author: Maximilian Michels 
Authored: Fri May 20 14:29:12 2016 +0200
Committer: Maximilian Michels 
Committed: Mon May 23 12:36:25 2016 +0200

--
 .../clusterframework/FlinkResourceManager.java  |  4 ++--
 .../clusterframework/types/ResourceID.java  | 14 +++
 .../flink/runtime/jobmanager/JobManager.scala   |  4 ++--
 .../flink/yarn/RegisteredYarnWorkerNode.java| 13 +-
 .../flink/yarn/YarnContainerInLaunch.java   | 16 +
 .../flink/yarn/YarnFlinkResourceManager.java| 25 +---
 .../flink/yarn/YarnTaskManagerRunner.java   |  5 ++--
 7 files changed, 51 insertions(+), 30 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/017106e1/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/FlinkResourceManager.java
--
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/FlinkResourceManager.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/FlinkResourceManager.java
index a5c354c..8766e15 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/FlinkResourceManager.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/FlinkResourceManager.java
@@ -353,7 +353,7 @@ public abstract class FlinkResourceManager extend
ResourceID resourceID = msg.resourceId();
try {
Preconditions.checkNotNull(resourceID);
-   WorkerType newWorker = 
workerRegistered(msg.resourceId());
+   WorkerType newWorker = workerRegistered(resourceID);
WorkerType oldWorker = 
registeredWorkers.put(resourceID, newWorker);
if (oldWorker != null) {
LOG.warn("Worker {} had been registered 
before.", resourceID);
@@ -363,7 +363,7 @@ public abstract class FlinkResourceManager extend
self());
} catch (Exception e) {
// This may happen on duplicate task manager 
registration message to the job manager
-   LOG.warn("TaskManager resource registration failed for 
{}", resourceID);
+   LOG.warn("TaskManager resource registration failed for 
{}", resourceID, e);
 
// tell the JobManager about the failure
String eStr = ExceptionUtils.stringifyException(e);

http://git-wip-us.apache.org/repos/asf/flink/blob/017106e1/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/types/ResourceID.java
--
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/types/ResourceID.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/types/ResourceID.java
index 8e48244..e599456 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/types/ResourceID.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/clusterframework/types/ResourceID.java
@@ -19,6 +19,7 @@
 package org.apache.flink.runtime.clusterframework.types;
 
 import org.apache.flink.util.AbstractID;
+import org.apache.flink.util.Preconditions;
 
 import java.io.Serializable;
 
@@ -32,10 +33,15 @@ public class ResourceID implements Serializable {
private final String resourceId;
 
public ResourceID(String resourceId) {
+   Preconditions.checkNotNull(resourceId, "ResourceID must not be 
null");
this.resourceId = resourceId;
}
 
-   public String getResourceId() {
+   /**
+* Gets the Resource Id as string
+* @return Stringified version of the ResourceID
+*/
+   public final String getResourceIdString() {
return resourceId;
}
 
@@ -48,10 +54,10 @@ public class ResourceID implements Serializable {
}
 
@Override
-   public boolean equals(Object o) {
+   public final 

[2/2] flink git commit: [FLINK-3953][maven] rename unit-tests execution to default-test

2016-05-23 Thread mxm
[FLINK-3953][maven] rename unit-tests execution to default-test

After 38698c0b101cbb48f8c10adf4060983ac07e2f4b, there are now two
executions defined for the Surefire plugin: unit-tests and
integration-tests. In addition, there is an implicit default execution
called default-test. This leads to the unit tests to be executed
twice. This renames unit-tests to default-test to prevent duplicate
execution.

This closes #2019


Project: http://git-wip-us.apache.org/repos/asf/flink/repo
Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/5fdf39b1
Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/5fdf39b1
Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/5fdf39b1

Branch: refs/heads/master
Commit: 5fdf39b1fec032f5816cb188334c129ff9186415
Parents: 017106e
Author: Maximilian Michels 
Authored: Mon May 23 12:06:25 2016 +0200
Committer: Maximilian Michels 
Committed: Mon May 23 12:36:56 2016 +0200

--
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/flink/blob/5fdf39b1/pom.xml
--
diff --git a/pom.xml b/pom.xml
index 784aa40..22515d0 100644
--- a/pom.xml
+++ b/pom.xml
@@ -923,7 +923,7 @@ under the License.



-   unit-tests
+   default-test
test

test