[GitHub] spark pull request #22637: [SPARK-25408] Move to more ideomatic Java8

2018-10-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/22637


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to more ideomatic Java8

2018-10-06 Thread Fokko
Github user Fokko commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223196616
  
--- Diff: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/CLIService.java 
---
@@ -154,6 +154,7 @@ public synchronized void start() {
   throw new ServiceException("Unable to connect to MetaStore!", e);
 }
 finally {
+  // IMetaStoreClient is not AutoCloseable, closing it manually
--- End diff --

Done!


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to more ideomatic Java8

2018-10-06 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223188294
  
--- Diff: 
sql/catalyst/src/test/java/org/apache/spark/sql/catalyst/expressions/RowBasedKeyValueBatchSuite.java
 ---
@@ -152,31 +151,24 @@ public void emptyBatch() throws Exception {
 // Expected exception; do nothing.
   }
   Assert.assertFalse(batch.rowIterator().next());
-} finally {
-  batch.close();
 }
   }
 
   @Test
-  public void batchType() throws Exception {
-RowBasedKeyValueBatch batch1 = 
RowBasedKeyValueBatch.allocate(keySchema,
-valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
-RowBasedKeyValueBatch batch2 = 
RowBasedKeyValueBatch.allocate(fixedKeySchema,
-valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
-try {
+  public void batchType() {
+try (RowBasedKeyValueBatch batch1 = 
RowBasedKeyValueBatch.allocate(keySchema,
+valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
+ RowBasedKeyValueBatch batch2 = 
RowBasedKeyValueBatch.allocate(fixedKeySchema,
--- End diff --

It's hard to know the best way to wrap these things, but here you have the 
continuation line indented *less* than the one that starts the statement. I 
think we can't do that. I'd indent both continuations 2 spaces from their 
preceding line ideally. Here and elsewhere where the try statement has to wrap.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to more ideomatic Java8

2018-10-06 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223188308
  
--- Diff: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/CLIService.java 
---
@@ -154,6 +154,7 @@ public synchronized void start() {
   throw new ServiceException("Unable to connect to MetaStore!", e);
 }
 finally {
+  // IMetaStoreClient is not AutoCloseable, closing it manually
--- End diff --

this should still be reverted


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to more ideomatic Java8

2018-10-06 Thread Fokko
Github user Fokko commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223187949
  
--- Diff: 
sql/catalyst/src/test/java/org/apache/spark/sql/catalyst/expressions/RowBasedKeyValueBatchSuite.java
 ---
@@ -152,31 +151,27 @@ public void emptyBatch() throws Exception {
 // Expected exception; do nothing.
   }
   Assert.assertFalse(batch.rowIterator().next());
-} finally {
-  batch.close();
 }
   }
 
   @Test
-  public void batchType() throws Exception {
-RowBasedKeyValueBatch batch1 = 
RowBasedKeyValueBatch.allocate(keySchema,
-valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
-RowBasedKeyValueBatch batch2 = 
RowBasedKeyValueBatch.allocate(fixedKeySchema,
-valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
-try {
-  Assert.assertEquals(batch1.getClass(), 
VariableLengthRowBasedKeyValueBatch.class);
-  Assert.assertEquals(batch2.getClass(), 
FixedLengthRowBasedKeyValueBatch.class);
-} finally {
-  batch1.close();
-  batch2.close();
+  public void batchType() {
+try (RowBasedKeyValueBatch batch1 = 
RowBasedKeyValueBatch.allocate(keySchema,
+valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
--- End diff --

My mistake. thanks!


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to more ideomatic Java8

2018-10-06 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223187058
  
--- Diff: 
sql/catalyst/src/test/java/org/apache/spark/sql/catalyst/expressions/RowBasedKeyValueBatchSuite.java
 ---
@@ -152,31 +151,27 @@ public void emptyBatch() throws Exception {
 // Expected exception; do nothing.
   }
   Assert.assertFalse(batch.rowIterator().next());
-} finally {
-  batch.close();
 }
   }
 
   @Test
-  public void batchType() throws Exception {
-RowBasedKeyValueBatch batch1 = 
RowBasedKeyValueBatch.allocate(keySchema,
-valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
-RowBasedKeyValueBatch batch2 = 
RowBasedKeyValueBatch.allocate(fixedKeySchema,
-valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
-try {
-  Assert.assertEquals(batch1.getClass(), 
VariableLengthRowBasedKeyValueBatch.class);
-  Assert.assertEquals(batch2.getClass(), 
FixedLengthRowBasedKeyValueBatch.class);
-} finally {
-  batch1.close();
-  batch2.close();
+  public void batchType() {
+try (RowBasedKeyValueBatch batch1 = 
RowBasedKeyValueBatch.allocate(keySchema,
+valueSchema, taskMemoryManager, DEFAULT_CAPACITY);
--- End diff --

There's a compile error here. But note you can have multiple Closeables 
declared in one try block anyway, so these can be merged.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to mode ideomatic Java8

2018-10-05 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223102153
  
--- Diff: 
common/network-shuffle/src/test/java/org/apache/spark/network/shuffle/ExternalShuffleSecuritySuite.java
 ---
@@ -96,14 +96,16 @@ private void validate(String appId, String secretKey, 
boolean encrypt)
 ImmutableMap.of("spark.authenticate.enableSaslEncryption", 
"true")));
 }
 
-ExternalShuffleClient client =
-  new ExternalShuffleClient(testConf, new TestSecretKeyHolder(appId, 
secretKey), true, 5000);
-client.init(appId);
-// Registration either succeeds or throws an exception.
-client.registerWithShuffleServer(TestUtils.getLocalHost(), 
server.getPort(), "exec0",
-  new ExecutorShuffleInfo(new String[0], 0,
-"org.apache.spark.shuffle.sort.SortShuffleManager"));
-client.close();
+try (ExternalShuffleClient client =
+new ExternalShuffleClient(
+testConf, new TestSecretKeyHolder(appId, secretKey), true, 
5000)) {
--- End diff --

Indentation? It seems to follow 4-space indentation for Java style, but it 
seems to be too outstanding in this file.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to mode ideomatic Java8

2018-10-05 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r223101514
  
--- Diff: 
common/network-shuffle/src/test/java/org/apache/spark/network/shuffle/ExternalShuffleBlockResolverSuite.java
 ---
@@ -98,19 +98,19 @@ public void testSortShuffleBlocks() throws IOException {
 resolver.registerExecutor("app0", "exec0",
   dataContext.createExecutorInfo(SORT_MANAGER));
 
-InputStream block0Stream =
-  resolver.getBlockData("app0", "exec0", 0, 0, 0).createInputStream();
-String block0 = CharStreams.toString(
-new InputStreamReader(block0Stream, StandardCharsets.UTF_8));
-block0Stream.close();
-assertEquals(sortBlock0, block0);
-
-InputStream block1Stream =
-  resolver.getBlockData("app0", "exec0", 0, 0, 1).createInputStream();
-String block1 = CharStreams.toString(
-new InputStreamReader(block1Stream, StandardCharsets.UTF_8));
-block1Stream.close();
-assertEquals(sortBlock1, block1);
+try (InputStream block0Stream = resolver.getBlockData(
+"app0", "exec0", 0, 0, 0).createInputStream()) {
+  String block0 =
+  CharStreams.toString(new InputStreamReader(block0Stream, 
StandardCharsets.UTF_8));
--- End diff --

Indentation?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to mode ideomatic Java8

2018-10-04 Thread Fokko
Github user Fokko commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r222893178
  
--- Diff: 
common/network-shuffle/src/test/java/org/apache/spark/network/shuffle/ExternalShuffleIntegrationSuite.java
 ---
@@ -133,37 +133,38 @@ private FetchResult fetchBlocks(
 
 final Semaphore requestsRemaining = new Semaphore(0);
 
-ExternalShuffleClient client = new ExternalShuffleClient(clientConf, 
null, false, 5000);
-client.init(APP_ID);
-client.fetchBlocks(TestUtils.getLocalHost(), port, execId, blockIds,
-  new BlockFetchingListener() {
-@Override
-public void onBlockFetchSuccess(String blockId, ManagedBuffer 
data) {
-  synchronized (this) {
-if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
-  data.retain();
-  res.successBlocks.add(blockId);
-  res.buffers.add(data);
-  requestsRemaining.release();
+try (ExternalShuffleClient client = new 
ExternalShuffleClient(clientConf, null, false, 5000)) {
+  client.init(APP_ID);
+  client.fetchBlocks(TestUtils.getLocalHost(), port, execId, blockIds,
+new BlockFetchingListener() {
+  @Override
+  public void onBlockFetchSuccess(String blockId, ManagedBuffer 
data) {
+synchronized (this) {
+  if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
+data.retain();
+res.successBlocks.add(blockId);
+res.buffers.add(data);
+requestsRemaining.release();
+  }
 }
   }
-}
-
-@Override
-public void onBlockFetchFailure(String blockId, Throwable 
exception) {
-  synchronized (this) {
-if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
-  res.failedBlocks.add(blockId);
-  requestsRemaining.release();
+
+  @Override
+  public void onBlockFetchFailure(String blockId, Throwable 
exception) {
+synchronized (this) {
+  if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
+res.failedBlocks.add(blockId);
+requestsRemaining.release();
+  }
 }
   }
-}
-  }, null);
+}, null
+  );
--- End diff --

Done


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to mode ideomatic Java8

2018-10-04 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r222892548
  
--- Diff: 
common/network-shuffle/src/test/java/org/apache/spark/network/shuffle/ExternalShuffleIntegrationSuite.java
 ---
@@ -133,37 +133,38 @@ private FetchResult fetchBlocks(
 
 final Semaphore requestsRemaining = new Semaphore(0);
 
-ExternalShuffleClient client = new ExternalShuffleClient(clientConf, 
null, false, 5000);
-client.init(APP_ID);
-client.fetchBlocks(TestUtils.getLocalHost(), port, execId, blockIds,
-  new BlockFetchingListener() {
-@Override
-public void onBlockFetchSuccess(String blockId, ManagedBuffer 
data) {
-  synchronized (this) {
-if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
-  data.retain();
-  res.successBlocks.add(blockId);
-  res.buffers.add(data);
-  requestsRemaining.release();
+try (ExternalShuffleClient client = new 
ExternalShuffleClient(clientConf, null, false, 5000)) {
+  client.init(APP_ID);
+  client.fetchBlocks(TestUtils.getLocalHost(), port, execId, blockIds,
+new BlockFetchingListener() {
+  @Override
+  public void onBlockFetchSuccess(String blockId, ManagedBuffer 
data) {
+synchronized (this) {
+  if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
+data.retain();
+res.successBlocks.add(blockId);
+res.buffers.add(data);
+requestsRemaining.release();
+  }
 }
   }
-}
-
-@Override
-public void onBlockFetchFailure(String blockId, Throwable 
exception) {
-  synchronized (this) {
-if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
-  res.failedBlocks.add(blockId);
-  requestsRemaining.release();
+
+  @Override
+  public void onBlockFetchFailure(String blockId, Throwable 
exception) {
+synchronized (this) {
+  if (!res.successBlocks.contains(blockId) && 
!res.failedBlocks.contains(blockId)) {
+res.failedBlocks.add(blockId);
+requestsRemaining.release();
+  }
 }
   }
-}
-  }, null);
+}, null
+  );
--- End diff --

Nit: move this onto the previous line


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: [SPARK-25408] Move to mode ideomatic Java8

2018-10-04 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/22637#discussion_r222892759
  
--- Diff: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/CLIService.java 
---
@@ -154,6 +154,7 @@ public synchronized void start() {
   throw new ServiceException("Unable to connect to MetaStore!", e);
 }
 finally {
+  // IMetaStoreClient is not AutoCloseable, closing it manually
--- End diff --

I wouldn't bother with this. This is a copy of some Hive code anyway


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #22637: Spark 25408

2018-10-04 Thread Fokko
GitHub user Fokko opened a pull request:

https://github.com/apache/spark/pull/22637

Spark 25408

## What changes were proposed in this pull request?

(Please fill in changes proposed in this fix)

## How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Fokko/spark SPARK-25408

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/22637.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #22637


commit d8d7c890f3d6914e47fc613f19415121809f4115
Author: Fokko Driesprong 
Date:   2018-09-11T20:37:33Z

[SPARK-25408] Move to mode ideomatic Java8

Use features from Java8 such as:
- Try-with-resource blocks

commit 99fd3a5af2c4fddc752b5b9c8725c4b68b2c9dbc
Author: Fokko Driesprong 
Date:   2018-10-05T04:58:37Z

[SPARK-25408] Move to mode ideomatic Java8

Fix the un AutoCloseable operations




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org