rdblue commented on a change in pull request #1161:
URL: https://github.com/apache/iceberg/pull/1161#discussion_r449655771



##########
File path: 
spark2/src/test/java/org/apache/iceberg/actions/TestRemoveOrphanFilesAction.java
##########
@@ -542,4 +545,41 @@ public void testRemoveOrphanFilesWithRelativeFilePath() 
throws IOException, Inte
     Assert.assertEquals("Action should find 1 file", invalidFiles, result);
     Assert.assertTrue("Invalid file should be present", fs.exists(new 
Path(invalidFiles.get(0))));
   }
+
+  @Test
+  public void testRemoveOrphanFilesWithHadoopCatalog() throws 
InterruptedException {
+    HadoopCatalog catalog = new HadoopCatalog(new Configuration(), 
tableLocation);
+    String namespaceName = "testDb";
+    String tableName = "testTb";
+
+    Namespace namespace = Namespace.of(namespaceName);
+    TableIdentifier tableIdentifier = TableIdentifier.of(namespace, tableName);
+    Table table = catalog.createTable(tableIdentifier, SCHEMA, 
PartitionSpec.unpartitioned(), Maps.newHashMap());
+
+    List<ThreeColumnRecord> records = Lists.newArrayList(
+            new ThreeColumnRecord(1, "AAAAAAAAAA", "AAAA")
+    );
+    Dataset<Row> df = spark.createDataFrame(records, 
ThreeColumnRecord.class).coalesce(1);
+
+    String tableFileSystemPath = tableLocation + "/" + namespaceName + "/" + 
tableName;
+    df.select("c1", "c2", "c3")
+            .write()

Review comment:
       Nit: Indentation is off. It should be 2 indents (4 spaces) from the 
indent of the line that is being continued, `df.select(...)`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to