[ 
https://issues.apache.org/jira/browse/GOBBLIN-1709?focusedWorklogId=809585&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-809585
 ]

ASF GitHub Bot logged work on GOBBLIN-1709:
-------------------------------------------

                Author: ASF GitHub Bot
            Created on: 16/Sep/22 16:40
            Start Date: 16/Sep/22 16:40
    Worklog Time Spent: 10m 
      Work Description: meethngala commented on code in PR #3560:
URL: https://github.com/apache/gobblin/pull/3560#discussion_r973206997


##########
gobblin-data-management/src/main/java/org/apache/gobblin/data/management/copy/iceberg/IcebergDatasetFinder.java:
##########
@@ -0,0 +1,97 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.gobblin.data.management.copy.iceberg;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Properties;
+import lombok.AllArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.commons.lang.StringUtils;
+import org.apache.gobblin.dataset.DatasetConstants;
+import org.apache.gobblin.dataset.IterableDatasetFinder;
+import org.apache.gobblin.util.HadoopUtils;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.FileSystem;
+import org.apache.hadoop.fs.Path;
+
+/**
+ * Finds {@link IcebergDataset}s. Will look for tables in a database using a 
{@link IcebergCatalog},
+ * and creates a {@link IcebergDataset} for each one.
+ */
+@Slf4j
+@AllArgsConstructor
+public class IcebergDatasetFinder implements 
IterableDatasetFinder<IcebergDataset> {
+
+  public static final String ICEBERG_DATASET_PREFIX = 
DatasetConstants.PLATFORM_ICEBERG + ".dataset";
+  public static final String ICEBERG_HIVE_CATALOG_METASTORE_URI_KEY = 
ICEBERG_DATASET_PREFIX + ".hive.metastore.uri";
+  public static final String ICEBERG_DB_NAME = ICEBERG_DATASET_PREFIX + 
".database.name";
+  public static final String ICEBERG_TABLE_NAME = ICEBERG_DATASET_PREFIX + 
".table.name";
+
+  private String dbName;
+  private String tblName;
+  private final Properties properties;
+  protected final FileSystem fs;
+
+  /**
+   * Finds all {@link IcebergDataset}s in the file system using the Iceberg 
Catalog.
+   * @return List of {@link IcebergDataset}s in the file system.
+   * @throws IOException
+   */
+  @Override
+  public List<IcebergDataset> findDatasets() throws IOException {
+    List<IcebergDataset> matchingDatasets = new ArrayList<>();
+    /*
+     * Both Iceberg database name and table name are mandatory based on 
current implementation.
+     * Later we may explore supporting datasets similar to Hive
+     */
+    if (StringUtils.isNotBlank(properties.getProperty(ICEBERG_DB_NAME)) || 
StringUtils.isNotBlank(properties.getProperty(ICEBERG_TABLE_NAME))) {
+      throw new IllegalArgumentException(String.format("Iceberg database name: 
{%s} or Iceberg table name: {%s} is missing",
+          ICEBERG_DB_NAME, ICEBERG_TABLE_NAME));
+    }
+    this.dbName = properties.getProperty(ICEBERG_DB_NAME);
+    this.tblName = properties.getProperty(ICEBERG_TABLE_NAME);
+
+    Configuration configuration = 
HadoopUtils.getConfFromProperties(properties);
+
+    IcebergCatalog icebergCatalog = 
IcebergCatalogFactory.create(configuration);
+    /* Each Iceberg dataset maps to an Iceberg table
+     * TODO: The user provided database and table names needs to be 
pre-checked and verified against the existence of a valid Iceberg table
+     */
+    matchingDatasets.add(createIcebergDataset(dbName, tblName, icebergCatalog, 
properties, fs));
+    log.info("Found {} matching datasets: {}", matchingDatasets.size(), 
matchingDatasets);

Review Comment:
   done!





Issue Time Tracking
-------------------

    Worklog Id:     (was: 809585)
    Time Spent: 6h 40m  (was: 6.5h)

> Create work units for Hive Catalog based Iceberg Datasets to support Distcp 
> for Iceberg
> ---------------------------------------------------------------------------------------
>
>                 Key: GOBBLIN-1709
>                 URL: https://issues.apache.org/jira/browse/GOBBLIN-1709
>             Project: Apache Gobblin
>          Issue Type: New Feature
>          Components: distcp-ng
>            Reporter: Meeth Gala
>            Assignee: Issac Buenrostro
>            Priority: Major
>          Time Spent: 6h 40m
>  Remaining Estimate: 0h
>
> We want to support Distcp for Iceberg based datasets. 
> As a pilot, we are starting with Hive Catalog and will expand the 
> functionality to cover all Iceberg based datasets.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to