nsivabalan commented on a change in pull request #1858:
URL: https://github.com/apache/hudi/pull/1858#discussion_r466989376



##########
File path: hudi-cli/src/main/java/org/apache/hudi/cli/commands/SparkMain.java
##########
@@ -329,9 +341,34 @@ private static int deleteSavepoint(JavaSparkContext jsc, 
String savepointTime, S
     }
   }
 
+  /**
+   * Upgrade or downgrade hoodie table.
+   * @param jsc instance of {@link JavaSparkContext} to use.
+   * @param basePath base path of the dataset.
+   * @param toVersion version to which upgrade/downgrade to be done.
+   * @return 0 if success, else -1.
+   * @throws Exception
+   */
+  protected static int upgradeOrDowngradeHoodieDataset(JavaSparkContext jsc, 
String basePath, String toVersion) throws Exception {
+    HoodieWriteConfig config = getWriteConfig(basePath);
+    HoodieTableMetaClient metaClient = 
ClientUtils.createMetaClient(jsc.hadoopConfiguration(), config, false);
+    try {
+      UpgradeDowngradeUtil.doUpgradeOrDowngrade(metaClient, 
HoodieTableVersion.valueOf(toVersion), config, jsc, null);

Review comment:
       I am not sure if migrate will be the right terminology to use here. 
Isn't migrate used to move from one system to another? This is more of an 
upgrade version or downgrade version right within the same system(hudi).  




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to