pratyakshsharma commented on a change in pull request #2073:
URL: https://github.com/apache/hudi/pull/2073#discussion_r490165732
##########
File path: docs/_posts/2020-08-22-ingest-multiple-tables-using-hudi.md
##########
@@ -0,0 +1,101 @@
+---
+title: "Ingest multiple tables using Hudi"
+excerpt: "Ingesting multiple tables using Hudi at a single go is now possible.
This blog gives a detailed explanation of how to achieve the same using
`HoodieMultiTableDeltaStreamer.java`"
+author: pratyakshsharma
+category: blog
+---
+
+When building a change data capture pipeline for already existing or newly
created relational databases, one of the most common problems that one faces is
simplifying the onboarding process for multiple tables. Ingesting multiple
tables to Hudi dataset at a single go is now possible using
`HoodieMultiTableDeltaStreamer` class which is a wrapper on top of the more
popular `HoodieDeltaStreamer` class. Currently `HoodieMultiTableDeltaStreamer`
supports COPY_ON_WRITE storage type only and the ingestion is done in a
sequential way.
+
+This blog will guide you through configuring and running
`HoodieMultiTableDeltaStreamer`.
+
+### Configuration
+
+ - `HoodieMultiTableDeltaStreamer` expects users to maintain table wise
overridden properties in separate files in a dedicated config folder. Common
properties can be configured via common properties file also.
+ - By default, hudi datasets are created under the path
`<base-path-prefix>/<database_name>/<name_of_table_to_be_ingested>`.
Review comment:
Doing that.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]