[
https://issues.apache.org/jira/browse/HUDI-4018?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17532554#comment-17532554
]
sivabalan narayanan commented on HUDI-4018:
-------------------------------------------
# simple sanity (1 insert, 1 upsert, 1 update(spark-sql), 1 delete. validate.)
: to assist in catching any bundling issues or basic regression.
# non-core write operations: insert override table, insert overwrite, delete
partitions.
# Testing immutable data: testing pure bulk_inserts and pure inserts.
# long running tests (multiple batches of insert, upserts, deletes and
validation) atleast 100 commits.
## we will make cleaner and archival configs aggressive (<10) so that those
get exercised often during these tests.
# Savepoint, restore tests.
> Prepare minimal set of yamls to be tested against any write mode and against
> any query engine
> ---------------------------------------------------------------------------------------------
>
> Key: HUDI-4018
> URL: https://issues.apache.org/jira/browse/HUDI-4018
> Project: Apache Hudi
> Issue Type: Improvement
> Components: tests-ci
> Reporter: sivabalan narayanan
> Assignee: sivabalan narayanan
> Priority: Major
> Labels: pull-request-available
> Fix For: 0.12.0
>
> Original Estimate: 8h
> Remaining Estimate: 8h
>
> Prepare 5 to 8 minimal set of yamls that can be used against any write mode
> and against any query engine and table types.
>
> For eg:
> lets say we come up with 6 yamls covering all cases.
> Same set should work for all possible combinations from below categories.
>
> Table type:
> COW/MOR
> Metadata:
> enable/disable
> Dataset type:
> partitioned/non-partitioned
> Write mode:
> delta streamer, spark datasource, spark sql, spark streaming sink
>
> Query engine:
> spark datasource, hive, presto, trino
>
>
>
>
--
This message was sent by Atlassian Jira
(v8.20.7#820007)