luoyuxia commented on code in PR #1924:
URL: https://github.com/apache/fluss/pull/1924#discussion_r2503520718
##########
website/docs/quickstart/flink-lake.md:
##########
@@ -253,26 +352,26 @@ CREATE TABLE enriched_orders (
`cust_mktsegment` STRING,
`nation_name` STRING,
PRIMARY KEY (`order_key`) NOT ENFORCED
+) WITH (
+ 'table.datalake.enabled' = 'true',
+ 'table.datalake.freshness' = '30s'
);
```
-## Streaming into Fluss
-
-First, run the following SQL to sync data from source tables to Fluss tables:
+Next, perform streaming data writing into the **datalake-enabled** table,
`datalake_enriched_orders`:
Review Comment:
From
```
Next, perform streaming data writing into the **datalake-enabled**
```
to
```
INSERT INTO datalake_enriched_orders
```
we can reuse same content, but remember use
```
INSERT INTO datalake_enriched_orders
SELECT o.order_key,
o.cust_key,
o.total_price,
o.order_date,
o.order_priority,
o.clerk,
c.name,
c.phone,
c.acctbal,
c.mktsegment,
n.name
FROM (
SELECT *, PROCTIME() as ptime
FROM `default_catalog`.`default_database`.source_order
) o
LEFT JOIN fluss_customer FOR SYSTEM_TIME AS OF o.ptime AS c
ON o.cust_key = c.cust_key
LEFT JOIN fluss_nation FOR SYSTEM_TIME AS OF o.ptime AS n
ON c.nation_key = n.nation_key;
```
##########
website/docs/quickstart/flink-lake.md:
##########
@@ -23,6 +21,132 @@ We encourage you to use a recent version of Docker and
[Compose v2](https://docs
### Starting required components
+<Tabs groupId="lake-tabs">
+ <TabItem value="paimon" label="Paimon" default>
+
+We will use `docker compose` to spin up the required components for this
tutorial.
+
+1. Create a working directory for this guide.
+
+```shell
+mkdir fluss-quickstart-flink
Review Comment:
```suggestion
mkdir fluss-quickstart-flink-paimon
```
##########
website/docs/quickstart/flink-lake.md:
##########
@@ -518,12 +623,5 @@ docker compose exec taskmanager tree /tmp/iceberg/fluss
```
The files adhere to Iceberg's standard format, enabling seamless querying with
other engines such as
[Spark](https://iceberg.apache.org/docs/latest/spark-queries/) and
[Trino](https://trino.io/docs/current/connector/iceberg.html).
-## Clean up
-After finishing the tutorial, run `exit` to exit Flink SQL CLI Container and
then run
-```shell
-docker compose down -v
-```
-to stop all containers.
-
-## Learn more
-Now that you're up and running with Fluss and Flink with Iceberg, check out
the [Apache Flink Engine](engine-flink/getting-started.md) docs to learn more
features with Flink or [this guide](/maintenance/observability/quickstart.md)
to learn how to set up an observability stack for Fluss and Flink.
\ No newline at end of file
+ </TabItem>
+</Tabs>
Review Comment:
we still need clean up
##########
website/docs/quickstart/flink-lake.md:
##########
@@ -437,8 +447,103 @@ LEFT JOIN fluss_nation FOR SYSTEM_TIME AS OF o.ptime AS n
ON c.nation_key = n.nation_key;
```
+ </TabItem>
+</Tabs>
+
### Real-Time Analytics on Fluss datalake-enabled Tables
Review Comment:
There is no much difference in this step, can these two tabs be merged into
one except view the files?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]