This is an automated email from the ASF dual-hosted git repository.
kfaraz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/druid.git
The following commit(s) were added to refs/heads/master by this push:
new 5fee4e1960a docs: fix syntax (#17859)
5fee4e1960a is described below
commit 5fee4e1960a214731616c1aab1f3de657e22e5f3
Author: 317brian <[email protected]>
AuthorDate: Tue Apr 1 20:01:31 2025 -0700
docs: fix syntax (#17859)
* docs: fix syntax
* fix node version
* fix docusaurus detected ones
* spelling file
* Update docs/querying/sql-functions.md
---------
Co-authored-by: Victoria Lim <[email protected]>
---
.github/workflows/static-checks.yml | 2 +-
docs/querying/sql-functions.md | 11 +++++++----
docs/tutorials/tutorial-extern.md | 2 +-
docs/tutorials/tutorial-rollup.md | 4 ++--
docs/tutorials/tutorial-sketches-theta.md | 5 ++---
docs/tutorials/tutorial-transform.md | 8 ++++----
website/.spelling | 2 ++
7 files changed, 19 insertions(+), 15 deletions(-)
diff --git a/.github/workflows/static-checks.yml
b/.github/workflows/static-checks.yml
index 6a149a46a6d..ee93ad2f581 100644
--- a/.github/workflows/static-checks.yml
+++ b/.github/workflows/static-checks.yml
@@ -160,7 +160,7 @@ jobs:
- name: setup node
uses: actions/setup-node@v3
with:
- node-version: 16.17.0
+ node-version: 18.0.0
- name: docs
run: |
diff --git a/docs/querying/sql-functions.md b/docs/querying/sql-functions.md
index 2eebcc5b2de..05e1faafd68 100644
--- a/docs/querying/sql-functions.md
+++ b/docs/querying/sql-functions.md
@@ -1333,9 +1333,10 @@ Computes a [Bloom
filter](../development/extensions-core/bloom-filter.md) from v
`numEntries` specifies the maximum number of distinct values before the
false positive rate increases.
* **Function type:** Aggregation
-<details><summary>Example</summary>
+<details>
+<summary>Example</summary>
-The following example returns a Base64-encoded Bloom filter representing the
set of devices ,`agent_category`, used in Albania:
+The following example returns a Base64-encoded Bloom filter representing the
set of devices, `agent_category`, used in Albania:
```sql
SELECT "country",
@@ -1362,7 +1363,8 @@ Returns true if an expression is contained in a
Base64-encoded [Bloom filter](..
* **Syntax:** `BLOOM_FILTER_TEST(expr, <STRING>)`
* **Function type:** Scalar, other
-<details><summary>Example</summary>
+<details>
+<summary>Example</summary>
The following example returns `true` when a device type, `agent_category`,
exists in the Bloom filter representing the set of devices used in Albania:
@@ -1912,7 +1914,8 @@ To enable support for a complex data type, load the
[corresponding extension](..
* **Syntax:** `DECODE_BASE64_COMPLEX(dataType, expr)`
* **Function type:** Scalar, other
-<details><summary>Example</summary>
+<details>
+<summary>Example</summary>
The following example returns a Theta sketch complex type from a
Base64-encoded string representation of the sketch:
diff --git a/docs/tutorials/tutorial-extern.md
b/docs/tutorials/tutorial-extern.md
index 7ec77f82d17..a58ed13a67f 100644
--- a/docs/tutorials/tutorial-extern.md
+++ b/docs/tutorials/tutorial-extern.md
@@ -203,4 +203,4 @@ Druid supports Amazon S3 or Google Cloud Storage (GCS) as
cloud storage destinat
See the following topics for more information:
* [Export to a
destination](../multi-stage-query/reference.md#extern-to-export-to-a-destination)
for a reference of the EXTERN.
-* [SQL-based ingestion
security](../multi-stage-query/security.md/#permissions-for-durable-storage)
for cloud permission requirements for MSQ.
+* [SQL-based ingestion
security](../multi-stage-query/security.md#permissions-for-durable-storage) for
cloud permission requirements for MSQ.
diff --git a/docs/tutorials/tutorial-rollup.md
b/docs/tutorials/tutorial-rollup.md
index 464197d551c..12a7e2a900a 100644
--- a/docs/tutorials/tutorial-rollup.md
+++ b/docs/tutorials/tutorial-rollup.md
@@ -30,7 +30,7 @@ This tutorial demonstrates how to apply rollup during
ingestion and highlights i
## Prerequisites
-Before proceeding, download Druid as described in [Quickstart
(local)](index.md) and have it running on your local machine. You don't need to
load any data into the Druid cluster.
+Before proceeding, download Druid as described in [Quickstart
(local)](./index.md) and have it running on your local machine. You don't need
to load any data into the Druid cluster.
You should be familiar with data querying in Druid. If you haven't already, go
through the [Query data](../tutorials/tutorial-query.md) tutorial first.
@@ -52,7 +52,7 @@ The data contains packet and byte counts from a source IP
address to a destinati
{"timestamp":"2018-01-02T21:35:45Z","srcIP":"7.7.7.7",
"dstIP":"8.8.8.8","packets":12,"bytes":2818}
```
-Load the sample dataset using the [`INSERT
INTO`](../multi-stage-query/reference.md/#insert) statement and the
[`EXTERN`](../multi-stage-query/reference.md/#extern-function) function to
ingest the data inline. In the [Druid web
console](../operations/web-console.md), go to the **Query** view and run the
following query:
+Load the sample dataset using the [`INSERT
INTO`](../multi-stage-query/reference.md#insert) statement and the
[`EXTERN`](../multi-stage-query/reference.md#extern-function) function to
ingest the data inline. In the [Druid web
console](../operations/web-console.md), go to the **Query** view and run the
following query:
```sql
INSERT INTO "rollup_tutorial"
diff --git a/docs/tutorials/tutorial-sketches-theta.md
b/docs/tutorials/tutorial-sketches-theta.md
index b5261776a02..681db68b88a 100644
--- a/docs/tutorials/tutorial-sketches-theta.md
+++ b/docs/tutorials/tutorial-sketches-theta.md
@@ -60,7 +60,7 @@ In this tutorial, you will learn how to do the following:
## Prerequisites
-Before proceeding, download Druid as described in the [single-machine
quickstart](index.md) and have it running on your local machine. You don't need
to load any data into the Druid cluster.
+Before proceeding, download Druid as described in the [single-machine
quickstart](./index.md) and have it running on your local machine. You don't
need to load any data into the Druid cluster.
It's helpful to have finished [Tutorial: Loading a
file](../tutorials/tutorial-batch.md) and [Tutorial: Querying
data](../tutorials/tutorial-query.md).
@@ -97,8 +97,7 @@ date,uid,show,episode
## Ingest data using Theta sketches
-Load the sample dataset using the [`INSERT
INTO`](../multi-stage-query/reference.md/#insert) statement and the
[`EXTERN`](../multi-stage-query/reference.md/#extern-function) function to
ingest the sample data inline. In the [Druid web
console](../operations/web-console.md), go to the **Query** view and run the
following query:
-
+Load the sample dataset using the [`INSERT
INTO`](../multi-stage-query/reference.md#insert) statement and the
[`EXTERN`](../multi-stage-query/reference.md#extern-function) function to
ingest the sample data inline. In the [Druid web
console](../operations/web-console.md), go to the **Query** view and run the
following query:
```sql
INSERT INTO "ts_tutorial"
diff --git a/docs/tutorials/tutorial-transform.md
b/docs/tutorials/tutorial-transform.md
index 3c9815b82bb..930dc5bad32 100644
--- a/docs/tutorials/tutorial-transform.md
+++ b/docs/tutorials/tutorial-transform.md
@@ -28,7 +28,7 @@ This tutorial demonstrates how to transform input data during
ingestion.
## Prerequisite
-Before proceeding, download Apache Druid® as described in [Quickstart
(local)](index.md) and have it running on your local machine. You don't need to
load any data into the Druid cluster.
+Before proceeding, download Apache Druid® as described in [Quickstart
(local)](./index.md) and have it running on your local machine. You don't need
to load any data into the Druid cluster.
You should be familiar with data querying in Druid. If you haven't already, go
through the [Query data](../tutorials/tutorial-query.md) tutorial first.
@@ -45,7 +45,7 @@ For this tutorial, you use the following sample data:
## Transform data during ingestion
-Load the sample dataset using the [`INSERT
INTO`](../multi-stage-query/reference.md/#insert) statement and the
[`EXTERN`](../multi-stage-query/reference.md/#extern-function) function to
ingest the data inline. In the [Druid web
console](../operations/web-console.md), go to the **Query** view and run the
following query:
+Load the sample dataset using the [`INSERT
INTO`](../multi-stage-query/reference.md#insert) statement and the
[`EXTERN`](../multi-stage-query/reference.md#extern-function) function to
ingest the data inline. In the [Druid web
console](../operations/web-console.md), go to the **Query** view and run the
following query:
```sql
INSERT INTO "transform_tutorial"
@@ -65,7 +65,7 @@ PARTITIONED BY DAY
```
In the `SELECT` clause, you specify the following transformations:
-* `animal`: prepends "super-" to the values in the `animal` column using the
[`TEXTCAT`](../querying/sql-functions.md/#textcat) function. Note that it only
ingests the transformed data.
+* `animal`: prepends "super-" to the values in the `animal` column using the
[`TEXTCAT`](../querying/sql-functions.md#textcat) function. Note that it only
ingests the transformed data.
* `triple-number`: multiplies the `number` column by three and stores the
results in a column named `triple-number`. Note that the query ingests both the
original and the transformed data.
Additionally, the `WHERE` clause applies the following three OR operators so
that the query only ingests the rows where at least one of the following
conditions is `true`:
@@ -99,5 +99,5 @@ Notice how the "lion" row is missing, and how the other three
rows that were ing
See the following topics for more information:
* [All functions](../querying/sql-functions.md) for a list of functions that
can be used to transform data.
-* [Transform spec reference](../ingestion/ingestion-spec.md/#transformspec) to
learn more about transforms in JSON-based batch ingestion.
+* [Transform spec reference](../ingestion/ingestion-spec.md#transformspec) to
learn more about transforms in JSON-based batch ingestion.
* [WHERE clause](../querying/sql.md#where) to learn how to specify filters in
Druid SQL.
\ No newline at end of file
diff --git a/website/.spelling b/website/.spelling
index 8df218d3351..2c84ee43b9c 100644
--- a/website/.spelling
+++ b/website/.spelling
@@ -18,6 +18,8 @@
# global dictionary is at the start, file overrides afterwards
# one word per line, to define a file override use ' - filename'
# where filename is relative to this configuration file
+BrowserOnly
+docusaurus
1M
100MiB
32-bit
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]