This is an automated email from the ASF dual-hosted git repository.
adutra pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/polaris.git
The following commit(s) were added to refs/heads/main by this push:
new 3c3fc95e docs: adapt regtests README.md to Quarkus (#846)
3c3fc95e is described below
commit 3c3fc95e26018f28e82d578be9bf72e7151c4dba
Author: Alexandre Dutra <[email protected]>
AuthorDate: Fri Jan 24 10:33:41 2025 +0100
docs: adapt regtests README.md to Quarkus (#846)
---
regtests/README.md | 108 ++++++++++++++++++++++++++++++++++++++++++-----------
1 file changed, 86 insertions(+), 22 deletions(-)
diff --git a/regtests/README.md b/regtests/README.md
index d0844f43..90a0f75a 100644
--- a/regtests/README.md
+++ b/regtests/README.md
@@ -21,31 +21,78 @@
# End-to-end regression tests
-## Run Tests With Docker Compose
+Regression tests are either run in Docker, using docker-compose to orchestrate
the tests, or
+locally.
+
+## Prerequisites
-Tests can be run with docker-compose by executing
+It is recommended to clean the `regtests/output` directory before running
tests. This can be done by
+running:
-```bash
-docker compose up --build --exit-code-from regtest
+```shell
+rm -rf ./regtests/output && mkdir -p ./regtests/output && chmod -R 777
./regtests/output
```
-This is the flow used in CI and should be done locally before pushing to
github to ensure that no environmental
-factors contribute to the outcome of the tests.
+## Run Tests With Docker Compose
+
+Tests can be run with docker-compose using the provided
`./regtests/docker-compose.yml` file, as
+follows:
+
+```shell
+./gradlew :polaris-quarkus-server:assemble -Dquarkus.container-image.build=true
+docker compose -f ./regtests/docker-compose.yml up --build --exit-code-from
regtest
+```
-## Run all tests
+In this setup, a Polaris container will be started in a docker-compose group,
using the image
+previously built by the Gradle build. Then another container, including a
Spark SQL shell, will run
+the tests. The exit code will be the same as the exit code of the Spark
container.
-Polaris REST server must be running on localhost:8181 before running tests.
+This is the flow used in CI and should be done locally before pushing to
GitHub to ensure that no
+environmental factors contribute to the outcome of the tests.
-Running test harness will automatically run idempotent setup script.
+**Important**: if you are also using minikube, for example to test the Helm
chart, you may need to
+_unset_ the Docker environment that was pointing to the Minikube Docker
daemon, otherwise the image
+will be built by the Minikube Docker daemon and will not be available to the
local Docker daemon.
+This can be done by running, _before_ building the image and running the tests:
+```shell
+eval $(minikube -p minikube docker-env --unset)
```
-./run.sh
+
+## Run Tests Locally
+
+Regression tests can be run locally as well, using the test harness.
+
+In this setup, a Polaris server must be running on localhost:8181 before
running tests. The simplest
+way to do this is to run the Polaris server in a separate terminal window:
+
+```shell
+./gradlew polarisServerRun \
+ '-Dpolaris.authentication.authenticator.type=test' \
+ '-Dpolaris.authentication.token-service.type=test' \
+
'-Dpolaris.features.defaults."SUPPORTED_CATALOG_STORAGE_TYPES"=["FILE","S3","GCS","AZURE"]'
\
+ '-Dpolaris.realm-context.realms=default-realm,realm1' \
+ '-Dquarkus.otel.sdk.disabled=true'
```
-## Run in VERBOSE mode with test stdout printing to console
+Note: the regression tests expect Polaris to run with certain options, e.g.
with support for `FILE`
+storage and with realms `default-realm,realm1`; if you run the above command,
this will be the case.
+If you run Polaris in a different way, make sure that Polaris is configured
appropriately.
+Running the test harness will automatically run the idempotent setup script.
From the root of the
+project, just run:
+
+```shell
+env POLARIS_HOST=localhost ./regtests/run.sh
```
-VERBOSE=1 ./run.sh t_spark_sql/src/spark_sql_basic.sh
+
+To run the tests in verbose mode, with test stdout printing to console, set
the `VERBOSE`
+environment variable to `1`; you can also choose to run only a subset of tests
by specifying the
+test directories as arguments to `run.sh`. For example, to run only the
`t_spark_sql` tests in
+verbose mode:
+
+```shell
+VERBOSE=1 POLARIS_HOST=localhost ./regtests/run.sh
t_spark_sql/src/spark_sql_basic.sh
```
## Run with Cloud resources
@@ -78,12 +125,14 @@ into the `credentials` folder. Then specify the name of
the file in your .env fi
path, as `/tmp/credentials` is the folder on the container where the
credentials file will be mounted.
-## Experiment with failed test
+## Fixing a failed test due to incorrect expected output
-```
-rm t_hello_world/ref/hello_world.sh.ref
-./run.sh
-```
+If a test fails due to incorrect expected output, the test harness will
generate a script to help
+you compare the actual output with the expected output. The script will be
located in the `output`
+directory, and will have the same name as the test, with the extension
`.fixdiffs.sh`.
+
+For example, if the test `t_hello_world` fails, the script to compare the
actual and expected output
+will be located at `output/t_hello_world/hello_world.sh.fixdiffs.sh`:
```
Tue Apr 23 06:32:23 UTC 2024: Running all tests
@@ -96,26 +145,41 @@ Tue Apr 23 06:32:32 UTC 2024: Test run concluded for
t_spark_sql:spark_sql_basic
Tue Apr 23 06:32:32 UTC 2024: Test SUCCEEDED: t_spark_sql:spark_sql_basic.sh
```
-Simply run the specified fixdiffs file to run `meld` and fix the ref file.
+Simply execute the specified `fixdiffs.sh` file, which will in turn run `meld`
and fix the ref file:
```
/tmp/polaris-regtests/t_hello_world/hello_world.sh.fixdiffs.sh
```
+Then commit the changes to the ref file.
+
## Run a spark-sql interactive shell
-With in-memory standalong Polaris running:
+With a Polaris server running in "dev" mode (see above), you can run a
spark-sql interactive shell
+to test. From the root of the project:
+```shell
+POLARIS_HOST=localhost ./regtests/run_spark_sql.sh
```
-./run_spark_sql.sh
+
+Some SQL commands that you can try:
+
+```sql
+create database db1;
+show databases;
+create table db1.table1 (id int, name string);
+insert into db1.table1 values (1, 'a');
+select * from db1.table1;
```
+Other commands are available in the `regtests/t_spark_sql/src` directory.
+
## Python Tests
Python tests are based on `pytest`. They rely on a python Polaris client,
which is generated from the openapi spec.
The client can be generated using two commands:
-```bash
+```shell
# generate the management api client
$ docker run --rm \
-v ${PWD}:/local openapitools/openapi-generator-cli generate \
@@ -134,7 +198,7 @@ $ docker run --rm \
Tests rely on Python 3.8 or higher. `pyenv` can be used to install a current
version and mapped to the local directory
by using
-```bash
+```shell
pyenv install 3.8
pyenv local 3.8
```