This is an automated email from the ASF dual-hosted git repository.

gerlowskija pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/solr-sandbox.git


The following commit(s) were added to refs/heads/main by this push:
     new 22f41cd  Add automation for running gatling nightly (#124)
22f41cd is described below

commit 22f41cd192b3212ffa2e29cc375fad8d71532dc2
Author: Jason Gerlowski <[email protected]>
AuthorDate: Fri Oct 10 08:38:24 2025 -0400

    Add automation for running gatling nightly (#124)
    
    Primary entrypoint is `run-benchmark-on-commits.sh`, which takes a
    list of commits (or a "branch", "startCommit" and "endCommit")
    and iterates over them, running Gatling benchmarks on each and storing
    the results in a data directory.
    
    Also added is a `periodic-benchmark.sh` script that is a little more 
suitable
    to run in a cronjob.  It works by recording some state about previous runs,
    allowing the current run to "remember" where previous runs left off and
    benchmark all commits since that point.
    
    A Jenkins integration or something else would obviously be nicer as a long
    term solution.  That's probably the "right" way to do this sort of recurring
    automation.
    
    But providing a simple path for folks to get up and running that doesn't
    involve installing and running Jenkins seems like a nice 80/20 tradeoff
    that lowers the barrier to entry a good deal.
    
    ---------
    
    Co-authored-by: Eric Pugh <[email protected]>
---
 scripts/gatling/README.md                   |  51 +++++++++++
 scripts/gatling/lib/env-state.sh            | 136 ++++++++++++++++++++++++++++
 scripts/gatling/lib/gatling.sh              |  20 ++++
 scripts/gatling/lib/git.sh                  |  99 ++++++++++++++++++++
 scripts/gatling/lib/solr.sh                 |  69 ++++++++++++++
 scripts/gatling/periodic-benchmark.sh       |  83 +++++++++++++++++
 scripts/gatling/run-benchmark-on-commits.sh | 102 +++++++++++++++++++++
 7 files changed, 560 insertions(+)

diff --git a/scripts/gatling/README.md b/scripts/gatling/README.md
new file mode 100644
index 0000000..80271ad
--- /dev/null
+++ b/scripts/gatling/README.md
@@ -0,0 +1,51 @@
+# Gatling Benchmark Scripts
+
+This directory contains automation scripts for running Gatling benchmarks 
(located in the `gatling-simulations` module) on a periodic basis.
+
+Each script "run" is intended to (1) identify the commits that are new since 
the previous run, (2) build a Solr distribution representing that commit, (3) 
start Solr, and (4) run a set of Gatling benchmarks using the Solr node. 
+
+## "State" and Output
+
+The scripts in this directory can be run manually, but are primarily intended 
to be invoked via a cronjob or in some other automated fashion.
+To this end the scripts are "stateful", so that they can remember what commits 
are "new" since the last run.
+By default, all state is stored in the `$HOME/.solr-benchmarks` directory.
+(This can be overridden by exporting a `BENCH_STATE_ROOT` env var pointing to 
an alternate location.)
+This directory is used to store repository checkouts, a "state" text file 
which remembers which commits have been benchmarked, and benchmark results 
themselves (located in the `results` subdirectory).
+
+Scripts store all Gatling output for each "run", which primarily consists of a 
HTML "report" web page and any JS and other assets that this requires.
+(We don't yet have a good way to aggregate these reports over time and show 
trendlines or other longitudinal data, but this is in progress.)
+
+## Customization
+
+The scripts support several environment variables that can be used to 
customize different aspects of the automation:
+
+- `START_SOLR_OPTS` - a string containing any arguments to pass as a part of 
the `bin/solr start` command used to start Solr.  Particularly useful for 
specifying Solr heap size and adjacent settings. 
+- `GATLING_SIMULATION_LIST` - (TODO unimplemented) by default scripts will run 
all available Gatling simulations, but if a more targeted subset of simulations 
is desired, users can specify that as a comma-delimited list here
+- `ALERT_COMMAND` - (TODO unimplemented) an optional CLI command to run to 
alert users for any warnings or errors (e.g. a Solr commit that fails to 
build).  The command will be passed a single argument contain the text of the 
notification.  Alerting mechanisms can be relatively simple (e.g. using `echo` 
to write to a known location on disk, or `mail` to send email) or more complex 
(e.g. CLI integrations with PagerDuty or other notification providers like 
"Pushover")
+
+## Manual Invocation Examples
+
+Runs all benchmarks on the three specified commits.
+(Assumes commits are on Solr's `main` branch, since `-b` not provided)
+```
+./scripts/gatling/run-benchmark-on-commits.sh -c df0fd5,9d1ecc,7f454a
+```
+
+Runs all benchmarks on all `branch_9x` commits between `88e363` and `b56ffc` 
(inclusive on both sides).
+```
+./scripts/gatling/run-benchmark-on-commits.sh -b branch_9x -e b56ffc -s 88e363
+```
+
+## Automated (i.e. cronjob) Invocation Examples
+
+The commands above are useful for running benchmarks on a static set (or 
range) of commits, but more useful for long term performance monitoring is to 
trigger the scripts to run periodically on any "new" commits since the last run.
+
+This requires either a CI system like Jenkins, or a cronjob to run the script 
at a regular interval.
+A full CI system will provide a more robust featureset for triggering, 
tracking, and debugging benchmark runs.
+However, for simplicity this directory also includes a cronjob-ready script 
(`periodic-benchmark.sh`) that can be used to run benchmarks periodically 
without the effort of setting up a complex system like Jenkins.
+
+The cronjob below will run all benchmarks nightly on each `main` commit that 
is "new" since the previous night's run.
+Script logs will be stored in the `/tmp` directory for review in case of error.
+```
+0 2 * * * cd /path/to/source/checkout/solr-sandbox ; export 
OUTFILE_NAME="/tmp/`date '+%Y-%m-%d'`-benchmark-run.log" ; 
./scripts/gatling/periodic-benchmark.sh -b main -t java21_on_main &> 
$OUTFILE_NAME
+```
diff --git a/scripts/gatling/lib/env-state.sh b/scripts/gatling/lib/env-state.sh
new file mode 100644
index 0000000..07a0b63
--- /dev/null
+++ b/scripts/gatling/lib/env-state.sh
@@ -0,0 +1,136 @@
+#!/bin/bash
+
+# This library contains utilities for setting up and reading various
+# state bits needed to persist from one run of the bench scripts to the
+# next
+
+
+BENCH_LIB_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
+
+BENCH_STATE_ROOT="${BENCH_STATE_ROOT:-$HOME/.solr-benchmarks}"
+BENCH_RESULT_DIR="$BENCH_STATE_ROOT/results"
+BENCH_LAST_RUN_STATE="$BENCH_STATE_ROOT/last-run.txt"
+export BENCH_CHECKOUT_DIR="$BENCH_STATE_ROOT/checkouts"
+export BENCH_SOLR_CHECKOUT_DIR="$BENCH_CHECKOUT_DIR/solr"
+
+# Creates the directories and data necessary for running solr-test-health
+# utilities.  Depending on the host machine, may need to be run with sudo or
+# elevated permissions.
+# TODO - make this (and leaf functions) more idempotent
+function env_state_bootstrap() {
+  mkdir -p $BENCH_STATE_ROOT
+
+  mkdir -p $BENCH_CHECKOUT_DIR
+  mkdir -p $BENCH_RESULT_DIR
+  source $BENCH_LIB_DIR/solr.sh
+
+  solr_checkout_source_in $BENCH_CHECKOUT_DIR
+}
+
+# Wipes all env-state from this host
+function env_state_wipe() {
+  rm -rf $BENCH_STATE_ROOT
+}
+
+#########################################################
+##################
+# "Last Run IDs"
+##################
+# Currently these are stored in a flat file with each line formatted as:
+#   ^<tag> <last-run-id>$
+# Neither the tag nor the last-run-id may contain spaces, and tags should
+# not be used that could be a potential run-id (typically a date or commit 
hash)
+#
+# In the future perhaps these should be stored with more structure, but this
+# is sufficient currently
+#########################################################
+
+# Reads a state identifier (usually a commit hash) associated with a 
particular tag
+# Typically this is used to identify the last commit covered by the previous 
run
+#
+# Tags and last-run-IDs must not contain whitespace.
+#
+#   Usage: last_hash=$(env_state_read_last_run_identifer bats-tests)
+function env_state_read_last_run_id() {
+  if [[ -z $1 ]]; then
+    >&2 echo "Required argument 'tag' was not provided"
+    return 1
+  fi
+
+  local tag=$1
+
+  if [[ ! -f $BENCH_LAST_RUN_STATE ]]; then
+    touch $BENCH_LAST_RUN_STATE
+  fi
+
+  local tag_line="$(cat $BENCH_LAST_RUN_STATE | grep $tag)"
+  if [[ -z "$tag_line" ]]; then
+    return
+  fi
+  echo "$tag_line" | cut -d " " -f 2
+}
+
+# Writes a state identifier (usually a commit hash) associated with a 
particular tag.
+# Typically this is used to identify the last commit covered by the previous 
run.
+#
+# Tags and last-run-IDs must not contain whitespace.
+#
+#   Usage: env_state_write_last_run_id "bats-tests" deadbeef
+function env_state_write_last_run_id() {
+  if [[ -z $1 ]]; then
+    >&2 echo "Required argument 'tag' was not provided"
+    return 1
+  fi
+  if [[ -z $2 ]]; then
+    >&2 echo "Required argument 'last_run_id' was not provided"
+    return 2
+  fi
+
+  local tag=$1
+  local last_run_id=$2
+  local tmp_file="${BENCH_LAST_RUN_STATE}.tmp"
+
+  if [[ ! -f $BENCH_LAST_RUN_STATE ]]; then
+    touch $BENCH_LAST_RUN_STATE
+  fi
+
+  # If the tag is new grep will return non-zero, so the '|| true' keeps 'set 
-e' happy
+  (cat $BENCH_LAST_RUN_STATE | grep -v $tag > $tmp_file) || true
+  echo "$tag $last_run_id" >> $tmp_file
+  mv $tmp_file $BENCH_LAST_RUN_STATE
+}
+
+#########################################################
+##################
+# "Benchmark Results"
+##################
+# Currently these are stored within the "state" root, in a subdirectory of
+# the form: `$BENCH_STATE_ROOT/results/<tag>/<commit>`
+#
+# "Tag" values are typically a branch name, but may be any user identifier
+# without spaces or special characters. (e.g. mainJava21).  We may need
+# something more elaborate to accomodate multiple variables going forward, but
+# lets see how this does as a starting point.
+#
+# Assumes the cwd is positioned at the root of the solr-sandbox checkout.
+#
+#   Usage: env_state_store_gatling_result <tag> <commit>
+#########################################################
+function env_state_store_gatling_result() {
+  local tag="${1:-}"
+  local commit="${2:-}"
+
+  if [[ -z $tag ]]; then
+    >&2 echo "Required argument 'tag' was not provided"
+    exit 1
+  fi
+  if [[ -z $commit ]]; then
+    >&2 echo "Required argument 'commit' was not provided"
+    exit 1
+  fi
+
+  local dest_dir="${BENCH_RESULT_DIR}/${tag}/${commit}"
+  mkdir -p $dest_dir
+
+  mv gatling-simulations/build/reports/gatling $dest_dir
+}
diff --git a/scripts/gatling/lib/gatling.sh b/scripts/gatling/lib/gatling.sh
new file mode 100644
index 0000000..57212b2
--- /dev/null
+++ b/scripts/gatling/lib/gatling.sh
@@ -0,0 +1,20 @@
+#!/bin/bash
+
+BENCHMARK_DATA_BASE_URL="https://nightlies.apache.org/solr/benchmark-data";
+WIKI_SOLR_FILE="solr-wiki-batches-5k-1k.tar.gz"
+
+# Assumes running from the root of Solr-sandbox
+function gatling_download_wiki_data() {
+  local batches_path=".gatling/batches"
+  mkdir -p $batches_path
+
+  if [ "$(ls -A $batches_path | wc -l)" -eq "0" ]; then
+    pushd $batches_path
+      wget ${BENCHMARK_DATA_BASE_URL}/wiki/${WIKI_SOLR_FILE}
+      tar -xvf ${WIKI_SOLR_FILE}
+      rm $WIKI_SOLR_FILE
+    popd
+  else
+    >&2 echo "Wiki data already downloaded; skipping..."
+  fi
+}
diff --git a/scripts/gatling/lib/git.sh b/scripts/gatling/lib/git.sh
new file mode 100644
index 0000000..8067a74
--- /dev/null
+++ b/scripts/gatling/lib/git.sh
@@ -0,0 +1,99 @@
+#!/bin/bash
+
+# Prints (to stdout) the hash of the latest commit on the specified
+# branch.  (The current branch is assumed if no branch name is
+# specified)
+#   Usage: commit_hash=$(git_echo_latest_commit_on_branch branch_9x)
+#          commit_hash=$(git_echo_latest_commit_on_branch)
+function git_echo_latest_commit_on_branch() {
+  local branch_id="${1:-}"
+  git log $branch_id -n 1 --format=%H
+}
+
+# Runs 'git clean -xfd' to ensure no files are left behind from a
+# previous branch.  'gradle.properties' is explicitly not removed so 
+# that users may specify custom debug or other settings for gradle
+# there
+#   Usage: git_clean
+function git_clean() {
+  git clean -xfd -e gradle.properties
+}
+
+function git_checkout() {
+  local commit_id="${1}"
+  git checkout $commit_id
+}
+
+# Ensures a source checkout is up to date via 'git fetch', optionally
+# doing a hard reset on a specified branch.  Assumes that the cwd is 
+# placed somewhere inside the desired git repository
+#   Usage: git_update_checkout branch_9x
+function git_update_checkout() {
+  git fetch
+  if [[ -n $1 ]]; then
+    local branch_name=$1
+    local branch_name_stash="$(git branch --show-current)"
+
+    git checkout $branch_name
+    git reset --hard origin/$branch_name
+    git checkout $branch_name_stash
+  fi
+}
+
+# Prints (to stdout), one per line, the commits since a specified
+# commit-hash (inclusive).  A commit hash or branch name can also be
+# specified for the upper bound; 'HEAD' will be used if not specified.
+#
+# Commit hashes are output in oldest-to-newest order.
+#   Usage: commit_list="$(git_list_commits_since deadbeef)"
+#          commit_list="$(git_list_commits_since deadbeef branch_9x)"
+function git_list_commits_since() {
+  if [[ -z ${1:-} ]]; then
+    >&2 echo "Required argument 'commit_hash' was not provided"
+    return 1
+  fi
+
+  local commit_hash="$1"
+  local upper_bound="${2:-HEAD}"
+
+  echo $commit_hash
+  git rev-list --reverse ${commit_hash}..${upper_bound}
+}
+
+# Prints (to stdout), the hash of the commit 'N' places back on the
+# specified branch.  Expects the CWD to already be stationed within the
+# git repository, though the repository needn't be on the specified
+# branch.  This function will re-position the repository to the original
+# branch before returning to the caller.
+#   Usage: several_back_hash=$(git_get_previous_commit_hash main 5)
+function git_get_previous_commit_hash() {
+  if [[ -z ${1:-} ]]; then
+    >&2 echo "Required argument 'branch' was not provided"
+    return 1
+  fi
+
+  if [[ -z ${2:-} ]]; then
+    >&2 echo "Required argument 'num_commits' was not provided"
+    return 1
+  fi
+
+  local branch="$1"
+  local num_commits="$2"
+
+  git_update_checkout $branch &> /dev/null
+  local branch_name_stash="$(git branch --show-current)"
+  git checkout $branch &> /dev/null
+    local commit_hash="$(git rev-parse HEAD~${num_commits})"
+  git checkout $branch_name_stash &> /dev/null
+
+  echo $commit_hash
+}
+
+# Prints (to stdout) the numeric count of a specified commit-hash from
+# the root of the branch where it resides.  'HEAD' is used if no commit-
+# hash is specified.
+#   Usage: commit_num=$(git_echo_commit_count deadbeef)
+function git_echo_commit_count() {
+  local commit_id="${1:-HEAD}"
+  git rev-list --count $commit_id
+}
diff --git a/scripts/gatling/lib/solr.sh b/scripts/gatling/lib/solr.sh
new file mode 100644
index 0000000..5c7e3b2
--- /dev/null
+++ b/scripts/gatling/lib/solr.sh
@@ -0,0 +1,69 @@
+#!/bin/bash
+
+function solr_checkout_source_in() {
+  if [[ -z $1 ]]; then
+    >&2 echo "Required argument 'location' was not provided"
+    return 1
+  fi
+
+  local location=$1
+  if [[ -d $location/solr ]]; then
+    # The checkout already exists I guess?
+    return 0
+  fi
+
+  pushd $location
+    git clone https://github.com/apache/solr.git
+  popd
+}
+
+# Returns with an exit status indicating whether Solr is (or isn't)
+# running at the specified port.  Callers provide the port as a required
+# argument.  Assumes that the wording directory is positioned so that
+# 'bin/solr' can be invoked.
+#   Usage: solr_is_running 8983
+function solr_is_running() {
+  if [[ -z ${1:-} ]]; then
+    >&2 echo "Required argument 'port' was not provided"
+    return 1
+  fi
+
+  local port=$1
+  bin/solr assert --started "http://localhost:$port/solr";
+  return $?
+}
+
+function solr_kill_all() {
+  ps -ef | grep solr | grep java | awk {'print $2'} | xargs -r kill -9
+}
+
+# Start Solr (with custom ops pulled from START_SOLR_OPTS)
+# Assumes cwd of the Solr package root
+function solr_start() {
+  # TODO The "OR" done here to handle recent Solr changes in its "cloud-mode"
+  # syntax is unfortunate, figure out a better way to handle these 9.x/10.x
+  # differences
+  bin/solr start ${START_SOLR_OPTS:-} || bin/solr start -c ${START_SOLR_OPTS:-}
+}
+
+##############################
+# Solr build utilities (i.e. working with the Gradle build)
+##############################
+
+# Assumes caller is in the root of the Solr checkout directory
+function solr_build_package() {
+  ./gradlew clean assemble -Pproduction=true
+}
+
+# Returns the path (relative to the Solr checkout root) of the package created 
using 'gradle assemble'
+# (Assumes running in the Solr project root)
+function solr_get_package_directory() {
+  local solr_dist_name="$(ls -l solr/packaging/build/ | awk {'print $9'} | 
grep solr | grep -v slim)"
+
+  if [[ -z "$solr_dist_name" ]]; then
+    >&2 echo "No solr package exists; cannot get directory name"
+    return 1
+  fi
+
+  echo "solr/packaging/build/$solr_dist_name"
+}
diff --git a/scripts/gatling/periodic-benchmark.sh 
b/scripts/gatling/periodic-benchmark.sh
new file mode 100755
index 0000000..1e9689c
--- /dev/null
+++ b/scripts/gatling/periodic-benchmark.sh
@@ -0,0 +1,83 @@
+#!/bin/bash -x
+
+set -eu
+
+SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
+LIB_DIR="$SCRIPT_DIR/lib"
+SANDBOX_CHECKOUT_ROOT="$SCRIPT_DIR/../../"
+source $LIB_DIR/env-state.sh
+source $LIB_DIR/git.sh
+source $LIB_DIR/solr.sh
+
+
+#####################################
+# "Main" - arg parsing and validation
+TAG=""
+BRANCH="main"
+PREVIOUS_END_HASH=""
+
+if [ $# -gt 0 ]; then
+  while [ $# -gt 0 ]; do
+    case "${1:-}" in
+        -t|--tag)
+            TAG=${2}
+            shift 2
+        ;;
+        -b|--branch)
+            BRANCH=${2}
+            shift 2
+        ;;
+        -l|--last-hash)
+            PREVIOUS_END_HASH=${2}
+            shift 2
+        ;;
+        *)
+            shift
+        ;;
+    esac
+  done
+fi
+
+if [[ -z $TAG ]]; then
+  >&2 echo "Required argument 'tag' was not provided"
+  return 1
+fi
+
+# If no user-provided last-hash, attempt to read it from "state file"
+if [[ -z "$PREVIOUS_END_HASH" ]]; then
+  PREVIOUS_END_HASH=$(env_state_read_last_run_id "$TAG")
+fi
+
+# If we still don't have a last-hash, exit
+if [[ -z "$PREVIOUS_END_HASH" ]]; then
+  >&2 echo "'-l' is required for the initial run of each tag; exiting..."
+  exit 1
+fi
+
+##########################################################
+# Figure out the 'start' and 'end' points for benchmarking
+pushd $BENCH_SOLR_CHECKOUT_DIR
+  git_update_checkout $BRANCH
+  git_clean
+
+  # 'git_list_commits_since' is inclusive (i.e. it returns the arg in output),
+  # so that must be excluded to get the first commit *after* that point
+  START_COMMIT=$(git_list_commits_since $PREVIOUS_END_HASH | grep -v 
$PREVIOUS_END_HASH | head -n 1)
+  END_COMMIT=$(git_echo_latest_commit_on_branch $BRANCH)
+popd
+
+if [[ "$END_COMMIT" == "$PREVIOUS_END_HASH" ]]; then
+  >&2 echo "No new commits since the last cronjob run processed 
$PREVIOUS_END_HASH; exiting..."
+  exit 0
+fi
+
+if [[ -z "$START_COMMIT" ]]; then
+  >&2 echo "No new commits since the last cronjob run processed 
$PREVIOUS_END_HASH; exiting..."
+  exit 0
+fi
+
+############################################################################
+# Run benchmarking on all commits since the previous cronjob run, and update
+# the "last run" state only if successful
+$SCRIPTS_DIR/run-benchmark-on-commits.sh -b $BRANCH -s $START_COMMIT -e 
$END_COMMIT
+env_state_write_last_run_id "$TAG" $END_COMMIT
diff --git a/scripts/gatling/run-benchmark-on-commits.sh 
b/scripts/gatling/run-benchmark-on-commits.sh
new file mode 100755
index 0000000..1ad76e5
--- /dev/null
+++ b/scripts/gatling/run-benchmark-on-commits.sh
@@ -0,0 +1,102 @@
+#!/bin/bash -x
+
+set -eu
+
+SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
+LIB_DIR="$SCRIPT_DIR/lib"
+SANDBOX_CHECKOUT_ROOT="$SCRIPT_DIR/../../"
+source $LIB_DIR/env-state.sh
+source $LIB_DIR/git.sh
+source $LIB_DIR/solr.sh
+source $LIB_DIR/gatling.sh
+
+###########################
+# "Main" - begin arg parsing
+COMMIT_HASHES=""
+START_HASH=""
+END_HASH=""
+BRANCH="main"
+
+if [ $# -gt 0 ]; then
+  while [ $# -gt 0 ]; do
+    case "${1:-}" in
+        # Inclusive
+        -s|--start-hash)
+            START_HASH=${2}
+            shift 2
+        ;;
+        # Inclusive
+        -e|--end-hash)
+            END_HASH=${2}
+            shift 2
+        ;;
+        -b|--branch)
+            BRANCH=${2}
+            shift 2
+        ;;
+        -c|--commit-hashes)
+            COMMIT_HASHES=${2}
+            shift 2
+        ;;
+        *)
+            shift
+        ;;
+    esac
+  done
+fi
+
+###############################################
+# Identify the branch and the commits to run on
+env_state_bootstrap
+
+pushd $BENCH_SOLR_CHECKOUT_DIR
+  if [[ -n "${COMMIT_HASHES}" ]]; then
+    COMMIT_HASHES="$(echo "$COMMIT_HASHES" |  sed 's/,/\n/g')"
+  else # [[ -z "${COMMIT_HASHES}" ]]; then
+    if [[ -z "${START_HASH}" ]]; then
+      >&2 echo "Either '-c' or '-s'/'-e' argument must be provided"
+      exit 1
+    else
+      git_update_checkout $BRANCH
+      COMMIT_HASHES="$(git_list_commits_since $START_HASH $END_HASH)"
+    fi
+  fi
+popd
+
+####################################################
+# Download any benchmark-data needed for simulations
+pushd $SANDBOX_CHECKOUT_ROOT
+  gatling_download_wiki_data
+popd
+
+######################################################
+# Iterate over commits, building and benchmarking each
+pushd $BENCH_SOLR_CHECKOUT_DIR
+  for commit in $(echo "$COMMIT_HASHES") ; do
+    echo "Processing commit: [$commit]"
+
+    # Build and start Solr
+    git_checkout "$commit"
+    solr_kill_all
+    solr_build_package
+    package_dir="$(solr_get_package_directory)"
+    pushd $package_dir
+      export START_SOLR_OPTS="${START_SOLR_OPTS:-} -m 4g "
+      solr_start
+      if ! solr_is_running "8983" ; then
+        >&2 echo "Unable to start Solr; please check logs. Exiting..."
+        exit 1
+      fi
+    popd
+
+    # Run the benchmark(s) and store the result
+    #   (currently just wiki-indexing)
+    pushd $SANDBOX_CHECKOUT_ROOT 
+      ./scripts/gatling/setup_wikipedia_tests.sh
+      ./gradlew gatlingRun --simulation index.IndexWikipediaBatchesSimulation
+      env_state_store_gatling_result $BRANCH $commit
+    popd
+    solr_kill_all
+
+  done
+popd

Reply via email to