[
https://issues.apache.org/jira/browse/BEAM-5240?focusedWorklogId=147318&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-147318
]
ASF GitHub Bot logged work on BEAM-5240:
----------------------------------------
Author: ASF GitHub Bot
Created on: 24/Sep/18 21:24
Start Date: 24/Sep/18 21:24
Worklog Time Spent: 10m
Work Description: pabloem closed pull request #6482: Revert "[BEAM-5240]
Add metrics dashboard deployment script and logic…
URL: https://github.com/apache/beam/pull/6482
This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:
As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):
diff --git a/.test-infra/metrics/README.md b/.test-infra/metrics/README.md
deleted file mode 100644
index 70bfb04ddc8..00000000000
--- a/.test-infra/metrics/README.md
+++ /dev/null
@@ -1,105 +0,0 @@
-# BeamMonitoring
-This folder contains files required to spin-up metrics dashboard for Beam.
-
-## Utilized technologies
-* [Grafana](https://grafana.com) as dashboarding engine.
-* PostgreSQL as underlying DB.
-
-Approach utilized is to fetch data from corresponding system:
Jenkins/Jira/GithubArchives/etc, put it into PostreSQL and fetch it to show in
Grafana.
-
-## Local setup
-
-Install docker
-* install docker
- * https://docs.docker.com/install/#supported-platforms
-* install docker-compose
- * https://docs.docker.com/compose/install/#install-compose
-
-```sh
-# Remove old docker
-sudo apt-get remove docker docker-engine docker.io
-
-# Install docker
-sudo apt-get update
-sudo apt-get install \
- apt-transport-https \
- ca-certificates \
- curl \
- gnupg2 \
- software-properties-common
-curl -fsSL https://download.docker.com/linux/debian/gpg | sudo apt-key add -
-sudo apt-key fingerprint 0EBFCD88
-sudo add-apt-repository \
- "deb [arch=amd64] https://download.docker.com/linux/debian \
- $(lsb_release -cs) \
- stable"
-sudo apt-get update
-sudo apt-get install docker-ce
-
-# Install docker-compose
-sudo curl -L
https://github.com/docker/compose/releases/download/1.22.0/docker-compose-$(uname
-s)-$(uname -m) -o /usr/local/bin/docker-compose
-sudo chmod +x /usr/local/bin/docker-compose
-
-# start docker service if it is not running already
-sudo service docker start
-```
-
-## Kubernetes setup
-
-1. Configure gcloud & kubectl
- * https://cloud.google.com/kubernetes-engine/docs/quickstart
-2. Configure PosgreSQL
- a.
https://pantheon.corp.google.com/sql/instances?project=apache-beam-testing
- b. Check on this link to configure connection from kubernetes to
postgresql: https://cloud.google.com/sql/docs/postgres/connect-kubernetes-engine
-3. add secrets for grafana
- a. `kubectl create secret generic grafana-admin-pwd
--from-literal=grafana_admin_password=<pwd>`
-4. create persistent volume claims:
-```sh
-kubectl create -f beam-grafana-etcdata-persistentvolumeclaim.yaml
-kubectl create -f beam-grafana-libdata-persistentvolumeclaim.yaml
-kubectl create -f beam-grafana-logdata-persistentvolumeclaim.yaml
-```
-5. Build and publish sync containers
-```sh
-cd sync/jenkins
-docker build -t gcr.io/${PROJECT_ID}/beammetricssyncjenkins:v1 .
-docker push gcr.io/${PROJECT_ID}/beammetricssyncjenkins:v1
-```
-6. Create deployment `kubectl create -f beamgrafana-deploy.yaml`
-
-## Kubernetes update
-https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
-
-1. Build and publish sync containers
-```sh
-cd sync/jenkins
-docker build -t gcr.io/${PROJECT_ID}/beammetricssyncjenkins:v1 .
-docker push -t gcr.io/${PROJECT_ID}/beammetricssyncjenkins:v1
-```
-1. Update image for container `kubectl set image deployment/beamgrafana
container=<new_image_name>`
-
-
-## Useful Kubernetes commands and hints
-```sh
-# Get pods
-kubectl get pods
-
-# Get detailed status
-kubectl describe pod <pod_name>
-
-# Get logs
-kubectl log <PodName> <ContainerName>
-
-# Set kubectl logging level: -v [1..10]
-https://github.com/kubernetes/kubernetes/issues/35054
-```
-
-## Useful docker commands and hints
-* Connect from one container to another
- * `curl <containername>:<port>`
-* Remove all containers/images/volumes
-```sh
-sudo docker rm $(sudo docker ps -a -q)
-sudo docker rmi $(sudo docker images -q)
-sudo docker volume prune
-```
diff --git
a/.test-infra/metrics/beam-grafana-etcdata-persistentvolumeclaim.yaml
b/.test-infra/metrics/beam-grafana-etcdata-persistentvolumeclaim.yaml
deleted file mode 100644
index 42f27eda811..00000000000
--- a/.test-infra/metrics/beam-grafana-etcdata-persistentvolumeclaim.yaml
+++ /dev/null
@@ -1,14 +0,0 @@
-apiVersion: v1
-kind: PersistentVolumeClaim
-metadata:
- creationTimestamp: null
- labels:
- io.kompose.service: beam-grafana-etcdata
- name: beam-grafana-etcdata
-spec:
- accessModes:
- - ReadWriteOnce
- resources:
- requests:
- storage: 100Mi
-status: {}
diff --git
a/.test-infra/metrics/beam-grafana-libdata-persistentvolumeclaim.yaml
b/.test-infra/metrics/beam-grafana-libdata-persistentvolumeclaim.yaml
deleted file mode 100644
index 121362d0169..00000000000
--- a/.test-infra/metrics/beam-grafana-libdata-persistentvolumeclaim.yaml
+++ /dev/null
@@ -1,14 +0,0 @@
-apiVersion: v1
-kind: PersistentVolumeClaim
-metadata:
- creationTimestamp: null
- labels:
- io.kompose.service: beam-grafana-libdata
- name: beam-grafana-libdata
-spec:
- accessModes:
- - ReadWriteOnce
- resources:
- requests:
- storage: 100Mi
-status: {}
diff --git
a/.test-infra/metrics/beam-grafana-logdata-persistentvolumeclaim.yaml
b/.test-infra/metrics/beam-grafana-logdata-persistentvolumeclaim.yaml
deleted file mode 100644
index b4780d697f8..00000000000
--- a/.test-infra/metrics/beam-grafana-logdata-persistentvolumeclaim.yaml
+++ /dev/null
@@ -1,14 +0,0 @@
-apiVersion: v1
-kind: PersistentVolumeClaim
-metadata:
- creationTimestamp: null
- labels:
- io.kompose.service: beam-grafana-logdata
- name: beam-grafana-logdata
-spec:
- accessModes:
- - ReadWriteOnce
- resources:
- requests:
- storage: 100Mi
-status: {}
diff --git
a/.test-infra/metrics/beam-postgresql-data-persistentvolumeclaim.yaml
b/.test-infra/metrics/beam-postgresql-data-persistentvolumeclaim.yaml
deleted file mode 100644
index 2079b49a4ab..00000000000
--- a/.test-infra/metrics/beam-postgresql-data-persistentvolumeclaim.yaml
+++ /dev/null
@@ -1,14 +0,0 @@
-apiVersion: v1
-kind: PersistentVolumeClaim
-metadata:
- creationTimestamp: null
- labels:
- io.kompose.service: beam-postgresql-data
- name: beam-postgresql-data
-spec:
- accessModes:
- - ReadWriteOnce
- resources:
- requests:
- storage: 100Mi
-status: {}
diff --git a/.test-infra/metrics/beamgrafana-deploy.yaml
b/.test-infra/metrics/beamgrafana-deploy.yaml
deleted file mode 100644
index 70391767e73..00000000000
--- a/.test-infra/metrics/beamgrafana-deploy.yaml
+++ /dev/null
@@ -1,97 +0,0 @@
-apiVersion: extensions/v1beta1
-kind: Deployment
-metadata:
- name: beamgrafana
- labels:
- app: beamgrafana
-spec:
- strategy:
- type: Recreate
- template:
- metadata:
- labels:
- app: grafana
- spec:
- securityContext:
- fsGroup: 1000
- containers:
- - name: beammetricssyncjenkins
- image: gcr.io/apache-beam-testing/beammetricssyncjenkins:v15
- env:
- - name: JENSYNC_HOST
- value: 127.0.0.1
- - name: JENSYNC_PORT
- value: "5432"
- - name: JENSYNC_DBNAME
- value: beammetrics
- - name: JENSYNC_DBUSERNAME
- valueFrom:
- secretKeyRef:
- name: beammetrics-psql-db-credentials
- key: username
- - name: JENSYNC_DBPWD
- valueFrom:
- secretKeyRef:
- name: beammetrics-psql-db-credentials
- key: password
- - name: cloudsql-proxy
- image: gcr.io/cloudsql-docker/gce-proxy:1.11
- command: ["/cloud_sql_proxy",
-
"-instances=apache-beam-testing:us-west2:beammetrics=tcp:5432"]
- env:
- - name: GOOGLE_APPLICATION_CREDENTIALS
- value: /secrets/cloudsql/config.json
- volumeMounts:
- - name: beammetrics-psql-credentials
- mountPath: /secrets/cloudsql
- readOnly: true
- - name: beamgrafana
- image: grafana/grafana
- securityContext:
- runAsUser: 0
- env:
- - name: GF_AUTH_ANONYMOUS_ENABLED
- value: "true"
- - name: GF_AUTH_ANONYMOUS_ORG_NAME
- value: Beam
- - name: GF_INSTALL_PLUGINS
- value: vonage-status-panel
- - name: GF_SECURITY_ADMIN_PASSWORD
- valueFrom:
- secretKeyRef:
- name: grafana-admin-pwd
- key: grafana_admin_password
- - name: PSQL_DB_USER
- valueFrom:
- secretKeyRef:
- name: beammetrics-psql-db-credentials
- key: username
- - name: DB_PASSWORD
- valueFrom:
- secretKeyRef:
- name: beammetrics-psql-db-credentials
- key: password
- ports:
- - containerPort: 3000
- resources: {}
- volumeMounts:
- - mountPath: /var/lib/grafana
- name: beam-grafana-libdata
- - mountPath: /etc/grafana
- name: beam-grafana-etcdata
- - mountPath: /var/log/grafana
- name: beam-grafana-logdata
- volumes:
- - name: beammetrics-psql-credentials
- secret:
- secretName: beammetrics-psql-credentials
- - name: beam-grafana-libdata
- persistentVolumeClaim:
- claimName: beam-grafana-libdata
- - name: beam-grafana-etcdata
- persistentVolumeClaim:
- claimName: beam-grafana-etcdata
- - name: beam-grafana-logdata
- persistentVolumeClaim:
- claimName: beam-grafana-logdata
-
diff --git a/.test-infra/metrics/dashboards/dashboard.bak
b/.test-infra/metrics/dashboards/dashboard.bak
deleted file mode 100644
index 2af9e18f4bf..00000000000
--- a/.test-infra/metrics/dashboards/dashboard.bak
+++ /dev/null
@@ -1,297 +0,0 @@
-{
- "annotations": {
- "list": [
- {
- "builtIn": 1,
- "datasource": "-- Grafana --",
- "enable": true,
- "hide": true,
- "iconColor": "rgba(0, 211, 255, 1)",
- "limit": 100,
- "name": "Annotations & Alerts",
- "showIn": 0,
- "type": "dashboard"
- }
- ]
- },
- "editable": true,
- "gnetId": null,
- "graphTooltip": 0,
- "id": 1,
- "links": [],
- "panels": [
- {
- "aliasColors": {},
- "bars": false,
- "dashLength": 10,
- "dashes": false,
- "datasource": "BeamPSQL",
- "fill": 0,
- "gridPos": {
- "h": 9,
- "w": 12,
- "x": 0,
- "y": 0
- },
- "id": 6,
- "legend": {
- "avg": false,
- "current": false,
- "max": false,
- "min": false,
- "show": true,
- "total": false,
- "values": false
- },
- "lines": true,
- "linewidth": 1,
- "links": [],
- "nullPointMode": "null",
- "percentage": false,
- "pointradius": 2,
- "points": true,
- "renderer": "flot",
- "seriesOverrides": [],
- "spaceLength": 10,
- "stack": false,
- "steppedLine": false,
- "targets": [
- {
- "alias": "",
- "format": "time_series",
- "rawSql": "SELECT\n DATE_TRUNC('day', build_timestamp) as time,\n
avg(\n case \n when build_result = 'SUCCESS' then 1\n else 0\n end) as
value,\n job_name\nFROM\n jenkins_builds\nWHERE\n (build_timestamp BETWEEN
$__timeFrom() AND $__timeTo())\n AND (job_name LIKE 'beam_PostCommit_%')\n
AND NOT (job_name like '%_PR')\nGROUP BY\n time, job_name\norder BY\n time\n",
- "refId": "A"
- }
- ],
- "thresholds": [
- {
- "colorMode": "critical",
- "fill": false,
- "line": true,
- "op": "lt",
- "value": 0.85,
- "yaxis": "left"
- }
- ],
- "timeFrom": "14d",
- "timeShift": null,
- "title": "Greenness per day (in %)",
- "tooltip": {
- "shared": true,
- "sort": 0,
- "value_type": "individual"
- },
- "type": "graph",
- "xaxis": {
- "buckets": null,
- "mode": "time",
- "name": null,
- "show": true,
- "values": []
- },
- "yaxes": [
- {
- "format": "percentunit",
- "label": "",
- "logBase": 1,
- "max": null,
- "min": null,
- "show": true
- },
- {
- "format": "short",
- "label": null,
- "logBase": 1,
- "max": null,
- "min": null,
- "show": false
- }
- ],
- "yaxis": {
- "align": false,
- "alignLevel": null
- }
- },
- {
- "aliasColors": {},
- "bars": false,
- "dashLength": 10,
- "dashes": false,
- "datasource": "BeamPSQL",
- "fill": 1,
- "gridPos": {
- "h": 9,
- "w": 12,
- "x": 12,
- "y": 0
- },
- "id": 5,
- "legend": {
- "avg": false,
- "current": false,
- "max": false,
- "min": false,
- "show": true,
- "total": false,
- "values": false
- },
- "lines": true,
- "linewidth": 1,
- "links": [],
- "nullPointMode": "null",
- "percentage": false,
- "pointradius": 5,
- "points": false,
- "renderer": "flot",
- "seriesOverrides": [],
- "spaceLength": 10,
- "stack": false,
- "steppedLine": false,
- "targets": [
- {
- "alias": "",
- "format": "time_series",
- "rawSql": "SELECT\n build_timestamp as time,\n build_duration as
value,\n job_name\nFROM\n jenkins_builds\nWHERE\n (build_timestamp BETWEEN
$__timeFrom() AND $__timeTo())\n AND (job_name LIKE 'beam_PostCommit_%')\n
AND NOT (job_name LIKE '%_PR')\nORDER BY\n time\n ",
- "refId": "A"
- }
- ],
- "thresholds": [],
- "timeFrom": null,
- "timeShift": null,
- "title": "Job duration",
- "tooltip": {
- "shared": true,
- "sort": 0,
- "value_type": "individual"
- },
- "type": "graph",
- "xaxis": {
- "buckets": null,
- "mode": "time",
- "name": null,
- "show": true,
- "values": []
- },
- "yaxes": [
- {
- "format": "ms",
- "label": "",
- "logBase": 1,
- "max": null,
- "min": null,
- "show": true
- },
- {
- "format": "short",
- "label": null,
- "logBase": 1,
- "max": null,
- "min": null,
- "show": false
- }
- ],
- "yaxis": {
- "align": false,
- "alignLevel": null
- }
- },
- {
- "columns": [],
- "datasource": "BeamPSQL",
- "fontSize": "100%",
- "gridPos": {
- "h": 5,
- "w": 24,
- "x": 0,
- "y": 9
- },
- "hideTimeOverride": false,
- "id": 8,
- "links": [],
- "pageSize": null,
- "scroll": true,
- "showHeader": true,
- "sort": {
- "col": 0,
- "desc": true
- },
- "styles": [
- {
- "alias": "Time",
- "dateFormat": "YYYY-MM-DD HH:mm:ss",
- "link": false,
- "pattern": "Time",
- "type": "date"
- },
- {
- "alias": "Build Url",
- "colorMode": null,
- "colors": [
- "rgba(245, 54, 54, 0.9)",
- "rgba(237, 129, 40, 0.89)",
- "rgba(50, 172, 45, 0.97)"
- ],
- "dateFormat": "YYYY-MM-DD HH:mm:ss",
- "decimals": 2,
- "link": true,
- "linkUrl": "${__cell}",
- "mappingType": 1,
- "pattern": "build_url",
- "thresholds": [],
- "type": "number",
- "unit": "short"
- }
- ],
- "targets": [
- {
- "alias": "",
- "format": "table",
- "rawSql": "SELECT \n build_timestamp,\n job_name,\n
build_url\nFROM jenkins_builds\nWHERE \n (build_timestamp BETWEEN
$__timeFrom() AND $__timeTo())\n AND (job_name LIKE 'beam_PostCommit_%')\n
AND NOT (job_name LIKE '%_PR')\n AND NOT (build_result = 'SUCCESS')\nORDER BY
\n build_timestamp",
- "refId": "A"
- }
- ],
- "timeShift": null,
- "title": "Failed builds",
- "transform": "table",
- "type": "table"
- }
- ],
- "refresh": false,
- "schemaVersion": 16,
- "style": "dark",
- "tags": [],
- "templating": {
- "list": []
- },
- "time": {
- "from": "now-24h",
- "to": "now"
- },
- "timepicker": {
- "hidden": false,
- "refresh_intervals": [
- "1m",
- "5m",
- "15m",
- "30m",
- "1h",
- "2h",
- "1d"
- ],
- "time_options": [
- "5m",
- "15m",
- "1h",
- "6h",
- "12h",
- "24h",
- "2d",
- "7d",
- "30d"
- ]
- },
- "timezone": "",
- "title": "Post-commit jobs",
- "uid": "D81lW0pmk",
- "version": 3
-}
\ No newline at end of file
diff --git a/.test-infra/metrics/docker-compose.yml
b/.test-infra/metrics/docker-compose.yml
deleted file mode 100644
index 97d5fe0c12a..00000000000
--- a/.test-infra/metrics/docker-compose.yml
+++ /dev/null
@@ -1,33 +0,0 @@
-version: '3'
-services:
- postgresql:
- image: postgres
- ports:
- - "5432:5432"
- container_name: beampostgresql
- volumes:
- - beam-postgresql-data:/var/lib/postgresql/data
- environment:
- - POSTGRES_USER=admin
- - POSTGRES_PASSWORD=<PGPasswordHere>
- - POSTGRES_DB=beam_metrics
- grafana:
- image: grafana/grafana
- ports:
- - "3000:3000"
- container_name: beamgrafana
- volumes:
- - beam-grafana-libdata:/var/lib/grafana
- - beam-grafana-etcdata:/etc/grafana
- - beam-grafana-logdata:/var/log/grafana
- environment:
- - GF_SECURITY_ADMIN_PASSWORD=<GrafanaPasswordHere>
- - GF_AUTH_ANONYMOUS_ENABLED=true
- - GF_AUTH_ANONYMOUS_ORG_NAME=Beam
- - GF_INSTALL_PLUGINS=vonage-status-panel
-volumes:
- beam-postgresql-data:
- beam-grafana-libdata:
- beam-grafana-etcdata:
- beam-grafana-logdata:
-
diff --git a/.test-infra/metrics/sync/jenkins/Dockerfile
b/.test-infra/metrics/sync/jenkins/Dockerfile
deleted file mode 100644
index 205d3f7858b..00000000000
--- a/.test-infra/metrics/sync/jenkins/Dockerfile
+++ /dev/null
@@ -1,10 +0,0 @@
-FROM python:3
-
-WORKDIR /usr/src/app
-
-COPY . .
-
-RUN pip install --no-cache-dir -r requirements.txt
-
-
-CMD python ./syncjenkins.py
diff --git a/.test-infra/metrics/sync/jenkins/README.md
b/.test-infra/metrics/sync/jenkins/README.md
deleted file mode 100644
index f310077e5ab..00000000000
--- a/.test-infra/metrics/sync/jenkins/README.md
+++ /dev/null
@@ -1,3 +0,0 @@
-# Running script locally
-1. Build container
-2. `docker run -it --rm --name sync -v "$PWD":/usr/src/myapp -w /usr/src/myapp
-e "JENSYNC_PORT=5432" -e "JENSYNC_DBNAME=beam_metrics" -e
"JENSYNC_DBUSERNAME=admin" -e "JENSYNC_DBPWD=<password>" syncjenkins python
syncjenkins.py`
diff --git a/.test-infra/metrics/sync/jenkins/requirements.txt
b/.test-infra/metrics/sync/jenkins/requirements.txt
deleted file mode 100644
index 46d28f42328..00000000000
--- a/.test-infra/metrics/sync/jenkins/requirements.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-requests
-psycopg2-binary
-
-
diff --git a/.test-infra/metrics/sync/jenkins/syncjenkins.py
b/.test-infra/metrics/sync/jenkins/syncjenkins.py
deleted file mode 100644
index 7d5d1d2360c..00000000000
--- a/.test-infra/metrics/sync/jenkins/syncjenkins.py
+++ /dev/null
@@ -1,210 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-
-# Queries Jenkins to collect metrics and pu them in bigquery.
-import os
-import psycopg2
-import re
-import requests
-import socket
-import sys
-import time
-
-from datetime import datetime, timedelta
-from xml.etree import ElementTree
-
-# Keeping this as reference for localhost debug
-# Fetching docker host machine ip for testing purposes.
-# Actual host should be used for production.
-# import subprocess
-# cmd_out = subprocess.check_output(["ip", "route", "show"]).decode("utf-8")
-# host = cmd_out.split(" ")[2]
-
-host = os.environ['JENSYNC_HOST']
-port = os.environ['JENSYNC_PORT']
-dbname = os.environ['JENSYNC_DBNAME']
-dbusername = os.environ['JENSYNC_DBUSERNAME']
-dbpassword = os.environ['JENSYNC_DBPWD']
-
-jenkinsBuildsTableName = 'jenkins_builds'
-
-jenkinsJobsCreateTableQuery = f"""
-create table {jenkinsBuildsTableName} (
-job_name varchar NOT NULL,
-build_id integer NOT NULL,
-build_url varchar,
-build_result varchar,
-build_timestamp TIMESTAMP,
-build_builtOn varchar,
-build_duration integer,
-build_estimatedDuration integer,
-build_fullDisplayName varchar,
-timing_blockedDurationMillis integer,
-timing_buildableDurationMillis integer,
-timing_buildingDurationMillis integer,
-timing_executingTimeMillis integer,
-timing_queuingDurationMillis integer,
-timing_totalDurationMillis integer,
-timing_waitingDurationMillis integer,
-primary key(job_name, build_id)
-)
-"""
-
-def fetchJobs():
- url = ('https://builds.apache.org/view/A-D/view/Beam/api/json'
- '?tree=jobs[name,url,lastCompletedBuild[id]]&depth=1')
- r = requests.get(url)
- jobs = r.json()[u'jobs']
- result = map(lambda x: (x['name'],
- int(x['lastCompletedBuild']['id'])
- if x['lastCompletedBuild'] is not None
- else -1, x['url']), jobs)
- return result
-
-def initConnection():
- conn = psycopg2.connect(f"dbname='{dbname}' user='{dbusername}'
host='{host}'"
- f" port='{port}' password='{dbpassword}'")
- return conn
-
-def tableExists(cursor, tableName):
- cursor.execute(f"select * from information_schema.tables"
- f" where table_name='{tableName}';")
- return bool(cursor.rowcount)
-
-
-def initDbTablesIfNeeded():
- connection = initConnection()
- cursor = connection.cursor()
-
- buildsTableExists = tableExists(cursor, jenkinsBuildsTableName)
- print('Builds table exists', buildsTableExists)
- if not buildsTableExists:
- cursor.execute(jenkinsJobsCreateTableQuery)
- if not bool(cursor.rowcount):
- raise Exception(f"Failed to create table {jenkinsBuildsTableName}")
-
- cursor.close()
- connection.commit()
-
- connection.close()
-
-
-def fetchSyncedJobsBuildVersions(cursor):
- fetchQuery = f'''
- select job_name, max(build_id)
- from {jenkinsBuildsTableName}
- group by job_name
- '''
-
- cursor.execute(fetchQuery)
- return dict(cursor.fetchall())
-
-
-def fetchBuildsForJob(jobUrl):
- durFields = ('blockedDurationMillis,buildableDurationMillis,'
-
'buildingDurationMillis,executingTimeMillis,queuingDurationMillis,'
- 'totalDurationMillis,waitingDurationMillis')
- fields = (f'result,timestamp,id,url,builtOn,building,duration,'
- f'estimatedDuration,fullDisplayName,actions[{durFields}]')
- url = f'{jobUrl}api/json?depth=1&tree=builds[{fields}]'
- r = requests.get(url)
- return r.json()[u'builds']
-
-
-def buildRowValuesArray(jobName, build):
- timings = next((x
- for x in build[u'actions']
- if (u'_class' in x)
- and (x[u'_class'] ==
u'jenkins.metrics.impl.TimeInQueueAction')),
- None)
- values = [jobName,
- int(build[u'id']),
- build[u'url'],
- build[u'result'],
- datetime.fromtimestamp(build[u'timestamp'] / 1000),
- build[u'builtOn'],
- build[u'duration'],
- build[u'estimatedDuration'],
- build[u'fullDisplayName'],
- timings[u'blockedDurationMillis'] if timings is not None else -1,
- timings[u'buildableDurationMillis'] if timings is not None else -1,
- timings[u'buildingDurationMillis'] if timings is not None else -1,
- timings[u'executingTimeMillis'] if timings is not None else -1,
- timings[u'queuingDurationMillis'] if timings is not None else -1,
- timings[u'totalDurationMillis'] if timings is not None else -1,
- timings[u'waitingDurationMillis'] if timings is not None else -1]
- return values
-
-
-def insertRow(cursor, rowValues):
- cursor.execute(f'insert into {jenkinsBuildsTableName} values (%s, %s, %s,
%s,'
- '%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)', rowValues)
-
-
-def fetchNewData():
- connection = initConnection()
- cursor = connection.cursor()
- syncedJobs = fetchSyncedJobsBuildVersions(cursor)
- cursor.close()
- connection.close()
-
- newJobs = fetchJobs()
-
- for newJobName, newJobLastBuildId, newJobUrl in newJobs:
- syncedJobId = syncedJobs[newJobName] if newJobName in syncedJobs else -1
- if newJobLastBuildId > syncedJobId:
- builds = fetchBuildsForJob(newJobUrl)
- builds = [x for x in builds if int(x[u'id']) > syncedJobId]
-
- connection = initConnection()
- cursor = connection.cursor()
-
- for build in builds:
- if build[u'building']:
- continue;
- rowValues = buildRowValuesArray(newJobName, build)
- print("inserting", newJobName, build[u'id'])
- insertRow(cursor, rowValues)
-
- cursor.close()
- connection.commit()
- connection.close() # For some reason .commit() doesn't push data
-
-def probeJenkinsIsUp():
- sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
- result = sock.connect_ex(('builds.apache.org', 443))
- return True if result == 0 else False
-
-
-################################################################################
-if __name__ == '__main__':
- print("Started.")
-
- print("Checking if DB needs to be initialized.")
- sys.stdout.flush()
- initDbTablesIfNeeded()
-
- print("Start jobs fetching loop.")
- sys.stdout.flush()
-
- while True:
- if not probeJenkinsIsUp():
- print("Jenkins is unavailabel, skipping fetching data.")
- continue
- else:
- fetchNewData()
- print("Fetched data.")
- print("Sleeping for 5 min.")
- sys.stdout.flush()
- time.sleep(5 * 60)
-
- print('Done.')
-
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 147318)
Time Spent: 3.5h (was: 3h 20m)
> Create post-commit tests dashboard
> ----------------------------------
>
> Key: BEAM-5240
> URL: https://issues.apache.org/jira/browse/BEAM-5240
> Project: Beam
> Issue Type: Sub-task
> Components: testing
> Reporter: Mikhail Gryzykhin
> Assignee: Mikhail Gryzykhin
> Priority: Major
> Time Spent: 3.5h
> Remaining Estimate: 0h
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)