Author: hboutemy
Date: Sun May 3 13:04:51 2015
New Revision: 1677396
URL: http://svn.apache.org/r1677396
Log:
improved documentation of scripts
Modified:
comdev/projects.apache.org/STRUCTURE.txt
comdev/projects.apache.org/scripts/README.txt
Modified: comdev/projects.apache.org/STRUCTURE.txt
URL:
http://svn.apache.org/viewvc/comdev/projects.apache.org/STRUCTURE.txt?rev=1677396&r1=1677395&r2=1677396&view=diff
==============================================================================
--- comdev/projects.apache.org/STRUCTURE.txt (original)
+++ comdev/projects.apache.org/STRUCTURE.txt Sun May 3 13:04:51 2015
@@ -31,5 +31,5 @@ Stuff to run manually when needed:
Webserver required:
To test the site locally, a webserver is required or you'll get
"Cross origin requests are only supported for HTTP" errors.
-An easy setup is: run "python -m SimpleHTTPServer 8888" from site directory
-to have site available at http://localhost:8888/
+An easy setup for development is: run "python -m SimpleHTTPServer 8888" from
+site directory to have site available at http://localhost:8888/
Modified: comdev/projects.apache.org/scripts/README.txt
URL:
http://svn.apache.org/viewvc/comdev/projects.apache.org/scripts/README.txt?rev=1677396&r1=1677395&r2=1677396&view=diff
==============================================================================
--- comdev/projects.apache.org/scripts/README.txt (original)
+++ comdev/projects.apache.org/scripts/README.txt Sun May 3 13:04:51 2015
@@ -1,15 +1,53 @@
-This directory contains scripts for both importing and updating data from
+This directory contains Python 3 scripts for both importing and updating data
from
various sources:
-- parsechairs.py: Fetches current VPs from the foundation website. To be run as
- a cron job.
-- parsecommittees.py: Parses committee-info.txt and its TLP information
+1. updating data (cronjobs)
+
+- countaccounts.py: Extract from LDAP monthly statistics on Unix accounts
created
+ in: foundation/accounts.json + ldapsearch
+ out: foundation/accounts.json
+
+- parsechairs.py: Fetches current VPs from the foundation website.
+ in: http://www.apache.org/foundation/
+ out: foundation/chairs.json
+
- parsecommitters.py: Fetches and parses the committer (LDAP) list via
- people.apache.org. To be cronned
-- parsepmcs.py: imports PMC data from the old project.apache.org site. No need
- to run that more than once.
+ people.apache.org.
+ in: http://people.apache.org/committer-index.html
+ out: foundation/people.json + foundation/committers.json
+ List of committers with reference to projects (people.json) and pmcs with
corresponding committers (committers.json)
+
- podlings.py: Reads podlings.xml from the incubator site and creates a JSON
- with timeline data, as well as as podling project information.
+ with timeline data, as well as current podling projects information.
+ in: http://incubator.apache.org/podlings.xml
+ out: foundation/evolution.json + foundation/podlings.json
+ Monthly statisctics on podlings (evolution.json) and current list of
podlings (podlings.json)
+
+- parsereleases.py
+ in: http://www.apache.org/dist/
+ out: foundation/releases.json
+
+
+2. importing data (import)
+
+- parsecommittees.py: Parses committee-info.txt and its TLP information
+ in: foundation/chairs.json + committee-info.txt
+ out: foundation/committees.json
+
+- parsepmcs.py: imports PMC data from the old project.apache.org site. No need
+ to run that more than once?
+ in:
https://svn.apache.org/repos/asf/infrastructure/site-tools/trunk/projects/data_files/*.rdf
+ out: foundation/pmcs.json
+
- rdfparse.py: Parses existing RDF(DOAP) files from the old projects.a.o and
turns them into JSON objects.
+ in:
https://svn.apache.org/repos/asf/infrastructure/site-tools/trunk/projects/files.xml
+ projects' DOAP files
+ out: projects/*.json + foundation/projects.json
+
+- addpmc.py
+ in: foundation/pmcs.json + foundation/committees.json + params
+ out: foundation/pmcs.json + foundation/committees.json
+ list of PMCs with site url (pmcs.json) and monthly list of new committees
(committees.json)
+
+Notice: cycles.json is not imported from anywhere