Hi all,
Am 19.12.2012 18:30, schrieb Zieris, Franz:
Dear Thomas and Christoph,
As far as we
know GUI Tests with STF are only run nightly on the master branch. All
other JUnit Tests are run on any patch set submitted to code review.
This might result in Gerrit patch sets passing regular JUnit tests and
therefore get marked as verified by Jenkins CI although they break STF
test cases.
Even worse: The JUnit tests are executed for every new patch version (Job
"Saros-Gerrit"). The eventual submission of a patch is not tested immediately.
Even though a patch may be mergeable without a syntactical conflict, it may produce a
defect on a different level thus breaking both the STF and the JUnit tests.
But thankfully the JUnit tests are executed every hour on the master branch (Job
"Saros"). And there is also the nightly STF job ("Test_Saros_STF").
The two STF regressions run at 4 A.M and 6 A.M ... I would not call this
nightly :P
Goal: Analyze schedule and results of automated test runs
I like your Goal, but consider the following background information:
During the last semester break we conducted a project ("SWTP" -
Softwaretechnik-Projekt). The participants of this project worked with a clone of the
normal Saros repository. We played around with a different approach using feature
branches (the idea came from a blog post [1]):
* The master branch is closed to direct commits; everyone works on feature
branch instead.
* New patch versions are verified by Jenkins in the usual way (JUnit tests)
* As soon as a patch is finally submitted to a feature branch, a Jenkins
integration job will try to merge the feature branch (containing this new
commit) back into the master branch. This is done by the following steps:
** Merge on a textual level (normal "git merge"). If it fails due to merge
conflicts, the integration fails, and the master and the feature branch will diverge.
Otherwise:
** The source code is compiled. If it fails, the integration fails. Otherwise:
** The JUnit tests are executed. If they fail, the integration fails. Otherwise:
** The re-integration succeeded. The resulting merge commit is pushed back to
the official repository.
* The deviation of master and feature branches poses no immediate problem:
every subsequent commit on the feature branch may resolve the issue.
* Main benefit: developers working on different topics do not heavily block each other. It's up to
the developer when he/she wants to "fix" the "broken merge": It could be done
immediately to prevent further deviation of the branches, or the developer decides to add more
commits to the feature branch before reintegrating them into the master.
The (currently inactive) integration job is named "Saros-SWTP-Integration". In case you
wonder why this approach has not been adapted in the actual Saros workflow? It's pretty simple: the
current version of Jenkins' Git plugin does not provide the "--no-ff" option for merges.
Therefore fast-forward merges may occur, which make the Git history really hard to read, because
there are no easy-to-spot swimlanes.
For the near future I see to possibilities to cope with that: (1) Provide a
patch for Jenkins' Git plugin to fix this problem once and for all; (2)
Circumvent the pre-build merge feature of Jenkins' Git plugin by
re-implementing its functionality in a shell script, which would be embedded in
the Jenkins job. The required human resources for either of these solution are
yet lacking, but not for long :)
You may want to reconsider your Questions with this background information in
mind?
For rebuilding the Jenkins setup on your own machine I'm not the best contact
person.
@Stefan: Could you take over this topic?
The STF Jenkins Jobs only call shell scripts, so yes you can execute STF
test cases from the shell.
Downside -> much SETUP:
Currently the STF regressions do not run on the master branch. They do
not run on any branch. They just fetch several build artifacts.
So here is what the STF regressions does.
1. Takes a bunch of build artifacts (Saros, Whiteboard, Nebula) and
instrument them with Cobertura (instrumentation may be skipped)
####################################################################
# fetch ant file for junit
cp ${JENKINS_HOME}/scripts/test/saros_stf_test.xml .
# extract sources for cobertura coverage generation
mkdir src
cd src
"${JAVA_HOME}/bin/jar" xfv ../de.fu_berlin.inf.dpp.source*.jar
"${JAVA_HOME}/bin/jar" xfv ../de.fu_berlin.inf.nebula.source*.jar
cd ..
# be safe, extract every third party library from all plugin files
although the STF test cases MUST NOT NEED THEM !
mkdir lib
"${JAVA_HOME}/bin/jar" xfv de.fu_berlin.inf.dpp_*.jar lib
#INSTRUMENT
mkdir instr
cp de.fu_berlin.inf* instr
"${JENKINS_HOME}/tools/cobertura/cobertura-instrument.sh" \
--basedir "${WORKSPACE}/instr" \
--includeClasses 'de\.fu_berlin\.inf\..*' \
--excludeClasses '.*Test\$.*' \
--excludeClasses '.*\.Test.*' \
--excludeClasses '.*\.test\..*' \
--excludeClasses '.*TestSuite.*' \
--excludeClasses '.*\.stf\..*' \
--excludeClasses '.*Test' \
de.fu_berlin.inf*
2. Create or use a config file
####################################################################
echo "ALICE_JID = *@saros-con.imp.fu-berlin.de/Saros" > stf_config
echo "ALICE_PASSWORD = *" >> stf_config
echo "ALICE_HOST = 192.168.66.129" >> stf_config
echo "ALICE_PORT = 12345" >> stf_config
# add as many testers as you need (currently only 4 are supported,
ALICE, BOB, CARL, DAVE)
3. The next step is to pass the config file and ssh key along with some
other options to a script that generated a script with utility methods
####################################################################
#GENERATE RUN SCRIPT
"${JENKINS_HOME}/scripts/stf/stf_script_gen" \
--cobertura "${JENKINS_HOME}/tools/cobertura" \
"${JENKINS_HOME}/ssh_keys/saros-build" stf_config \
instr/de.fu_berlin.inf* | tee stf_regression.sh
4. Run the regression
####################################################################
. ./stf_regression.sh
set +e
deploy
stop_vnc_server
start_vnc_server
start_remote_bots
sleep 30
echo "STARTING REGRESSION: TIMEOUT IS 60 MINUTES"
"${JENKINS_HOME}/tools/ant/bin/ant" -Dsrc.dir=src -Dlib.dir=lib
"-Declipse.dir=${ECLIPSE_HOME}" -Djunit.dir=junit
"-Dsaros.plugin.dir=${WORKSPACE}"
"-Dstf.client.config.files=${WORKSPACE}/stf_config" -lib
${JENKINS_HOME}/tools/junit -f saros_stf_test.xml &
ANT_PID=$!
wait_until_timeout 120 30 $ANT_PID
TIMEOUT=$?
if [ $TIMEOUT -eq 0 ]
then
"${JAVA_HOME}/bin/java" -cp "${ECLIPSE_HOME}/plugins/*:lib/*:*"
"-Dde.fu_berlin.inf.dpp.stf.client.configuration.files=${WORKSPACE}/stf_config"
de.fu_berlin.inf.dpp.stf.client.ShutdownRemoteEclipse
# WAIT FOR CODE COVERAGE CALCULATION IN THE REMOTE ECLIPSES
sleep 120
fi
stop_remote_bots KILL
kill_ssh_connections
stop_vnc_server
fetch_code_coverage
fetch_screen_shots
exit $TIMEOUT
################################################
5. Generate Coverage
"${JENKINS_HOME}/tools/cobertura/cobertura-merge.sh" coverage*
"${JENKINS_HOME}/tools/cobertura/cobertura-report.sh" --format xml
--destination "${WORKSPACE}" "${WORKSPACE}/src"
Pre:
Fully configured Eclipse instances (SVN etc ...)
So what does the utility function do ?
First the instrumented plugin files along with some scripts are copied
to the remote servers.
Then it starts a X11 for every Eclipse instance on those remote servers.
Finally it starts the Eclipse instances (we do not use equinox to
install the DPP Feature, instead we just copy the plugin files to the
plugin folder which currently works well)
Then just do start Ant which run the regression.
As Eclipse on Unix has a problem with the kill signals (the process is
just killed instead shutting down cleanly which results in loss of the
code coverage data), we have a small class that shuts down Eclipse via
SWT Bot.
Last but not least everything is killed.
I added the other scripts in the attachment as one of them is not that
short.
BR,
Stefan
Regarding your second Goal "Improve code quality" please not that code quality is not a
"mono-lithic" property, it's rather complex or facetted. You may want to look into [2]
(reachable via our university's proxy), where the following eight quality facets are listed:
* Functionality/Capability
* Performance
* Reliability
* Usability
* Maintainability
* Transparency
* Portability
* Testability
Unfortunately these facets are correlated, e.g. performance is hard to improve while
keeping the other ones at the same "level".
Furthermore your Questions a rather simple to answer:
- Is it possible to improve code quality and reduce code complexity?
- Is it possible to improve the maintainability of the source code?
Possible? Yes :)! So: What do you *really* want to know?
Best Regards,
Franz
[1]
http://twasink.net/2011/09/20/git-feature-branches-and-jenkins-or-how-i-learned-to-stop-worrying-about-broken-builds/
[2] http://link.springer.com/book/10.1007/978-3-540-76323-9
-----Original Message-----
From: Christoph Viebig [mailto:li...@christoph-viebig.de]
Sent: Wednesday, December 19, 2012 5:33 PM
To: dpp-devel@lists.sourceforge.net
Subject: [DPP-Devel] GQM project
Dear Saros Community,
in our university course about software processes at Freie Universität
Berlin we are now working on Goal Question Metrics (GQM).
As part of our exercise course we have to elaborate a study on GQM in an
open source project which we would like to conduct at the Saros
project. Therefore we have written down some ideas.
GQM is a software metric model and therefore aims to measure metrics of
software processes and products to allow comparison and evaluation of
such. As a result either changes can be suggested or current procedure
can be confirmed.
Our first idea regards the continuous integration service. As far as we
know GUI Tests with STF are only run nightly on the master branch. All
other JUnit Tests are run on any patch set submitted to code review.
This might result in Gerrit patch sets passing regular JUnit tests and
therefore get marked as verified by Jenkins CI although they break STF
test cases. Hence the first goal is to analyze the schedule and results
of automated test runs.
Goal: Analyze schedule and results of automated test runs
Object of study: Unit- and STF-Test-Results of all (accepted) patch sets
Focus: Number of broken test cases in all (accepted) patch sets
Stakeholder: Saros developers
Questions to answer for this goal:
- Is the current schedule of automated test runs adequate?
- For which patch sets or branches and at which time are test runs
useful?
Our second goal targets Saros code quality and is a very general one.
Goal: Improve code quality
Object of study: Saros source code
Focus: Code complexity
Stakeholder: Saros developers
Questions to answer for this goal:
- Is it possible to improve code quality and reduce code complexity?
- Is it possible to improve the maintainability of the source code?
As this is still a draft we would like to ask if you have comments on
the proposed goals and questions. We are not familiar with Saros for a
long time yet so it might happen that some of our goals or questions are
not very useful to be asked. Please tell us! Are there any which would
be more important to the project?
To further investigate possible metrics to answer our questions
regarding CI test workloads we have setup an instance of Jenkins. We are
now at the point where we have to configure Jenkins to run JUnit and
STF-Tests on specific branches of our Saros repository.
What actions are needed to configure this? Can you give us the
configuration you use? Would it be possible to run STF-Tests without
Jenkins, i.e. from the Shell as well?
Thank you very much in advance!
Best regards
Thomas Benndorf and Christoph Viebig
------------------------------------------------------------------------------
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
_______________________________________________
DPP-Devel mailing list
DPP-Devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dpp-devel
------------------------------------------------------------------------------
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
_______________________________________________
DPP-Devel mailing list
DPP-Devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dpp-devel
<project name="stf.saros.test" basedir="." default="test">
<target name="test">
<mkdir dir="${junit.dir}" />
<mkdir dir="${src.dir}" />
<junit printsummary="yes" reloading="false">
<sysproperty key="de.fu_berlin.inf.dpp.stf.client.configuration.files" value="${stf.client.config.files}" />
<classpath>
<fileset dir="${saros.plugin.dir}">
<include name="de.fu_berlin.inf.dpp_*.jar" />
</fileset>
<fileset dir="${eclipse.dir}/plugins">
<include name="*.jar" />
</fileset>
<fileset dir="${lib.dir}" />
</classpath>
<formatter type="xml" />
<batchtest todir="${junit.dir}">
<fileset dir="${src.dir}">
<include name="**/stf/test/**/*Test.java" />
<exclude name="**/stf/test/stf/**" />
</fileset>
</batchtest>
</junit>
</target>
</project>
#! /bin/sh
# params DISPLAY, USER(e.g: Alice, Bob), PORT
set -x
DISPLAY=$1
USER=$2
RMI_PORT=$3
JAVA=`/usr/bin/which java`
if [ ! -x $JAVA ]; then
echo "error, no java executable found"
exit 1
fi
HOSTIP=`/sbin/ifconfig eth1 | grep "inet addr" | cut -d ":" -f 2 | cut -d " "
-f 1`
ECLIPSE_DIR="${HOME}/eclipse"
ECLIPSE_PLUGIN_DIR="${ECLIPSE_DIR}/plugins"
WORKSPACE="${HOME}/workspace_${USER}"
PLUGIN_ID_PREFIX="de.fu_berlin.inf.dpp"
SAROS_PLUGIN_DIR="${HOME}/plugins"
# determine (versioned) filename of plugin (_ suppresses .source versions)
SAROS_PLUGIN_FILENAME=`ls -1 $SAROS_PLUGIN_DIR | grep
"${PLUGIN_ID_PREFIX}_[0-9]*"`
if [ -z $SAROS_PLUGIN_FILENAME ]; then
echo "cannot find Saros plugin in $SAROS_PLUGIN_DIR"
exit 1
fi
echo "deleting workspace: ${WORKSPACE}"
rm -rf "${WORKSPACE}"
if [ ! -e "${SAROS_PLUGIN_DIR}/.lock" ]; then
touch "${SAROS_PLUGIN_DIR}/.lock"
echo "deleting old plugin(s)"
rm -f "${ECLIPSE_PLUGIN_DIR}/de.fu_berlin.inf"*
echo "installing plugins"
cp "${SAROS_PLUGIN_DIR}/de.fu_berlin.inf"* "${ECLIPSE_PLUGIN_DIR}"
fi
mkdir -p "${WORKSPACE}"
mkdir -p "${WORKSPACE}/.metadata/.plugins/org.eclipse.core.runtime/.settings"
# enable auto close of eclipse
echo EXIT_PROMPT_ON_CLOSE_LAST_WINDOW=false >
"${WORKSPACE}/.metadata/.plugins/org.eclipse.core.runtime/.settings/org.eclipse.ui.ide.prefs"
# keep SVN quite
echo ask_user_for_usage_report_preference=false >
"${WORKSPACE}/.metadata/.plugins/org.eclipse.core.runtime/.settings/org.tigris.subversion.subclipse.tools.usage.prefs"
echo "grant{\\npermission java.security.AllPermission;\\n};" >
"${WORKSPACE}/stf.policy"
# get path to equinox jar inside eclipse home folder
CLASSPATH=$(find "${ECLIPSE_PLUGIN_DIR}" -name
"org.eclipse.equinox.launcher_*.jar" | sort | tail -1);
CLASSPATH="${CLASSPATH}:${HOME}/cobertura/cobertura.jar"
LD_LIBRARY_PATH=/usr/lib/jni:${LD_LIBRARY_PATH}
export LD_LIBRARY_PATH
export DISPLAY
export CLASSPATH
echo "starting Eclipse for user ${USER}"
$JAVA -version
# be sure to set -Dosgi.parentClassloader=app otherwise instrumented classes
would throw a class not found exception
$JAVA \
-XX:MaxPermSize=192m -Xms384m -Xmx512m -ea \
-Djava.rmi.server.codebase="file:${SAROS_PLUGIN_FILENAME}" \
-Djava.security.manager \
-Djava.security.policy="file:${WORKSPACE}/stf.policy" \
-Djava.rmi.server.hostname="${HOSTIP}" \
-Dde.fu_berlin.inf.dpp.testmode="${RMI_PORT}" \
-Dde.fu_berlin.inf.dpp.sleepTime=200 \
-Dorg.eclipse.swtbot.keyboard.strategy=org.eclipse.swtbot.swt.finder.keyboard.MockKeyboardStrategy
\
-Dorg.eclipse.swtbot.keyboard.layout=de.fu_berlin.inf.dpp.stf.server.bot.default
\
-Dfile.encoding=UTF-8 \
-Dnet.sourceforge.cobertura.datafile="${WORKSPACE}/coverage_${USER}.ser" \
-Dosgi.parentClassloader=app \
org.eclipse.equinox.launcher.Main \
-name "eclipse_${USER}" \
-clean \
-consoleLog \
-data "${WORKSPACE}"
#!/usr/bin/lua
--[[
This script depends on some hard coded pathes in the script start_eclipse.sh
Lua manual can be found at: http://www.lua.org/manual/5.1/
Author: Stefan Rossbach <rossbach@inf>
Currently the firewall does not work
]]
local host_to_user_mapping =
{
["192.168.66.129"] = "saros-eclipse",
["192.168.66.130"] = "saros-eclipse"
}
local supported_emulators =
{
["none"] = true,
["802.11a"] = true,
["802.11b"] = true,
["802.11g"] = true,
["adsl"] = true,
["sdsl"] = true,
["highpacketloss"] = true
}
local function to_remote_ssh_command(user, host, sshkey, command)
return "ssh -n -i " .. sshkey .. " " .. user .."@" .. host .. " \"" ..
command .. "\""
end
local function to_scp_command(user, host, sshkey, from, to)
return "scp -o BatchMode=yes -i " .. sshkey .. " " .. user .."@" .. host ..
":\"" .. from .. "\"" .. " \"" .. to .. "\""
end
local function e_print(t)
io.stderr:write(t)
io.stderr:write("\n")
io.stderr:flush()
end
local function usage()
e_print("Usage: stf_script_gen [options] ssh_privkey config_file
plugin_files")
e_print("")
e_print(" --cobertura <cobertura_dir> directory to cobertura if code
coverage is desired, plugin file must be already instrumented")
-- theses options are not implemented yet
e_print(" --sniffer Start a sniffer on the gateway. You
find the file /tmp/packet.pcap on the gateway")
e_print(" --emulator <name> Use given emulator configuration")
os.exit(1)
end
local function parse_config_file(config)
local client, host, port
local t = {}
for line in io.lines(config) do
client, host = line:match("%s-([^_]+)_HOST%s-=%s-([%w%p]+).-")
if client and host then
if not t[client] then t[client] = {} end
t[client].host = host;
end
client, port = line:match("%s-([^_]+)_PORT%s-=%s-([%d]+).-")
if client and port then
if not t[client] then t[client] = {} end
t[client].port = port;
end
end
return t
end
local function process_args()
local t, i, j = {}, 1, 0
t.plugins = {}
while i <= table.getn(arg) do
local a = arg[i]
if a == "--sniffer" then
t.sniffer, i = true , i + 1
elseif a=="--emulator" then
t.emulator, i = arg[i+1], i + 2
elseif a =="--cobertura" then
t.cobertura, i = arg[i+1], i + 2
elseif a:sub(1, 2) ~= "--" then
if j == 0 then
t.sshkey = arg[i]
elseif j == 1 then
t.config = arg[i]
else
table.insert(t.plugins, arg[i])
end
i, j = i + 1, j + 1
else
i = i + 1
end
end
return t
end
if table.getn(arg) == nil or arg[1] == "--help" then usage() end
local args = process_args()
if args.emulator and (not supported_emulators[args.emulator]) then
e_print("error, unknown emulator type: " .. args.emulator) os.exit(1) end
if not (args.sshkey and args.config) or #args.plugins == 0 then usage() end
local config = parse_config_file(args.config)
local hosts = {}
for _, v in pairs(config) do
if v.port == nil or v.host == nil then e_print("error in config file, host
or port is missing for: " .. k) os.exit(1) end
hosts[v.host] = host_to_user_mapping[v.host]
if hosts[v.host] == nil then e_print ("error, the host " .. v.host .. " is
not known") os.exit(1) end
end
---------------------------------------------------------------------
print("#!/usr/bin/sh")
---------------------------------------------------------------------
print("function wait_until_timeout ()")
print("{")
print("POLL=$1")
print("DELAY=$2")
print("PID_TO_WAIT_FOR=$3")
print("while [ $POLL -gt 0 ]")
print("do")
print("sleep $DELAY")
print("POLL=$(( $POLL - 1 ))")
print("kill -0 $PID_TO_WAIT_FOR > /dev/null 2>&1")
print("STATUS=$?")
print("if [ $STATUS -ne 0 ]")
print("then")
print("return 0")
print("fi")
print("done")
print("return 1")
print("}")
print("function stop_vnc_server ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# STOP VNC SERVERS #\"")
print("echo \"#############################################\"")
for host, user in pairs(hosts) do
for i = 1, 9 do
print(to_remote_ssh_command(user, host, args.sshkey, "vncserver -kill
:" .. i))
print(to_remote_ssh_command(user, host, args.sshkey, "rm -rf /tmp/.X"
.. i .. "-lock"))
print(to_remote_ssh_command(user, host, args.sshkey, "rm -rf
/tmp/.X11-unix/X" .. i))
end
end
print("}")
---------------------------------------------------------------------
print("function start_vnc_server ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# START VNC SERVERS #\"")
print("echo \"#############################################\"")
for host, user in pairs(hosts) do
for _, net in pairs(config) do
if host == net.host then print(to_remote_ssh_command(user, host,
args.sshkey, "vncserver")) end
end
end
print("}")
---------------------------------------------------------------------
print("function start_sniffer ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# START SNIFFER #\"")
print("echo \"#############################################\"")
print("}")
---------------------------------------------------------------------
print("function configure_emulator ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# CONFIGURE EMULATOR #\"")
print("echo \"#############################################\"")
print("}")
---------------------------------------------------------------------
print("function deploy ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# DEPLOY #\"")
print("echo \"#############################################\"")
for host, user in pairs(hosts) do
local stf_scripts = os.getenv("JENKINS_HOME") .. "/scripts/stf"
print(to_remote_ssh_command(user, host, args.sshkey, "rm -rf /home/" .. user
.. "/plugins"))
print(to_remote_ssh_command(user, host, args.sshkey, "rm -rf /home/" .. user
.. "/bin"))
print(to_remote_ssh_command(user, host, args.sshkey, "mkdir -p /home/" ..
user .. "/plugins"))
print(to_remote_ssh_command(user, host, args.sshkey, "mkdir -p /home/" ..
user .. "/bin"))
for _, plugin in ipairs(args.plugins) do print("rsync -v -e 'ssh -i " ..
args.sshkey .. "'" .. " -a " .. plugin .. " " .. user .. "@" .. host ..
":plugins/") end
print("rsync -v -e 'ssh -i " .. args.sshkey .. "'" .. " -a " .. stf_scripts
.. "/start_eclipse.sh" .. " " .. user .. "@" .. host .. ":bin/")
print("rsync -v -e 'ssh -i " .. args.sshkey .. "'" .. " -a " .. stf_scripts
.. "/stop_eclipse.sh" .. " " .. user .. "@" .. host .. ":bin/")
if args.cobertura then
print(to_remote_ssh_command(user, host, args.sshkey, "rm -rf /home/" ..
user .. "/cobertura"))
print(to_remote_ssh_command(user, host, args.sshkey, "mkdir -p /home/" ..
user .. "/cobertura"))
print("rsync -v -e 'ssh -i " .. args.sshkey .. "'" .. " -a " ..
args.cobertura .. "/ " .. user .. "@" .. host .. ":cobertura/")
end
end
print("}")
---------------------------------------------------------------------
print("function start_remote_bots ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# START REMOTE BOTS #\"")
print("echo \"#############################################\"")
local display_numbers = {}
for saros_user, net in pairs(config) do
display_numbers[net.host] = 1
end
for saros_user, net in pairs(config) do
local current_display_number = display_numbers[net.host]
print(to_remote_ssh_command(host_to_user_mapping[net.host], net.host,
args.sshkey, "./bin/start_eclipse.sh :" .. current_display_number .. " " ..
saros_user .. " " .. net.port) .. " > " .. saros_user:lower() .. ".log 2>&1
&");
print("echo $! >> ssh.pids")
print("# SLEEP TO AVOID HIGH LOAD ON THE MACHINES AND ECLIPSE INITIALIZATION
ERRORS")
print("sleep 30")
display_numbers[net.host] = current_display_number + 1
end
print("}")
---------------------------------------------------------------------
print("function stop_remote_bots ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# STOP REMOTE BOTS #\"")
print("echo \"#############################################\"")
for saros_user, net in pairs(config) do
print(to_remote_ssh_command(host_to_user_mapping[net.host], net.host,
args.sshkey, "./bin/stop_eclipse.sh " .. "${1}" .. " " .. saros_user .. " " ..
net.port));
end
print("}")
---------------------------------------------------------------------
print("function kill_ssh_connections ()")
print("{")
print("cat ssh.pids | while read LINE")
print("do")
print("kill -SIGKILL $LINE")
print("done")
print("}")
---------------------------------------------------------------------
print("function fetch_code_coverage ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# FETCH CODE COVERAGE #\"")
print("echo \"#############################################\"")
if args.cobertura then
for saros_user, net in pairs(config) do
print(to_scp_command(host_to_user_mapping[net.host], net.host,
args.sshkey, "/home/" .. host_to_user_mapping[net.host] .. "/workspace_" ..
saros_user .. "/coverage_" .. saros_user .. ".ser", "coverage_" .. saros_user
.. ".ser"))
end
end
print("}")
---------------------------------------------------------------------
print("function fetch_screen_shots ()")
print("{")
print("echo \"#############################################\"")
print("echo \"# FETCH SCREENSHOTS #\"")
print("echo \"#############################################\"")
if args.cobertura then
for saros_user, net in pairs(config) do
print(to_remote_ssh_command(host_to_user_mapping[net.host], net.host,
args.sshkey, "cd /home/" .. host_to_user_mapping[net.host] .. "/workspace_" ..
saros_user .. "/.metadata && tar czvf " .. saros_user .. "screen_shots.tar.gz
saros_screenshots"))
print(to_scp_command(host_to_user_mapping[net.host], net.host,
args.sshkey, "/home/" .. host_to_user_mapping[net.host] .. "/workspace_" ..
saros_user .. "/.metadata/" .. saros_user .. "screen_shots.tar.gz", saros_user
.. "screen_shots.tar.gz"))
print("tar xzvf " .. saros_user .. "screen_shots.tar.gz")
end
end
print("}")
---------------------------------------------------------------------
print("echo \"#############################################\"")
print("echo \"# this script was auto generated by #\"")
print("echo \"# #\"")
print("echo \"# stf_script_gen #\"")
print("echo \"# #\"")
print("echo \"#############################################\"")
#! /bin/sh
set -x
# params KILL_MODE USER(e.g: Alice, Bob), PORT
PID=`ps aux | grep -v grep | grep testmode=${3}.*workspace_${2} | sed 's/ \+/
/g' | cut -d " " -f 2`
kill -${1} ${PID}
------------------------------------------------------------------------------
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
_______________________________________________
DPP-Devel mailing list
DPP-Devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dpp-devel