[Blueprint servercloud-s-juju-contributor-onramp] Juju Contributor Onramp

2013-05-10 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  UDS 1303 Pad Data:
  Pad: http://pad.ubuntu.com/uds-1303-servercloud-1303-juju-contributor-onramp
  Old spec:  
https://blueprints.launchpad.net/ubuntu/+spec/community-r-juju-contributor-onramp-1
  
  Idea:
  In regards to using other code hosting: mariusko_: Also maybe compare with 
how it is done with nodejs: npm help publish 
(https://npmjs.org/doc/developers.html)
  
  [USER STORIES]
  
  James is a sysadmin at a company and is interested in getting his charms
  into the store for convenience/peer review. He's got some internal stuff
  that he can keep seperate, but he doesn't want to maintain boring
  infrastructure charms on his own.
  
  Kirk is deploying a charm but finds that it's missing a feature or needs
  a bugfix and he thinks he knows how to fix it.
  
  Robert is a developer at a company who works on a database that is
  charmed up, he notices that the charm needs improvement to follow his
  project's recommended best practice, but isn't sure how to claim
  ownership of a charm.
  
  Lars is used to github and is totally confused on how to get charms he's
  interested in.
  
  Cliff submitted a fix to a charm two weeks ago and has no idea how to
  get someone to get his fix in.
  
  [ASSUMPTIONS]
  
  - Mims/Castro want to mirror how OpenStack does reviews and code
  contributions as well as their usage of LP Blueprints.
  
  [RISKS]
  
  - There's only so much simplification that can happen when it comes to 
submitting code.
  - Poor response time leads people to believe that we don't want to help them.
  - Workflow ties into the store, so we need to be careful on how we simplify 
this.
  
  [IN SCOPE]
  
  - Refining workflow for submission
  - Measuring contributor metrics and enforcing a fast response time.
  
  [OUT OF SCOPE]
  
  [USER ACCEPTANCE]
  
  [RELEASE NOTE/BLOG]
  
  - We've made it easier than ever to contribute to charms with a new
  submission workflow.
+ 
+ 
+ [notes]
+ 
+ Low hanging fruit
+ Docs! Big issue to getting new contributors
+ Instructional videos will be a big help to resolving “How do I get going” for 
new people
+ “Contribute to this charm” on jujucharms.com
+ 
+ Better charm tools to help contributors (charm-helpers, charm-tools,
+ charmsupport)
+ 
+ NO LOCAL PROVIDER FOR JUJU-CORE
+ Inhibitor for WebOps, charmers, community, and basically everyone

-- 
Juju Contributor Onramp
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-s-juju-contributor-onramp

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-s-juju-charm-testing] Juju Charm Testing

2013-05-10 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  Discussion:
   *Bug submission on charm failure.
   *Define a process around how charm maintainers respond to test failures and 
subsequent bugs. Can a user run a manual test and submit the test back to the 
bug report to update testing status to green.
   *Enable Autocharm tester to be more resilient against provider failures, and 
Jenkins usage.
   simulate provider failure, and be able to recover: $ juju ssh MACHINE 
sudo shutdown now
   * Define WIs to execute auto charm testing on Go.
   * Continuous Integration (also will help with gating on charm commits)
   * Juju Testing Blogging
   * Juju testing communication to Juju lists.
   * Work on integrating/fixing Charm runner (graph testing/ dependency/env set 
up testing).
  
  Add a Jenkins workflow to run a charm or a set of charms in the following LXC 
environments:
   -raring container on raring host
   -raring container on precise host
   -precise container on raring host
   -precise container on precise host
  
  Two modes of testing:
   -Unit (does the charm start, and report ready)
   -Workload (test the charms relations, and pushing data)
  
  Reference Links:
   *Charm Test Spec [html] https://juju.ubuntu.com/docs/charm-tests.html
   * Charm Test Spec [source] 
http://bazaar.launchpad.net/~juju/juju/docs/view/head:/source/charm-tests.rst
   * CharmTester Charm http://jujucharms.com/~mark-mims/oneiric/charmtester
   * Charm Runner: https://launchpad.net/charmrunner
   * Jenkins Charm Testing: https://jenkins.qa.ubuntu.com/view/Charms/
  
  [USER STORIES]
  
  William is a juju user who wishes to know a charms current stability
  
  Saul is patching a charm and wants to in sure his changes are work with
  current tests
  
  Laura is a charm maintainer and wants to add tests to in sure her charm
  is stable
  
  Kara is a charm maintainer and needs to know when her charm is broken
  
  Lee is a charmer who, while reviewing charm submissions, needs to know
  if these changes break backwards compatibility with currently deployed
  services
  
  Gaius is a charm maintainer from an upstream project and needs an easy
  way to learn how to write tests for his charm
  
  [ASSUMPTIONS]
  
  - Charm tester/charm tester control will work with gojuju for at least
  graph testing
  
  [RISKS]
  
  - Relying soley on graph testing may result in inaccurate test results due 
to lack of embedded tests
  - Making tests too complicated may result in low adoption rate of embedded 
testing
  
  [IN SCOPE]
  
  [OUT OF SCOPE]
  
  [USER ACCEPTANCE]
  
  [RELEASE NOTE/BLOG]
  
  (Needs spec and WI definition) -[a.rosales; 12-DEC-2012]
  
  === UDS 1303 Notes ===
  Pad: http://pad.ubuntu.com/uds-1303-servercloud-r-juju-charm-testing
  
  Question:
  Is there a way in meta-data to explicitly state provider support.
  -Example: Ceph: Does cloud provider have block support
  -More broadly stated does the cloud provider have the capabilities the 
charm needs
  
  Idea:
    -In charm testing status be able to show that a charm failure can be a 
result of the provider not providing the needed capabilities, ie the Ceph charm 
fails on a provider because it does not support object store.
    -Make interface usage more verbose in the charm description.
    -Need a rule/spec on how a interface should be implemented
  -Need to investigate possible enforment of interfaces
   -**Have the testing framework iterate through the operational deployment 
requirments.
  
  Interfaces doc link broken:
    -Example: http://jujucharms.com/interfaces/ceph-client  Interface doc link 
broken:
    https://juju.ubuntu.com/Interfaces/ceph-client -- broken
  
  Meta-language testing (http://paste.ubuntu.com/5588570/):
  
  Lanugage suggestions:
  http://lettuce.it/
  http://cukes.info/
  
  Charm-Runner integration:
   - https://launchpad.net/juju-deployer
  
  Wrap Go/Py juju client status:
   - https://launchpad.net/python-jujuclient
+ 
+ 
+ ---
+ 
+ [notes from cloudsprint 2013-05]
+ Topics to cover
+ Current Testing
+ Current todos
+ Experiences from IS
+ Ideas
+ 
+ Review charm policy to include:
+ Test must pass tests
+ Charm must have tests
+ 
+ We want embedded tests!
+ All tests live in the charm
+ Functional Tests
+ /test (in charm)
+ Integration
+ /test.d (in charm)
+ How to make it low entry for charmers to add tests
+ charm create tests (charm-tools make a stub _simple_ test)
+ leverage libraries, and possibly a deployment (dare I say declarative) 
testing lang.
+ Sidnei mocks all the juju calls (U1 testing)
+ have a library that stubs this for you.
+ pull this into charm-helper library
+ leverage Go-watch
+ leverage charm testing with charm upgrade
+ 
+ Story Points: (added to Blueprint work item)
+ Integration Testing includes framework that charm authors can write tests 
against (embedded in the charm).
+ Jenkins testing on new merge proposal, on success it is a candidate for review
+ Develop Juju test

[Blueprint servercloud-s-juju-contributor-onramp] Juju Contributor Onramp

2013-05-10 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  UDS 1303 Pad Data:
  Pad: http://pad.ubuntu.com/uds-1303-servercloud-1303-juju-contributor-onramp
  Old spec:  
https://blueprints.launchpad.net/ubuntu/+spec/community-r-juju-contributor-onramp-1
  
  Idea:
  In regards to using other code hosting: mariusko_: Also maybe compare with 
how it is done with nodejs: npm help publish 
(https://npmjs.org/doc/developers.html)
  
  [USER STORIES]
  
  James is a sysadmin at a company and is interested in getting his charms
  into the store for convenience/peer review. He's got some internal stuff
  that he can keep seperate, but he doesn't want to maintain boring
  infrastructure charms on his own.
  
  Kirk is deploying a charm but finds that it's missing a feature or needs
  a bugfix and he thinks he knows how to fix it.
  
  Robert is a developer at a company who works on a database that is
  charmed up, he notices that the charm needs improvement to follow his
  project's recommended best practice, but isn't sure how to claim
  ownership of a charm.
  
  Lars is used to github and is totally confused on how to get charms he's
  interested in.
  
  Cliff submitted a fix to a charm two weeks ago and has no idea how to
  get someone to get his fix in.
  
  [ASSUMPTIONS]
  
  - Mims/Castro want to mirror how OpenStack does reviews and code
  contributions as well as their usage of LP Blueprints.
  
  [RISKS]
  
  - There's only so much simplification that can happen when it comes to 
submitting code.
  - Poor response time leads people to believe that we don't want to help them.
  - Workflow ties into the store, so we need to be careful on how we simplify 
this.
  
  [IN SCOPE]
  
  - Refining workflow for submission
  - Measuring contributor metrics and enforcing a fast response time.
  
  [OUT OF SCOPE]
  
  [USER ACCEPTANCE]
  
  [RELEASE NOTE/BLOG]
  
  - We've made it easier than ever to contribute to charms with a new
  submission workflow.
  
- 
- [notes]
+ [notes from cloudsprint 2013-05]
  
  Low hanging fruit
  Docs! Big issue to getting new contributors
  Instructional videos will be a big help to resolving “How do I get going” for 
new people
  “Contribute to this charm” on jujucharms.com
  
  Better charm tools to help contributors (charm-helpers, charm-tools,
  charmsupport)
  
  NO LOCAL PROVIDER FOR JUJU-CORE
  Inhibitor for WebOps, charmers, community, and basically everyone

-- 
Juju Contributor Onramp
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-s-juju-contributor-onramp

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-s-juju-docs] Improve Juju Documentation

2013-05-10 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  --- UDS 1303 Discussion ---
  PAD URL:  http://pad.ubuntu.com/uds-1303-servercloud-1303-juju-docs
    -Leave behind RST!
    -pandoc doesn't support some of the newer HTML5 elements (asides, etc)
    HTML5 w/ JavaScript it is!
    -Make sure juju.ubuntu.com landing page tell Juju story well (what is it, 
why should I use it)
    -Dynamic Content
  -Screencasts
  -Code Examples
  -Better references in sidbars with improved navigation
  -Annotate Juju deploy command
  -Annootate Juju GUI deploy
  -Annotate anatomoy of a charm
  
  --- Ideas ---
  Good examples:
   *stripe.com (https://stripe.com/docs)
     -good journeys
     -not overwhelming
   *http://developer.android.com/sdk/index.html
     -ttp://developer.android.com/tools/workflow/index.html
  
  Would like to have good sidebar notes, screen shots, tips, and examples.
  
  Would like to have knowledge of what I am at in a journey
    *Getting Started (Juju deploy)
    *Charm Development
    *Charm Deployment
    *Charm Discovery
    *Charm Debugging
  
  Model magazines
    *Wired
    *Linux Format
    *Content rich
  
  Need version control
  Want Markdown not RST
  Get rid of anything marked draft
  Need organization of what we currently have.
  Evaluate Pandoc instead of Sphinx
  Discuss GUI and Charm Browser
  
  --- Immediate Needs ---
  Verify the usability of docs (are the instructions correct).
  Identify a model for the docs (framework)
    *Not docbook - too complicated and print-biased
    *Sphinx? Needed since core is now Go?
    *Something markdown based?
    *simple HTML5?
  Milestone
    *13.04 default for Go
    *How to use it
    *Transition
  -Differences
  -Py and Go living together
  
  --- Organization ---
  
  juju users:
  Getting started
     Local provider configuration
     OpenStack provider configuration
     EC2 provider configuration
     Rackspace
     New Providers
     Deploying a Charm
     Exposing A charm
     Implicit Relations
     Machine Constraints
   * User tutorial - rename to “Using Juju” (should be embeddedd per topic)
  Charms
  The Juju Charm Store (for users)
  Handling common errors
  
  charm authors:
  Writing a charm
  The Juju Charm Store
  Charm Testing
  Hook debugging
  Charm Store Policy
  Charm Quality Rating
  Debugging
  
  Reference Guide
    Full table of all charm attributes
    Full table of all juju commands
    example: http://developer.android.com/reference/packages.html
    Glossary -appendix
    Operating Systems - appendix stuff
    Relation references - appendix stuff
  
  Evaluate for removal:
  -move-
   About juju - *move to juju.ubuntu.com.
  -delete-
  Frequently Asked Questions - DELETE THIS, link to autogenerated FAQ at AU.
  juju modules - remove Juju core specific
  Drafts - REMOVE.
  -content should be covered-
  Implementation details - (should be in reference guide for charm 
attributes)
  Commands to work with relation settings and membership (be sure to cover 
in charm author)
  Upgrades - both (be sure how to implement charm upgrade hooks are covered 
in charm author)
  Charm Upgrades (be sure to be covered in Charm Users: deployment)
  New relation-ids hook command -(should be covered in charm author)
  
  # Documentation Framework (Courseware)
  
  https://juju.ubuntu.com/docs/charm.html
  -Make sure to review the docs point, and have a corrolation betweeen user 
jouneys and screen casts.
  - Put screen cast in docs
  -screen casets and docs should reference each other.
  
  0 -- New to Juju
  1. [user] Installing and configurating Juju
   i. https://juju.ubuntu.com/docs/getting-started.html
   1. Charm Introduction
     i. https://juju.ubuntu.com/docs/charm.html
  1. Charm Discovery/ Charm Store
    i https://juju.ubuntu.com/docs/charm-store.html
  1. [user] Deploying your first charm (include constraints)
   i. https://juju.ubuntu.com/docs/user-tutorial.html
  
  1. [user] How to use relations
  creating relations
  removing relations
  1. [user] Service configuration/Lifecycle managment: Using the 
configuration.yaml  juju set
    i. https://juju.ubuntu.com/docs/service-config.html
  
  1.[user] Scaling services
  1. [user] Debugging deployment - debug logs and friends
  1. [user] Advanced topics (deploy-to, jitsu and friends)
    -need to confirm how plug-in model is in Juju 2.0
  
  1- New Charm Author
  Review https://juju.ubuntu.com/docs/write-charm.html
  
  1. Submitting your first contribution to a charm
  1. Writing a new charm from scratch.
    i. https://juju.ubuntu.com/docs/write-charm.html
    a. charm tools
    a. meta-data/readme/licensing/examples/description/how to use/why to use 
from Ops or Dev perspective
    a. Hooks
    a. relations
   i. interface
   i. requires
    a. testing
  1. Publishing to the Juju Charm Store
  1. How to improve your charm using

[Blueprint servercloud-s-juju-charmhelper2] Charm Helper 2 - Declarative Charming

2013-05-10 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  [USER STORIES]
  
  Rodrigo is a new charm writer and needs tools to solve problems other
  authors have already solved in their charms.
  
  Dora is a seasoned charmer that has common functions she wants to
  publish and share with other authors
  
  Ricardo has written charms before and wants to know what shared
  functions are available for charms
  
  Lucy is rewriting several hooks to another language and needs to know
  what helpers exist in that language
  
  Bruce wants to review what parameters a helper function accepts and what
  the expected output is
  
  [ASSUMPTIONS]
- 
  
  [RISKS]
  
  - It'll be difficult for some functions to create a comperable cross-
  language function
  
  [IN SCOPE]
  
  [OUT OF SCOPE]
  
  [USER ACCEPTANCE]
  
  [RELEASE NOTE/BLOG]
  
  --- UDS Discussion ---
  
  We've learned a lot since the creation of charm helper. Debian packagers  
took 7 iterations before boiling all of debhelper's goodness into a  
declarative system. We can learn from them, and get there in our second  
iteration.
  Discussion:
  Charm-helper is not easily discoverable, poorly documented, and not as 
awesome as it could be.
  dannf likes to write makefiles
  sources list
  config settings with meaning
  want common things to go in a declarative charm
  get rid of copied lib files/folder
  Maintain charm-helper seperately from juju, calling it at the top of the hook 
vs included in juju trunk
  What happens during upgrade-charm or if a pakcage gets removed?
  Juju can't help with leader election across units of a service, but 
charm-helper could)
  Work:
  - describe how to handle lifecycle changes for packages in 
charm-helper/packages (also thinkof: upgrade-charm)
  - install packages
  - debconf preseeding too
  - templating and/or building config files with dotd/concat partials 
(erb_template _and_ cheetah_template helpers)?
  - remote_files
  - deploy from {distro,ppa,upstream?} ?
  - Manage config files (dynamic, static, etc)
  - Sanitize relation settings (potential risk of SQL Ejection, highjacking, 
terror)
  - Plugin based for easy extension of charm-helper
  - leader election (perhaps somewhere other than charmhelper)
  - private files (SSL certs) ... no clue...
  - Collaborate on files that no charm owns
+ 
+ ---
+ [notes from cloudsprint 2013-05]
+ - Wedgewood has “Charm Support” which is used in IS and online services charms
+   - Python based library
+   - Covers debian packaging
+ - 
+   - Command line interface
+ - Bash support via argparse
+   - Has hook env for handling various hook specific items (nrpe, relation 
data, etc)
+   - Exec.d allows you to pre-seed stuff in to charm prior to deployment to 
perform “pre-install” tasks
+ - Handles items as exec.d/charm-pre-install in various hooks by invoking 
the matching command
+   - “Rudimentary” persistent storage support
+   -  Charm helpers doesn’t do what they wanted it to. Charm-helpers wasn’t a 
known until after using charm support.
+ - OpenStack has “openstack-charm-helpers” which is used by the OpenStack 
charms for deployment
+   - Configuration/relation abstracting for easier python support
+   - “Storage” management like charm-support
+   - Conditional restart of services, mapping dicts to services to manage 
restart of services
+   - Caching of juju environment data, to speed up hook execution
+   - Local file syncs between peers
+ - Bi-directional file xfer between peers
+   - Didn’t use it because: Primitive, moving too fast, and didn’t achieve 
anything they needed at the time.
+- Current work: lp:~openstack-charmers/openstack-charm-helpers/ha-helpers
+ 
+ Gary’s Ideas:
+  - Python helpers of charm-helpers
+   - Three levels of stuff
+ - Shelltoolbox
+ - Current charm helpers
+ - some random lib stuff?
+   - utils.py branch/release management
+   - backend.py specify repos that a config wants, packages. Does really cool 
python stuff
+  
+ TIME TO GET DOWN TO BUSINESS
+  - Salt/chef/puppet for potential dropins to research. Configuration 
management tools and how they solve this
+  - Look at openstack for management of shared libraries
+  - Python package
+- Sub packages and modules
+   - Merge proposals for charms, make libs not txt?
+ 
+ 
+ 
+ Marco scratch
+  machine.py for machine level functionality

-- 
Charm Helper 2 - Declarative Charming
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-s-juju-charmhelper2

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-q-juju-charm-best-practices] Juju Charm Best practices

2012-10-15 Thread Mark Mims
Blueprint changed by Mark Mims:

Work items changed:
  Work items:
  [jorge] Pick better example charms, flagbearer charms.: DONE
  [jorge] Choose a single place for presenting flagbearer charms (charm 
browser?): TODO
  [jorge] Move rules and charm best practices into juju/docs.: DONE
  [jorge] Add Maintainer field information to documentation. (thanks clint!): 
DONE
  [jorge] Reach out to maintainers to get their maintainer field filled out.: 
DONE
  [jorge] Add docs work items to patch pilot!: DONE
  [jorge] pull file bug or join the doc team from other projects: TODO
  [jorge] Clarify documentention on what to expect when your using LXC and EC2. 
Cost, wait time, what you see, logs, downloading. You should expect this to 
download 300 megs, etc. Here's a table of look here troubleshooting your LXC 
thing: DONE
  [imbrandon] Reach out to maintainers to get their maintainer filled out.: DONE
  [imbrandon] Put Doc instructions in the docs: http://askubuntu.com/q/52063 : 
TODO
- [mark-mims] (and individual maintainers!) clean up existing charms wrt best 
practices: TODO
+ [mark-mims] (and individual maintainers!) clean up existing charms wrt best 
practices: DONE
  [clint-fewbar] Comment and add links to docs in 'charm create' templates: 
POSTPONED
  [clint-fewbar] juju in Debian! NOW!: DONE
  [clint-fewbar] determine most likely maintainers based on bzr logs: DONE
  [clint-fewbar] fix doc repos wrt (trunk, go, docs): DONE
  [marcoceppi] tag:juju review on AU and SE network: DONE
  [hazmat] incorporate output of charm proof into charm browser: DONE
  [hazmat] clear out trunk docs for juju: DONE
  [imbrandon] Clarify documentention on what to expect when your using LXC and 
EC2. Cost, wait time, what you see, logs, downloading. You should expect this 
to download 300 megs, etc. Here's a table of look here troubleshooting your 
LXC thing.: TODO
  [hazmat] make juju report on initial lxc image downloads ?: TODO
  [clint-fewbar] make juju lxc use lxc networking instead of libvirt networking 
?: INPROGRESS

-- 
Juju Charm Best practices
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-q-juju-charm-best-practices

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-q-juju-charm-unit-tests] Juju Charm Unit Tests

2012-10-15 Thread Mark Mims
Blueprint changed by Mark Mims:

Work items changed:
  Work items:
  documentation for unit tests: INPROGRESS
- write examples of unit tests on flag-bearing charms: INPROGRESS
+ write examples of unit tests on flag-bearing charms (example charm): DONE
  [jimbaker] make 'jitsu run-tests' aware of error codes (Jitsu watch): 
INPROGRESS
  [jimbaker] improve charmtester logging information: DONE
  [hazmat] charm proof testing: DONE
- keep everything green: INPROGRESS
- [mark-mims] make charmtester easier to use one-off by anyone: INPROGRESS
+ keep everything green (ongoing task): DONE
+ [mark-mims] make charmtester easier to use one-off by anyone: DONE
  [mark-mims] make charmtester whitelist: DONE
  extend charm tools to help generate tests: POSTPONED
  run charmtester against maas environment: TODO

-- 
Juju Charm Unit Tests
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-q-juju-charm-unit-tests

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-q-juju-charm-unit-tests] Juju Charm Unit Tests

2012-08-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Work items changed:
  Work items:
  documentation for unit tests: INPROGRESS
- write examples of unit tests on flag-bearing charms: TODO
+ write examples of unit tests on flag-bearing charms: INPROGRESS
  [jimbaker] make 'jitsu run-tests' aware of error codes (Jitsu watch): 
INPROGRESS
  [jimbaker] improve charmtester logging information: DONE
  [hazmat] charm proof testing: TODO
  keep everything green: INPROGRESS
  [mark-mims] make charmtester easier to use one-off by anyone: INPROGRESS
- [mark-mims] make charmtester whitelist: INPROGRESS
- extend charm tools to help generate tests: TODO
+ [mark-mims] make charmtester whitelist: DONE
+ extend charm tools to help generate tests: POSTPONED
  run charmtester against maas environment: TODO

-- 
Juju Charm Unit Tests
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-q-juju-charm-unit-tests

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-q-juju-integration] Juju Integration

2012-08-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Work items changed:
  Work items:
- [mark-mims] charm to deploy chef-server (maybe w sub like clint's puppet sub) 
: TODO
- [mark-mims] charms that call chef solo recipes : TODO
+ [mark-mims] charm to deploy chef-server (maybe w sub like clint's puppet sub) 
: POSTPONED
+ [mark-mims] charms that call chef solo recipes : POSTPONED
  [negronjl] jujustrano ( Juju Capistrano integration ) : DONE
- [negronjl] go to town on clint's puppet sub ( puppetforge? ) ( add charm 
create option for puppet recipes ) : TODO
+ [negronjl] go to town on clint's puppet sub ( puppetforge? ) ( add charm 
create option for puppet recipes ) : POSTPONED
  [negronjl] define ways that juju would need to call Capistrano : DONE
  [negronjl] jrapi as juju-jitsu subcommand... 'jitsu api' : POSTPONED
  [negronjl] integrate Juju with facter : DONE
- [hazmat] export/import environment into juju-jitsu : TODO
+ [hazmat] export/import environment into juju-jitsu : DONE
  [negronjl] how to make charms out of puppetforge modules / github cookbooks : 
POSTPONED
  [negronjl] plugin for chef-search to interface with juju (possibly just chef 
sub) : POSTPONED
  [mike-mcclurg] cloudstack integration : TODO
  [imbrandon] investigate enstratus integration : TODO

-- 
Juju Integration
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-q-juju-integration

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-q-juju-charm-unit-tests] Juju Charm Unit Tests

2012-08-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Work items changed:
  Work items:
  documentation for unit tests: INPROGRESS
  write examples of unit tests on flag-bearing charms: INPROGRESS
  [jimbaker] make 'jitsu run-tests' aware of error codes (Jitsu watch): 
INPROGRESS
  [jimbaker] improve charmtester logging information: DONE
- [hazmat] charm proof testing: TODO
+ [hazmat] charm proof testing: DONE
  keep everything green: INPROGRESS
  [mark-mims] make charmtester easier to use one-off by anyone: INPROGRESS
  [mark-mims] make charmtester whitelist: DONE
  extend charm tools to help generate tests: POSTPONED
  run charmtester against maas environment: TODO

-- 
Juju Charm Unit Tests
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-q-juju-charm-unit-tests

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Bug 1037331] [NEW] lxc-create should clear the cache when interrupted

2012-08-15 Thread Mark Mims
Public bug reported:

hallyn if you interrupt lxc-create, it *will* clear out the container.
but not the cache

** Affects: lxc (Ubuntu)
 Importance: Undecided
 Status: New

-- 
You received this bug notification because you are a member of Ubuntu
Server Team, which is subscribed to lxc in Ubuntu.
https://bugs.launchpad.net/bugs/1037331

Title:
  lxc-create should clear the cache when interrupted

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/lxc/+bug/1037331/+subscriptions

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-q-juju-charm-unit-tests] Juju Charm Unit Tests

2012-07-24 Thread Mark Mims
Blueprint changed by Mark Mims:

Work items changed:
  Work items:
- documentation for unit tests: TODO
+ documentation for unit tests: INPROGRESS
  write examples of unit tests on flag-bearing charms: TODO
  [jimbaker] make 'jitsu run-tests' aware of error codes (Jitsu watch): 
INPROGRESS
- [jimbaker] improve charmtester logging information: TODO
+ [jimbaker] improve charmtester logging information: DONE
  [hazmat] charm proof testing: TODO
- keep everything green: TODO
- [mark-mims] make charmtester easier to use one-off by anyone: TODO
- [mark-mims] make charmtester whitelist: TODO
+ keep everything green: INPROGRESS
+ [mark-mims] make charmtester easier to use one-off by anyone: INPROGRESS
+ [mark-mims] make charmtester whitelist: INPROGRESS
  extend charm tools to help generate tests: TODO
  run charmtester against maas environment: TODO

-- 
Juju Charm Unit Tests
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-q-juju-charm-unit-tests

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-p-juju-charm-testing] Juju: automated testing of charms

2012-03-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  Status: Spec nearing final approval. Merge proposals submitted against
  juju to implement some parts of the wrapper inside juju itself:
  http://pad.lv/939932 , http://pad.lv/939944
  
  Work Items:
  [clint-fewbar] write spec for charm testing facility: DONE
  implement specified testing framework: INPROGRESS
  [mark-mims] deploy testing framework for use with local provider (all phase1 
tests are done w/ local provider): DONE
- deploy testing framework for use against ec2: TODO
- deploy testing framework for use against canonistack: TODO
+ [mark-mims] deploy testing framework for use against ec2: INPROGRESS
+ [mark-mims] deploy testing framework for use against canonistack: INPROGRESS
  deploy testing framework for use against orchestra (managing VMs instead of 
machines): POSTPONED
  write charm tests for mysql: POSTPONED
  [clint-fewbar]  write charm tests for haproxy: POSTPONED
  [clint-fewbar]  write charm tests for wordpress: POSTPONED
  [mark-mims]  write charm tests for hadoop: POSTPONED
  [james-page]  add openstack tests: TODO
  [mark-mims]  jenkins charm to spawn basic charm tests: DONE
  [mark-mims]  basic charm tests... just test install hooks for now: DONE
  
  Session notes:
  Welcome to Ubuntu Developer Summit!
  #uds-p #track #topic
  put your session notes here
  Requirements of automated testing of charms:
  * LET'S KEEP IT SIMPLE! :-)
  * Detect breakage of a charm relating to an interface
  * Identification of individual change which breaks a given relationship
  * Maybe implement tests that mock a relation to ensure implementers are 
compliant
  * Test dependent charms when a provider charm changes
  * Run test NxN of providers and requirers so all permutations are sane 
(_very_ expensive, probably impossible)
  * Run testing against multiple environment providers (EC2/OpenStack/BareMetal)
  * Notify maintainers when the charm breaks, rather than waiting for polling
  * Verify idempotency of hooks
  * Tricky to _verify_, and not an enforced convention at the moment, so 
not sure
  * be able to specify multiple scenarios
  * For functional tests, they are in fact exercising multiple charms. Should 
those sit
    within the charms, or outside since it's in fact exercising the whole graph?
    * The place for these composed tests seem to be the stack
  * As much data as possible should be collected about the running tests so 
that a broken
    charm can be debugged and fixed.
  * Provide rich artifacts for failure analysis
  * Ideally tests will be run in lock step mode, so that breaking charms can 
be individually
    identified, but this is hard because changes may be co-dependent
  * It would be nice to have interface-specific tests that can run against any 
charms that
    implement such interfaces. In addition to working as tests, this is also a 
pragmatic
    way to document the interface.
  * support gerrit-like topics?  (What's that? :-) i.e., change-sets (across 
different branches)
  * We need a way to know which charms trigger which tests
  * Keep it simple
  * James mentioned he'd like to have that done by Alpha 1 (December) so that 
he can take
    that into account for the OpenStack testing effort.
  ACTIONS:
  [niemeyer] spec
  [james-page] add openstack tests
  
  Proposal below is too complicated, rejected (Kept for posterity)
  
  Proposal:
  
  Each charm has a tests directory
  
  Under tests, you have executables:
  
  __install__ -- test to run after charm is installed with no relations
  
  Then two directories:
  provides/
  requires/
  
  These directories have a directory underneath for each interface
  provided/required. Those directories contain executables to run.
  
  The test runner follows the following method:
  
  deploy charm
  wait for installed status
  run __install__ script, FAIL if exits non-zero
  destroy service
  for interface in provides ; do
  calculate graph of all charms in store which require interface and all of 
its dependency combinations
  deploy requiring charm w/ dependencies and providing service
  add-relation between requiring/providing
  for test in provides/interface ; do
    run test with name of deployed requiring service
  for interface in requires ; do
  repeat process above with provides/requires transposed
  
  Each commit to any branch in charm store will queue up a run with only
  that change applied, none that have been done after it, and record
  pass/fail

-- 
Juju: automated testing of charms
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-p-juju-charm-testing

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-p-juju-charm-testing] Juju: automated testing of charms

2012-03-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  Status: Spec nearing final approval. Merge proposals submitted against
  juju to implement some parts of the wrapper inside juju itself:
  http://pad.lv/939932 , http://pad.lv/939944
  
  Work Items:
  [clint-fewbar] write spec for charm testing facility: DONE
  implement specified testing framework: INPROGRESS
- [mark-mims] deploy testing framework for use with local provider (all phase1 
tests are done w/ local provider): DONE
+ [mark-mims] deploy testing framework for use with local provider (all phase1 
tests and charm graph tests are done w/ local provider): DONE
  [mark-mims] deploy testing framework for use against ec2: INPROGRESS
  [mark-mims] deploy testing framework for use against canonistack: INPROGRESS
  deploy testing framework for use against orchestra (managing VMs instead of 
machines): POSTPONED
  write charm tests for mysql: POSTPONED
  [clint-fewbar]  write charm tests for haproxy: POSTPONED
  [clint-fewbar]  write charm tests for wordpress: POSTPONED
  [mark-mims]  write charm tests for hadoop: POSTPONED
  [james-page]  add openstack tests: TODO
  [mark-mims]  jenkins charm to spawn basic charm tests: DONE
  [mark-mims]  basic charm tests... just test install hooks for now: DONE
  
  Session notes:
  Welcome to Ubuntu Developer Summit!
  #uds-p #track #topic
  put your session notes here
  Requirements of automated testing of charms:
  * LET'S KEEP IT SIMPLE! :-)
  * Detect breakage of a charm relating to an interface
  * Identification of individual change which breaks a given relationship
  * Maybe implement tests that mock a relation to ensure implementers are 
compliant
  * Test dependent charms when a provider charm changes
  * Run test NxN of providers and requirers so all permutations are sane 
(_very_ expensive, probably impossible)
  * Run testing against multiple environment providers (EC2/OpenStack/BareMetal)
  * Notify maintainers when the charm breaks, rather than waiting for polling
  * Verify idempotency of hooks
  * Tricky to _verify_, and not an enforced convention at the moment, so 
not sure
  * be able to specify multiple scenarios
  * For functional tests, they are in fact exercising multiple charms. Should 
those sit
    within the charms, or outside since it's in fact exercising the whole graph?
    * The place for these composed tests seem to be the stack
  * As much data as possible should be collected about the running tests so 
that a broken
    charm can be debugged and fixed.
  * Provide rich artifacts for failure analysis
  * Ideally tests will be run in lock step mode, so that breaking charms can 
be individually
    identified, but this is hard because changes may be co-dependent
  * It would be nice to have interface-specific tests that can run against any 
charms that
    implement such interfaces. In addition to working as tests, this is also a 
pragmatic
    way to document the interface.
  * support gerrit-like topics?  (What's that? :-) i.e., change-sets (across 
different branches)
  * We need a way to know which charms trigger which tests
  * Keep it simple
  * James mentioned he'd like to have that done by Alpha 1 (December) so that 
he can take
    that into account for the OpenStack testing effort.
  ACTIONS:
  [niemeyer] spec
  [james-page] add openstack tests
  
  Proposal below is too complicated, rejected (Kept for posterity)
  
  Proposal:
  
  Each charm has a tests directory
  
  Under tests, you have executables:
  
  __install__ -- test to run after charm is installed with no relations
  
  Then two directories:
  provides/
  requires/
  
  These directories have a directory underneath for each interface
  provided/required. Those directories contain executables to run.
  
  The test runner follows the following method:
  
  deploy charm
  wait for installed status
  run __install__ script, FAIL if exits non-zero
  destroy service
  for interface in provides ; do
  calculate graph of all charms in store which require interface and all of 
its dependency combinations
  deploy requiring charm w/ dependencies and providing service
  add-relation between requiring/providing
  for test in provides/interface ; do
    run test with name of deployed requiring service
  for interface in requires ; do
  repeat process above with provides/requires transposed
  
  Each commit to any branch in charm store will queue up a run with only
  that change applied, none that have been done after it, and record
  pass/fail

-- 
Juju: automated testing of charms
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-p-juju-charm-testing

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-p-juju-charm-testing] Juju: automated testing of charms

2012-03-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  Status: Spec nearing final approval. Merge proposals submitted against
  juju to implement some parts of the wrapper inside juju itself:
  http://pad.lv/939932 , http://pad.lv/939944
  
  Work Items:
  [clint-fewbar] write spec for charm testing facility: DONE
- implement specified testing framework: INPROGRESS
+ implement specified testing framework (phase1): DONE
  [mark-mims] deploy testing framework for use with local provider (all phase1 
tests and charm graph tests are done w/ local provider): DONE
  [mark-mims] deploy testing framework for use against ec2: INPROGRESS
  [mark-mims] deploy testing framework for use against canonistack: INPROGRESS
  deploy testing framework for use against orchestra (managing VMs instead of 
machines): POSTPONED
  write charm tests for mysql: POSTPONED
  [clint-fewbar]  write charm tests for haproxy: POSTPONED
  [clint-fewbar]  write charm tests for wordpress: POSTPONED
  [mark-mims]  write charm tests for hadoop: POSTPONED
  [james-page]  add openstack tests: TODO
  [mark-mims]  jenkins charm to spawn basic charm tests: DONE
  [mark-mims]  basic charm tests... just test install hooks for now: DONE
  
  Session notes:
  Welcome to Ubuntu Developer Summit!
  #uds-p #track #topic
  put your session notes here
  Requirements of automated testing of charms:
  * LET'S KEEP IT SIMPLE! :-)
  * Detect breakage of a charm relating to an interface
  * Identification of individual change which breaks a given relationship
  * Maybe implement tests that mock a relation to ensure implementers are 
compliant
  * Test dependent charms when a provider charm changes
  * Run test NxN of providers and requirers so all permutations are sane 
(_very_ expensive, probably impossible)
  * Run testing against multiple environment providers (EC2/OpenStack/BareMetal)
  * Notify maintainers when the charm breaks, rather than waiting for polling
  * Verify idempotency of hooks
  * Tricky to _verify_, and not an enforced convention at the moment, so 
not sure
  * be able to specify multiple scenarios
  * For functional tests, they are in fact exercising multiple charms. Should 
those sit
    within the charms, or outside since it's in fact exercising the whole graph?
    * The place for these composed tests seem to be the stack
  * As much data as possible should be collected about the running tests so 
that a broken
    charm can be debugged and fixed.
  * Provide rich artifacts for failure analysis
  * Ideally tests will be run in lock step mode, so that breaking charms can 
be individually
    identified, but this is hard because changes may be co-dependent
  * It would be nice to have interface-specific tests that can run against any 
charms that
    implement such interfaces. In addition to working as tests, this is also a 
pragmatic
    way to document the interface.
  * support gerrit-like topics?  (What's that? :-) i.e., change-sets (across 
different branches)
  * We need a way to know which charms trigger which tests
  * Keep it simple
  * James mentioned he'd like to have that done by Alpha 1 (December) so that 
he can take
    that into account for the OpenStack testing effort.
  ACTIONS:
  [niemeyer] spec
  [james-page] add openstack tests
  
  Proposal below is too complicated, rejected (Kept for posterity)
  
  Proposal:
  
  Each charm has a tests directory
  
  Under tests, you have executables:
  
  __install__ -- test to run after charm is installed with no relations
  
  Then two directories:
  provides/
  requires/
  
  These directories have a directory underneath for each interface
  provided/required. Those directories contain executables to run.
  
  The test runner follows the following method:
  
  deploy charm
  wait for installed status
  run __install__ script, FAIL if exits non-zero
  destroy service
  for interface in provides ; do
  calculate graph of all charms in store which require interface and all of 
its dependency combinations
  deploy requiring charm w/ dependencies and providing service
  add-relation between requiring/providing
  for test in provides/interface ; do
    run test with name of deployed requiring service
  for interface in requires ; do
  repeat process above with provides/requires transposed
  
  Each commit to any branch in charm store will queue up a run with only
  that change applied, none that have been done after it, and record
  pass/fail

-- 
Juju: automated testing of charms
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-p-juju-charm-testing

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-p-juju-charm-testing] Juju: automated testing of charms

2012-03-28 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  Status: Spec nearing final approval. Merge proposals submitted against
  juju to implement some parts of the wrapper inside juju itself:
  http://pad.lv/939932 , http://pad.lv/939944
  
  Work Items:
  [clint-fewbar] write spec for charm testing facility: DONE
  implement specified testing framework (phase1): DONE
  [mark-mims] deploy testing framework for use with local provider (all phase1 
tests and charm graph tests are done w/ local provider): DONE
- [mark-mims] deploy testing framework for use against ec2: INPROGRESS
- [mark-mims] deploy testing framework for use against canonistack: INPROGRESS
+ [mark-mims] deploy testing framework for use against ec2: POSTPONED (this is 
an out-of-cycle activity)
+ [mark-mims] deploy testing framework for use against canonistack: POSTPONED 
(this is an out-of-cycle activitly)
  deploy testing framework for use against orchestra (managing VMs instead of 
machines): POSTPONED
  write charm tests for mysql: POSTPONED
  [clint-fewbar]  write charm tests for haproxy: POSTPONED
  [clint-fewbar]  write charm tests for wordpress: POSTPONED
  [mark-mims]  write charm tests for hadoop: POSTPONED
  [james-page]  add openstack tests: TODO
  [mark-mims]  jenkins charm to spawn basic charm tests: DONE
  [mark-mims]  basic charm tests... just test install hooks for now: DONE
  
  Session notes:
  Welcome to Ubuntu Developer Summit!
  #uds-p #track #topic
  put your session notes here
  Requirements of automated testing of charms:
  * LET'S KEEP IT SIMPLE! :-)
  * Detect breakage of a charm relating to an interface
  * Identification of individual change which breaks a given relationship
  * Maybe implement tests that mock a relation to ensure implementers are 
compliant
  * Test dependent charms when a provider charm changes
  * Run test NxN of providers and requirers so all permutations are sane 
(_very_ expensive, probably impossible)
  * Run testing against multiple environment providers (EC2/OpenStack/BareMetal)
  * Notify maintainers when the charm breaks, rather than waiting for polling
  * Verify idempotency of hooks
  * Tricky to _verify_, and not an enforced convention at the moment, so 
not sure
  * be able to specify multiple scenarios
  * For functional tests, they are in fact exercising multiple charms. Should 
those sit
    within the charms, or outside since it's in fact exercising the whole graph?
    * The place for these composed tests seem to be the stack
  * As much data as possible should be collected about the running tests so 
that a broken
    charm can be debugged and fixed.
  * Provide rich artifacts for failure analysis
  * Ideally tests will be run in lock step mode, so that breaking charms can 
be individually
    identified, but this is hard because changes may be co-dependent
  * It would be nice to have interface-specific tests that can run against any 
charms that
    implement such interfaces. In addition to working as tests, this is also a 
pragmatic
    way to document the interface.
  * support gerrit-like topics?  (What's that? :-) i.e., change-sets (across 
different branches)
  * We need a way to know which charms trigger which tests
  * Keep it simple
  * James mentioned he'd like to have that done by Alpha 1 (December) so that 
he can take
    that into account for the OpenStack testing effort.
  ACTIONS:
  [niemeyer] spec
  [james-page] add openstack tests
  
  Proposal below is too complicated, rejected (Kept for posterity)
  
  Proposal:
  
  Each charm has a tests directory
  
  Under tests, you have executables:
  
  __install__ -- test to run after charm is installed with no relations
  
  Then two directories:
  provides/
  requires/
  
  These directories have a directory underneath for each interface
  provided/required. Those directories contain executables to run.
  
  The test runner follows the following method:
  
  deploy charm
  wait for installed status
  run __install__ script, FAIL if exits non-zero
  destroy service
  for interface in provides ; do
  calculate graph of all charms in store which require interface and all of 
its dependency combinations
  deploy requiring charm w/ dependencies and providing service
  add-relation between requiring/providing
  for test in provides/interface ; do
    run test with name of deployed requiring service
  for interface in requires ; do
  repeat process above with provides/requires transposed
  
  Each commit to any branch in charm store will queue up a run with only
  that change applied, none that have been done after it, and record
  pass/fail

-- 
Juju: automated testing of charms
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-p-juju-charm-testing

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs


[Blueprint servercloud-p-juju-charm-testing] Juju: automated testing of charms

2012-01-17 Thread Mark Mims
Blueprint changed by Mark Mims:

Whiteboard changed:
  Status: spec needed, place-holder work items added to get a better
  handle on the scope of work. Items blocked on spec.
  
  Work Items:
  [niemeyer] write spec for charm testing facility: TODO
  implement specified testing framework: BLOCKED
  deploy testing framework for use with local provider: BLOCKED
  deploy testing framework for use against ec2: BLOCKED
  deploy testing framework for use against canonistack: BLOCKED
  deploy testing framework for use against orchestra (managing VMs instead of 
machines): BLOCKED
  write charm tests for mysql: BLOCKED
  [clint-fewbar] write charm tests for haproxy: BLOCKED
  [clint-fewbar] write charm tests for wordpress: BLOCKED
  [mark-mims] write charm tests for hadoop: BLOCKED
  [james-page] add openstack tests: BLOCKED
  
  Session notes:
  Welcome to Ubuntu Developer Summit!
  #uds-p #track #topic
  put your session notes here
  Requirements of automated testing of charms:
  * LET'S KEEP IT SIMPLE! :-)
  * Detect breakage of a charm relating to an interface
  * Identification of individual change which breaks a given relationship
  * Maybe implement tests that mock a relation to ensure implementers are 
compliant
  * Test dependent charms when a provider charm changes
  * Run test NxN of providers and requirers so all permutations are sane 
(_very_ expensive, probably impossible)
  * Run testing against multiple environment providers (EC2/OpenStack/BareMetal)
  * Notify maintainers when the charm breaks, rather than waiting for polling
  * Verify idempotency of hooks
  * Tricky to _verify_, and not an enforced convention at the moment, so 
not sure
  * be able to specify multiple scenarios
  * For functional tests, they are in fact exercising multiple charms. Should 
those sit
    within the charms, or outside since it's in fact exercising the whole graph?
    * The place for these composed tests seem to be the stack
  * As much data as possible should be collected about the running tests so 
that a broken
    charm can be debugged and fixed.
  * Provide rich artifacts for failure analysis
  * Ideally tests will be run in lock step mode, so that breaking charms can 
be individually
    identified, but this is hard because changes may be co-dependent
  * It would be nice to have interface-specific tests that can run against any 
charms that
    implement such interfaces. In addition to working as tests, this is also a 
pragmatic
    way to document the interface.
  * support gerrit-like topics?  (What's that? :-) i.e., change-sets (across 
different branches)
  * We need a way to know which charms trigger which tests
  * Keep it simple
  * James mentioned he'd like to have that done by Alpha 1 (December) so that 
he can take
    that into account for the OpenStack testing effort.
  ACTIONS:
  [niemeyer] spec
  [james-page] add openstack tests
+ [mark-mims] jenkins charm to spawn basic charm tests DONE
+ [mark-mims] basic charm tests... just test install hooks for now INPROGRESS
  
  Proposal below is too complicated, rejected (Kept for posterity)
  
  Proposal:
  
  Each charm has a tests directory
  
  Under tests, you have executables:
  
  __install__ -- test to run after charm is installed with no relations
  
  Then two directories:
  provides/
  requires/
  
  These directories have a directory underneath for each interface
  provided/required. Those directories contain executables to run.
  
  The test runner follows the following method:
  
  deploy charm
  wait for installed status
  run __install__ script, FAIL if exits non-zero
  destroy service
  for interface in provides ; do
  calculate graph of all charms in store which require interface and all of 
its dependency combinations
  deploy requiring charm w/ dependencies and providing service
  add-relation between requiring/providing
  for test in provides/interface ; do
    run test with name of deployed requiring service
  for interface in requires ; do
  repeat process above with provides/requires transposed
  
  Each commit to any branch in charm store will queue up a run with only
  that change applied, none that have been done after it, and record
  pass/fail

-- 
Juju: automated testing of charms
https://blueprints.launchpad.net/ubuntu/+spec/servercloud-p-juju-charm-testing

-- 
Ubuntu-server-bugs mailing list
Ubuntu-server-bugs@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-server-bugs