Hi Sanjeewa,

This is excellent progress on achieving coverage objectives. Your feedback
on test framework improvements are really valuable for us to enhance better
user experience with test automation. Please see my comments/clarifications
inline.

On Sat, Mar 1, 2014 at 3:49 PM, Sanjeewa Malalgoda <[email protected]>wrote:

> Hi All,
> The intention of this mail to share our experiences on writing integration
> tests for API Manager with our automation test framework(we used 4.2.5).
> Over last few weeks we were working on adding automated tests to cover some
> of user scenarios in API manager. Test automation framework provided great
> support to implement new test cases. With the help of automation framework
> we were able to gain nearly 60% line of code, 74% method and 80% class
> coverage for API manager specific packages. We have discussed about process
> and updates on developer mailing list with header "*[Dev][APIM] Status
> update on improving automated tests for API manager*". With experience we
> gained we would like to suggest some improvements + best practices for our
> automation framework and users. Some of these improvements may already
> available with newer version of automation framework(4.3.0). If so please
> ignore that as we have planned to move next version in future.
>
>
>    1. Having single API to all http, https based client operations. Fix
>    some minor issues in Http client.
>
> Test framework maintain three classes which facilitate http, https client
based operations. We will try to come up with single class for this - Jira
created to track the improvement - https://wso2.org/jira/browse/TA-810

1. HttpsURLConnectionClient
2. HttpURLConnectionClient
3. HttpClientUtil

Can you create TA jira for minor issues with HTTP client.


>    1. Make all base test cases environment independent. With that we will
>    be able to run any test in builder enabled mode or disabled mode +
>    distributed deployment or single server deployment.
>
> This is related to test design. You can implement own base test case class
to provide support for all test execution environments. However I admin
that, the approach wasn't much straight forward with our old framework. The
new test framework 4.3.0 has more enhance support for this. I will come up
with a sample test case using new framework to demonstrate the usage.

>
>    1. It would be ideal if we can start new server with some predefined
>    configuration set. Ex: For API manager BAM integration we can ask user to
>    put BAM distribution into some folder and add path to variable named
>    bam.server.home. Then we can put all configurations into src/resources/conf
>    directory. As most of carbon based servers having similar format we will
>    able to start new server that need for our test like this
>    [IntegrationTestUtils.startCarbonServer(" bam.server.home", "confDir")].
>    This can be implemented as platform scenario but sometimes we might need to
>    download already released pack and test integration scenario with that(In
>    APIM case; test compatibility of APIM 2.0.0 with different BAM versions
>    2.0.0, 2.3.0, 2.4.0).
>
> This comes under deployment automation, which we haven't started yet and
there is a RM item to track the feature implementation. Also IMO, running
the deployment in single server is not recommended, need to go for
distributed setup.

>
>    1. Also it would be ideal if we can start few sample web services as a
>    part of automation test framework initializing(as external services). So we
>    might be able to test against them. At this moment we are using some
>    external hosted services to test our products. We can host few sample
>    services on SLive and use them for test cases.
>
> RESTFull backend was provided for ESB integration tests. The same backend
can be provided for APIM as well. Jira created to track the requirement -
https://wso2.org/jira/browse/TA-811

Lets try to use APIM itself to host SOAP services.

>
>    1. Configuration re writer utility class. When we run test with
>    builder enabled sometimes we need to edit configuration files based on the
>    scenario. As an example we can consider about same test scenario with and
>    without caching. So we need to enable caching by editing cache
>    configurations of config file. We can have few configuration files based on
>    scenario but it would be nightmare maintaining all of them.
>
> Why can't we edit the configuration files at test run time and restart the
sever. In that case, no need to maintain separate configuration file for
each test case. You can use TestNG @Factory annotation to rerun the same
test with different configurations.

Any suggestions which you would like to get implemented though Test
framework ?

Usage of @Factory at -
https://docs.wso2.org/display/TA430/Multiple+User+Mode+Test+Case


>    1. Decouple base test cases (ServerStartupBaseTest) from test
>    scenarios and make it configurable.
>
> Base test classes can be configurable though testng.xml files. However, we
have used special custom annotation to skip these base classes from
platform test execution mode. Can you elaborate more on this requirement

>
>    1. While writing APIM test modules for jaggery based applications we
>    wanted to copy sample jaggery applications and validate output responses.
>    We already written some of utility methods and we can improve them and use
>    jaggery test framework as well.
>
> +1

>
>    1. Adding code level break down for emma reports(now we have package,
>    class and method level break down). We might need to point source code to
>    emma run time for this. With this we will be able to identify missing
>    scenarios, code blocks with minimum effort.
>
> JIRA is created already - https://wso2.org/jira/browse/TA-784

>
>    1. We have to think about some advance scenarios like multiple user
>    stores, multiple database types. At this stage for APIM related scenarios
>    we used puppet scripts to setup distributed environment with MySQL and ran
>    integration test against configured deployment. So puppet will take care of
>    setting up environment with different database types, user stores and etc.
>
>
Multiple user store cases were covered under IS integration tests. So same
test case can be used as a reference here.

You need to have puppet scripts covering different test dimensions to cover
different DB types.

>
>    1. Is there any possibility of generating emma code coverage for
>    distributed external tests(builder disabled) as well? It would be useful
>    IMO.
>
> We haven't thought about generating code coverage for distributed setups
yet. There is no direct way to do this however so involve some R&D effort.


>    1. Support to send bulk requests(we are using JMeter tests as work
>    arounds)
>
>
Did you mean volume or concurrency tests ? Running load tests at the time
of build phase (builder mode) is not recommended due to high load on
builder machines. Can you elaborate more on this requirement

>
>
>
> *Things we need to focus when we develop tests using automation framework*
>
>    - Write single test case which can run on builder enabled single
>    server mode or external distributed/ clustered deployment (In APIM case it
>    would be single server mode and gateway, key manager, store, publisher
>    separated deployment).
>    - Cleanup all data we added to system during the tests.
>    - Always make all external services, testing end points configurable.
>    - Use emma report to check code coverage and identify missing
>    scenarios/methods/classes.
>    - Use proper emma instrumentation and filtering according to you
>    product use cases.
>
>
> If we can add considerable amount of test cases to cover user scenarios of
> product that would be very useful when we release new product versions
> frequently. We don't have to worry about already released features unless
> there is an API or data level change. Automated tests will take care about
> released features and we can pay more attention to newly added features. It
> would be ideal if we can add automated test to every product. At the
> initial stage that might be bit difficult but once its done it will help us
> to save Dev, QA time.
>

Absolutely +1

Thanks,
Krishantha.


> Thanks krishantha, nuwanW, dharshana and team for the support you all
> provided.
>
> Thanks,
> sanjeewa.
> --
>
> *Sanjeewa Malalgoda*
> Senior Software Engineer
> WSO2 Inc.
> Mobile : +94713068779
>
>  <http://sanjeewamalalgoda.blogspot.com/>blog
> :http://sanjeewamalalgoda.blogspot.com/<http://sanjeewamalalgoda.blogspot.com/>
>
>
>


-- 
Krishantha Samaraweera
Senior Technical Lead - Test Automation
Mobile: +94 77 7759918
WSO2, Inc.; http://wso2.com/
lean . enterprise . middlewear.
_______________________________________________
Architecture mailing list
[email protected]
https://mail.wso2.org/cgi-bin/mailman/listinfo/architecture

Reply via email to