> On Mar 23, 2015, at 9:01 PM, Amila Jayasekara <[email protected]> wrote: > > On Sun, Mar 22, 2015 at 11:46 PM, Suresh Marru <[email protected] > <mailto:[email protected]>> wrote: > Hi All, > > Current Integration tests cover a decent amount of basic capabilities. These > tests are useful to ensure all the components are working in harmony. But as > we move towards a 1.0 release and announce Airavata is ready for production, > we need to also do significant scalability and reliability tests (run similar > tests over a period of time). How about we program against the Java SDK and > write simple cron like test triggers? > > Hi Suresh, > > What do you mean by "simple cron like triggers” ?
Hi Amila, I mean once we program against the client SDK, execute the tests on-demand, or deploy a Airavata Server and schedule tests to run periodically. For instance there were some occasional issues with ZooKeeper being incoherent (which are now fixed) but if such sporadic issues come up, it will be good to pro-actively uncover them and fix them. Similar could be issues coming from compute resources, these are not Airavata issues for say, but we need to ensure Airavata can tolerate these well. So my suggestion is to come up with a framework to battle test the system in as close to real world usage as possible. Suresh > > Thanks > -AJ > > We could start with something like, assuming 10 gateways are sending 100 > requests each and establish that as a the base line. > > Any thoughts? > > Suresh >
