Hi Folks,
Just an fyi- The first pass of the module ctakes-regression-test module has
been added:
In a nutshell, it will "auto-magically" do the below when mvn test is executed:
1) Run all pipelines defined in the desc/collection_processing_engine folder
2) Use all of the test dummy notes we've gathered in testdata/input/
3) Place all of the generated results into
testdata/generatedoutput/{cpefilename}/
4) And compare the XML results between:
testdata/expecteoutput
testdata/generatedoutput
I've only added the default pipeline and coreference so far. If someone has a
chance, they could add the other auxiliary pipelines (it is extremely easy:
Just add the corresponding test cpe.xml in the
desc/collection_processing_engine, run it once, and copy the xmls files from
generatedoutput into the expectedoutput. Then it will automatically do the
comparisons in the future. No code changes required!
I would strongly recommend any new modules to add their expected results to the
regression tests. There are many benefits to this:
- Automatically verify that your new code did not break existing functionality
- Someone else or another component inadvertently breaking an existing
component.
--Pei
> -----Original Message-----
> From: Steven Bethard [mailto:[email protected]]
> Sent: Thursday, January 03, 2013 5:27 PM
> To: [email protected]
> Subject: Re: cTAKES Testing (CTAKES-84)
>
> On Jan 3, 2013, at 3:20 PM, "Chen, Pei" <[email protected]>
> wrote:
> > So I started looking into Jira:
> https://issues.apache.org/jira/browse/CTAKES-84
> > Independent of any unit tests, I was thinking of consolidating all of the
> > test
> dummy documents across the components and putting them into a directory
> within the clinical-pipeline component. Then create and run a single
> regression test pipeline that includes ALL of the components including
> optional ones (DrugNER, Smoking Status, etc.) and compare the xmi/cas
> output to sure they all work together and generate the expected output.
> >
> > Open to other ideas though...
> >
> > This was already done one way or another (semi-manually?), but I think it
> would be nice to have it automatically run every time we cut a release.
>
> +1 on automatically running this kind of test before every release.
>
> Steve