1. A
2. I use <solution> to compile my projects, and have the references
sorted out in my Visual Studio projects in a solution.
3. <nunit2>
4. Usually compile with VS, finding syntax errors is easier from its task
list, and then run the tests with Nant in a command line window. I have my
tests in a seperate "Testing" console project, and use the <test> element
of the <nunit2> task to specify a .config file to use with the
tests.
5. Look at them onscreen, log them to an XML file, and append
some simple stats to a CSV, just so I can graph the tests/failures/sucesses
over time.
6. What I have, and I'm pretty confident this won't scale, is a nant
task to restore my db as db_test, and then all my tests are run against
db_test. The next time I run the unit tests, db_Test gets reset, and I'm
always working with a clean slate. The connection strings are all in the
app.config for my Testing console app. My tests add things to the
database, and then I create new objects to pull that data back in, testing that
input values are the same coming out, that a process generated the right number
of rows, that it didn't generate duplicate rows, things like
that.
That way works for me, but I doubt its the best solution for DB
testing. I think this works because right now my database doesn't really
have any data in it, taking 8MB or so. On a larger database the
restore process would take more than a couple seconds, so I'm gonna have to
figure out something else eventually. I've had/read a few other solutions:
a. wrap everything in a "BEGIN TRAN...ROLLBACK", which
only works if your application uses the same connection for all db calls. (not
likely)
b. make your tests clean themselves up. Delete
everything you insert, etc. I tried to do this at first, but it was
really, really difficult, and was changing my API, adding class members and
output params just so the tests could get back the information needed to delete
everything.
c. Use a mock objects scheme. http://www.mockobjects.com/Faq.html#head-76f6eae2365ef12ff3dcb1bbfef7ab0a4dc77c92
This didn't really work all that well with my OR mapping, and a lot of
what I was wanting to test was stored procedures, so this wasn't a good fit for
me.
I wrote up a little blurb with more information about how I use NAnt with
NUnit (and NDoc) on my blog:
Thanks, From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Jaroslaw Kowalski Sent: Sunday, December 12, 2004 1:05 PM To: [EMAIL PROTECTED] Subject: [Nant-users] NUnit with NAnt Hi guys,
I'd like to know how do you use NUnit with
NAnt?
Specifically (I'd like to hear about pros and cons
of the way you use NUnit):
1. Which version of nunit.framework.dll do you
use:
a) the one that comes with NAnt
(framework-specific)
b) the one installed by
NUnit installer or
c) some binary version that
comes with your application
d) some version from the
GAC
2. What's the best way to locate the
nunit.framework.dll to be used in <references> ?
3. Do you <exec> nunit-console.exe or use
<nunit2>?
4. Do you support dual compilation/running of test
cases (using both VS.NET and NAnt) ? How do you deal with it? What about
configuration file management?
5. What do you do with the test results (email,
store in VCS, discard)?
6. (this is not related to NAnt) What's the best
way to test your application against a database? Do you use a separate database?
How do you create it? How do you specify the configuration settings? How to you
check your database contents for correctness? How do you deal with transactions
in the test cases?
Jarek
|