Hi Leonid Thanks again for the well thought out response. If you don't mind I'd like to ask a few more questions.
Leonid Lastovkin wrote: > > Patrick, > > I think your situation is very specific, and cannot be implemented > with just 20 lines of code. But it should not be hard to do by writing > some of your own classes. > > First of all, it looks like you will be creating some files that the > test suite will depend on. If you loose them or they become corrupt, > then your tests won't be useful at least for some time. Yeah I've thought about that. I guess I can always check the files into the source control system, that way they won't get lost and they can always be checked out on the build machine easily. The only problem is that they could be large. Mmm will have to think about a proper way of storing them. A dbase could work but would be lots of hassle, although a dbase server would only have to be located in one spot and could be reachable from everywhere. So many questions. I'll see about all of this when I finally start writing the tests :-) > > The following stuff: > > [Source(Reader = typeof(MatrixEquationReader), File=" SomeSillyFile.mtx")] > [Compare(typeof(MatrixEquationComparer))] > [Target(Writer=typeof(MatrixEquationWriter),File="SomeSillyResultFile.xml")] > > Is a syntactic sugar for a few dozen lines of C# code. I think you are > trying to do something specific, and so you have to decide the format > for " SomeSillyFile.mtx" and "SomeSillyResultFile.xml". By XML file > you probably do not mean the XML Test report. I do not think you want > that anyway. I think you will want to write a Reader class and a > Writer class yourself. My advice would be: once you know what you > need, pick file formats that are as simple as possible. You can end up > spending a large chunk of your time writing a file parser. Plain ASCII > comma-separated values (CSV) is a good format for storing tabular > data, and there are some libraries for it. I probably didn't explain myself properly but the main reason for the attributes is that I'm not only going to be testing matrix (solvers) with these tests. They will be used for all the verification tests I need, ranging from matrix solvers (probably the first application) to my fluid flow solver etc. etc. So there will be many different types of solvers tested and for each solvers there will be many different tests. What I'm trying to do is reduce overhead inside the tests, not because of the typing but because I want to prevent unnecessary copy / paste and unnecessary coupling between the data and the test. So I figured that a test basically reads some test data, runs the solver on that test data, then compares the output with the baseline and then writes the results back to a file. That's how the attributes came about. > > Now, I am not sure how extensively you want to test your solver and > how many data sets you are creating (I'm guessing more than 10, if you > bothered to look into fancy test decorators in the first place). Eh yeah many more than 10. I don't know exactly how many but I can imagine that there will be 100's or maybe even 1000's of these tests. > That will affect how you structure the directory containing input > files / output files. Suppose you can categorize the problems being > solved into 3 types: "TypeA", "TypeB", "TypeC". Then, you could > organize the directory like so: > > <stuff on file structure snipped> > > and then have a Scanner class, which scans the directory and builds up > an object, which can iteratively return to you a pair of files of type > a, b, c or all of them. Good idea. I'll have a look at that. I was planning on allowing directories to be scanned so I'll have a go based on your suggestions. Thanks :-) > > I do not know if the algorithm that you are using for solving changes > over time, but you want to make sure that it is consistent. If it > does, then you probably want to store the results in time-stamped > files, and so the logic of the scanner class would become more complex. Good idea. I'll add that to the list of things to implement :-) > > Another thing I am not sure about is how do you compare the results. I > am guessing that the results are different each time because it is not > possible to come up with a closed-form solution, and the algorithm > uses some sort of numerical approximation. You'd have to write the > comparer class yourself and think about how it should work. It would > have to make sure that one vector is similar to another. Well' it > sounds to me that you want to create a baseline (several results) > using your algorithm, when you know that your algorithm works > correctly, and save those somewhere. Then, during testing you would > solve the same equation several times, and then make sure that the N > results in the baseline and the N results that you just got look > reasonably close to each other. If the solver utilizes random number > generators, then you may be able to find a stored result A and a new > result B where A and B are not terribly similar (because A may be in > the left tail of a distribution, and B may be in the right tail). But > , when extracting the statistics about N stored results and N new > results, the numbers should be consistent. You have to know how the > algorithm works before you can decide what deviation is acceptable. > Or, you have to know what sort of deviation between consecutive > results the client is willing to tolerate. ... You probably know more > than I do about how to compare the results. This is pretty much how I was planning to implement it. Not entirely sure about the exact implementation but I'll work that out when I need to. I'd probably track the results between different tests so that I can see if an implementation is getting better (more accurate) or worse. Also it be nice to be able to link changes to bug fixes etc. Also it sounds like you have a lot more knowledge about numerics than you claim credit for, and you probably know more than I do ;-) Would you mind telling me what your work is in? > > However, I do not think you really need a decorator like this one: > [Compare(typeof(MatrixEquationComparer))] > > It looks like the Comparer class won't be simple, so what's a big deal > if you add a few more lines of code to your test suite? > Just initialize the comparer in the constructor or init, and use it > inside of a test. The decorator is not really shortening your test. The reason for the comparer is that I want to be able to write verification tests for many different algorithms, but I only want to write the comparer once and if possible I want to keep the tests really simple. I've noticed in my current tests that much of a test is in both setting up the test conditions and then verifying them, so I'm trying to find a way to provide a verification API (through the comparer etc.) so that the tests are shorter, simpler and more clear on what they actually test. Anyway, more clarification about my (possibly silly) ideas. Any more comments? I'd be grateful for them Regards Petrik --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "MbUnit.User" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/MbUnitUser?hl=en -~----------~----~----~----~------~----~------~--~---
