On Thu, May 7, 2009 at 4:09 AM, Oleg Kobchenko <[email protected]> wrote:
>
> These issues are debatable.
>
>> 1. It should be easy to locate the failing test in the source code.
>
> The output tells the script name and verb name, which are unique.
> If it is hard to find a place within the code of a single verb,
> the verb should be revised or split.

The key is reducing the cognitive load. How many step do we need to
get to the failure point in the code?

1. read and parse the failure
2. copy, paste the file location to open up the file
3. ctrl+f to find the verb
4. scan through the line

Each step takes time and if there is some non-distinctiveness, which
happens often, among a few test lines in one verb, the step 4 would
take a long time.

However, compare that to the one step procedure

1. Look at the failure and double click on the failure report line
(which includes file name and the line number)

You don't need to fill your working memory with other unimportant
things and procedures.

>
>> 2. It should be easy to get more information on the failure.
>
>   assert 'qq123' -: 'qq123'rplc'qq';'zz'
>
> means "Assert that X equals Y'.
> The output clearly shows both the expected value and the expression.
> It does not really matter what was the wrong value--it should
> have been right. You don't want to mull over the output listing
> full of failures: you want to get it fixed quick and see all OKs.

When there is a failure, you never inspect the actual result?  I
cannot believe it. There are sometimes difficult cases where you are
blind and biased, so you can't easily guess your wrong logic and typo
unless you see the comparison between the actual and the expected.
With the information you can make a wiser choice. I cannot agree you
with many years of experience in test driven development with xUnits
and building a few unit testing frameworks in a few languages.

See for example, JSSpec http://jania.pe.kr/jsspec/demo.html

Readable, informative failure is always an asset not a waste.


>
>
> "should_match" could be considered feature creep. For examples of
> existing approach, see code snippet at
>
>   http://en.wikipedia.org/wiki/Unit_testing#Design

The reason for introducing many Asserts(assertEqual for each types,
for instance) was for supporting Readable Failure pattern in a
environment where introspection isn't powerful enough. It is
considered an important practice among xUnit communities

>
> also the latest generation of xUnits:
>
>   http://testng.org/doc/documentation-main.html#success-failure
>
> which uses standard Java "assert" mechanism.

Yes. However, those who have used xunit for years would never use the
simple assert expression without deep introspection facility provided.

For example, Ruby has a good introspection power. Python, as well, if
not that much. There are some xunit frameworks in those languages,
which you may use simple assert and the framework will do the
introspection to find out the actual value.

See:

assert 3 == add(1,2)

When there is a failure, the framework looks into the line and prints
something like,

Failure: 3 == add(1,2)
3 is not equal to 4

It is only this case that simple assert is preferred or recommended.

Junit, for instance, recommends assertThat :
http://www.artima.com/forums/flat.jsp?forum=155&thread=109515
They took this way since Java has not enough introspective power.

>
> The thing to consider is how much additional value in relation to
> complicating the implementation does a new feature bring; in the
> meantime, are there more important features missing.

Yes, there might be.

>
> However, it would be good to see and discuss some concrete proposals.
>

Thank you for the discussion.

>
>
>> From: June Kim <[email protected]>
>>
>> Thank you for the link.
>>
>> However, there are two violation of principles for effective unit
>> testing in the addon script.
>>
>> 1. It should be easy to locate the failing test in the source code.
>> 2. It should be easy to get more information on the failure.
>>
>> To elaborate them, I'll take the example from the wiki page at
>> http://www.jsoftware.com/jwiki/Addons/general/unittest
>>
>>    unittest jpath '~addons/general/unittest/demo/two_test.ijs'
>> before bad
>> before ex2
>> before rplc
>> after rplc
>> Test: D:\Math\j602\addons\general\unittest\demo\two_test.ijs
>> bad: Error
>> |assertion failure: assert
>> |       assert'qq123'-:'qq123'rplc'qq';'zz'
>> ex2: OK
>> rplc: OK
>>
>> 1. We know that test_bad has failed. However, if the program told us
>> where the assert line lies(not the start of the verb test_bad) in the
>> source code, it would be more convinient to locate the place. All we
>> got from the current unittest program is, just the name of the test
>> case.
>>
>> 2. We know that the test case fails. However, we don't know full
>> information that is ready and very helpful for us to find the cause of
>> bug. If the failure result showed us what the actual value(the result
>> of 'qq123'rplc'qq';'zz') was against the expected value, we would be
>> at a more advantageous place.
>>
>> Rectifying no. 2 isn't difficult with a new set of verbs, like should_match.
>>
>> Rectifying no. 1 is difficult.
>>
>> On Wed, May 6, 2009 at 8:11 PM, Oleg Kobchenko wrote:
>> >
>> > Such Unit Test framework already exists as an addon:
>> >  general/unittest
>> >
>> >  http://www.jsoftware.com/jwiki/Addons/general/unittest
>> >
>> > The mentioned issues were addressed. In fact J has very rich
>> > introspection facilities, exception handling, and being
>> > interpretive help a lot.
>> >
>> > You might want to consider extending the framework with
>> > a GUI similar to JUnit/NUnit. The hierarchies of tests are
>> > in scirpts and further in folder structures, which can be
>> > discovered by traversing folder structure and naming
>> > conventions.
>> >
>> >
>> >
>> >
>> > ----- Original Message ----
>> >> From: June Kim
>> >>
>> >> I am trying to make a unit-testing framework for J, in the lines of
>> >> the other standard star-unit frameworks, for example, like JUnit
>> >> http://en.wikipedia.org/wiki/JUnit
>> >>
>> >> However, I found the facilities for reflection and exception handling
>> >> in J is wanting compared to other languages like Java.
>> >>
>> >> ---------
>> >> sum=: +
>> >> mul=: *
>> >>
>> >> should_eq=: dyad :0
>> >>   :
>> >>   text=. ''
>> >>   if. -.x-:y do.
>> >>     text=. x ([,' -...@-: ',])&": y
>> >>   end.
>> >>
>> >>   text assert x-:y
>> >> )
>> >>
>> >> test_sum=: monad : 0
>> >>   10 should_eq sum/ 1 2 3 4
>> >>   10 should_eq sum/ 1 2 3
>> >> )
>> >> test_mul=: monad : 0
>> >>   24 should_eq mul/ 1 2 3 4
>> >>   6 should_eq mul/ 1 2 3 4
>> >> )
>> >>
>> >> all_test=: monad : 0
>> >>   test_sum''
>> >>   test_mul''
>> >> )
>> >>
>> >> all_test''
>> >>
>> >> ---------
>> >>
>> >> First limitation: I want to get the line number(in the script file
>> >> along with the file name) when there is a failure. How do I get this?
>> >> (13!:13 was a possible path but the line number there isn't the LN
>> >> from the top of the file)
>> >>
>> >> Second limitation: each test case should run independently from each
>> >> other. That means the failure in test_sum shouldn't stop running
>> >> test_mul. What is an easy and flexible way? (maybe making a gerund
>> >> array of test_sum and test_mul, and running through the gerund array
>> >> to execute each verb inside try, catch statement -- handling the
>> >> failure appropriately afterwards)
>> >>
>> >> Gathering all the verbs that start with "test_" wouldn't be that hard.
>> >> (reflection)
>> >> ----------------------------------------------------------------------
>> >> For information about J forums see http://www.jsoftware.com/forums.htm
>
>
>
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to