Agreed.  Tests that rely on timing are going to be fragile and/or somewhat
unpredictable.  Darius, why does the unit test require a delay?

-Burke

On Mon, Sep 12, 2011 at 4:02 PM, Jeremy Keiper <[email protected]> wrote:

> I really think time-sensitive tests are a bad idea.  There must be a way to
> test our time-sensitive code without using delays.  If I try to build (with
> tests) on a slow machine, some tests inevitably fail.
>
> Jeremy Keiper
> OpenMRS Core Developer
> AMPATH / IU-Kenya Support
>
>
>
> On Sat, Sep 10, 2011 at 2:11 PM, Darius Jazayeri 
> <[email protected]>wrote:
>
>> Yes, we apply the same dateVoided (in milliseconds) to all items
>> recursively deleted in one transaction.
>>
>> Given that we're currently not using voidReason, if you can see an
>> argument for not changing things, I'm happy to leave the code as-is, since
>> adding a delay fixed the unit test.
>>
>> -Darius
>>
>>
>> On Sat, Sep 10, 2011 at 10:39 AM, Burke Mamlin 
>> <[email protected]>wrote:
>>
>>> If (1) we are recording the same dateVoided for all data voided in a
>>> single transaction (i.e., even if the voiding takes several seconds) and (2)
>>> dateVoided is recorded in seconds or milliseconds, then I don't think we
>>> need to require that the voidReason match, since it leaves open the option
>>> to make voidReason appropriate to the object (e.g., we *could* set the
>>> voidReason for obs with "encounter voided" even though the encounter gets a
>>> voidReason of "invalid encounter").  I'm not saying that we need to change
>>> things if we're using the same voidReason across all objects voided in a
>>> single transaction; rather, I'm saying that we don't necessarily need to
>>> *require* the same voidReason across all objects if the voidDate is
>>> unique enough to make the likelihood of duplicate voidDates for the same
>>> patient vanishingly small.
>>>
>>> -Burke
>>>
>>>
>>> On Fri, Sep 9, 2011 at 11:54 PM, Darius Jazayeri <[email protected]>wrote:
>>>
>>>> FWIW, it looks like the unit test in question is bad, in that it's
>>>> sensitive to the clock time, so putting a Thread.sleep(100) in the middle 
>>>> of
>>>> the test fixes it.
>>>>
>>>> But looking at the code I realized that we're not doing things exactly
>>>> right. Currently, when you unvoid a patient, it also unvoids any of their
>>>> encounters and orders with the same voidedBy and dateVoided. But we're not
>>>> looking for the same voidReason. Any reason we shouldn't be looking at that
>>>> too?
>>>>
>>>> -Darius
>>>>
>>>>
>>>> On Fri, Sep 9, 2011 at 8:27 PM, Darius Jazayeri <[email protected]>wrote:
>>>>
>>>>> So, it looks like the specific problem I was running into is that I was
>>>>> using a workspace folder under one of the parallels-mapped network drives
>>>>> (\\psf\...) and one of Eclipse/Maven/Java/WinXP didn't like that. So after
>>>>> switching to c:\workspace, I was able to get the build to run.
>>>>>
>>>>> Of course then I hit a unit test failure, which happens on this windows
>>>>> VM, but not on the up-to-date checkout I have on my regular OSX install...
>>>>>
>>>>> PatientDataUnvoidHandlerTest
>>>>> org.openmrs.api.handler.PatientDataUnvoidHandlerTest
>>>>>
>>>>> handle_shouldNotUnvoidTheOrdersAndEncountersThatNeverGotVoidedWithThePatient(org.openmrs.api.handler.PatientDataUnvoidHandlerTest)
>>>>> java.lang.AssertionError:
>>>>> at org.junit.Assert.fail(Assert.java:91)
>>>>> at org.junit.Assert.assertTrue(Assert.java:43)
>>>>> at org.junit.Assert.assertTrue(Assert.java:54)
>>>>> at
>>>>> org.openmrs.api.handler.PatientDataUnvoidHandlerTest.handle_shouldNotUnvoidTheOrdersAndEncountersThatNeverGotVoidedWithThePatient(PatientDataUnvoidHandlerTest.java:134)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>> at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>> at
>>>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>>>>> at
>>>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>>>>> at
>>>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>>>>> at
>>>>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>>>>> at
>>>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
>>>>> at
>>>>> org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
>>>>> at
>>>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>>>> at
>>>>> org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:82)
>>>>> at
>>>>> org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:72)
>>>>> at
>>>>> org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:240)
>>>>> at
>>>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
>>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
>>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
>>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
>>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
>>>>> at
>>>>> org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
>>>>> at
>>>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
>>>>> at
>>>>> org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:70)
>>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
>>>>> at
>>>>> org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:180)
>>>>> at
>>>>> org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
>>>>> at
>>>>> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
>>>>> at
>>>>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
>>>>> at
>>>>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
>>>>> at
>>>>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
>>>>> at
>>>>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
>>>>>
>>>>> -Darius
>>>>>
>>>>> ------------------------------
>>> Click here to 
>>> unsubscribe<[email protected]?body=SIGNOFF%20openmrs-devel-l>from 
>>> OpenMRS Developers' mailing list
>>
>>
>> ------------------------------
>> Click here to 
>> unsubscribe<[email protected]?body=SIGNOFF%20openmrs-devel-l>from 
>> OpenMRS Developers' mailing list
>>
>
> ------------------------------
> Click here to 
> unsubscribe<[email protected]?body=SIGNOFF%20openmrs-devel-l>from 
> OpenMRS Developers' mailing list
>

_________________________________________

To unsubscribe from OpenMRS Developers' mailing list, send an e-mail to 
[email protected] with "SIGNOFF openmrs-devel-l" in the  body (not 
the subject) of your e-mail.

[mailto:[email protected]?body=SIGNOFF%20openmrs-devel-l]

Reply via email to