With 2, it's a little strange that test_spilling is being skipped - I think
that one should be run.

On Tue, Jul 19, 2016 at 8:39 AM, Tim Armstrong <[email protected]>
wrote:

> It looks like the benchmark-test issue is something to do with the
> granularity of the clock. It can get stuck in an infinite loop if the
> function call below always takes less than the smallest measurable unit of
> time (i.e. Start() and Stop() are called in the same time quantum).
>
>   while (sw.ElapsedTime() < target_cycles) {
>     sw.Start();
>     function(batch_size, args);
>     sw.Stop();
>     iters += batch_size;
>   }
>
> We use Intel's rdtsc instruction for a timer here, so I guess whatever PPC
> alternative you used may work a little differently. This is probably ok,
> but it's possible that it could affect timers elsewhere in Impala.
>
> One solution would be to increase the default batch size.
>
> On Tue, Jul 19, 2016 at 5:29 AM, Valencia Serrao <[email protected]>
> wrote:
>
>> Hi Tim,
>>
>> Following are some observations:
>>
>> 1. *BE test -issue: benchmark-test hangs*
>> Putting trace logs like below in benchmark.cc:
>> *while (sw.ElapsedTime() < target_cycles) {*
>> * LOG(INFO) <<" in while(sw.ElapsedTime() < target_cycles)";*
>> * sw.Start();*
>> * function(batch_size, args);*
>> * sw.Stop();*
>> * iters += batch_size;*
>> * LOG(INFO) <<" In while:::::::: sw.ElapsedTime() "<< sw.ElapsedTime();*
>> * LOG(INFO) <<" In while:::::::: iters = " << iters ;*
>>
>> In Release mode, I observed that the *sw.ElapsedTime()* remains constant
>> and does not increase, therefore, it is caught up in an infinite loop and
>> the benchmark-test hangs. In Debug mode, *sw.ElapsedTime()* keeps on
>> increasing and therefore is able to come out of the while loop and
>> benchmark-test doesn't hang in Debug mode.
>> I'm working on this issue, however, if you could give any pointers about
>> it, that would be really great.
>>
>> 2. *Custom cluster tests: *I have included the code changes in my branch
>> and many of the earlier 36 skipped tests have now executed and they pass,
>> but with the following exception(when compared to the output in the
>> *https://issues.cloudera.org/browse/IMPALA-3614*
>> <https://issues.cloudera.org/browse/IMPALA-3614> ):
>> custom_cluster/test_spilling.py sss.
>>
>> *Current CC test stats:* 34 passed, 7 skipped, 3 warnings.
>>
>> 3.* End-to-End tests:* I couldn't dive into the EE tests. I will surely
>> let you know more about them as soon as I'm done with them.
>>
>> Regards,
>> Valencia
>>
>> [image: Inactive hide details for Valencia Serrao---07/19/2016 10:26:31
>> AM---Hi Tim, Thank you for the information.]Valencia Serrao---07/19/2016
>> 10:26:31 AM---Hi Tim, Thank you for the information.
>>
>> From: Valencia Serrao/Austin/Contr/IBM
>> To: Tim Armstrong <[email protected]>
>> Cc: [email protected], Manish Patil/Austin/Contr/IBM@IBMUS,
>> Nishidha Panpaliya/Austin/Contr/IBM@IBMUS, Sudarshan
>> Jagadale/Austin/Contr/IBM@IBMUS
>> Date: 07/19/2016 10:26 AM
>> Subject: Re: Issues with tests in Release-mode Impala build
>> ------------------------------
>>
>>
>> Hi Tim,
>>
>> Thank you for the information.
>>
>> I am working on the pointers you have given and also on the fix for
>> Custom cluster (skipped) tests. I will inform you on the findings.
>>
>> Regards,
>> Valencia
>>
>>
>>
>> [image: Inactive hide details for Tim Armstrong ---07/18/2016 09:19:52
>> PM---Hi Valencia, 1. We run tests in release mode nightly and it]Tim
>> Armstrong ---07/18/2016 09:19:52 PM---Hi Valencia, 1. We run tests in
>> release mode nightly and it doesn't look like we've seen
>>
>> From: Tim Armstrong <[email protected]>
>> To: [email protected]
>> Cc: Valencia Serrao/Austin/Contr/IBM@IBMUS, Nishidha
>> Panpaliya/Austin/Contr/IBM@IBMUS, Sudarshan
>> Jagadale/Austin/Contr/IBM@IBMUS, Manish Patil/Austin/Contr/IBM@IBMUS
>> Date: 07/18/2016 09:19 PM
>> Subject: Re: Issues with tests in Release-mode Impala build
>> ------------------------------
>>
>>
>>
>> Hi Valencia,
>>
>> 1. We run tests in release mode nightly and it doesn't look like we've
>> seen this hang. I'd suggest you attach a debugger to the benchmark-test
>> process and see what it's doing. It could either be an actual hang, or an
>> infinite/very long loop. That test is only testing our benchmarking
>> utilities, not Impala itself, but IMO it's always good to understand why
>> something like that is happening in case there's a more general problem.
>> 2. Sounds like *https://issues.cloudera.org/browse/IMPALA-3614*
>> <https://issues.cloudera.org/browse/IMPALA-3614> . Have you got the fix
>> for that in your branch?
>> 3. Look forward to hearing more.
>>
>> Cheers,
>> Tim
>>
>> On Mon, Jul 18, 2016 at 2:49 AM, Valencia Serrao <*[email protected]*
>> <[email protected]>> wrote:
>>
>>
>>    Hi All,
>>
>>    I have built Impala in Release mode. I executed the tests,  following
>>    are
>>    some observations:
>>
>>    1. BE test: The test execution hangs at the "benchmark-test". There
>>    are no
>>    errors shown and it hangs at this test. Earlier, running the BE tests
>>    in
>>    debug mode this issue did not occur.
>>    2. Custom Cluster test: 5 tests passed and 36 tests skipped. All of
>>    the
>>    skipped cases give the message: "INSERT not implemented for S3"
>>    3. EE tests: I've also seen some failures here (yet to check the
>>    details)
>>
>>    As for FE and JDBC tests, everything works fine, release mode test
>>    output
>>    is same as that of debug mode test output.
>>
>>    Is the  "benchmark-test" test known to fail in Release mode or am I
>>    missing
>>    out on any configuration. Also, I want to understand the significance
>>    of
>>    this test, if in case we could ignore it and move ahead.
>>
>>
>>
>>    Regards,
>>    Valencia
>>
>>
>>
>>
>>
>>
>

Reply via email to