Hi, a small update on this:

I improved the command in Perfkit. If you're interested, below you can find
the link to the PR. I also noticed that the task used for running
integration tests sometimes gets cached (locally, this doesn't happen on
Jenkins). The PR to this issue is also below[2].

Regarding the idea to collect execution time from profile report: it seems
doable and would require changing the beam_integration_benchmark.py
implementation[3]. I postponed doing this as I find proper building a more
crucial task.

Best regards,

[1] https://github.com/GoogleCloudPlatform/PerfKitBenchmarker/pull/1690
[2] https://github.com/apache/beam/pull/5395

2018-05-14 12:35 GMT+02:00 Łukasz Gajowy <lukasz.gaj...@gmail.com>:

> Hi,
> thanks for all the advice - much appreciated! During the mvn -> gradle
> migration we just "translated" the existing mvn commands to gradle. We
> definitely need to improve them in PerfKit now. I also like Scott's idea
> about using the --profile flag. It would be awesome to utilize this in
> Perfkit so I will investigate the topic further too.
> Best regards,
> Łukasz
> 2018-05-10 1:55 GMT+02:00 Lukasz Cwik <lc...@google.com>:
>> +1 on only specifying the target that you need to build, You should use
>> './gradlew -p path/to/java/project assemble' OR './gradlew
>> :project-artifact-name:assemble' to build the jars that you should need.
>> You can run these commands in a checked out version of your workspace and
>> validate that they produce what you expect.
>> On Tue, May 8, 2018 at 9:17 AM Scott Wegner <sweg...@google.com> wrote:
>>> A few thoughts:
>>> 1. Gradle can intelligently build only the dependencies necessary for a
>>> task, so it shouldn't build all of Python if the test suite, if you only
>>> specify the task you're interested in. I'm not sure of the command for
>>> "build all of the dependencies of my tests but don't run my tests"; maybe
>>> "./gradlew mytests -x mytests" ?
>>> 2. Some tasks in the build are not yet cacheable for various reasons. So
>>> you may see them getting rebuilt on the second execution even on success,
>>> which would then be included in your overall build timing. Information
>>> about which tasks were used from the build cache is available in the Gradle
>>> build scan (--scan).
>>> Another idea for measuring the execution time of just your tests would
>>> be to pull this out of Gradle's build report.  Adding the --profile flag
>>> generates a report in $buildDir/reports/profile, which should have the
>>> timing info for just the task you're interested in:
>>> https://docs.gradle.org/current/userguide/command_line_interface.html
>>> On Tue, May 8, 2018 at 8:23 AM Łukasz Gajowy <lukasz.gaj...@gmail.com>
>>> wrote:
>>>> Hi Beam Devs,
>>>> currently PerfkitBenchmarker (a tool used to invoke performance tests)
>>>> has two phases that run gradle commands:
>>>>    - Pre-build phase: this is where all the beam repo is build. This
>>>>    phase is to prepare the necessary artifacts so that it doesn't happen 
>>>> when
>>>>    executing tests.
>>>>    - Actual test running phase. After all necessary code is built we
>>>>    run the test and measure it's execution time. The execution time is
>>>>    displayed on the PerfKit dashboard [1].
>>>> After the recent mvn - gradle migration we noticed that we are unable
>>>> to "Pre build" the code[2]. Because one of the python related tasks fails,
>>>> the whole "preBuild" phase fails silently and the actual building happens
>>>> in the "test running" phase which increases the execution time (this is
>>>> visible in the plots on the dashboard).
>>>> This whole situation made me wonder about several things, and I'd like
>>>> to ask you for opinions. I think:
>>>>    - we should skip all the python related tasks while building beam
>>>>    for java performance tests in PerfKit. Those are not needed anyway when 
>>>> we
>>>>    are running java tests. Is it possible to skip them in one go (eg. the 
>>>> same
>>>>    fashion we skip all checks using -xcheck option)?
>>>>    - the same goes for Python tests: we should skip all java related
>>>>    tasks when building beam for python performance tests in PerfKit. Note 
>>>> that
>>>>    this bullet is something to be developed in the future, as
>>>>    beam_PerformanceTests_Python job (the only Python Performance test job) 
>>>> is
>>>>    failing for 4 months now and seems abandoned. IMO it should be done when
>>>>    someone will bring the test back to life. For now the job should be
>>>>    disabled.
>>>>    - we should modify Perfkit so that when the prebuild phase fails
>>>>    for some reason, the test is not executed. Now we don't do this and the
>>>>    test execution time depends on whether "gradle integrationTest" command
>>>>    builds something or just runs the test. IMO when the command has to 
>>>> build
>>>>    anything the execution time should not be included in the Dashboards,
>>>>    because it's a false result.
>>>> What do you think of all this?
>>>> [1] https://apache-beam-testing.appspot.com/explore?dashboar
>>>> d=5755685136498688
>>>> [2] https://issues.apache.org/jira/browse/BEAM-4256
>>>> Best regards,
>>>> Łukasz Gajowy

Reply via email to