[
https://issues.apache.org/jira/browse/CASSANDRA-16183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17218319#comment-17218319
]
Adam Holmberg edited comment on CASSANDRA-16183 at 10/21/20, 2:42 PM:
----------------------------------------------------------------------
I started looking at this yesterday, and thought I would drop a note here to
see if there are preferences on how to proceed. I have considered three
approaches:
1.) Unit tests could be added touching all of these metrics. It would be
straightforward to validate CL mapping, counters, latency sanity (i.e. non-zero
and correct relative to each other. What we would not achieve is validating
Unavailables and Timeouts. We would just test that they are there, and zero.
If that is not sufficient we could augment this approach by tacking on
validation in dtests where those conditions are already being created.
2.) If the above is not rigorous enough, we could instead create new tests in
Python dtests. I think all the building blocks are there. What I am less sure
about is our current aversion to adding tests to this suite.
3.) If we need dtests and don't want to expand Python suite, it's probably
possible to do in-JVM dtests. This one is probably the heaviest lift for me
just because I have no exposure to that machinery yet. I'm happy to figure it
out, but the upfront learning will be more.
[~blerer]I'm interested in your thoughts, or anyone else who cares to weigh in.
was (Author: aholmber):
I started looking at this yesterday, and thought I would drop a note here to
see if there are preferences on how to proceed. I have considered three
approaches:
1.) Unit tests could be added touching all of these metrics. It would be
straightforward to validate CL mapping, counters, latency sanity (i.e. non-zero
and correct relative to each other. What we would not achieve is validating
Unavailables and Timeouts. We would just test that they are there, and zero.
If that is not sufficient we could augment this approach by tacking on
validation in dtests where those conditions are already being created.
2.) If the above is not rigorous enough, we could create new tests in Python
dtests. I think all the building blocks are there. What I am less sure about is
our current aversion to adding tests to this suite.
3.) If we need dtests and don't want to expand Python suite, it's probably
possible to do in-JVM dtests. This one is probably the heaviest lift for me
just because I have no exposure to that machinery yet. I'm happy to figure it
out, but the upfront learning will be more.
[~blerer]I'm interested in your thoughts, or anyone else who cares to weigh in.
> Add tests to cover ClientRequest metrics
> -----------------------------------------
>
> Key: CASSANDRA-16183
> URL: https://issues.apache.org/jira/browse/CASSANDRA-16183
> Project: Cassandra
> Issue Type: Improvement
> Components: Test/dtest/java
> Reporter: Benjamin Lerer
> Assignee: Adam Holmberg
> Priority: Normal
> Fix For: 4.0-beta
>
>
> We do not have test that covers the ClientRequest metrics.
> * ClientRequestMetrics
> * CASClientRequestMetrics
> * CASClientWriteRequestMetrics
> * ClientWriteRequestMetrics
> * ViewWriteMetrics
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]