FWIW we store additional information such as a test UUID in influxdb as a
separate field (rather than part of the series name) which lets us compare
results across tests or filter things like request labels across tests in
our summary dashboard on Flood IO e.g.:

https://s3.amazonaws.com/flood-io-support/Flood_IO_2015-04-14_08-00-41.jpg

Since "series are cheap" in influxdb we tend to organise series by
<accountId>.<projectId>.<metric> and it's great for analysing time series
information as such.

In any reporting/dashboard solution however I don't think you'd escape the
need to store other meta data about each test in a relational DB or the
like, something that is better suited to that type of info ..

Hope that info is of use.

Regards,


Tim Koopmans

[image: Flood IO Pty Ltd] <https://flood.io>



On Tue, Apr 14, 2015 at 1:58 AM, Glenn Caccia <[email protected]>
wrote:

>
> Not really.  It certainly is a nice tool for quickly creating a new
> dashboard for a new script.  However, it doesn't address the issue of
> comparing results from different runs for the same script.  There are other
> ways of dealing with that, taking screen shots after each run and putting
> then in a common doc, for example, but ideally your reporting solution
> would be a one-stop shop for all your reporting needs.
>
>
>      From: Bob <[email protected]>
>  To: JMeter Users List <[email protected]>
>  Sent: Sunday, April 12, 2015 8:43 PM
>  Subject: Re: Thoughts on InfluxDB/Grafana integration
>
> Glenn, I think Chaitanya's Grafana Dashboard generator solves the problem?
>
> https://github.com/bhattchaitanya/Grafana-Dashboard-Generator/wiki
>
>
>
>
>
> On 11/04/15 01:45, Glenn Caccia wrote:
> > Thinking about this more, you could use a dynamic rootMetricsPrefix,
> something like..
> >
> > jmeter.${__TestPlanName}.${__time}.
> >
> > That could then be used across all scripts and would satisfy the basic
> requirement from a storage perspective, but Grafana itself still can't
> easily handle the requirement from a display perspective.  Since queries
> are hard coded into a graph, you'd be stuck either needing to make a new
> dashboard for each test run or manually editing a dashboard for each test
> run.  It would be a mess to work with.
> >        From: Glenn Caccia <[email protected]>
> >  To: JMeter Users List <[email protected]>
> >  Sent: Friday, April 10, 2015 1:23 PM
> >  Subject: Re: Thoughts on InfluxDB/Grafana integration
> >
> >
> > You could do that, but it would then require remembering to change the
> root value each time you did a new run, which would then also require
> changing your dashboard queries each time to pick up on the new run.  I
> don't think that's a solution I would want to maintain.  I would definitely
> use variations on the rootMetricsPrefix to distinguish between test
> scripts, however.  The InfluxDB/Grafana solution is great for real-time
> analysis, which is certainly important, but seems to fall short on the need
> to easily compare runs.
> >        From: Philippe Mouawad <[email protected]>
> >
> >
> >  To: JMeter Users List <[email protected]>
> >  Sent: Friday, April 10, 2015 11:54 AM
> >  Subject: Re: Thoughts on InfluxDB/Grafana integration
> >
> > Hi,
> > What about playing on rootMetricsPrefix to do that ?
> >
> > Regarding SQL, do you know that you can now easily build a jdbc backend
> to
> > store results in a database, you could contribute this to core.
> >
> >
> > Regards
> >
> >
> >
> > On Friday, April 10, 2015, Glenn Caccia <[email protected]>
> wrote:
> >
> >>    I've successfully installed InfluxDB and Grafana and did some basic
> >> testing where I can now see results in Grafana.  I'm beginning to wonder
> >> about the benefits of this system.  A while ago I had toyed around with
> the
> >> idea of using Elasticsearch as a backend for JMeter test results and
> using
> >> Kibana to view results.  I ultimately dropped the idea because of the
> >> limitations of how data is structured.  I see the exact same issue with
> >> InfluxDB and Grafana (either that, or I don't fully understand what can
> be
> >> done in these tools).
> >> What I want when viewing results is the ability to work with results in
> >> terms of projects, test plans, and results from a particular test run.
> For
> >> example, I want to see results for project A, test plan B and compare
> >> results from the prior run with the current run.  With InfluxDB/Grafana
> >> solution, there is no concept of a run.  If I run a test one day and
> then
> >> run the same test the subsequent day, I can't compare the results using
> the
> >> same view.  I can certainly change my time filter to see both inline
> (with
> >> a big gap inbetween) or view one and then view the other, but I can't
> stack
> >> them in separate graphs and see them at the same time or display them in
> >> the same graph.  Likewise, if I want to see what performance was like
> the
> >> last time a test was run and I don't know when the last test was run, I
> >> have to do a bit of searching by playing with the time filter.
> >> A while ago I worked for a company that used SQL Server for a lot of
> their
> >> data storage needs.  This gave me access to the SQL Server Report
> Builder
> >> tool.  I was able to create a solution where JMeter results were loaded
> >> into SQL Server and we had a report interface where you could choose
> your
> >> project, choose your test plan and then see the dates/times for all
> prior
> >> runs.  From this, you could choose which run(s) to view.  I don't have
> >> access to tools like that with my current company, but I miss that kind
> of
> >> ability to structure and access test results.  A similar approach
> >> to storing and presenting results can be seen with loadosphia.
> >> In short, it seems like this new solution is primarily useful for
> >> analyzing results from a current test run (which can already be done
> with
> >> existing listeners) but is not as useful a tool for comparing results or
> >> checking on results from prior runs.  Am I missing something or is that
> a
> >> fair conclusion?
> >>
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>
>
>

Reply via email to