Laurentiu,

I am cc-ing analytics@ public e-mail lists where you can direct this type
of questions.

>for this goal we need time series of the http requests (pagecount, traces,
and so on)  the resolution of milliseconds.
This is really not super clear. I think you need to provide a more through
description of the data you need.
For example, this describes internal data that we retain for ad most 90
days:
https://wikitech.wikimedia.org/wiki/Analytics/Data/Webrequest


Given that you need ms resolution it is not likely we might have a dataset
you can use but maybe someone on this list can point you to a released
dataset that might be suitable.

Thanks,

Nuria







On Fri, Jan 27, 2017 at 3:00 PM, Laurentiu Checiu <[email protected]>
wrote:

> Dear Ms. Nuria Ruiz,
>
> I am a PhD student at the University of Ottawa and my research is focused
> on the cloud computing stochastic models.
>
> I found "Page view statistics for Wikimedia projects":
> http://dumps.wikimedia.org/other/pagecounts-raw and from this source I
> can construct time series of http requests on a hourly base. Based on these
> time series we can estimate a model for a cloud computing system. However,
> this hourly rate of the requests is not quite suitable for our intended
> model. We are aiming to a model able to react at level of seconds or even
> faster and for this goal we need time series of the http requests
> (pagecount, traces, and so on) at the resolution of milliseconds. We are
> interested only on the number of the requests on the time unit (ms) and not
> on the actual source or the destination of these http requests.
>
> May I ask you for help in this matter ?
>
> Best regards,
> Laurentiu Checiu
>
_______________________________________________
Analytics mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to