I would highly recommend using X-Analytics header for this, and
establishing a "well known" key name(s). X-Analytics gets parsed into
key-value pairs (object field) by our varnish/hadoop infrastructure,
whereas the user agent is basically a semi-free form text string. Also,
user agent cannot be
Hi!
> I'll try to throw in a #TOOL: comment where I can remember using SPARQL,
> but I'll be bound to forget a few...
Thanks, though using distinct User-Agent may be easier for analysis,
since those are stored as separate fields, and doing operations on
separate field would be much easier than
Hi!
> Would it help if I add the following header to every large batch of queries?
I think having a distinct User-Agent header (maybe with URL linking to
the rest of the info) would be enough. This is recorded by the request
log and can be used later in processing.
In general, every time you
>
> Once we have this, we would like to analyse for content (which properties
> and classes are used, etc.) but also for query feature (how many OPTIONALs,
> GROUP BYs, etc. are used). Ideas on what to analyse further are welcome. Of
> course, SPARQL can only give a partial idea of "usage", since
On 30.09.2016 20:47, Denny Vrandečić wrote:
Markus, do you have access to the corresponding HTTP request logs? The
fields there might be helpful (although I might be overtly optimistic
about it)
Yes, we can access all logs. For bot-based queries, this should be very
helpful indeed. I can
Hi Denny,
On 30-09-16 20:47, Denny Vrandečić wrote:
Markus, do you have access to the corresponding HTTP request logs? The
fields there might be helpful (although I might be overtly optimistic
about it)
I was about to say the same. I use pywikibot quite a lot and it sends
some nice headers
Hi Markus,
I assume I qualify for (1) and (2). I can add an identifyable comment
with a '#Tool:' prefix to every major sparql query done by our tools.
One bot run usually generates a few very heavy queries, and 10,000s of
smaller ones, depending on the actual task a bot performs. All of this
On 30.09.2016 19:50, Andra Waagmeester wrote:
Just curious while we are on the topic. When you are inspecting the
headers to separate between "organic" queries and bot queries, would it
be possible to count the times a set of properties is used in the
different queries? This would be a nice way
Markus, do you have access to the corresponding HTTP request logs? The
fields there might be helpful (although I might be overtly optimistic about
it)
On Fri, Sep 30, 2016 at 11:38 AM Yuri Astrakhan
wrote:
> I guess I qualify for #2 several times:
> * The & support
I guess I qualify for #2 several times:
* The & support access to the geoshapes service, which
in turn can make requests to WDQS. For example, see
https://en.wikipedia.org/wiki/User:Yurik/maplink (click on "governor's
link")
* The wiki tag supports the same geoshapes service, as well as
Just curious while we are on the topic. When you are inspecting the headers
to separate between "organic" queries and bot queries, would it be possible
to count the times a set of properties is used in the different queries?
This would be a nice way to demonstrate to original external resources
On 30.09.2016 16:18, Andra Waagmeester wrote:
Would it help if I add the following header to every large batch of queries?
###
# access: (http://query.wikidata.org
or https://query.wikidata.org/bigdata/namespace/wdq/sparql?query={SPARQL} .)
# contact: email, acountname, twittername etc
#
Would it help if I add the following header to every large batch of queries?
###
# access: (http://query.wikidata.org or
https://query.wikidata.org/bigdata/namespace/wdq/sparql?query={SPARQL} .)
# contact: email, acountname, twittername etc
# bot: True/False
# .
##
On Fri, Sep
Dear SPARQL users,
We are starting a research project to investigate the use of the
Wikidata SPARQL Query Service, with the goal to gain insights that may
help to improve Wikidata and the query service [1]. Currently, we are
still waiting for all data to become available. Meanwhile, we would
14 matches
Mail list logo