Pintoch closed this task as "Resolved".Pintoch claimed this task.Pintoch added a comment.
This issue is resolved, the feature was released in OpenRefine 2.7 rc1.TASK DETAILhttps://phabricator.wikimedia.org/T146740EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailp
Pintoch added a comment.
@Smalyshev thanks for your quick reply! Just for clarity, I am not personally working on the PST, I was just trying to find out if there was any established way to use RDF to represent a data import. If that is the case, then other tools could use that format too
Pintoch added a comment.
@Lydia_Pintscher , @Smalyshev and @Tpt : is there any info about how RDF is expected to behave as an import format for Wikidata? As far as I can tell, the RDF that gets fed into the Query Service is not designed for import at all:
first, there is a lot of redundancy
Pintoch added a comment.
@Lydia_Pintscher that makes sense. Okay, thank you to you both, we are on the same page! Given all these tickets on the topic I was worried that I had missed something obvious about this issue…TASK DETAILhttps://phabricator.wikimedia.org/T173749EMAIL PREFERENCEShttps
Pintoch added a comment.
Two closely related examples which exhibit different behaviors:
http://tinyurl.com/y9ctl9vo (incorrect ordering)
http://tinyurl.com/y8gy77hf (correct ordering)TASK DETAILhttps://phabricator.wikimedia.org/T162250EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings
Pintoch added a comment.
The import was discussed at various places, including at the data import hub, the property talk page and my talk page.TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc
Pintoch added a comment.
I think there are plenty of examples of non-CC0 data being imported in Wikidata.
PubMedCentral is being imported at a large scale and as far as I can tell it is not CC0 (at least https://europepmc.org/Copyright does not make that clear. It focuses on the full texts, which
Pintoch added a comment.
I am glad I got the discussion going then: you now have one concrete example to look at (or maybe two? you did not comment on PMC). I think it is fair to say that this is not exactly an isolated case (but I am surprised that you seem to (pretend) not to know? Maybe
Pintoch added a comment.
Both properties mentioned above have been created in the mean time:
https://www.wikidata.org/wiki/Property:P5037
https://www.wikidata.org/wiki/Property:P5115
So this constraint could be useful, I think.TASK DETAILhttps://phabricator.wikimedia.org/T191963EMAIL
Pintoch added a comment.
It would be fantastic to have more meaningful edit summaries with wbeditentity. It's of course hard to do this in general, but it would be great to have this for some common cases where a short summary seems doable (adding multiple statements with the same property
Pintoch created this task.Pintoch added projects: Wikimedia-Hackathon-2018, Wikidata.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONCome to complain about anything you do not like about the process of matching data to Wikidata with OpenRefine!
This aspect of OpenRefine is suitable for quick
Pintoch added a comment.
As soon as this is supported by the Wikibase API, then it makes sense to build support for this directly in Wikidata-Toolkit. This is something that would be massively useful for many people.
As for OpenRefine, we need to brainstorm a bit to find when exactly these calls
Pintoch created this task.Pintoch added a project: Wikidata.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONThe Wikibase date datatype has a notion of precision, and also a lesser known notion of upper and lower bounds expressed as integer multiples of this precision.
For instance
Pintoch updated the task description. (Show Details)
CHANGES TO TASK DESCRIPTION...If this is not a bug, then there is a bug in Wikidata-Toolkit, which agrees with thismy interpretation of the default value:...TASK DETAILhttps://phabricator.wikimedia.org/T194869EMAIL PREFERENCEShttps
Pintoch added a project: Wikibase-DataModel.
TASK DETAILhttps://phabricator.wikimedia.org/T194869EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Pintoch, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude
Pintoch updated the task description. (Show Details)
CHANGES TO TASK DESCRIPTION...When creating a claim with a date value in the Wikibase UI, by default the "After" parameter is set to 0 (see [[ https://www.wikidata.org/w/index.php?title=Q4115189=prev=680600484 |this example]]). Is it i
Pintoch added a comment.
Note to self: for this we would need to rethink Wikidata authentication in OpenRefine, migrating it to OAuth. This would include adding OAuth support in Wikidata-Toolkit. This has not been done yet because OAuth is not suited for open source software that is run directly
Pintoch added subscribers: Spinster, DarTar.Pintoch added a comment.
@Spinster @DarTar That's in 20 minutes in Sala de Projectes! QC/0011TASK DETAILhttps://phabricator.wikimedia.org/T193875EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: DarTar
Pintoch added a comment.
When running software on localhost, the client needs to have OAuth consumer credentials, which are supposed to be private. If I apply for an OAuth consumer for OpenRefine, I cannot put the credentials in OpenRefine's source code, because it would allow anyone to reuse them
Pintoch added a comment.
Oh I meant 10:30, fixing that nowTASK DETAILhttps://phabricator.wikimedia.org/T194952EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Rfarrand, bcampbell, Halfak, Pintoch, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden
Pintoch created this task.Pintoch added projects: Wikidata, Wikimedia-Hackathon-2018.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONI will introduce the EditGroups tool, designed to keep track of edit batches in Wikidata, review them and revert them easily:
https://tools.wmflabs.org
Pintoch added a comment.
@bcampbell that would be nice! but only if it's not too much effort :)TASK DETAILhttps://phabricator.wikimedia.org/T194952EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Rfarrand, bcampbell, Halfak, Pintoch, Aklapper, Lahi
Pintoch added a subscriber: Tpt.Pintoch added a comment.
After discussion with @Tpt, for now we are just going to change Wikidata-Toolkit's behaviour to use 0 in the After parameter as well… but that's just because it's really hard to shift the default now.TASK DETAILhttps
Pintoch added a project: Wikibase-Quality-Constraints.
TASK DETAILhttps://phabricator.wikimedia.org/T197587EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Aklapper, Pintoch, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, Gstupp
Pintoch created this task.Pintoch added a project: Wikibase-Containers.Herald added a subscriber: Aklapper.Herald added a project: Wikidata.
TASK DESCRIPTIONIt would be fabulous if we could just say "yes I want quality constraints in my own wikibase" and automatically populate th
Pintoch created this task.Pintoch added a project: Wikidata.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONBackground: we want to make it possible for OpenRefine users to upload statements to arbitrary Wikibase instances.
Idea: the user should be able to specify which instance they want
Pintoch added a comment.
A sample of what such a manifest could look like is here:
https://gist.github.com/despens/d6ae4110c4e97944ddba29f23d78899f
It could be served at a predictable location for each wikibase instance - such as, for instance,
https://www.wikidata.org/manifest-v0.1.json
Pintoch added a comment.
One other approach to this problem would be to consider that these manifest files are not expected to be necessarily hosted by the Wikibase instance itself - these configuration files could be user-contributed and hosted anywhere (or derived automatically from the Wikibase
Pintoch closed this task as "Resolved".
TASK DETAILhttps://phabricator.wikimedia.org/T193875EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: DarTar, Spinster, PeterTheOne, Lucas_Werkmeister_WMDE, Tpt, Pintoch, Aklapper, Lahi, Gq86, GoranSM
Pintoch closed this task as "Resolved".
TASK DETAILhttps://phabricator.wikimedia.org/T194952EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Rfarrand, bcampbell, Halfak, Pintoch, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplo
Pintoch added a comment.
Just noting that this prevents us from adding examples on lexeme-related properties, such as https://www.wikidata.org/wiki/Property:P5244.TASK DETAILhttps://phabricator.wikimedia.org/T195615EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences
Pintoch added a comment.
I have observed this bug multiple times now (also using Firefox).TASK DETAILhttps://phabricator.wikimedia.org/T195258EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Pintoch, hoo, Seb35, abian, Lucas_Werkmeister_WMDE
Pintoch added a comment.
this bug seems to be fairly new, it has probably been introduced by a recent code change. I have just run into it and it is definitely a new behavior.TASK DETAILhttps://phabricator.wikimedia.org/T186945EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel
Pintoch added a comment.
Etalab (who runs the open data portal of the French government) have released a statement (in French) concerning the attribution requirement of their "licence ouverte", confirming that it only applies to the first re-user.
https://github.com/etalab/wiki-data-
Pintoch added a comment.
@Lydia_Pintscher @Ladsgroup any idea how I could be notified of any new automatic edit summaries, such as the wbeditentity-create-item that this change introduced? For any such summary, I need to add it to EditGroups, especially if the new auto summary replaces a highly
Pintoch updated the task description. (Show Details)
CHANGES TO TASK DESCRIPTIONSome Wikidata properties are expected to hold the same values when they are both present on the same item. This is the case for https://www.wikidata.org/wiki/Property:P4285 and https://www.wikidata.org/wiki
Pintoch created this task.Pintoch added projects: Wikibase-Quality-Constraints, Wikidata.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONSome Wikidata properties are expected to hold the same values when they are both present on the same item. This is the case for P4285 and P269 for instance
Pintoch added a comment.
I think this ticket can be closed given that we cannot figure out what it is supposed to be about.
For reference, https://github.com/wetneb/openrefine-wikibase works for arbitrary Wikibase instances now (the first bullet point in my reply above).TASK DETAILhttps
Pintoch closed this task as "Invalid".
TASK DETAILhttps://phabricator.wikimedia.org/T192811EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Aklapper, Lydia_Pintscher, Pintoch, Daniel_Mietchen, RazShuty, Nandana, LJ, Lahi, Gq86, S
Pintoch added a comment.
Some of the OpenRefine edits were not tagged during development but all edits done with a released version should be. Some of the OpenRefine batches are uploaded via QuickStatements, in which case they are tagged as such. (The main benefits of using QS with OpenRefine
Pintoch claimed this task.Pintoch added a comment.
I would be interested in helping with this - I can guide you through the uploading process with OpenRefine.
If you want to prepare for this, I feel free to download OpenRefine have a look at tutorials, like these:
https://www.wikidata.org/wiki
Pintoch added a comment.
Just to let you know that the problem with the ".0" will be solved in the next version of OpenRefine.
In the meantime, you can solve the issue by transforming your column with the following _expression_: value.toString().replace(".0","").
Pintoch added a comment.
Thanks for the ping Lydia! On the top of my mind, the only uses of SPARQL in the tools I maintain are in the openrefine-wikidata interface:
queries to retrieve the list of subclasses of a given class - lag is not critical at all for this as the ontology is assumed
Pintoch added a comment.
Awesome! \o/ Actually OpenRefine could potentially help you already at that stage to do the matching - let me know if you want a quick demo :)TASK DETAILhttps://phabricator.wikimedia.org/T207839EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel
Pintoch added a comment.
@Gehel my service has been quite unstable for some time, but I haven't found the time yet to find out exactly where the problem is coming from - it could be SPARQL, the Wikidata API, redis or the webservice itself. I will add a few more metrics to understand what is going
Pintoch added a comment.
The search interface can also be used for that thanks to the haswbstatement command. That only gets you one id per query, so it might not be suited for all tools. I don't know if the lag is lower in this interface.
Retrieving items by identifiers is quite crucial in many
Pintoch edited subscribers, added: Pintoch; removed: Cloud-Services.Pintoch added a comment.
I have taken the liberty to remove "Cloud Services" as a subscriber to this ticket as I do not think every toollabs user wants to receive notifications about this.TASK D
Pintoch added a comment.
I was thinking of the opposite: consider the violations related to the revision R of the item I to be the violations of the statements of I with respect to the state of Wikidata just before R+1 was saved.
Because for the current revision, you do want to keep invalidating
Pintoch added a comment.
I currently use my own custom hacky script to create properties, but having something stable and usable by anyone would be highly beneficial.TASK DETAILhttps://phabricator.wikimedia.org/T139898EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel
Pintoch created this task.Pintoch added a project: Wikidata-Gadgets.Restricted Application added a subscriber: Aklapper.Restricted Application added a project: Wikidata.
TASK DESCRIPTIONSome other Wikimedia sites have a gadget which makes it possible to display dynamic watchlist messages (above
Pintoch added a comment.
Related:
T213012: Enable the Watchlist Messages gadget in Wikidata
TASK DETAILhttps://phabricator.wikimedia.org/T205017EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Pintoch, Addshore, Aklapper, Tarrow, Nandana, LJ, Lahi
Pintoch added a comment.
Oh can they? Sorry I had no idea! Thanks, I will try to enable it myself.TASK DETAILhttps://phabricator.wikimedia.org/T213012EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Lydia_Pintscher, Pintoch, Aklapper, Nandana, Lahi
Pintoch added a comment.
I have pinged a few interface admins on wiki to enable this.TASK DETAILhttps://phabricator.wikimedia.org/T213012EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: PintochCc: Lydia_Pintscher, Pintoch, Aklapper, Nandana, Lahi, Gq86
Pintoch added subscribers: Tpt, Pintoch.Pintoch added a comment.
This ticket is fantastic news.
It's probably completely out of the scope of this, but let's mention it anyway: I think it would be massively useful for revision scoring if we could have access to both the current violation report
Pintoch added a comment.
@aborrero thanks for the ping. I do not recognize the shape of the queries as coming from this tool though. The openrefine-wikidata tool should do relatively few SPARQL queries, whose results are cached in redis. How did you determine that this tool is the source
Pintoch added a comment.
@Lydia_Pintscher personally here is what I would concretely implement in the EditGroups tool. For each edit that is part of an edit group:
fetch the constraints violations before and after the edit (this fetching would happen as the edit is retrieved, so in near real
Pintoch added a comment.
@Lydia_Pintscher yes indeed! For instance the aggregation at batch-level would probably not be meaningful for inverse constraints (unless there is a way to detect all the violations added and solved by an edit, not just on the item where the edit was made). But isn't
Pintoch added a comment.
@Lucas_Werkmeister_WMDE thank you very much for that!TASK DETAILhttps://phabricator.wikimedia.org/T207484EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Lucas_Werkmeister_WMDE, PintochCc: Pintoch, Addshore, Lydia_Pintscher
Pintoch added a comment.
Concerning the dumps, it should be possible to add versioning information on
a per-entity basis, for instance by adding the revision id in the JSON
serialization of the entity, as is currently done in Special:EntityData. This
would arguably be more useful than a per
Pintoch added a comment.
I am wondering what is the status of this: is more discussion needed about
what version information to include, or are we simply waiting for a patch? I
vote for the revision id to serve as version id (possibly with other metadata
such as timestamp
Pintoch added a comment.
I think Wikidata-Toolkit could be used for that:
https://github.com/Wikidata/Wikidata-Toolkit/blob/master/wdtk-rdf/src/main/java/org/wikidata/wdtk/rdf/RdfSerializer.java
Obviously it would mean making sure the RDF serialization produced by it is
consistent
Pintoch added a comment.
@Smalyshev okay! Sorry if this is not the right place: I would be happy to
migrate the patch to another ticket. Indeed this only adds entity-level
metadata, not dump-level metadata. I think this would be less of a breaking
change, given that it does not require
Pintoch added a comment.
I agree with @Nicolastorzec above.
I suspect that the entity dumps are more popular and cheaper to generate than
the full SQL/XML dumps, so I would argue that they should be generated more
often.
Even if the entity dumps and the SQL/XML dumps were generated
Pintoch added a comment.
If you need a mapping from ISO language codes to Wikimedia ones,
Wikidata-Toolkit has such a mapping:
https://github.com/Wikidata/Wikidata-Toolkit/blob/3e62f93b137c25961c5a12172c7f213a720ecb67/wdtk-datamodel/src/main/java/org/wikidata/wdtk/datamodel/interfaces
Pintoch added a subscriber: Mvolz.
Pintoch added a comment.
> As a tool developer, it would be very useful to access the language
fallback graph from an API.
I would use the resulting graph in
https://tools.wmflabs.org/openrefine-wikidata/ . This tool is basically a
wrapper o
Pintoch created this task.
Pintoch added projects: Discovery-Search, Wikidata.
Restricted Application added a subscriber: Aklapper.
TASK DESCRIPTION
The entity suggester only returns items, not properties. This is
counter-intuitive as Special:Search returns both items and properties
Pintoch added a comment.
What is the protocol to go forward on this? Should we hold a RFC on-wiki to
let people choose among the possible solutions above?
TASK DETAIL
https://phabricator.wikimedia.org/T206392
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel
Pintoch added a parent task: T56328: Provide intraline diff format in API
action=compare.
TASK DETAIL
https://phabricator.wikimedia.org/T218779
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc: Aklapper, Pintoch, alaa_wmde, Nandana, Lahi
Pintoch created this task.
Pintoch added a project: Wikidata-Campsite.
Restricted Application added a subscriber: Aklapper.
Restricted Application added a project: Wikidata.
TASK DESCRIPTION
As an API user, I would like to be able to access diffs between revisions of
items (or properties
Pintoch updated the task description.
TASK DETAIL
https://phabricator.wikimedia.org/T218779
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc: Aklapper, Pintoch, alaa_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic,
QZanden, LawExplorer
Pintoch renamed this task from "Exposed structured diffs in Wikibase API" to
"Expose structured diffs in Wikibase API".
TASK DETAIL
https://phabricator.wikimedia.org/T218779
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To:
Pintoch added a comment.
@Lydia_Pintscher Sure! I have just deployed a demonstration of this use case
on EditGroups <https://tools.wmflabs.org/editgroups/>.
The goal is to index all Wikidata edit groups by the properties that they
change (as statements or qualifiers, added or r
Pintoch added a comment.
Useful solution from Nikki: add in your common.css:
/* from [[User:Nikki/common.css]] */
.wb-preferred { background-color: lavender }
.wb-deprecated { background-color: mistyrose }TASK DETAILhttps://phabricator.wikimedia.org/T206392EMAIL PREFERENCEShttps
Pintoch added a comment.
I have updated the Wikibase data model docs, which incorrectly mentioned precisions of hours, minutes and seconds. I assume that they were there because they were part of an earlier design?TASK DETAILhttps://phabricator.wikimedia.org/T57755EMAIL PREFERENCEShttps
Pintoch added a comment.
Ok great! I'll move the field to the end and try to make Jenkins happy then.
TASK DETAIL
https://phabricator.wikimedia.org/T87283
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc: Lydia_Pintscher, Pintoch
Pintoch added a comment.
On a similar note, we also have some properties whose value is expected to be
the subject entity id:
- https://www.wikidata.org/wiki/Property:P6482
- https://www.wikidata.org/wiki/Property:P6413
TASK DETAIL
https://phabricator.wikimedia.org/T191963
EMAIL
Pintoch added a project: OpenRefine.
TASK DETAIL
https://phabricator.wikimedia.org/T197588
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc: Nikerabbit, Salgo60, Aklapper, Gstupp, Lucas_Werkmeister_WMDE, Pintoch,
darthmon_wmde
Pintoch added a project: OpenRefine.
TASK DETAIL
https://phabricator.wikimedia.org/T194767
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc: Aklapper, Abbe98, Pintoch, Lucas_Werkmeister_WMDE, darthmon_wmde,
Premeditated, Nandana, Lahi
Pintoch added a comment.
Thanks all for your patience for this! Excited to see my first commit making
it into Wikibase \o/
TASK DETAIL
https://phabricator.wikimedia.org/T87283
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc
Pintoch added a comment.
That's great! Is there a list of the supported external ids?
TASK DETAIL
https://phabricator.wikimedia.org/T223776
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Danmichaelo, Pintoch
Cc: Pintoch, Lea_Lacroix_WMDE
Pintoch added a subscriber: Lydia_Pintscher.
Pintoch added a comment.
@Lydia_Pintscher we would need your thoughts about this.
In a nutshell, the proposal is to add the `lastrevid` field currently exposed
in `Special:EntityData` and in the API (`action=wbgetentities`) to the JSON
dumps
Pintoch added a comment.
This is nice! However, when visualizing properties by category, it seems that
subclasses are not taken into account: only the properties bearing that exact
category as P31 <https://phabricator.wikimedia.org/P31> value are listed. This
gives a pretty inaccurat
Pintoch added a comment.
Great point! I did not think about that in this way. It sounds like a very
sensible route to follow.
TASK DETAIL
https://phabricator.wikimedia.org/T203557
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Pintoch
Cc: Abit
Pintoch added a comment.
I thought for a moment that there was an issue with the fact that at the
moment, filtering by tags only works for Special:RecentChanges (which only
contain the most recent changes, not all of them). But Lucas pointed out that
it is also supported
Pintoch added a comment.
Exciting developments! I am wondering if that `comment_data` could be (or
already is) exposed in any public API? Exposing such diffs would also solve
T106306 <https://phabricator.wikimedia.org/T106306> in the same go.
As an API consumer I would have a
Pintoch added a comment.
Thanks! I am working on making OpenRefine easier to host, but it's a long
term project indeed (exciting announcement about that soon).
TASK DETAIL
https://phabricator.wikimedia.org/T238003
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel
Pintoch added a comment.
So this is what we get with an exponential back-off (1.5 factor), at the
moment:
22:37:27.148 [..baseapi.WbEditingAction] [maxlag] Waiting for all:
5.49167 seconds lagged. -- pausing for 1000 milliseconds. (19338ms)
22:37:28.729
Pintoch added a comment.
I am first getting in touch with people who seem to be running bots with
maxlag greater than 5 or no maxlag parameter at all, to see if they would
accept to follow @Addshore's advice never to use maxlag greater than 5 at all.
TASK DETAIL
https
Pintoch added a comment.
I've had a quick look at the code to see if I could submit a patch for this
myself but it is not clear to me where the edits are done - I have looked in
petscan_rs and wikibase_rs to no avail. Petscan edits might be done in the
browser by sending them to some Widar
Pintoch added a comment.
@Theklan let's move your issue to a different ticket as your issue does not
seem to be related: T240436 <https://phabricator.wikimedia.org/T240436>
TASK DETAIL
https://phabricator.wikimedia.org/T197588
EMAIL PREFERENCES
https://phabricator.wikimedia.org/se
Pintoch added a comment.
@Bugreporter have you got details of where this behaviour is currently
implemented in PetScan? In particular, how do you request the current maxlag
with the MediaWiki API?
TASK DETAIL
https://phabricator.wikimedia.org/T240370
EMAIL PREFERENCES
https
Pintoch added a comment.
If clients are able to retrieve the current lag periodically (through some
MediaWiki API call? which one?), then this should not require any server-side
change. Clients can continue to use `maxlag=5` but to also throttle themselves
using the smoothed function
Pintoch created this task.
Pintoch added a project: Wikidata.
Restricted Application added a subscriber: Aklapper.
TASK DESCRIPTION
With the introduction of the WDQS lag in Wikidata's maxlag computation
(T221774 <https://phabricator.wikimedia.org/T221774>), we are now seeing the
behav
Pintoch added a comment.
Thanks! I think dynamically changing the maxlag value is likely to still
introduce some thresholds, whereas a continuous slowdown (by retrieving the lag
and compute one's edit rate based on it) should in theory reach an equilibrium
point.
In the meantime
Pintoch added a comment.
Thanks for the analysis! Whether this is a breaking change or not is not my
concern: Petscan and other mass-editing tools based on Widar should play by the
book. I can provide a simple patch which ensures maxlag=5 is applied to all
Widar edits: if someone wants
Pintoch created this task.
Pintoch added a project: Wikidata.
Restricted Application added a subscriber: Aklapper.
TASK DESCRIPTION
To ensure that Wikidata bot edits slow down when lag is high, we should
ensure that bot operators follow the guidelines which recommend to use the
`maxlag
Pintoch created this task.
Pintoch added a project: Wikidata.
TASK DESCRIPTION
Author Disambiguator edits go through even if maxlag is higher than 5, we
should fix the code to ensure it complies with the maxlag policy on Wikimedia
wikis <https://www.mediawiki.org/wiki/Manual:Maxlag_parame
Pintoch created this task.
Pintoch added a project: Wikidata.
TASK DESCRIPTION
Petscan edits go through even if maxlag is higher than 5, we should fix the
code to ensure it complies with the maxlag policy on Wikimedia wikis
<https://www.mediawiki.org/wiki/Manual:Maxlag_parameter> by
Pintoch created this task.
Pintoch added a project: Wikidata.
TASK DESCRIPTION
Edoderoobot edits go through even if maxlag is higher than 5, we should fix
the code to ensure it complies with the maxlag policy on Wikimedia wikis
<https://www.mediawiki.org/wiki/Manual:Maxlag_parameter> by
Pintoch created this task.
Pintoch added a project: Wikidata.
TASK DESCRIPTION
LargeDatasetBot edits go through even if maxlag is higher than 5, we should
fix the code to ensure it complies with the maxlag policy on Wikimedia wikis
<https://www.mediawiki.org/wiki/Manual:Maxlag_parame
1 - 100 of 170 matches
Mail list logo