gerritbot added a comment.
Change 388287 merged by jenkins-bot:
[wikidata/query/rdf@master] Add option to fetch ordinal of the result in MWAPI query:
https://gerrit.wikimedia.org/r/388287TASK DETAILhttps://phabricator.wikimedia.org/T177275EMAIL
Not sure if I would go for it, but…
"Precision for the location of the center should be one percent of the
square root of the area covered."
Oslo covers nearly 1000 km², that would give 1 % of 32 km or 300 meter or
0.3 arc seconds.
On Mon, Nov 6, 2017 at 2:50 AM, John Erling Blad
Glittertinden, a mountain in Norway have a geopos 61.651222 N 8.557492 E,
alternate geopos 6835406.62, 476558.22 (EU89, UTM32).
Some of the mountains are measured to within a millimeter in elevation. For
example Ørneflag is measured to be at 1242.808 meter, with a position
6705530.826, 537607.272
hoo added a comment.
Note: I just also found T179793: Consider dropping the "wb_items_per_site.wb_ips_site_page" index while looking at this… maybe this can be done at once?!TASK DETAILhttps://phabricator.wikimedia.org/T114904EMAIL
hoo created this task.hoo added projects: Wikidata, MediaWiki-extensions-WikibaseRepository, DBA.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONFrom db1070:
KEY `wb_ips_site_page` (`ips_site_page`),
This is useful for queries where we want to find a given linked page by title (like
hoo added a comment.
Giving the size of the table, changing this shouldn't be overly horrible. It's a fair bit of migration work… but I assume doing this for maintenance queries and consistency is worth it.TASK DETAILhttps://phabricator.wikimedia.org/T114904EMAIL
ChristianKl created this task.ChristianKl added a project: Wikidata.Herald added a subscriber: Aklapper.
TASK DESCRIPTIONOne of Wikidata core problems at the moment is that it doesn't have enough review of edits and thus vandalism doesn't often takes a while to get reverted.
This is partly
hoo added a comment.
In T170779#3734809, @Smalyshev wrote:
@Snaterlicious, @hoo, @thiemowmde Do you know why the check is there and what it meant to be doing? @tstarling raised the following concern:
The search term is normalized by the server using $wgContLang->normalize(), which potentially
2017-11-05 19:13 GMT+01:00 Marco Neumann :
> Andrew,
>
> what would be your first choice for conflict resolution here? write an
> entry into the relevant item/discuss page? or go for a Requests for
> comment on the Community portal? or to contact the claim author
>
GarardM,
I don't mind the wikidata mess it's part of the open data ecosystem it
tries to embrace and will actually allow the project to grow along
some interesting real world data challenges.
btw we could use the ranking feature to exclude the statement from
some of the queries. so a flag for
Advice on the general case: a lot of country of citizenship properties,
especially ones created early, were a bit slapdash and there's a lot of
edge cases which get smoothed over. So don't assume they're gospel truth
(or necessarily worth "disputing" rather than correcing) to begin with just
Hoi,
No Sjoerd, the primary post is about issues how to fix them and how to
signal them.
When it is about nationality, it is an old story and as far as I am
concerned countries have a start date and often an end date. Prior to this
start date the country does not exist. During the epoch of the
You are not answering the question (not the first time). P27 has a lot of
problems, due to people using it for citizenship and nationality at the same
time. Put this in combination with the "fantastic" Wikipedia category system!
AFAIK there is still some discussion going on here:
ChristianKl added a comment.
Quora links to us with do-follow and we link to them with do-follow. In this case, I don't see a problem.TASK DETAILhttps://phabricator.wikimedia.org/T175230EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ChristianKlCc:
Addshore added a comment.
@Smalyshev we discussed dumping the JNL files used by blaze graph directly at points during wikidata con.
I'm aware that isnt a HDT dump, but im wondering if this would help in any way.TASK DETAILhttps://phabricator.wikimedia.org/T179681EMAIL
thalhamm removed thalhamm as the assignee of this task.
TASK DETAILhttps://phabricator.wikimedia.org/T143424EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: thalhammCc: Lydia_Pintscher, Smalyshev, thalhamm, thiemowmde, Sjoerddebruin, Glorian_Yapinus, Aklapper,
Nemo_bis added a comment.
This is still happening.TASK DETAILhttps://phabricator.wikimedia.org/T175230EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Nemo_bisCc: Nemo_bis, Aklapper, Lahi, GoranSMilovanovic, QZanden, Wikidata-bugs, aude,
Arkanosis added a comment.
FWIW, I've just tried to convert the ttl dump of the 1st of November 2017 on a machine with 378 GiB of RAM and 0 GiB of swap and… well… it failed with std::bad_alloc after more than 21 hours of runtime. Granted, there was another process eating ~100 GiB of memory, but I
Hoi,
There is much more to this. When a publication has been denounced, when the
author is denounced for having it ghost written. When ghost written is not
to reflect because of the stigma involved.. We should forcefully flag
publications, findings and authors when there is a problem.. A query
What's the current procedure for disputing a non trivial claim on a
wikidata item?
I know I can just go ahead and change a claim (statement and/or its
value) but the dispute itself would only be captured in the change-log
of the respective wikidata instance.
Would one create a discussion entry
20 matches
Mail list logo