brion removed a project: TimedMediaHandler.
brion added a comment.
Removing TimedMediaHandler as the patch landed last month.
TASK DETAIL
https://phabricator.wikimedia.org/T260735
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: brion
Cc: brion
brion added a comment.
PHP 5 is obsolete; use PHP 7.
cd /vagrant/mediawiki
sudo -u www-data php tests/phpunit/phpunit.php --wiki wiki
TASK DETAILhttps://phabricator.wikimedia.org/T95899EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: brionCc: brion
brion added a comment.
As for role ids -- perhaps we should primarily use the names, not the numbers, in the bit. It's analogous to a page's reference (a primary identifier) not to its or (which are provided informatively if you want to repro the database exactly, but can be freely discarded
brion added a comment.
Ok, proposed transitional schema looks like it imports cleanly via importDump (which uses same code path as Special:Import). The proposed final schema, however, imports a revision with empty text (and throws a notice on Undefined index: text in /vagrant/mediawiki/includes
brion added a comment.
My concern with the two-step transition idea is that some consumers may not update on a reliable schedule, or may not be able to do so easily. For instance, if people are using Special:Export on one wiki and Special:Import'ing those pages on another that's *not* a Wikimedia
brion added a comment.
In T178047#4073991, @ArielGlenn wrote:
In T178047#4073899, @brion wrote:
Not sure offhand about the schema; Yahoo's old documentation seems to have vanished from the net. (Probably on the wayback machine but I can't find a URL reference)
We don't have a schema in our
brion added a comment.
Not sure offhand about the schema; Yahoo's old documentation seems to have vanished from the net. (Probably on the wayback machine but I can't find a URL reference)
Ideally, I think we'd want a way for the content handler to provide a text extract that can be used here
brion added a comment.
Note there's some folks interested in IIIF/Wikimedia discussion from the IIIF end; some notes started recently on this doc: https://docs.google.com/document/d/1lqtwd1rwUIck6nmetxtmQkzuTpgnwOSPREhH9YEjmHI/edit
@SandraF_WMF I added a few notes on that doc what I'm interested
brion added a comment.
The idea's quite interesting but has fallen out of discussion some time ago. Shall we remove from the TechCom RFC list, or is there a party interested in taking it back on?TASK DETAILhttps://phabricator.wikimedia.org/T105845EMAIL PREFERENCEShttps://phabricator.wikimedia.org
brion added a comment.
@SandraF_WMF note I've been involved in the IIIF's A/V working group on extending the protocol to support audio and video, and have been at a few of the working group meetings for that. There's also a big IIIF working meeting in Toronto coming up in October; if there's
brion added a comment.
I wrote up some quick thoughts at https://www.mediawiki.org/wiki/User:Brion_VIBBER/MCR_alternative_thoughts
Mainly exploring along two lines:
what if we did a model with separate data tables for each new 'slot' instead of a common content-blob interface (possibly more
brion added a comment.Quick update prior to today's rfc irc discussion: we basically have two areas to discuss: 1) technical questions (data types, use as 'aggregation layer' for data from wikidata), and 2) the hosting question (which wiki to put it on, or whether to give it its own).TASK
brion added a comment.
In https://phabricator.wikimedia.org/T107595#2266131, @GWicke wrote:
> The use case for providing metadata is so that we can use stores like
RESTBase, which already provide an API keyed on title, revision & render ID. It
also already deals with the compl
brion added a comment.
If I understand, the case for passing more metadata to the blob store is as a
hint for cross-blob data compression.
For this I think we mainly want to pass the identifier of a related blob: the
blob with the data from the same slot in the previous revision
brion added a comment.
> This assumes the BlobStore will actually talk to the (same) database. I
would like to have Transaction separate from the DB stuff, so it can be used
just as well with files, or Cassandra, or Swift, or whatever we come up with to
store blobs. We shouldn't ass
brion added a comment.
In https://phabricator.wikimedia.org/T107595#2264334, @daniel wrote:
> We could (optionally?) provide a transaction context to the blob store like
this:
I kinda like that, yeah. Maybe extend Database with a transactional interface
that takes a callb
brion added a comment.
(if RevisionBuilder takes a $dbw param via constructor/factory, then the
question of the connection is easier)
TASK DETAIL
https://phabricator.wikimedia.org/T107595
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: daniel
brion added a comment.
> The above code would replace much of what is in the Revision class now, in
particular insertOn(). We can keep Revision around, but I'm not sure we can
provide b/c for insertOn().
b/c here looks relatively straightforward to me; it creates a new revis
brion added a comment.
re this:
$bs->deleteBlob( $dataUrl ); // dk: this goes wrong if the URL is
content/hash based!
I think the return from this:
$dataUrl = $bs->saveBlob( $content->serialize() );
needs to signal whether a blob was created or whether an exis
brion added a comment.
Regarding transactional nature:
Assuming the backing blob storage continues to work on the model of the
current `text` table blobs with external storage backing, the "easy way" is to
allow extra backend blobs to leak in case of transaction rollback, an
brion added a comment.
Ok in that case... I will trust nothing ;)
TASK DETAIL
https://phabricator.wikimedia.org/T107595
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: daniel, brion
Cc: RobLa-WMF, Yurik, ArielGlenn, APerson, TomT0m, Krenair
brion added a comment.
Aaa and now I see the bits in gerrit. I'll review all this tomorrow when I'm
a little bit rested. Hehehe
TASK DETAIL
https://phabricator.wikimedia.org/T107595
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: daniel, brion
brion added a comment.
Ah great, that was mostly written before your post. ;) sounding good so
far... Do you have code fleshed out enough to share or should we take that
class structure and write fresh?
TASK DETAIL
https://phabricator.wikimedia.org/T107595
EMAIL PREFERENCES
https
brion added a comment.
@daniel I'd like to help write up updated RfC text on MediaWiki.org for this
as its a thing a number of potential other work areas will depend on. Editing
team is interested in putting more metadata in
(https://phabricator.wikimedia.org/T132072) which means James
brion added a comment.
I'm a fan of "inheritMetadata" :)
TASK DETAIL
https://phabricator.wikimedia.org/T120452
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Yurik, brion
Cc: TheDJ, Eloy, Jdforrester-WMF, brion, ThurnerRupert
brion added a comment.
Side note -- the referenced source data: page should get recorded as a
template link in the link tables maybe? Or a file link at least. Some kind of
reference. :)
TASK DETAIL
https://phabricator.wikimedia.org/T120452
EMAIL PREFERENCES
https
brion added a comment.
@yurik I like that -- maybe generalize it as a metadata inheritance model;
anything not filled out in the local json is taken from the referenced .tabular
item.
TASK DETAIL
https://phabricator.wikimedia.org/T120452
EMAIL PREFERENCES
https
brion added a comment.
@yurik: The W3C CSV on the Web working group's metadata model recommendation
refers to "columns" with attributes for "name" and "titles" (plural, allowing
alternates or per-language variants), with similar recommended character
r
brion added a comment.
Re headers -- yeah need to distinguish between header labels (i18nable text)
and column ids (identifiers for programs). As long as capability is there I
don't mind the terms used, sounds like you're already working on that :)
TASK DETAIL
https
brion added a comment.
Side note: headers are rejected if they contain spaces. That seems odd?
TASK DETAIL
https://phabricator.wikimedia.org/T120452
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Yurik, brion
Cc: Jdforrester-PERSONAL
brion added a comment.
In https://phabricator.wikimedia.org/T120452#2227240, @matmarex wrote:
> As I understand these are stored as regular MediaWiki pages now, so they
have a maximum length of 2 MB. Even naive queries pulling the whole thing into
memory would be fast eno
brion added a comment.
Pulling individual data items out of large lists; pulling relevant columns in
order to sum them; pulling or updating a small number of cells during editing;
sub setting a large data set to graph the subset; sub setting a large data set
to perform operations on it Ina
brion added a comment.
Couple quick notes:
- pretty cool. :)
- I worry about efficiency of storage and queries; for small tables json
blobs are fine but for large data sets thisll get extremely verbose, and
loading/saving small updates to a large table will get very slow. Consider
brion added a subscriber: brion.
TASK DETAIL
https://phabricator.wikimedia.org/T114443
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: Ottomata, brion
Cc: brion, intracer, Smalyshev, mark, MZMcBride, Krinkle, EBernhardson, bd808,
Joe, dr0ptp4kt
brion added a subscriber: brion.
TASK DETAIL
https://phabricator.wikimedia.org/T114251
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: brion
Cc: brion, Krenair, Addshore, Smalyshev, Qgil, Lucie, MrStradivarius, Aklapper,
Lydia_Pintscher, aude
brion added a comment.
But yeah that code'll have to run on both. Feel free to ping me for review etc
TASK DETAIL
https://phabricator.wikimedia.org/T106523
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/
To: brion
Cc: brion, gerritbot, Jdlrobson, Sumit
brion added a comment.
If http://caniuse.com/#feat=srcset is to be believed, the main problem browser
on mobile is going to be Safari on iOS 8, since it supports native srcset but
only for density switching, not for the size specifications. So if you use a
customized polyfill, you can't rely
brion added a comment.
Well, desktop isn't as big a worry since high-res screen usually means large
screen (and you'll usually max out at 2, rather than the 3 that some phones
like the iPhone 6plus reach)... :) So it should be safe there to load the 1x
and then polyfill-load the native-density
brion added a comment.
So if I'm reading the above comments correctly, these banners want to use the
width capping specifications ('w640') in img srcset, but Safari and our JS
polyfill only support switching on the device pixel ratio ('2x').
I suppose you might be able to rig up a fancier
brion added a comment.
(If the redirect gets cached that'd still have the roundtrip, though the first
request would be cheaper once the thumbnail's created as we'd avoid hitting PHP
app servers.)
TASK DETAIL
https://phabricator.wikimedia.org/T76827
REPLY HANDLER ACTIONS
Reply to comment
brion added a comment.
Hmm, well if thumb.php redirected you'd have an extra round-trip plus the
overhead of hitting the PHP app servers in the first place... might not be
ideal either. :(
TASK DETAIL
https://phabricator.wikimedia.org/T76827
REPLY HANDLER ACTIONS
Reply to comment
brion added a subscriber: brion.
brion added a comment.
Note that the thumbnail size will need to be selectable through some input
variable if you stuff it into one API req like this -- suitable size will
depend on the device and how the client software chooses to show thumbs.
(Note another
brion added a subscriber: brion.
TASK DETAIL
https://phabricator.wikimedia.org/T84923
REPLY HANDLER ACTIONS
Reply to comment or attach files, or !close, !claim, !unsubscribe or !assign
username.
EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences
43 matches
Mail list logo