On 2016-07-31 10:53 AM, Takashi OTA wrote:
When you want to check increase/decrease of linked domains in chronological
order through edit history
This is actually a harder problem that it seems, even at first glance:
if you want to examine the links over time then, when you are looking at
On 2016-08-01 12:21 PM, Gergo Tisza wrote:
the parser has changed over time and old templates
might not work anymo
Aaah. Good point. Also, the changes in extensions (or, indeed, what
extensions are installed at all) might break attempts to parse the past,
as it were.
You know, this is
On Mon, Aug 1, 2016 at 7:46 AM, Marc-Andre wrote:
> Clearly, all the data to do so is there in the database - and I seem to
> recall that there exists an extension that will allow you to use the parser
> in that way - but the Foundation projects do not have such an extension
>
On 08/01/2016 11:37 AM, Marc-Andre wrote:
...
Is there something we can do to make the passage of years hurt less?
Should we be laying groundwork now to prevent issues decades away?
One possibility is considering storing rendered HTML for old revisions.
It lets wikitext (and hence parser)
"Should we be laying groundwork now to prevent issues decades away?" I'll
answer that with "Yes". I could provide some interesting stories about
technological and budgetary headaches that result from repeatedly delaying
efforts to make legacy software be forwards-compatible. The technical
details
Hi,
On 07/31/2016 07:53 AM, Takashi OTA wrote:
> Such links are stored in externallinks.sql.gz, in an expanded form.
>
> When you want to check increase/decrease of linked domains in chronological
> order through edit history, you have to check pages-meta-history1.xml etc.
> In a such case,
On Mon, Aug 1, 2016 at 9:51 AM, Subramanya Sastry wrote:
> On 08/01/2016 11:37 AM, Marc-Andre wrote:
>> Is there something we can do to make the passage of years hurt less?
>> Should we be laying groundwork now to prevent issues decades away?
>
>
> One possibility is
On Mon, Aug 1, 2016 at 11:47 AM, Rob Lanphier wrote:
> > HTML storage comes with its own can of worms, but it seems like a
> solution
> > worth thinking about in some form.
> >
> > 1. storage costs (fully rendered HTML would be 5-10 times bigger than
> > wikitext for that
On Mon, Aug 1, 2016 at 12:19 PM, Gergo Tisza wrote:
> Specifying wikitext-html conversion sounds like a MediaWiki 2.0 type of
> project (ie. wouldn't expect it to happen in this decade), and even then it
> would not fully solve the problem[...]
You seem to be suggesting
WikiConference North America will take place October 7 through 10 in San
Diego.
The session tracks are:
1. Community
2. Advocacy & Outreach
3. Technology & Infrastructure
4. Health care and science
5. GLAM
6. Education and Academic Engagement
Please submit proposals here:
On Mon, Aug 1, 2016 at 1:56 PM, Gergo Tisza wrote:
> On Mon, Aug 1, 2016 at 1:01 PM, Rob Lanphier wrote:
>> On Mon, Aug 1, 2016 at 12:19 PM, Gergo Tisza wrote:
>> > Specifying wikitext-html conversion sounds like a MediaWiki 2.0
On 1 August 2016 at 17:37, Marc-Andre wrote:
> We need to find a long-term view to a solution. I don't mean just keeping
> old versions of the software around - that would be of limited help. It's
> be an interesting nightmare to try to run early versions of phase3 nowadays,
> One possibility is considering storing rendered HTML for old revisions. It
> lets wikitext (and hence parser) evolve without breaking old revisions.
Plus
> rendered HTML will use the template revision at the time it was rendered
vs.
> the latest revision (this is the problem Memento tries to
I am not familiar with databases. I have old MySQL based wikis sites
I cannot access anymore due to a change in PHP and MySQL versions. I
have old XML dumps. Is it possible to reload them under SQLight
wikis? These were working group wikis: we only are interested in
restoring texts. We have
On Mon, Aug 1, 2016 at 1:01 PM, Rob Lanphier wrote:
> On Mon, Aug 1, 2016 at 12:19 PM, Gergo Tisza wrote:
> > Specifying wikitext-html conversion sounds like a MediaWiki 2.0 type of
> > project (ie. wouldn't expect it to happen in this decade), and
For mass imports, use importDump.php - see <
http://www.mediawiki.org/wiki/Manual:Importing_XML_dumps> for details.
On Mon, Aug 1, 2016 at 8:37 PM, Jefsey wrote:
> I am not familiar with databases. I have old MySQL based wikis sites I
> cannot access anymore due to a change
There is a slow moving discussion about this at
https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Markdown
The bigger risk is that the rest of the world settles on using
CommonMark Markdown once it is properly specified. That will mean in
the short term that MediaWiki will need to support
TL:DR; You get to a spec by paying down technical debt that untangles
wikitext parsing from being intricately tied to the internals of
mediawiki implementation and state.
In discussions, there is far too much focus on the fact that you cannot
write a BNF grammar or yacc / lex / bison /
On Tue, Aug 2, 2016 at 8:34 AM, Gergo Tisza wrote:
> On Mon, Aug 1, 2016 at 5:27 PM, Rob Lanphier wrote:
>
>> Do you believe that declaring "the implementation is the spec" is a
>> sustainable way of encouraging contribution to our projects?
>
>
>
On Mon, Aug 1, 2016 at 5:27 PM, Rob Lanphier wrote:
> Do you believe that declaring "the implementation is the spec" is a
> sustainable way of encouraging contribution to our projects?
Reimplementing Wikipedia's parser (complete with template inclusions,
Wikidata fetches,
20 matches
Mail list logo