As a followup, it's worth talking about puppetization and how we're going
to accomplish that.

* Node.JS itself should be installable via apt package (we'll have to do a
custom package so that we get Node v10)
* Node dependencies will be all 'npm install'ed into a node_modules
submodule of the main repo for the application which we can deploy with the
rest of the application code.
** It's worth noting that although this means that we'll still be pulling
our dependencies from a separate source initially; what is currently in
production will be in our git repos. We can also version lock inside our
configuration.

~Matt Walker
Wikimedia Foundation
Fundraising Technology Team


On Wed, Nov 13, 2013 at 3:02 PM, Matthew Walker <[email protected]>wrote:

> Hey,
>
> For the new renderer backend for the Collections Extension we've come up
> with a tentative architecture that we would like operations buy in on. The
> living document is here [1]. It's worth saying explicitly that whatever
> setup we use must be able to handle the greater than 150k requests a day we
> serve using the old setup.
>
> Basically we're looking at having
> * 'render servers' run node.js
> * doing job management in Redis
> * rendering content using PhantomJS and/or Latex
> * storing rendered files locally on the render servers (and streaming the
> rendered results through MediaWiki -- this is how it's done now as well).
> * having a garbage collector run routinely on the render servers to
> cleanup old stale content
>
> Post comments to the talk page please :)
>
> [1 ]https://www.mediawiki.org/wiki/PDF_rendering/Architecture
>
> ~Matt Walker
> Wikimedia Foundation
> Fundraising Technology Team
>
_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to