Hi! This is a quick heads-up about the status of HHVM migration, and what
the MediaWiki Core team is working on.

There are three challenges that we have to solve before we can run HHVM in
production:

* We need good packages. The packages provided by Facebook have some deep
issues that need to be fixed before they meet our packaging standards.
 This is a good opportunity to recognize Faidon's leadership on this front:
he has been liasoning with Facebook and Debian, working to resolve the
outstanding issues. Thanks, Faidon!
* We need to port a bunch of C extensions to the Zend PHP interpreter to
HHVM. The most complex by far is LuaSandbox. Tim has been working on that.
In the process, he has made substantial improvements to the Zend extension
compatibility layer provided by HHVM, which we are waiting to have merged
upstream: <https://github.com/facebook/hhvm/pull/1986>.  Once they are
merged, they will be in the queue for the next release.  Releases are cut
every eight weeks.
* I also want to recognize Max Seminik, who stepped up to port Wikidiff2,
producing a patch in short order.
* We need to adapt our app server configuration for HHVM. This includes
configuring HHVM itself as well as reconfiguring Apache to act as a fastcgi
reverse-proxy.
* We need to amend our deployment process so that it implements additional
requirements for HHVM.  Specifically, we will need to add a build step to
produce a bytecode archive in advance of deployment. We are not working on
that piece yet, but I think that Bryan's work on scap is going to make this
a lot easier to implement once we do tackle it.

What we've done so far is to use Facebook's packages in Labs and in
MediaWiki-Vagrant, configured Jenkins to run the unit tests under HHVM
(Antoine), and configured a Jenkins job to build HHVM from source hourly so
we can test patches (Chad). Aaron and I reasoned our way out of having to
port the igbinary extension, and Aaron is now working on porting
FastStringSearch. Along the way, we have been running into small
compatibility nits which we have fixed either by changing core's behavior
to be cross-compatible or by filing bugs and submitting patches upstream.

As you can see, there are some hard blockers that stand between us and HHVM
in production, and the biggest ones are not entirely in our hands (i.e.,
they depend on upstream merging patches and fixing packages). At the same
time, there is a lot of useful work left to do that can continue without
being blocked by these things. For that reason, the Core MediaWiki team is
currently targetting the Beta cluster for HHVM work.

Our target for the current sprint is to have the ability to have Apache run
either the Zend interpreter or HHVM based on the presence of a magic
cookie. By default, visitors to the beta cluster will be served pages
generated using the Zend interpreter, but by setting the cookie, Apache
would serve MediaWiki using HHVM instead.  This is an idea we got from
Niklas, who has implemented something very similar for <
http://dev.translatewiki.net/>.  Doing this this would allow the beta
cluster to continue to be faithful to production and thus continue to be a
good target for testing, while at the same time provide a way for people
working on HHVM specifically to test ported extensions and to identify and
fix integration points in a production-like environment. It also gives us a
way of making our progress visible to you.

We have benchmarked different workloads on different hardware and have
found the performance of HHVM to be impressively better than the Zend
interpreter in most cases, but we don't yet have numbers to share that
project the impact on users, because we don't have the means of simulating
the load patterns of production, and because some parts of the stack are
still in the process of being ported. We expect that having the option of
running HHVM on the Beta cluster with the complete set of extensions that
Wikimedia uses will make it possible for us to project how it will perform
in production. But we are optimistic, given what we've observed and given
the spate of independent evaluations of HHVM from different corners of the
PHP community.

We are using Bugzilla to track our progress. You can search for bugs with
the 'hiphop' keyword, or simply head to <https://www.mediawiki.org/wiki/HHVM>,
which aggregates the most recently touched items via RSS. If you'd like to
get involved, pick an open bug, or get in touch via the lists or IRC.

Regards, Core Platform.
_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to