https://bugzilla.wikimedia.org/show_bug.cgi?id=56590
--- Comment #15 from [email protected] --- So far, we've been doing some basic integration testing on a single page (en:Barack_Obama) in a Jenkins job after each commit. While we could fully mock up all API accesses, for large pages like en:Barack_Obama, this would require us to effectively do a capture and replay of all API accesses for those pages (since it is not just wikitext, but also templates, extensions, images, etc. -- some 100s to 1000s of API calls). Scott does have some basic code for capturing these accesses and dumping them to a file and using that to do a replay (without relying on internet). But, that seems like extra complexity instead of relying on internet access if we really want to do full-page integration testing on a set of N (for small values of N, say 5) pages after each commit. So, I think the real question is whether we should be doing full page testing in a Jenkins/CI job after each commit or not. If yes, it seems simpler to rely on HTTP. All that said, we do have a mock MW API server for testing PHP extensions (bug 45440). See parsoid/js/tests/mockAPI.js. Currently, we use this for running parserTests only. -- You are receiving this mail because: You are on the CC list for the bug. _______________________________________________ Wikibugs-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikibugs-l
