First off, must apologize for adding issues to the github repo to have discussions, when they should be posted here in the Google Group.
Anyways, recently I opened up an issue to discuss the possibility of cloning JSHint tests to ESLint to improve our test coverage and further guarantee that we have parity with JSHint, which is a 0.1.0 milestone. The Github issue is here if you want to catch up: https://github.com/nzakas/eslint/issues/330 So, a bunch of good news. First, I emailed Anton, the auther of JSHint, and he's totally cool with us pulling in as many tests as we want from JSHint to improve our test coverage and reach parity. Second, last night I decided to investigate the JSHint source to understand what how they have things architected and how I make go about bringing in tests from there to ESLint in an orderly fashion. What I discovered is pretty awesome. First, their test runner has some really good ideas that we can borrow and improve on. In fact, while I've only deeply investigated JSHint's parser tests so far, I think it may be possible to have a unified set of tests for both projects in a really really DRY and straightforward approach that is more maintainable for both projects. Basically, for both projects, almost all the tests are basically composed of a few pieces of data with little to no executable code (i.e. loops): (1) A test name (2) A test description or comment (3) a fixture, usually 1 to 50 lines in length. (4) An options object used to configure the linter (5) An array of error tuples, with a line number and an error string. Basically, all the tests can be converted into a recursive data structure like so: tests = [{ name: "destructuring const as es5", description: "This test deals with blah blah blah blah. More info can be found in this Github Issue and in Section foo of the ECMAScript standard. Blah blah blah", fixture: ["parser", "destructuring-const-as-es5"], tests: [ { name: "Some sub test", description: "Description of subtest", options: { unused: true, undef: true }, errors: [ [1, "'const' is only available in JavaScript 1.7."], [1, "'destructuring expression' is only available in JavaScript 1.7."], [6, "'const' is only available in JavaScript 1.7."], // etc. ] }, { name: "Some other sub test with different options but same fixture", description: "Description of subtest", options: { unused: false, undef: true }, errors: [ // some different errors based on different options ] } ] }, { // another test suite }]; >From what I can tell so far, the benefits of drying up all the tests for both projects completely outweighs the benefits of the few places that loops are used in the testing code. In the case of JSHint, they could continue to have these tests in one big file like tests/unit/parser.js. For us we could have these tests packaged with each of the rules. Where things get interesting but is totally premature opt and would require agreement from both sides (Nicholas and Anton), would be having a super simple abstraction layer for interacting with the test data structures for injecting the test runner and fixture loader of choice. It's super functional, very easy to add tests directly from real examples, eliminates the multiline code in an array ugliness, and I believe can be make extremely fast since it will be parallelizable (you can map/reduce the tests over all your cores if your test runner is capable of that); Okay, so where am I with all this? I'm halfway done with a sed/awk bash script to extract all a couple thousand lines of fixtures from test files like JSHint's tests/unit/parser.js. All tests are passing beautifully in JSHint, but I will have to go back and manually add some of the code fixture comments that were lost in my conversion/extraction process. After I complete that I will extract all the (linterOptions, errorArray) tuples into an array of test data structures. -- You received this message because you are subscribed to the Google Groups "ESLint" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.
