Github user FavioVazquez commented on the pull request:

    https://github.com/apache/spark/pull/5786#issuecomment-97618915
  
    In the link 
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/31337/, 
there is no failure in the tests, but in the console output the error thrown 
was this:
    
    FAIL: test_count_by_value_and_window (__main__.WindowFunctionTests)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File "pyspark/streaming/tests.py", line 418, in 
test_count_by_value_and_window
        self._test_func(input, func, expected)
      File "pyspark/streaming/tests.py", line 133, in _test_func
        self.assertEqual(expected, result)
    AssertionError: Lists differ: [[1], [2], [3], [4], [5], [6], [6], [6], [6], 
[6]] != [[1], [2], [3], [4], [5], [6], [6], [6], [6]]
    
    First list contains 1 additional elements.
    First extra element 9:
    [6]
    
    - [[1], [2], [3], [4], [5], [6], [6], [6], [6], [6]]
    ?                                          -----
    
    + [[1], [2], [3], [4], [5], [6], [6], [6], [6]]
    
    ----------------------------------------------------------------------
    Ran 40 tests in 134.429s
    
    FAILED (failures=1)
    ('timeout after', 20)
    ('timeout after', 20)
    ('timeout after', 20)
    ('timeout after', 5)
    Had test failures; see logs.
    [error] Got a return code of 255 on line 240 of the run-tests script.
    Archiving unit tests logs...
    
    So I'm not sure what happened.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to