Github user tarekauel commented on the pull request:

    https://github.com/apache/spark/pull/6981#issuecomment-116502627
  
    The tests fail because Hive calculates for the May, 6 2011 18 as week 
number, Java `Locale.US` 19.
    
    The reason is a different definition of week 1 (see [The definition of Week 
of Year is local dependent.](http://stackoverflow.com/a/4608695/3532525)).
    
    
    ```
    scala> new java.text.SimpleDateFormat("w", java.util.Locale.US).format(new 
java.text.SimpleDateFormat("dd-MM-yyyy", 
java.util.Locale.US).parse("06-05-2011"))
    res16: String = 19
    ```
    
    ```
    scala> new java.text.SimpleDateFormat("w", 
java.util.Locale.GERMANY).format(new java.text.SimpleDateFormat("dd-MM-yyyy", 
java.util.Locale.GERMANY).parse("06-05-2011"))
    res15: String = 18
    ```
    
    @Davies It seems to be that the `SimpleDateFormat` ignores that the default 
Timezone has been set in `beforeAll` of `HiveCompatibilitySuite`. Shall I add 
`WeekOfYear` to the whitelist?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to