yaooqinn edited a comment on pull request #28719:
URL: https://github.com/apache/spark/pull/28719#issuecomment-638585185


   Hmm... Bad news, I am afraid I just the concept of `THE FIRST DAY OF WEEK` 
is not only about the letter 'u' itself!!!
   
   It affects all week-based patterns.
   
   i.e., For the date `2019-12-29(Sunday)`, in the Sunday Start system, it 
belongs to 2020 of week-based-year, in the Monday Start system, it goes to 
2019. the week-of-week-based-year(`w`) will be affected too.
   
   ```sql
   spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY', 'locale', 'en-GB'));
   2019
   spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY', 'locale', 'en-US'));
   2020
   
   spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY-ww-uu', 'locale', 'en-US'));
   2020-01-01
   spark-sql> SELECT to_csv(named_struct('time', to_timestamp('2019-12-29', 
'yyyy-MM-dd')), map('timestampFormat', 'YYYY-ww-uu', 'locale', 'en-GB'));
   2019-52-07
   ```
   
   Don't pay too much attention to the CSV function, I just use it to mock the 
default locale changing which changes the rule of `THE FIRST DAY OF WEEK`.
   
   
     


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to