Roy T. Fielding wrote:
Oh, for crying out loud. Apps do not need microsecond resolution for time since epoch. None of them do.
I agree strongly with Roy's main argument, although I have objections to this particular part.
I can think of examples where apps need the time since the epoch in subsecond resolution. Higher-resolution loggers, unique ID generation, computing elapsed time for internal performance monitoring, etc.
However, for all of these cases, the interface provided by gettimeofday() works just fine: provide a time_t and a microseconds value separately, and let the app decide what to do with the microseconds (where "ignore the microseconds completely" is a valid answer for some apps). The problem with APR is that it does arbitrary and expensive transformations on the data (64-bit math to combine together the seconds and microseconds).
They need microsecond resolution for small interval timers. The vast majority of APR time usages are for epoch times or intervals in seconds. There is nothing that the app can do to "work around" APR's funky data type because APR forces the conversion on every function whether that function needs microseconds or not. Using raw time_t is not an option because APR is supposed to be providing an interface to a time_now function that is portable. The fact of the matter is that using the same data structure for two different purposes is the wrong design for the portability library's basic interface to time, which is why none of the operating systems work that way.
The trade-off is obvious -- just look at the profile results. If you aren't willing to use that as the determinant of which interface is better, then we shouldn't be using APR for a Web server.
+1
I really think APR should follow the model that's worked well for Unix: - A gettimeofday function returns separate values for time_t and microseconds. - All APIs that involve a timestamp use just a time_t. - If an application needs to make use of the microsecond data (e.g., a JVM built on top of APR that needs to support Java-style millisecond-resolution time objects), it's up to the application programmer to manipulate the microseconds in whatever manner is most appropriate, based on the specific functional and performance needs of that app.
Brian
....Roy