HTTP Components team - I have some approved time this summer to work on an open source project, and I'd like to work on improving the caching support in the async http client. Currently, the requests to the origin are non-blocking, but the requests to the cache are blocking. The async caching support appears to be implemented as a decorator of the http client, while in the blocking client case its implemented by decorating the internal ClientExecChain instance.
My initial idea was to follow the same pattern in the async client as with the blocking client, and use an internal ExecutorService to submit requests to the cache, and then block (with a timeout) the returned Future with the cache lookup result. This is of course still blocking, but at least provides a potentially configurable timeout when checking the cache. How should I approach this? I see a comment in https://issues.apache.org/jira/browse/HTTPASYNC-76 regarding the likely need to make changes to the existing blocking http client caching implementation along with changes to the core async http client protocol pipeline processing. Are there any existing ideas, plans, etc., for making the caching non-blocking for the async client? Or what changes would be needed in the blocking client's caching implementation? Is there enough need to make this improvement? Thanks.
