For what it's worth (in case future searchers find this thread), here's what I did.
https://gist.github.com/ericeslinger/f2111549574bcfaa3c1b It's pretty simple, and I decided to *not* override anything on $http directly with interceptors, but instead provide my own service. That way if I have really important stuff, I can hit $http directly. Anyway, all it really does is: on a BatchRequest.get call, check to see if there's already a debounce timer counting down (timerPromise) and start one if not. Then it adds the current request to the batch and returns a promise to that request. When the timer fires, it takes all the currently queued requests, assembles a post request formatted for Bassmaster https://github.com/hapijs/bassmaster and sends that request. Upon getting a response to that post, it resolves or rejects promises accordingly. Bassmaster will return data in order, so request 0 in the POST request ends up at position 0 in the response. It's worth noting that Bassmaster always returns a 200, even if some of the sub-requests had error codes, so you have to manually deal with error situations. You also lose some data you normally have in a $http call, but that's not the end of the world. Functionally, it means that instead of passing response.data into my response parser (since http promises resolve to an object with a .data and .headers and that stuff), I just pass response to my parser. Any feedback is welcome. This seems to be my favorite approach to solving this problem for now, because it doesn't try to predict user behavior or have the server track the state of the client cache. Requests only get sent off if the object isn't already in the cache. I do some clever pre-loading in certain situations (if you are loading a post thread, you're pretty much guaranteed to Need and have Never Loaded all replies to that thread) but only in really constrained situations. The goal is to have as dumb of an API as possible, and handle the cleverness on the client side. e On Fri, Aug 15, 2014 at 10:42 PM, Sander Elias <[email protected]> wrote: > Hi Eric, > > The idea you have is sound, however a generic solution to this is a very > hard thing to create. The main reason being > that it depends on the server side. So a solution that works in your case > will seldom work for anyone elses. > > Also there are a lot of unknown variables. like the size of your poster > base, and the size of every profile > If those are reasonable small a solution might be, to send off the entire > thing in 1 go, and pre-populate your cache > > Also you might include the profiles in the initial request on the server, > and again pre-populate your cache > > another solution is entire client-side. before rendering you page, loop > through your result, compare to the cache, and > fetch the missing profiles in a single request. (Here I'm assuming your > server can provide a suitable answer!). when this > request comes back, stuff it in the cache again, and then render the part > of the page where you need this. > > Regards > Sander > > > -- > You received this message because you are subscribed to the Google Groups > "AngularJS" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > To post to this group, send email to [email protected]. > Visit this group at http://groups.google.com/group/angular. > For more options, visit https://groups.google.com/d/optout. > -- You received this message because you are subscribed to the Google Groups "AngularJS" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/angular. For more options, visit https://groups.google.com/d/optout.
