> I see a simple scenario like the following one:
> user asks for a very expensive task clicking section A
> while it's waiting for it, user changes idea clicking section B
> both section A and section B needs that very expensive async call
> drop "going to section A" info and put "go to section B" to that very same 
> promise
> whenever resolved, do that action
> A caching mechanism to trigger only once such expensive operation would also 
> work, yet it's not possible to drop "go into A" and put "go into B”

for your scenario, what you want is a cacheable background-task, where you can 
piggyback B onto the task initiated by A (e.g. common-but-expensive 
database-queries that might take 10-60 seconds to execute).

its generally more trouble than its worth to micromanage such tasks with 
removeListeners or make them cancellable (maybe later on C wants to piggyback, 
even tho A and B are no longer interested).  its easier implementation-wise to 
have the background-task run its course and save it to cache, and just have A 
ignore the results.  the logic is that because this common-but-expensive task 
was recently called, it will likely be called again in the near-future, so let 
it run  its course and cache the result.

here's a real-world cacheable-task implementation for such a scenario, but it 
piggybacks the expensive gzipping of commonly-requested files, instead of 
database-queries [1] [2]

[1] 
https://github.com/kaizhu256/node-utility2/blob/2018.1.13/lib.utility2.js#L4372 
- piggyback gzipping of files onto a cacheable-task
[2] 
https://github.com/kaizhu256/node-utility2/blob/2018.1.13/lib.utility2.js#L5872 
- cacheable-task source-code




```js
/*jslint
    bitwise: true,
    browser: true,
    maxerr: 4,
    maxlen: 100,
    node: true,
    nomen: true,
    regexp: true,
    stupid: true
*/

local.middlewareAssetsCached = function (request, response, nextMiddleware) {
/*
 * this function will run the middleware that will serve cached gzipped-assets
 * 1. if cache-hit for the gzipped-asset, then immediately serve it to response
 * 2. run background-task (if not already) to re-gzip the asset and update cache
 * 3. save re-gzipped-asset to cache
 * 4. if cache-miss, then piggy-back onto the background-task
 */
    var options;
    options = {};
    local.onNext(options, function (error, data) {
        options.result = options.result || 
local.assetsDict[request.urlParsed.pathname];
        if (options.result === undefined) {
            nextMiddleware(error);
            return;
        }
        switch (options.modeNext) {
        case 1:
            // skip gzip
            if (response.headersSent ||
                    !(/\bgzip\b/).test(request.headers['accept-encoding'])) {
                options.modeNext += 1;
                options.onNext();
                return;
            }
            // gzip and cache result
            local.taskCreateCached({
                cacheDict: 'middlewareAssetsCachedGzip',
                key: request.urlParsed.pathname
            }, function (onError) {
                local.zlib.gzip(options.result, function (error, data) {
                    onError(error, !error && data.toString('base64'));
                });
            }, options.onNext);
            break;
        case 2:
            // set gzip header
            options.result = local.base64ToBuffer(data);
            response.setHeader('Content-Encoding', 'gzip');
            response.setHeader('Content-Length', options.result.length);
            options.onNext();
            break;
        case 3:
            local.middlewareCacheControlLastModified(request, response, 
options.onNext);
            break;
        case 4:
            response.end(options.result);
            break;
        }
    });
    options.modeNext = 0;
    options.onNext();
};

...

local.taskCreateCached = function (options, onTask, onError) {
/*
 * this function will
 * 1. if cache-hit, then call onError with cacheValue
 * 2. run onTask in background to update cache
 * 3. save onTask's result to cache
 * 4. if cache-miss, then call onError with onTask's result
 */
    local.onNext(options, function (error, data) {
        switch (options.modeNext) {
        // 1. if cache-hit, then call onError with cacheValue
        case 1:
            // read cacheValue from memory-cache
            local.cacheDict[options.cacheDict] = 
local.cacheDict[options.cacheDict] ||
                {};
            options.cacheValue = 
local.cacheDict[options.cacheDict][options.key];
            if (options.cacheValue) {
                // call onError with cacheValue
                options.modeCacheHit = true;
                onError(null, JSON.parse(options.cacheValue));
                if (!options.modeCacheUpdate) {
                    break;
                }
            }
            // run background-task with lower priority for cache-hit
            setTimeout(options.onNext, options.modeCacheHit && 
options.cacheTtl);
            break;
        // 2. run onTask in background to update cache
        case 2:
            local.taskCreate(options, onTask, options.onNext);
            break;
        default:
            // 3. save onTask's result to cache
            // JSON.stringify data to prevent side-effects on cache
            options.cacheValue = JSON.stringify(data);
            if (!error && options.cacheValue) {
                local.cacheDict[options.cacheDict][options.key] = 
options.cacheValue;
            }
            // 4. if cache-miss, then call onError with onTask's result
            if (!options.modeCacheHit) {
                onError(error, options.cacheValue && 
JSON.parse(options.cacheValue));
            }
            (options.onCacheWrite || local.nop)();
            break;
        }
    });
    options.modeNext = 0;
    options.onNext();
};
```

> On 24 Apr 2018, at 6:06 PM, Oliver Dunk <[email protected]> wrote:
> 
> Based on feedback, I agree that a blanket `Promise.prototype.clear()` is a 
> bad idea. I don’t think that is worth pursuing.
> 
> I still think that there is value in this, especially the adding and removing 
> of listeners you have reference to as Andrea’s PoC shows. Listeners would 
> prevent the chaining issue or alternatively I think it would definitely be 
> possible to decide on intuitive behaviour with the clear mechanic. The 
> benefit of `clear(reference)` over listeners is that it adds less to the 
> semantics.
> 
> I think the proposed userland solutions are bigger than I would want for 
> something that I believe should be available by default, but I respect that a 
> lot of the people in this conversation are in a better position to make a 
> judgement about that than me.
> _______________________________________________
> es-discuss mailing list
> [email protected]
> https://mail.mozilla.org/listinfo/es-discuss

_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to