Re: [Wikitech-l] Render with a slow process
Can you be a bit more concrete in what you are thinking of. In general i think most users would find it unacceptable if part of the page was only available after some slow (<1 min) amount of time after the page rendered. The closest things i can think of to this is image rendering (and especially video scaling) is asyncronous. Page rendering itself also works kind of like this - if someone edits a page, and pool counter decides too many people are trying to render at once, next viewer will get an old version of page. -- Brian On Saturday, April 25, 2020, John Erling Blad wrote: > Slow process, fast rendering > > Imagine someone edits a page, and that editing hits a very slow > tag-function of some kind. You want to respond fast with something > readable, some kind of temporary page until the slow process has finished. > Do you chose to reuse what you had from the last revision, if it the > content of the function hasn't changed, or do you respond with a note that > you are still processing? The temporary page could then update missing > content through an API. > > I assume that a plain rerender of the page will occur after the actual > content is updated and available, i.e. a rerender will only happen some > time after the edit and the slow process would then be done. A last step of > the process could then be to purge the temporary page. > > This work almost everywhere, but not for collections, they don't know about > temporary pages. Is there some mechanism implemented that does this or > something similar? It feels like something that should have a general > solution. > > There are at least two different use cases; one where some existing > information gets augmented with external data (without really changing the > content), and one where some external data (could be content) gets added to > existing content. An example of the first could be verification of > references, wile the later could be natural language generation. > > /jeblad > ___ > Wikitech-l mailing list > Wikitech-l@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Render with a slow process
To be honest I don't fully understand the question. What you wrote sounds like we have something like this already. Or did I get this wrong? On a very high "user experience" level unrelated to MediaWiki I do have a suggestion: You could do it similar to how "like" features in social media clients are implemented. Clicking such a "like" button feels like it's immediately done. But in reality the job is not done yet. There is still a request going on in the background, which might even fail. In other words: The client-side immediately gives the user a response that is most likely to happen, without actually knowing if it happens. In case of a later server-side error the visual response is updated to let the user know. When an element on a page is edited and the user clicks "save"; the client-side already knows everything, doesn't it? It could update the old rendering with the new information and present that as if it is already saved. That's more or less what React frameworks are about. What you probably don't want to do is to re-implement complicated rendering pipelines on the client-side. In this case you might need to use a spinner or other kind of placeholder. Best Thiemo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Render with a slow process
Slow process, fast rendering Imagine someone edits a page, and that editing hits a very slow tag-function of some kind. You want to respond fast with something readable, some kind of temporary page until the slow process has finished. Do you chose to reuse what you had from the last revision, if it the content of the function hasn't changed, or do you respond with a note that you are still processing? The temporary page could then update missing content through an API. I assume that a plain rerender of the page will occur after the actual content is updated and available, i.e. a rerender will only happen some time after the edit and the slow process would then be done. A last step of the process could then be to purge the temporary page. This work almost everywhere, but not for collections, they don't know about temporary pages. Is there some mechanism implemented that does this or something similar? It feels like something that should have a general solution. There are at least two different use cases; one where some existing information gets augmented with external data (without really changing the content), and one where some external data (could be content) gets added to existing content. An example of the first could be verification of references, wile the later could be natural language generation. /jeblad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l