I'll take a step back and describe the challenge more detailed:

I have a skin (Bootstrap) which is showing live data in charts. One design 
goal is to have the most accurate and most up to date values shown in the 
front end. The charts are updated with every loop value (received by the 
MQTT-JS client in the browser). One challenge is to keep the charts in sync 
with the data in the backend. They run out of sync quickly for various 
reasons. One very common reason is inactivity: the device which runs the 
browser is inactive, the browser tab running the web page is inactive, and 
so on. A mechanism in the web page checks regularly for updated backend 
data and asynchronously reloads backend data and refreshes the charts. Such 
a call for new data is done in two steps: the first step is to get a small 
JS file containing the timestamp of the latest backend data, and is only 
done when such new data can be expected (after the next archive_interval). 
If this timestamp is newer than the last known timestamp, the second step, 
fetching the new backend data and reloading the charts, is done.

The idea is, or better was, to inform the front end when new backend data 
was uploaded, to get the latest backend data immediately.

But thinking a little bit further, this doesn't really fix or improve the 
issue I have with this mechanism, which is: 

The new backend data is uploaded quite a while after the preceding 
archive_interval has finished. Depending on the number and complexity of 
reports run, and the backend machine's capabilities, this takes the one or 
the other minute. While in most cases this gap isn't an issue, in some 
cases you miss or lose events and readings up until the next 
archive_interval's backend data is available, for instance:

Imagine, rain is pouring down like crazy and your gauge goes up every 
couple of seconds. At the top of the archive_interval the live data is 
constantly updates by the MQTT messages coming in. After a minute or so, 
the backend has finished storing the new archive value, generated the 
reports and uploaded the fresh backend data. The front end now fetches the 
backend data and reloads the charts, "losing" all loop data in between. 
"Losing", because this data will show up again, but you will have to wait 
for the next archive_interval and the reports and uploads to be done. Or 
imagine a very strong gust happening in this gap: it will disappear in the 
chart and reappear after the next archive_interval.

So all in all, this is not a big deal. But from the above design goals 
point of view, it is an issue.

To resolve the issue, informing the front end isn't a good approach. The 
gap is still there. To solve my issue, I need to keep all loop data with 
timestamps after the top of the newest archive interval remaining in the 
front end, after syncing it with the newest backend data.



vince schrieb am Sonntag, 20. August 2023 um 20:33:38 UTC+2:

> On Thu, Aug 17, 2023 at 12:58 AM [email protected] <[email protected]> 
> wrote:
>
> I want to inform the front end a certain event occurs on the backend, to 
> be more exact: when a report is finished. Does something like this exist?
>
>
> When StdReport runs, it changes the date+time last modified for the output 
> directory contents of the skin, so simply watch the timestamp on any file 
> the skin writes out.
>
> You will likely need to externally monitor the file with a python script 
> or something and then do a MQTT publish to your frontend.  How to do the 
> monitoring depends on how many skins you want to monitor and how realtime 
> you want the notification to be.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"weewx-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/weewx-user/de115dc2-dedd-4b7f-9141-8972039a2809n%40googlegroups.com.

Reply via email to