See comments below.
Gary
On Saturday, 15 March 2025 at 03:47:33 UTC+10 Andrew McGinnis wrote:
If you want to use weectl import to import data into existing archive
records, each record in your import source data (in your case your CSV
data) needs to include the existing data for all fields as well as the
updated rainfall data. You then need to use the (as yet undocumented)
--update command line option when running weectl import.
So rather than just the records I'm *missing*, I need to go from the first
missing record to the last missing record, inclusive of all in-between
records? I can easily do that, just not 100% on whether it's required.
No, you don't need 'in between records'. Your import data should only
include the records you intend to update/alter, you do need the data from
the other fields of the records you are updating/importing. Perhaps an
example might make it clearer. Say your archive table contains the
following:
dateTime outTemp inTemp rain outHumidity
1742000400 23.1 20.1 1.0 80
1742000100 23.2 20.1 2.0 81
1741999800 23.2 20.1 5.0 84
1741999500 23.3 20.2 0.0 83
1741999200 23.5 20.2 0.8 83
1741998900 23.5 20.2 0.6 84
1741998600 23.4 20.1 0.6 82
1741998300 23.3 20.1 0.8 83
1741998000 23.3 20.2 0.0 84
1741997700 23.3 20.1 1.0 82
Lets say you want to update the rain value in records timestamped 1741999500
and 1741998000 (ie the 0.0 values) and your new rain values (per archive
period values) for these two records are 0.2 and 0.4 respectively. So you
might start with your import data as follows:
dateTime rain
1741999500 0.2
1741998000 0.4
If you import this data using the --update command line option (or without
the --update command line option if you deleted the two archive table
records concerned) your archive table will be:
dateTime outTemp inTemp rain outHumidity
1742000400 23.1 20.1 1.0 80
1742000100 23.2 20.1 2.0 81
1741999800 23.2 20.1 5.0 84
1741999500 0.2
1741999200 23.5 20.2 0.8 83
1741998900 23.5 20.2 0.6 84
1741998600 23.4 20.1 0.6 82
1741998300 23.3 20.1 0.8 83
1741998000 0.4
1741997700 23.3 20.1 1.0 82
Each row in your import data is considered to be an archive record in
itself and any fields that are not included in your import data will be
empty. To keep the rest of the existing data in the records being updated
(in this case outTemp, inTemp and outHumidity), your import data needs to
include values for outTemp, inTemp and outHumidity. Your import data would
be:
dateTime outTemp inTemp rain outHumidity
1741999500 23.3 20.2 0.2 83
1741998000 23.3 20.2 0.4 84
Importing this data will give you:
dateTime outTemp inTemp rain outHumidity
1742000400 23.1 20.1 1.0 80
1742000100 23.2 20.1 2.0 81
1741999800 23.2 20.1 5.0 84
1741999500 23.3 20.2 0.2 83
1741999200 23.5 20.2 0.8 83
1741998900 23.5 20.2 0.6 84
1741998600 23.4 20.1 0.6 82
1741998300 23.3 20.1 0.8 83
1741998000 23.3 20.2 0.4 84
1741997700 23.3 20.1 1.0 82
Which I suspect is the desired result. Of course this is a very simple and
unrealistic example but hopefully you get the idea.
Alternatively, you could avoid using the --update command line option by
first deleting the records to be imported from the WeeWX database and then
importing your merged/updated CSV data.
If you'll save me the research, how do I delete specific records/periods,
either a single period record, or a range? I just recently noticed that the
soilMoist1 observation, on my weewx testing install, recorded about 84
hours of very wrong values, that my production weewx install didn't (both
getting the data from the same mqtt topic). I know how to drop entire
observation columns via weectl database drop-columns NAME, but in this case
it's just range of periods within the column that need purging. Or is it as
simple as, zeroing out the values in a range via sqlite> update archive set
soilMoist1=0.0 where dateTime > {startepochtime} and dateTime <=
{endepochtime]; ?
In terms of deleting records I use the sqlite3 utility to allow me to use
SQL commands to interact with the database. Some people use GUI based sql
editors to manipulate the database; I don't so I can't help you there. In
terms of sqlite3 the following commands can be used to delete archive table
records:
$ sqlite3 /var/lib/weewx/weewx.sdb
sqlite> DELETE FROM archive WHERE dateTime=1741999500;
sqlite3> .q
$
In this case we deleted the archive record timestamped 1741999500.
To delete a range of archive records by timestamp use something like:
sqlite> DELETE FROM archive WHERE dateTime>=1741969500 AND
dateTime<=1741999500;
which would delete all archive records timestamped from 1741969500
to 1741999500 inclusive.
Note that you may need to install the sqlite3 utility on your system using
something like (for debian based systems):
$ sudo apt install sqlite3
--
You received this message because you are subscribed to the Google Groups
"weewx-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion visit
https://groups.google.com/d/msgid/weewx-user/4506f570-fb2c-40f2-95ab-a98c02a59011n%40googlegroups.com.