On 04/01/2024 11:40, Ralph Corderoy wrote:
Hi,
Tim wrote:
Probably obvious, but when you use containers do make sure you have a
solution in place for keeping current with updates.
Not that obvious to me, as you probably suspected. I've not done much
with containers before.
Seems I need to poll to learn of a later tag on the remote hub.
Presumably there's a command method instead of ‘browsing’ as various web
pages suggest? And then it's a ‘pull, stop, run’ dance to get the new
one running with the containers' data persisting through the image's
volumes.
‘apt-get upgrade -u’ does have a certain charm to it, in comparison.
Should I use Podman rather than Docker? I see it's daemonless. Would
there be much other advantage if I'm just interested in a simple set up?
I've not explored Podman, but in terms of keeping things up to date I
use Watchtower. It installs in another container and can be configured
to auto update, or just notify. Personally I just get it to both email
me and pop a message into my Discord server. The Discord bit is
experimental as much as anything, but I seem to be on so many of these
platforms for different things I thought I'd get them all running and
then ditch whichever I didn't want to use. Since I use
docker-compose.yml files for setting up my containers I can simply run a
docker compose up -d to pull the new image and sort things out. All my
data is in external volumes or bind mounts (volumes annoy me a bit as to
be platform agnostic they have to be in a set location, and with my disk
setup that means I'm forced to use bind mounts unless I want to loop
back a Samba or NFS mount which really doesn't seem to make sense).
Another option is Portainer that I keep going back and forth on whether
I like or not. It notifies of updates and I've got a free tier business
license (it was 5 nodes when I signed up, but has dropped to 3 now I
think), which allows me to store my docker-compose.yml files in Gitlab
and pull from there. I upload .env files with the secure data that's
stored locally (and allows me to use a single docker-compose.yml file
for multiple containers). I'm trying to work out whether it will work
nicely with Dockerfiles to allow me to modify the image as I currently
have a phpfpm image that I run some commands against on creation to add
Maridb / MySQL support and a few other bits, which takes a while when
updating. Doing that and creating a new image would give a shorter
downtime but I've not investigated how that would complicate monitoring
updates as the source image wouldn't actually be installed.
All this is fairly new stuff for me, and I've been out of things for a
couple of years caring for my dad with prostate cancer and Alzheimer's,
and after he died back in May I've been suffering from back problems and
my wife has been diagnosed with breast cancer. I started a server
migration to 64 bit and Docker containers and promptly went down with
Covid (as did my wife in the middle fo chemotherapy) which stalled
things, and in spite of preparing with test setups I've found myself
completely reworking the setup to use shared Nginx, Mariadb and PHP
containers to ease the load on my poor old VPS.
Some of my Docker stuff is available to look at on Gitlab as snapshots
of what I'm actually playing with in a private repository. I've tried to
put decent notes with the config files:
https://gitlab.com/aptanet/docker-compose-files
--
Paul Tansom | Aptanet Ltd. | https://www.aptanet.com/ | 023 9238 0001
=============================================================================
Registered in England | Company No: 4905028 | Registered Office: Ralls House,
Parklands Business Park, Forrest Road, Denmead, Waterlooville, Hants, PO7 6XP
--
Next meeting: Online, Jitsi, Tuesday, 2024-02-06 20:00
Check to whom you are replying
Meetings, mailing list, IRC, ... http://dorset.lug.org.uk
New thread, don't hijack: mailto:dorset@mailman.lug.org.uk