On Fri, Mar 08, 2024 at 03:31:38PM +1000, Piers Rowan wrote:
> On 8/3/24 10:52, Craig Sanders via luv-main wrote:
> > 16GB isn't a lot these days. My guess is you're most likely running out
> > of RAM. The best solution is to add more RAM to the system if possible.
>
> That is not possible with this model. The strange thing is that this
That's a shame. Is the memory soldered in? if not, maybe there's a
third-party RAM kit that might fit?
Otherwise, do you have a desktop PC as well? Maybe you could offload some of
the workload to that, accessing it via ssh.
BTW, you can even tunnel X applications over ssh - I wouldn't want to use a
web browser or watch videos this way, but it should be fine for editing code
and probably for LibreOffice Writer & Calc too. I use ssh like this quite
often, in particular whenever I need to scan documentes with xsane because my
scanner is connected to my home server rather than my workstation machine.
Anyway, using ssh like this can work quite well if you're at the same location
(e.g. home) but can be inconvenient if you need to travel between multiple
locations (maybe a VPN could help, or at least static IP addresses).
As Sun used to say back in the 80s and 90s, "The Network is the
Computer". Using ssh like this (or it would have been rsh at the time) is
partly what they meant by that. Obviously, the faster & lower latency the
network, the better - a wired 1Gbps or 2.5Gbps LAN would be better than
wireless but even wireless is fine if there's not too many other WLANs nearby
competing for bandwidth.
> behavior is recent. Running applications concurrently has never been an
> issue. Unless there is something bloaty about an update this issue should
> never happenbut.
Unless you've drastically changed what you're doing lately, this is probably
just the "natural evolution" of software over time. Software tends to be
updated to do more stuff (to use a highly technical term), and doing more
stuff requires more resources - RAM in particular. Apps get bigger, and
the libraries they depend upon get bigger too. And software under rapid
development tends to oscillate between rapid growth spurts of new features and
experiments with little concern for optimisation followed by consolidation and
optimisation and bug fixes.
> What I noticed was that the system chewed away at the swap to the tune of
> 2GB (its Max).
>
> Iv'e increased the swap to 4GB:
>
> webgen@webgen-01:~$ sudo swapon -s
> Filename Type Size Used Priority
> /swapfile file 4194300 0 -2
I'd be inclined to increase that to at least 8GB if you have the disk space
available for it. And watch it closely for a while, just to see how close
your system gets to filling it up.
Also watch to see if the swap usage fluctuates all the time (indicating stuff
is being constantly swapped in and out - it's possible that this may be why
you've used so much of the drive's rated write endurance in only a few years)
or whether it settles down and mostly stabilises after a while.
BTW, in another message you were talking about replacing the NVME drive. it's
worth mentioning that upgrading or replacing a drive doesn't have to require
a complete re-install. You can just copy one drive to another (either a
bit-wise copy using dd or similar, or a file copy using e.g. rsync or tar or
cp -af), and optionally use gparted to adjust the size of your partition(s) if
necessary. If your laptop doesn't have two NVME slots, you could back it up
to an external hard disk, e.g. using a bootable USB stick of Clonezilla, and
then restore it to the new NVME - this is probably the easiest way...even if
you have two NVME slots, Clonezilla is a convenient way to clone a system from
one drive to another.
If you're using UEFI to boot, it should just work because UEFI looks for the
FAT32 "EFI System" partition.
If you're using old-fashioned style MBR to boot then you'll probably need
to re-install the grub boot sector on the new disk - you can do that from
the clonezilla USB stick or some other "rescue" disk. you'll need to mount
your partition(s) into, e.g., /target then bind mount /proc, /sys, /dev under
/target, then "chroot /target" before running grub-install. then exit from the
chroot and unmount them. Clonezilla may do this for you automatically when
you clone or backup & restore an entire drive...or it may not. Can't remember.
> I am currently running:
>
> - Terminal
> - GIMP
Terminal probably doesn't use that much RAM, unless you have it set to
keep tens of thousands of lines of backscroll buffer. GIMP can use a lot,
depending on what you're doing with it - proportional to the size and
complexity of your artwork.
> - Visual Code
This is microsoft's code editor, right? If so, it's probably bloatware -
MS aren't noted for efficient coding practices. So, probably uses lots of
RAM. Dunno, I've never really been into using fancy GUI IDEs - vim + bash + a