Re: Best practices for updating systems over extremely slow links

2016-10-28 Thread Richard Owlett

On 10/28/2016 4:14 AM, Karl E. Jorgensen wrote:

On Thu, Oct 27, 2016 at 03:45:21PM +, rbraun204 . wrote:

I have a couple of debian boxes in very remote areas that are connected back to
our wan via a 56kbps satellite link.  Most of the time we have a constant
stream of data coming/going to that machine so the link is saturated quite a
bit.

[snip]

Normally, I'd suggest looking into running a private mirror and
rsyncing it, but with only one machine at each location, that's
overkill.

I think you may want to look into apt-zip: This will decouple the
download from apt, allowing you to get the data xferred by whatever
method works (rsync?), and picking up the xferred file on the remote
location with apt...

Unfortunately, apt-zip has been discontinued - last appears in wheezy,
but it may still work or be made to work?  It seems aimed at your
exact use case...

Failing that I can imagine other (hand-crafted) solutions with these
components:

- make sure /etc/sources.list (and /etc/sources.list.d) are identical

- rsync /var/lib/apt/* across

- on the remote end: run:

 apt-get upgrade --print-uris  # or similar

- grab the URLs

- download the *.debs and rsync them into /var/cache/apt/archives/

- on the remote end:

 apt-get upgrade

This is entirely off the top of my head, but with a bit more thought
and scripting, it _should_ work...

Hope this helps
--
Karl


Would apt-offline be appropriate? It is a maintained Debian package.

offline APT package manager
   https://packages.debian.org/jessie/apt-offline
Welcome to Offline APT Package Manager project!
   http://apt-offline.alioth.debian.org/
Offline Package Management for APT

https://debian-administration.org/article/648/Offline_Package_Management_for_APT
Using APT Offline
   https://www.debian.org/doc/manuals/apt-offline/index.en.html





Re: Best practices for updating systems over extremely slow links

2016-10-28 Thread Karl E. Jorgensen
Hi

On Thu, Oct 27, 2016 at 03:45:21PM +, rbraun204 . wrote:
> I have a couple of debian boxes in very remote areas that are connected back 
> to
> our wan via a 56kbps satellite link.  Most of the time we have a constant
> stream of data coming/going to that machine so the link is saturated quite a
> bit.

Ouch. I guess that the data isn't for looking at cat videos then!

> I'm having all sorts of trouble getting apt to play nicely with the extremely
> slow link.  When I try to do an apt-get update,  it seems to work for a while,
>   then will start to download whichever list it's currently on all over 
> again. 
> I tried running apt-get update for about 24h and it would never completely
> download the amd64 main Packages.gz (around 7.5M).  It would just keep trying
> to start over and over again. Maybe sometimes it will work, but +50% of the
> time it will crap out.  Apt is configured to use a proxy server, aswell as
>  http::timeout is set to 300 via apt.conf
> 
> FWIW,  I can reliably rsync files over the sat link without issue.  It takes a
> while for sure,  getting about .75 - 1.5KB/s.   So the files do get there.  So
> it seems like whatever magic is baked into the rsync protocol to handle these
> slow links is working alot more reliably for me then the http gets that apt is
> using.  Running rsync with bwlimit will work all day I've found.

That's very odd. So TCP connections stay alive then.

> I'm currently trying to build a list of debs that the system wants using
> something like 
> 
> apt-get dist-upgrade --allow-unauthenticated -y --print-uris | grep -o '\
> 'http.*\' | tr "\'" " " > downloads
> 
> then wget'ing them locally and rsyncing them up the remote.  Seems to be
> working so far,  but the last failed apt-get update seemed to blow away the
> lists on the remote and I can no longer see any pending package upgrades on 
> the
> system.  
> 
> I've also tried tarring up /var/lib/apt/lists/* from a known working system 
> and
> rsyncing that up to the remote,  to try and update the lists manually I 
> guess. 
> But that didn't seem to work either.  After dropping the list files in 
> /var/lib
> /apt/lists and running apt-get dist-upgrade,  still showed no pending 
> updates. 
> So not sure why that would be.
> 
> So after all that,  here are my questions :)
> 
> 1.  Is there some crappy link tweaks I can use in apt to help apt with
> transferring data over a 1.5KB link?
> 
> 2.  In theory,  if I wanted to transfer the apt-get update data via rsync,
>  should I be able to tar up /var/lib/apt/lists/* and send that manually?  It
> didn't seem to work,  but I would imagine there's more going on behind the
> scenes.

if the sources.list are identical, yes: I believe that should work.

> 3.  Generally just curious what others have done when trying to keep systems 
> up
> to date in very remote places with limited pipes.
> 
> 
> Worst case scenario,  If we had to burn a cd full of debs monthly and ship it
> out to the remote I guess that would work.  We also have our own custom repos
> with software that gets updated aswell. But sometimes we would need to push
> those updates out asap.  Also,  there is only 1 machine at each remote,  so
> it's not an issue of running approx to save X machines all updating over the
> network at once.

Normally, I'd suggest looking into running a private mirror and
rsyncing it, but with only one machine at each location, that's
overkill.

I think you may want to look into apt-zip: This will decouple the
download from apt, allowing you to get the data xferred by whatever
method works (rsync?), and picking up the xferred file on the remote
location with apt...

Unfortunately, apt-zip has been discontinued - last appears in wheezy,
but it may still work or be made to work?  It seems aimed at your
exact use case...

Failing that I can imagine other (hand-crafted) solutions with these
components:

- make sure /etc/sources.list (and /etc/sources.list.d) are identical

- rsync /var/lib/apt/* across

- on the remote end: run:

apt-get upgrade --print-uris  # or similar

- grab the URLs

- download the *.debs and rsync them into /var/cache/apt/archives/

- on the remote end:

apt-get upgrade 

This is entirely off the top of my head, but with a bit more thought
and scripting, it _should_ work...

Hope this helps
--
Karl



Re: Best practices for updating systems over extremely slow links

2016-10-27 Thread emetib
hello.

don't know how many different set ups that you have going on in the boondocks, 
yet if they are basically all the same you could install a virtual machine(s) 
at your location that copies what is at your distant locations.

this is from page 108 of the debian handbook(comments added) -
Installing the same selection of packages several times

It can be useful to systematically install the same list of packages on several
computers. This can be done quite easily.

First, retrieve the list of packages installed on the computer which will serve
as the “model” to copy.

dpkg --get-selections > pkg-list

The pkg-list file then contains the list of installed packages. Next, transfer
the pkg-list file onto the computers you want to update and use the following 
commands:

## Update dpkg’s database of known packages
# avail=‘mktemp‘
# apt-cache dumpavail > ”$avail”
# dpkg --merge-avail ”$avail”
# rm -f ”$avail”
## Update dpkg’s selections
# dpkg --set-selections < pkg-list
## Ask apt-get to install the selected packages
# apt-get dselect-upgrade
The first commands records the list of available packages in the dpkg database,
then dpkg --set-selections restores the selection of packages that you wish to
install, and the apt-get invocation executes the required operations! aptitude
does not have this command.

so with this you could basically mirror your remote system running as a virtual 
guest.

with this do can do an 'apt-get -d upgrade' (on virtual system)
this will only download your packages and then you can tar.gz them up and rsync 
that to the remote systems and unpack in /var/cache/apt/archives/ and then 
'apt-get upgrade' (on remote system).

if you set up a cron job to 'apt-get update' on the remote systems (before you 
rsync the tar.gz) then you could just write a script that 'upgrades' on them 
when the rsync is done.

for the individual package updates you could push those on an as needed basis 
to upgrade what you needed.

depending on the variety of your remote systems, you could probably get away 
with having all of the variants installed on one virtual system, and then push 
the same tar.gz file to all of them and they will only install what is needed 
from what is in the archive dir.



Best practices for updating systems over extremely slow links

2016-10-27 Thread rbraun204 .
I have a couple of debian boxes in very remote areas that are connected
back to our wan via a 56kbps satellite link.  Most of the time we have a
constant stream of data coming/going to that machine so the link is
saturated quite a bit.

I'm having all sorts of trouble getting apt to play nicely with the
extremely slow link.  When I try to do an apt-get update,  it seems to work
for a while,   then will start to download whichever list it's currently on
all over again.  I tried running apt-get update for about 24h and it would
never completely download the amd64 main Packages.gz (around 7.5M).  It
would just keep trying to start over and over again. Maybe sometimes it
will work, but +50% of the time it will crap out.  Apt is configured to use
a proxy server, aswell as  http::timeout is set to 300 via apt.conf

FWIW,  I can reliably rsync files over the sat link without issue.  It
takes a while for sure,  getting about .75 - 1.5KB/s.   So the files do get
there.  So it seems like whatever magic is baked into the rsync protocol to
handle these slow links is working alot more reliably for me then the http
gets that apt is using.  Running rsync with bwlimit will work all day I've
found.

I'm currently trying to build a list of debs that the system wants using
something like

apt-get dist-upgrade --allow-unauthenticated -y --print-uris | grep -o
'\'http.*\' | tr "\'" " " > downloads

then wget'ing them locally and rsyncing them up the remote.  Seems to be
working so far,  but the last failed apt-get update seemed to blow away the
lists on the remote and I can no longer see any pending package upgrades on
the system.

I've also tried tarring up /var/lib/apt/lists/* from a known working system
and rsyncing that up to the remote,  to try and update the lists manually I
guess.  But that didn't seem to work either.  After dropping the list files
in /var/lib/apt/lists and running apt-get dist-upgrade,  still showed no
pending updates.  So not sure why that would be.

So after all that,  here are my questions :)

1.  Is there some crappy link tweaks I can use in apt to help apt with
transferring data over a 1.5KB link?

2.  In theory,  if I wanted to transfer the apt-get update data via rsync,
 should I be able to tar up /var/lib/apt/lists/* and send that manually?
It didn't seem to work,  but I would imagine there's more going on behind
the scenes.

3.  Generally just curious what others have done when trying to keep
systems up to date in very remote places with limited pipes.


Worst case scenario,  If we had to burn a cd full of debs monthly and ship
it out to the remote I guess that would work.  We also have our own custom
repos with software that gets updated aswell. But sometimes we would need
to push those updates out asap.  Also,  there is only 1 machine at each
remote,  so it's not an issue of running approx to save X machines all
updating over the network at once.

Thanks guys.