Control: tag -1 -moreinfo

On Tue, Aug 13, 2019 at 08:44:07AM +0800, Paul Wise wrote:
> Package: apt
> Severity: wishlist
> X-Debbugs-CC:
> Control: block 871656 by -1
> For machines that are in a location with no Internet, apt-offline is a
> semi-convenient way to perform updates, upgrades and installs.
> There are two situations where offline machines can occur:
>  * systems in remote locations with no Internet access at all
>  * systems that are air-gapped and recieve only incoming data, no
>    outgoing data is allowed for security reasons.
> Unfortunately it was discovered that apt-offline does not check
> signatures properly and the package was removed from Debian buster.
> In addition the interface that apt-offline uses for exporting the list
> of files that should be downloaded is just the --print-uris option,
> which I noticed only prints MD5 hashes when installing packages.
> It would be nice to resolve both of these issues properly by creating a
> bidirectional interface between external downloaders and apt.

I'm fine with having print uris fixed so it reproduces all hashes and/or
exporting them via JSON hooks.

I think it's the wrong solution though. The correct solution is to bundle
the offline system's state (/var/lib/apt, /var/lib/dpkg/status, /etc/apt),
copy it to the online system, and then calculate and download the
update/upgrade there, and copy /var/lib/apt and /var/cache/apt back.

Hence I'd propose to have a bundle and an unbundle command or something.

> I suggest that such an interface should have these properties:
>  * be usable with all commands, including update, install, upgrade etc
>  * allow the downloader to be run on any kind of system with Internet
>    access, including Windows/macOS/Android etc machines

update is not possible. we don't know which files we'll have to
fetch, it depends on the server response.

>  * allow the downloader to be as sophisticated or as dumb as needed
>  * tell the downloader what to download and what filenames to choose

>  * tell the downloader how to verify each download was correct,
>    including needed OpenPGP keys etc

ugh, no. verification is pretty complex, we don't want people to
reimplement it.

>  * optionally don't tell the downloader about local sources.list
>    transports like file:// cdrom:// copy:// since those probably won't
>    be available on the download system but in some circumstances they
>    could be if the sysadmins have set them up correctly
>  * some transports (mirror:// tor://) may need some special handling...

That's pointless. We can't know what the downloader supports, so we
can't ignore anything. We can tell it what to download, and that's
about it.

>  * allow imports of downloaded data from a directory, probably best to
>    leave it to apt-offline users to define how they transfer the data
>    to the import directory

Just stuff them into partial, and run install to make it recognize
the files are already downloaded.

>  * do verification twice, potentially once by the downloader (won't be
>    possible in all situations) and always by apt

See above

debian developer - | - free software dev
ubuntu core developer                              i speak de, en

Attachment: signature.asc
Description: PGP signature

Reply via email to