Thanks for reporting this. I agree that it sounds useful, though it might
be very challenging due to the decentralized nature of git-lfs. I’m happy
to keep this bug open, but it seems better served for the upstream tracker
at https://github.com/git-lfs/git-lfs/issues.

Stephen

On Jan 31, 2022 at 9:50:31 AM, Barak A. Pearlmutter <ba...@pearlmutter.net>
wrote:

> Package: git-lfs
> Version: 3.0.2-1
> Severity: wishlist
> X-Debbugs-Cc: none, Barak A. Pearlmutter <ba...@pearlmutter.net>
>
> I have a repo whose only remote is on a gitlab instance. I'm using
> git-lfs to manage large binary files in this repo. The remote goes down.
> Now "git add foo.pdf / git commit", when *.pdf files are tracked by lfs,
> freezes! Waits forever for the remote when trying to transfer the big
> blobs.
>
> This violates what I consider a central concept of git, namely that
> operations are local unless you explicitly fetch or push. It means you
> cannot work with lfs while offline, like on an aeroplane, or even (as
> above) when the gitlab instance is offline for maintenance.
>
> There is also a potential security issue. Users might reasonably assume
> they can safely do "git add/commit/rebase" operations locally, with
> intermediate steps exposing secret information that is later removed
> before doing a push. Nope!
>
> Anyway: I *wish* git-lfs allowed remote operation, like git-annex does.
> It seems like it should be technically possible to wait until an
> lfs-tracked file (well, its https://git-lfs.github.com/spec/v1 smudge
> stub) is actually pushed before transferring the associated big binary
> blob. Or at the very least, giving up and remembering to try again later
> if there's a big binary blob transfer problem.
>
> -- System Information:
> Versions of packages git-lfs depends on:
> ii  git    1:2.34.1-1
> ii  libc6  2.33-5
>
>

Reply via email to