On Thu, Jul 20, 2017 at 12:41 AM, Volodymyr Sendetskyi
<volodymy...@devcom.com> wrote:
> It is known, that git handles badly storing binary files in its
> repositories at all.
> This is especially about large files: even without any changes to
> these files, their copies are snapshotted on each commit. So even
> repositories with a small amount of code can grove very fast in size
> if they contain some great binary files. Alongside this, the SVN is
> much better about that, because it make changes to the server version
> of file only if some changes were done.
>
> So the question is: why not implementing some feature, that would
> somehow handle this problem?

There are 'external' solutions such as git LFS and git-annex, mentioned
in replies nearby.

But note there are also efforts to handle large binary files internally
https://public-inbox.org/git/3420d9ae9ef86b78af1abe721891233e3f5865a2.1500508695.git.jonathanta...@google.com/
https://public-inbox.org/git/20170713173459.3559-1-...@jeffhostetler.com/
https://public-inbox.org/git/20170620075523.26961-1-chrisc...@tuxfamily.org/

Reply via email to