Thanks for the reply. I just read the intro to GIT and I am concerned about
the part that it will copy the whole repository to the developers work area.
They really just need the one directory and files under that one directory. The
history has TBs of data.
From: Junio C Hamano [mailto:gits...@pobox.com]
Sent: Tuesday, May 20, 2014 1:18 PM
To: Stewart, Louis (IS)
Subject: EXT :Re: GIT and large files
"Stewart, Louis (IS)" <louis.stew...@ngc.com> writes:
> Can GIT handle versioning of large 20+ GB files in a directory?
I think you can "git add" such files, push/fetch histories that contains such
files over the wire, and "git checkout" such files, but naturally reading,
processing and writing 20+GB would take some time. In order to run operations
that need to see the changes, e.g. "git log -p", a real content-level merge,
etc., you would also need sufficient memory because we do things in-core.
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html