Hi, I have a repository with about 55GB of contents, with binary files that 
are less than 100MB each (so no LFS mode) from a project which has almost 
filled up an entire hard drive. I am trying to add all of the contents to a 
git repo and push it to GitHub but every time I do

git add .

in the folder with my contents after initializing and setting my remote, 
git starts caching all the files to .git/objects, making the .git folder 
grow in size rapidly. All the files are binaries, so git cannot stage 
changes between versions anyway, so there is no reason to cache versions.

Is there any way, such as editing the git attributes or changing something 
about how files are staged in the git repository, to only just add indexes 
or references to files in the repository rather than cache them into the 
.git folder, while also being able to push all the data to GitHub?

-- 
You received this message because you are subscribed to the Google Groups "Git 
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to git-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/git-users/6328201b-e2f8-4b19-9e9f-5bfcee7585d9n%40googlegroups.com.

Reply via email to