How do I download a large Git Repository?

If you don't need to pull the whole history you could specify the number of revisions to clone

git clone <repo_url> --depth=1

Of course this might not help if you have a particularly large file in your repository


1) you can initially download the single branch having only the latest commit revision (depth=1), this will significantly reduce the size of the repo to download and still let you work on the code base:

git clone --depth <Number> <repository> --branch <branch name> --single-branch

example:
git clone --depth 1 https://github.com/dundermifflin/dwightsecrets.git --branch scranton --single-branch


2) later you can get all the commits (after this your repo will be in the same state as after a git clone):

git fetch --unshallow

or if it's still too much, get only last 25 commits:

git fetch --depth=25


Other way: git clone is not resumable but you can first git clone on a third party server and then download the complete repo over http/ftp which is actually resumable.


For me, helped perfectly, like is described in this answer: https://stackoverflow.com/a/22317479/6332374, but with one little improvement, because of big repo:

At first:

git config --global core.compression 0

then, clone just a part of your repo:

git clone --depth 1 <repo_URI>

and now "the rest"

git fetch --unshallow

but here is the trick.: When you have a big repo sometimes you must perform that step multiple times. So... again,

git fetch --unshallow

and so on.

Try multiple times. Probably you will see, that each time you perform 'unshallow' you get more and more objects before the error.

And at the end, just to be sure.

git pull --all


One potential technique is just to clone a single branch. You can then pull in more later. Do git clone [url_of_remote] --branch [branch_name] --single-branch.

Large repositories seem to be a major weakness with git. You can read about that at http://www.sitepoint.com/managing-huge-repositories-with-git/. This article mentions a git extension called git-annex that can help with large files. Check it out at https://git-annex.branchable.com/. It helps by allowing git to manage files without checking the files into git. Disclaimer, I've never tried it myself.

Some of the solutions at How do I clone a large Git repository on an unreliable connection? also may help.

EDIT: Since you just want the files you may be able to try git archive. You'd use syntax something like

git archive --remote=ssh://[email protected]/username/reponame.git --format=tar --output="file.tar" master

I tried to test on a repo at my AWS Codecommit account but it doesn't seem to allow it. Someone on BitBucket may be able to test. Note that on Windows you'd want to use zip rather than tar, and this all has to be done over an ssh connection not https.

Read more about git archive at http://git-scm.com/docs/git-archive

Tags:

Git

Bitbucket