Can't push to GitHub error: pack-objects died of signal 13

This might be caused by the new GitHub limits on file sizes, or possibly the packing process is crashing out due to large data (like in this other SO question from 2011: Alternative way to do an initial push of a large repo) before those particular limits can even be reported.

A few weeks ago on a colleague's Mac OS X machine I saw a similar opaque error messag when pushing (tho' only over HTTPS - SSH gave the good error message) - yet having tried to reproduce this just now by pushing a big file to GitHub on my own Mac, I got the expected informative error message on both connection protocols ("remote: error: File big is 976.56 MB; this exceeds GitHub's file size limit of 100 MB"). It's possible that my colleague had an older version of Git installed that couldn't report the errors correctly.

If you decide you need to remove the large files from your Git history (as it's the only way you'll get them pushed up to GitHub), I can recommend The BFG:

$ java -jar bfg.jar  --strip-blobs-bigger-than 50M  my-repo.git

Full disclosure: I'm the author of the BFG Repo-Cleaner.


Simple solution is to increase the HTTP post buffer size to allow for larger chunks to be pushed up to the remote repo. To do that, simply type:

git config http.postBuffer 52428800

The number is in bytes, so in this case I have set it to 50MB. The default is 1MB.


If you couldn't start the proccess it could also mean that permissions of your master server repository are wrong. I've just faced this problem, and my repository on master was with root.root permissions, so my user git couldn't write data there.

You may try checking your master's repository permissions before trying to push.

Tags:

Git

Github