Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error Cloning Repository Due to Large File Transfer (RPC failed) #91669

Open
alantomanu opened this issue Dec 4, 2024 · 5 comments
Open

Error Cloning Repository Due to Large File Transfer (RPC failed) #91669

alantomanu opened this issue Dec 4, 2024 · 5 comments

Comments

@alantomanu
Copy link
Contributor

🐞 Problem

While trying to clone the repository using the standard git clone command, I encountered the following errors:

  • error: RPC failed; curl 18 Transferred a partial file00 KiB/s
  • fetch-pack: unexpected disconnect while reading sideband packet
  • fatal: early EOF
  • fatal: fetch-pack: invalid index-pack output

These errors occurred due to the repository's large size, causing the cloning process to fail.


🎯 Goal

Provide a solution to address cloning issues for large repositories, ensuring contributors can clone the repository seamlessly without encountering RPC or buffer-related errors.


💡 Possible Solution

To resolve the issue, I used the following commands:

  1. Increase Git Buffer Size:

    git config --global http.postBuffer 524288000
  2. Perform a Shallow Clone:

    git clone --depth 1  https://github.com/your-name/first-contributions.git

This allowed me to successfully clone the repository with reduced data transfer requirements.


📋 Steps to Solve the Problem

  1. Update the documentation or README file to include troubleshooting steps for cloning issues.
  2. Add the following solution for users facing similar problems:
    • Use the git config --global http.postBuffer command to increase the buffer size.
    • Consider performing a shallow clone using --depth 1.
  3. Optionally, provide an FAQ section or link to Git troubleshooting resources.

This issue will help future contributors who encounter similar problems while cloning large repositories. Thank you! 🎉

@Roshanjossey
Copy link
Member

Hi @alantomanu, thank you for reporting this issue.

Also, props to figuring out the problem and fixing it for yourself.

I'd rather not extra commands or flags that are not common for common contribution workflows.

We've addressed repo size problem before and solved it #10499.

I'll take some time to measure this before making that change. In the meantime, let's keep this issue open for other's who might face this.

@alantomanu
Copy link
Contributor Author

Hi @Roshanjossey,

Thank you for the quick response and for acknowledging the issue! I completely understand the concern about introducing extra commands or flags that may not be necessary for the typical workflow.

I appreciate that the repository size problem has been addressed in the past (#10499), and I’m happy to keep this issue open so others facing the same challenge can find a solution. Please let me know if there’s any way I can assist in measuring or testing any proposed changes.

Thanks again, and I look forward to the resolution!

Best,
@alantomanu

@Roshanjossey
Copy link
Member

Please let me know if there’s any way I can assist in measuring or testing any proposed changes.

You could try it in your local.

I use du for finding size

This is what I'm getting from a fork

 ❯ du -sh first-contributions                                                                       ~/open
 72M	first-contributions
 ❯ du -sh first-contributions/.git                                                                  ~/open
 70M	first-contributions/.git

My solution would probably be to reset all changes to a single commit

git reset $(git commit-tree "HEAD^{tree}" -m "squash all changes to single commit")

Then measure again.

I'll check on what default buffer size is and try to land below that.

Also, could you tell me what happens when you clone a repo with similar number of commits? eg: https://github.com/neovim/neovim

@alantomanu
Copy link
Contributor Author

Thank you for the suggestion and detailed explanation! I'll try the steps locally and see if resetting the changes to a single commit reduces the size as expected. I'll measure using du as you suggested to verify the impact.

As for testing a repository with a similar number of commits, I’ll attempt to clone neovim/neovim and compare the behavior. I’ll report back on whether I encounter the same cloning issues and share the results here for further analysis.

Thanks again for your guidance and suggestions—I'll keep you updated!

Best,
@alantomanu

@alantomanu
Copy link
Contributor Author

I tried your approach of using git reset to squash all changes into a single commit, but I encountered issues with the command. When I tried running:
git reset $(git commit-tree "HEAD^{tree}" -m "squash all changes to single commit")
I received an error due to an unknown switch -m. I also encountered errors when attempting to clone other large repositories like neovim, which resulted in a connection failure (RPC failed; curl 18 Transferred a partial file), even after increasing the buffer size with git config --global http.postBuffer 524288000.

In response to your request, I tried cloning the neovim/neovim repository, which has a similar number of commits, but I encountered the same issue:

git clone https://github.com/neovim/neovim.git
This led to the same error: RPC failed; curl 18 Transferred a partial file.

Additionally, based on my local testing, the .git directory in the first-contributions repo is quite large (around 70MB), which may be contributing to the issue.

Do you have any further suggestions on how to address this, or would it be helpful to explore resetting the repo to a single commit as you proposed?

Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants