Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use memory-efficient Merkle tree computation when generating proofs #163

Open
noahfigueras opened this issue Sep 7, 2024 · 4 comments · May be fixed by #164
Open

Use memory-efficient Merkle tree computation when generating proofs #163

noahfigueras opened this issue Sep 7, 2024 · 4 comments · May be fixed by #164

Comments

@noahfigueras
Copy link

When trying to compute proofs, for the beacon state validators array, it brakes with the following error:

memory allocation of 35184372088800 bytes failed
Aborted (core dumped)

I assume this is cause due to the following line allocating the full node_count space. But, this can cause problems with bigger size lists especially if they are not full. For example if we try to proof that a validator in the BeaconState is active or exists, on computing that proof it will brake due to allocating too many slots on the buffer for empty nodes as it will try to allocate slots for the full list up to VALIDATOR_REGISTRY_LIMIT.

You can reproduce it with the following code:

    const VALIDATOR_REGISTRY_LIMIT: usize = 1099511627776;

    let vec = vec![100u64, 1000u64];

    let h = List::<u64, VALIDATOR_REGISTRY_LIMIT>::try_from(vec).unwrap();
    let p = h.prove(&[1.into()]);
@noahfigueras noahfigueras changed the title Out-of-memory error due to excessive allocation size in Lists with big size. Out-of-memory error due to excessive allocation in Lists with big size. Sep 7, 2024
@ralexstokes
Copy link
Owner

hi @noahfigueras thanks for this!

I think your assessment is correct and basically I used a more naive version of Merkle tree computation for proving as for the initial implementation I focused on getting the APIs right and everything else around the proof generation/verification

Instead, the line you linked should use this implementation which is memory-efficient and I think will resolve this problem

I haven't had time to update the implementation but PRs are very welcome!

@ralexstokes ralexstokes changed the title Out-of-memory error due to excessive allocation in Lists with big size. Use memory-efficient Merkle tree computation when generating proofs Sep 9, 2024
@ralexstokes
Copy link
Owner

I think a nice way forward would be to adjust Tree so that it can handle "virtual" nodes (subtrees that are instances of a "zero" tree) and then refactor merkleize_chunks_with_virtual_padding so that it uses this updated Tree (so just returning the root node in this function), and then during proving the Tree can just be walked (via generalized indices) to compute the necessary internal nodes in a memory-efficient way

@ralexstokes
Copy link
Owner

ah, this is referenced in #141 btw

@noahfigueras
Copy link
Author

hi @ralexstokes! I think that's good approach, I'll try to get this working. Is it ok moving merkleize_chunks_with_virtual_padding inside Tree?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants