Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prefetch JUMPDESTs through RPC #427

Open
wants to merge 121 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 34 commits
Commits
Show all changes
121 commits
Select commit Hold shift + click to select a range
7b01f5d
Implement JumpDest fetching from RPC.
einar-polygon Jul 15, 2024
4591482
feedback + cleanups
einar-polygon Sep 15, 2024
91c2945
cleanups
einar-polygon Sep 15, 2024
b58c5d6
fix overflow
einar-polygon Sep 15, 2024
037fb57
fmt
einar-polygon Sep 16, 2024
85ee8c2
fix testscripts
einar-polygon Sep 16, 2024
1243768
refactor
einar-polygon Sep 16, 2024
e7244c6
for testing
einar-polygon Sep 16, 2024
16e9c26
extract initcode
einar-polygon Sep 17, 2024
f3871d9
improve test script
einar-polygon Sep 17, 2024
4fd6b8b
fix stack issue
einar-polygon Sep 18, 2024
88eb73d
random fixes
einar-polygon Sep 18, 2024
39cd26c
fix CREATE2
einar-polygon Sep 18, 2024
8a964b8
fmt, clippy
einar-polygon Sep 18, 2024
32e68bf
investigate 15,35
einar-polygon Sep 19, 2024
71b003e
merge
einar-polygon Sep 19, 2024
76df518
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Sep 19, 2024
184878d
fix scripts
einar-polygon Sep 19, 2024
c000b5a
remove redtape for JUMP/I
einar-polygon Sep 19, 2024
ec81701
misc
einar-polygon Sep 19, 2024
bff471e
fix ci
einar-polygon Sep 20, 2024
ca9620d
minimize diff
einar-polygon Sep 20, 2024
4c97c0f
include whole function in timeout
einar-polygon Sep 20, 2024
8bce013
avoid ensure macro
einar-polygon Sep 20, 2024
b0ebc2c
fix CREATE
einar-polygon Sep 20, 2024
62c7053
small adjustments
einar-polygon Sep 23, 2024
74b86fd
fmt
einar-polygon Sep 23, 2024
c8be888
feedback
einar-polygon Sep 23, 2024
d00439f
feedback
einar-polygon Sep 24, 2024
6876c07
Add JumpdestSrc parameter
einar-polygon Sep 24, 2024
60efef9
Refactor
einar-polygon Sep 24, 2024
b07752d
Add jmp src to native
einar-polygon Sep 24, 2024
66ea811
Feedback
einar-polygon Sep 24, 2024
f230b84
fixup! Feedback
einar-polygon Sep 24, 2024
90722a3
feedback
einar-polygon Sep 25, 2024
a0e0879
fix missing code for CREATE
einar-polygon Sep 25, 2024
6bff4e4
fix
einar-polygon Sep 26, 2024
5e4162d
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Sep 26, 2024
c26f475
fix arguments
einar-polygon Sep 26, 2024
f367409
feedback
einar-polygon Sep 30, 2024
464dbc0
fix
einar-polygon Sep 30, 2024
2783ccd
debugging 460
einar-polygon Sep 30, 2024
abee812
debugging 460
einar-polygon Oct 1, 2024
8bfccdd
dbg
einar-polygon Oct 1, 2024
f9c2f76
bugfix
einar-polygon Oct 1, 2024
313d78d
dbg
einar-polygon Oct 1, 2024
09079e6
fix
einar-polygon Oct 1, 2024
7c84a63
batching working
einar-polygon Oct 1, 2024
4202ece
cleanups
einar-polygon Oct 1, 2024
e1124d3
feedback docs
einar-polygon Oct 2, 2024
1f8d476
feedback
einar-polygon Oct 2, 2024
7319a15
feedback filtermap
einar-polygon Oct 2, 2024
d4838e0
review
einar-polygon Oct 2, 2024
27b5719
fmt
einar-polygon Oct 3, 2024
eaf3ed7
fix set_jumpdest_analysis_inputs_rpc
einar-polygon Oct 3, 2024
10b6a22
discuss: deser in #427 (#681)
0xaatif Oct 3, 2024
c11d17d
feat: block structlog retrieval (#682)
atanmarko Oct 7, 2024
06b1913
better tracing
einar-polygon Oct 9, 2024
339f6af
bug fix
einar-polygon Oct 9, 2024
61a6b6a
json
einar-polygon Oct 9, 2024
f8f0a85
reinstantiate timeout
einar-polygon Oct 9, 2024
8d609ad
merge
einar-polygon Oct 9, 2024
dbb65ea
ignore None
einar-polygon Oct 9, 2024
d415d22
feedback
einar-polygon Oct 9, 2024
54a7df8
feedback: rustdoc
einar-polygon Oct 9, 2024
44b421c
feedback: add user-specified timeout
einar-polygon Oct 10, 2024
98b9c8e
feedback
einar-polygon Oct 11, 2024
4707d38
fix: addresses
einar-polygon Oct 14, 2024
4843501
todo: fix todo
einar-polygon Oct 14, 2024
8f980d2
testing: improve prove_stdio script
einar-polygon Oct 14, 2024
ee7e5f3
testing: improve test_native script
einar-polygon Oct 14, 2024
36557d1
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Oct 14, 2024
5451399
fmt
einar-polygon Oct 14, 2024
e9a8702
Round 5
einar-polygon Oct 14, 2024
b2f66ed
testing
einar-polygon Oct 15, 2024
cfb293c
testing: improve reporting, add error cases
einar-polygon Oct 15, 2024
3c497cc
change exit code
einar-polygon Oct 15, 2024
2dc52cb
don't panic!
einar-polygon Oct 16, 2024
dd89251
fix type 5 errors
einar-polygon Oct 18, 2024
6c59c41
Fix: 19548491
einar-polygon Oct 19, 2024
0d7f6b7
add stats
einar-polygon Oct 19, 2024
b8cf325
dbg
einar-polygon Oct 21, 2024
e9ec9f8
remove test scripts
einar-polygon Oct 21, 2024
7263ea3
remove modifications
einar-polygon Oct 21, 2024
ced5d5f
rename a
einar-polygon Oct 21, 2024
7dc2f51
remove todo
einar-polygon Oct 21, 2024
2848ede
add derive_more and add docs
einar-polygon Oct 21, 2024
552d569
clean up
einar-polygon Oct 21, 2024
83a0820
reinstantiate failover simulation
einar-polygon Oct 21, 2024
81f847c
use Hash2code
einar-polygon Oct 21, 2024
2313337
re-add prove_stdio.sh
einar-polygon Oct 21, 2024
206e9a4
mv derive_more
einar-polygon Oct 21, 2024
0b7e997
cleanup
einar-polygon Oct 21, 2024
d67911e
remove tracing
einar-polygon Oct 21, 2024
ee64de9
Merge branch 'develop' into einar/prefetch_transaction_jumps/pr
einar-polygon Oct 21, 2024
6b88ff6
fix derive_more
einar-polygon Oct 21, 2024
d202dd9
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Oct 21, 2024
ef42fb2
Merge branch 'develop' into einar/prefetch_transaction_jumps/pr
einar-polygon Nov 4, 2024
b1d5eaa
.
einar-polygon Nov 4, 2024
078d95a
.
einar-polygon Nov 5, 2024
89d3756
fix: search for witness index
einar-polygon Nov 7, 2024
04e2df9
remove ctx
einar-polygon Nov 7, 2024
3df0292
workaround
einar-polygon Nov 7, 2024
bd02ee3
.
einar-polygon Nov 8, 2024
7b7cb46
bug investigations
einar-polygon Nov 11, 2024
8811d23
before going back
einar-polygon Nov 11, 2024
c9bc69d
.
einar-polygon Nov 11, 2024
200f6ac
remove max_wctx
einar-polygon Nov 11, 2024
3f283c9
.
einar-polygon Nov 12, 2024
4483db6
add empty contexts
einar-polygon Nov 12, 2024
1d8b230
insert extra contexts. confirm empty empty tables
einar-polygon Nov 12, 2024
edab205
remove offset
einar-polygon Nov 13, 2024
5ee17da
.
einar-polygon Nov 13, 2024
71ecdf1
debug info
einar-polygon Nov 13, 2024
19a624c
fix
einar-polygon Nov 13, 2024
fde63a0
.
einar-polygon Nov 14, 2024
5157244
premerge
einar-polygon Nov 14, 2024
ca7ccac
merge
einar-polygon Nov 14, 2024
cb98478
748 batched passing
einar-polygon Nov 14, 2024
e47731d
n=18 ok
einar-polygon Nov 15, 2024
55a5dd8
test batching
einar-polygon Nov 15, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 5 additions & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ alloy = { version = '0.3.0', default-features = false, features = [
"transport-http",
"rpc-types-debug",
] }
alloy-primitives = "0.8.0"
alloy-serde = "0.3.0"
anyhow = "1.0.86"
async-stream = "0.3.5"
axum = "0.7.5"
Expand All @@ -51,6 +53,7 @@ criterion = "0.5.1"
dotenvy = "0.15.7"
either = "1.12.0"
enum-as-inner = "0.6.0"
enumn = "0.1.13"
env_logger = "0.11.3"
eth_trie = "0.4.0"
ethereum-types = "0.14.1"
Expand Down Expand Up @@ -87,14 +90,15 @@ serde = "1.0.203"
serde-big-array = "0.5.1"
serde_json = "1.0.118"
serde_path_to_error = "0.1.16"
serde_with = "3.8.1"
sha2 = "0.10.8"
static_assertions = "1.1.0"
thiserror = "1.0.61"
tiny-keccak = "2.0.2"
tokio = { version = "1.38.0", features = ["full"] }
toml = "0.8.14"
tower = "0.4"
tracing = "0.1"
tracing = { version = "0.1", features = ["attributes"] }
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
u4 = "0.1.0"
uint = "0.9.5"
Expand Down
1 change: 1 addition & 0 deletions evm_arithmetization/benches/fibonacci_25m_gas.rs
Original file line number Diff line number Diff line change
Expand Up @@ -193,6 +193,7 @@ fn prepare_setup() -> anyhow::Result<GenerationInputs<F>> {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: Default::default(),
})
}

Expand Down
73 changes: 65 additions & 8 deletions evm_arithmetization/src/cpu/kernel/interpreter.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,13 @@
//! the future execution and generate nondeterministically the corresponding
//! jumpdest table, before the actual CPU carries on with contract execution.

use core::option::Option::None;
use std::collections::{BTreeMap, BTreeSet, HashMap};

use anyhow::anyhow;
use ethereum_types::{BigEndianHash, U256};
use log::Level;
use keccak_hash::H256;
use log::{trace, Level};
use mpt_trie::partial_trie::PartialTrie;
use plonky2::hash::hash_types::RichField;
use serde::{Deserialize, Serialize};
Expand All @@ -19,7 +21,9 @@ use crate::cpu::columns::CpuColumnsView;
use crate::cpu::kernel::aggregator::KERNEL;
use crate::cpu::kernel::constants::global_metadata::GlobalMetadata;
use crate::generation::debug_inputs;
use crate::generation::jumpdest::{ContextJumpDests, JumpDestTableProcessed, JumpDestTableWitness};
use crate::generation::mpt::{load_linked_lists_and_txn_and_receipt_mpts, TrieRootPtrs};
use crate::generation::prover_input::get_proofs_and_jumpdests;
use crate::generation::rlp::all_rlp_prover_inputs_reversed;
use crate::generation::state::{
all_ger_prover_inputs, all_withdrawals_prover_inputs_reversed, GenerationState,
Expand Down Expand Up @@ -56,6 +60,7 @@ pub(crate) struct Interpreter<F: RichField> {
/// Counts the number of appearances of each opcode. For debugging purposes.
#[allow(unused)]
pub(crate) opcode_count: [usize; 0x100],
/// A table of contexts and their reached JUMPDESTs.
jumpdest_table: HashMap<usize, BTreeSet<usize>>,
/// `true` if the we are currently carrying out a jumpdest analysis.
pub(crate) is_jumpdest_analysis: bool,
Expand All @@ -71,9 +76,9 @@ pub(crate) struct Interpreter<F: RichField> {
pub(crate) fn simulate_cpu_and_get_user_jumps<F: RichField>(
final_label: &str,
state: &GenerationState<F>,
) -> Option<HashMap<usize, Vec<usize>>> {
) -> Option<(JumpDestTableProcessed, JumpDestTableWitness)> {
match state.jumpdest_table {
Some(_) => None,
Some(_) => Default::default(),
None => {
let halt_pc = KERNEL.global_labels[final_label];
let initial_context = state.registers.context;
Expand All @@ -92,16 +97,15 @@ pub(crate) fn simulate_cpu_and_get_user_jumps<F: RichField>(

let clock = interpreter.get_clock();

interpreter
let jdtw = interpreter
.generation_state
.set_jumpdest_analysis_inputs(interpreter.jumpdest_table);
.set_jumpdest_analysis_inputs(interpreter.jumpdest_table.clone());

log::debug!(
"Simulated CPU for jumpdest analysis halted after {:?} cycles.",
clock
);

interpreter.generation_state.jumpdest_table
(interpreter.generation_state.jumpdest_table).map(|x| (x, jdtw))
}
}
}
Expand All @@ -114,7 +118,7 @@ pub(crate) struct ExtraSegmentData {
pub(crate) withdrawal_prover_inputs: Vec<U256>,
pub(crate) ger_prover_inputs: Vec<U256>,
pub(crate) trie_root_ptrs: TrieRootPtrs,
pub(crate) jumpdest_table: Option<HashMap<usize, Vec<usize>>>,
pub(crate) jumpdest_table: Option<JumpDestTableProcessed>,
pub(crate) accounts: BTreeMap<U256, usize>,
pub(crate) storage: BTreeMap<(U256, U256), usize>,
pub(crate) next_txn_index: usize,
Expand Down Expand Up @@ -150,6 +154,59 @@ pub(crate) fn set_registers_and_run<F: RichField>(
interpreter.run()
}

/// Computes the JUMPDEST proofs for each context.
///
/// # Arguments
///
/// - `jumpdest_table_rpc`: The raw table received from RPC.
/// - `code_db`: The corresponding database of contract code used in the trace.
pub(crate) fn set_jumpdest_analysis_inputs_rpc(
jumpdest_table_rpc: &JumpDestTableWitness,
code_map: &HashMap<H256, Vec<u8>>,
) -> JumpDestTableProcessed {
let ctx_proofs = (*jumpdest_table_rpc)
.iter()
.flat_map(|(code_addr, ctx_jumpdests)| {
let code = if code_map.contains_key(code_addr) {
&code_map[code_addr]
} else {
&vec![]
};
trace!(
"code: {:?}, code_addr: {:?} <============",
&code,
&code_addr
);
trace!("code_map: {:?}", &code_map);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to keep this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I want to keep all tracing until we are happy with the computed jumpdest tables. It doesn't takes longer to re-add it than to remove it.

prove_context_jumpdests(code, ctx_jumpdests)
})
.collect();
JumpDestTableProcessed::new(ctx_proofs)
}

/// Orchestrates the proving of all contexts in a specific bytecode.
///
/// # Arguments
///
/// - `ctx_jumpdests`: Map from `ctx` to its list of offsets to reached
/// `JUMPDEST`s.
einar-polygon marked this conversation as resolved.
Show resolved Hide resolved
/// - `code`: The bytecode for the contexts. This is the same for all contexts.
fn prove_context_jumpdests(
code: &[u8],
ctx_jumpdests: &ContextJumpDests,
) -> HashMap<usize, Vec<usize>> {
ctx_jumpdests
.0
.iter()
.map(|(&ctx, jumpdests)| {
let proofs = jumpdests.last().map_or(Vec::default(), |&largest_address| {
get_proofs_and_jumpdests(code, largest_address, jumpdests.clone())
});
(ctx, proofs)
})
.collect()
}

impl<F: RichField> Interpreter<F> {
/// Returns an instance of `Interpreter` given `GenerationInputs`, and
/// assuming we are initializing with the `KERNEL` code.
Expand Down
2 changes: 2 additions & 0 deletions evm_arithmetization/src/cpu/kernel/tests/add11.rs
Original file line number Diff line number Diff line change
Expand Up @@ -194,6 +194,7 @@ fn test_add11_yml() {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: Default::default(),
einar-polygon marked this conversation as resolved.
Show resolved Hide resolved
};

let initial_stack = vec![];
Expand Down Expand Up @@ -371,6 +372,7 @@ fn test_add11_yml_with_exception() {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: Default::default(),
};

let initial_stack = vec![];
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,14 @@ use plonky2::hash::hash_types::RichField;
use crate::cpu::kernel::aggregator::KERNEL;
use crate::cpu::kernel::interpreter::Interpreter;
use crate::cpu::kernel::opcodes::{get_opcode, get_push_opcode};
use crate::generation::jumpdest::JumpDestTableProcessed;
use crate::memory::segments::Segment;
use crate::witness::memory::MemoryAddress;
use crate::witness::operation::CONTEXT_SCALING_FACTOR;

impl<F: RichField> Interpreter<F> {
pub(crate) fn set_jumpdest_analysis_inputs(&mut self, jumps: HashMap<usize, BTreeSet<usize>>) {
self.generation_state.set_jumpdest_analysis_inputs(jumps);
let _ = self.generation_state.set_jumpdest_analysis_inputs(jumps);
}

pub(crate) fn get_jumpdest_bit(&self, offset: usize) -> U256 {
Expand Down Expand Up @@ -106,7 +107,10 @@ fn test_jumpdest_analysis() -> Result<()> {
interpreter.generation_state.jumpdest_table,
// Context 3 has jumpdest 1, 5, 7. All have proof 0 and hence
// the list [proof_0, jumpdest_0, ... ] is [0, 1, 0, 5, 0, 7, 8, 40]
Some(HashMap::from([(3, vec![0, 1, 0, 5, 0, 7, 8, 40])]))
Some(JumpDestTableProcessed::new(HashMap::from([(
3,
vec![0, 1, 0, 5, 0, 7, 8, 40]
)])))
);

// Run jumpdest analysis with context = 3
Expand All @@ -123,14 +127,14 @@ fn test_jumpdest_analysis() -> Result<()> {

// We need to manually pop the jumpdest_table and push its value on the top of
// the stack
interpreter
(*interpreter
.generation_state
.jumpdest_table
.as_mut()
.unwrap()
.get_mut(&CONTEXT)
.unwrap()
.pop();
.unwrap())
.get_mut(&CONTEXT)
.unwrap()
.pop();
einar-polygon marked this conversation as resolved.
Show resolved Hide resolved
interpreter
.push(41.into())
.expect("The stack should not overflow");
Expand Down Expand Up @@ -175,7 +179,9 @@ fn test_packed_verification() -> Result<()> {
let mut interpreter: Interpreter<F> =
Interpreter::new(write_table_if_jumpdest, initial_stack.clone(), None);
interpreter.set_code(CONTEXT, code.clone());
interpreter.generation_state.jumpdest_table = Some(HashMap::from([(3, vec![1, 33])]));
interpreter.generation_state.jumpdest_table = Some(JumpDestTableProcessed::new(HashMap::from(
[(3, vec![1, 33])],
)));

interpreter.run()?;

Expand All @@ -188,7 +194,9 @@ fn test_packed_verification() -> Result<()> {
let mut interpreter: Interpreter<F> =
Interpreter::new(write_table_if_jumpdest, initial_stack.clone(), None);
interpreter.set_code(CONTEXT, code.clone());
interpreter.generation_state.jumpdest_table = Some(HashMap::from([(3, vec![1, 33])]));
interpreter.generation_state.jumpdest_table = Some(JumpDestTableProcessed::new(
HashMap::from([(3, vec![1, 33])]),
));

assert!(interpreter.run().is_err());

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ fn test_init_exc_stop() {
cur_hash: H256::default(),
},
ger_data: None,
jumpdest_table: Default::default(),
};
let initial_stack = vec![];
let initial_offset = KERNEL.global_labels["init"];
Expand Down
Loading