Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Encode Execution Engine Client Version In Graffiti #5290

Merged
merged 9 commits into from
Apr 24, 2024

Conversation

ethDreamer
Copy link
Member

Issue Addressed

Additional Info

All logic related to the choice of graffiti in the beacon node has been moved into:

beacon_node/beacon_chain/src/graffiti_calculator.rs

The graffiti field in the BeaconChain object has been replaced by an instance of GraffitiCalculator. The correct graffiti is obtained by calling the following method on chain.graffiti_calculator:

    /// Returns the appropriate graffiti to use for block production, prioritizing
    /// sources in the following order:
    /// 1. Graffiti specified by the validator client.
    /// 2. Graffiti specified by the user via beacon node CLI options.
    /// 3. The EL & CL client version string, applicable when the EL supports version specification.
    /// 4. The default lighthouse version string, used if the EL lacks version specification support.
    pub async fn get_graffiti(&self, validator_graffiti: Option<Graffiti>) -> Graffiti

@ethDreamer ethDreamer changed the title El client version graffiti Encode Execution Engine Client Version In Graffiti Feb 25, 2024
@ethDreamer ethDreamer added the ready-for-review The code is ready for review label Feb 25, 2024
Copy link
Member

@macladson macladson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Just had a few comments:

beacon_node/execution_layer/src/engine_api.rs Show resolved Hide resolved
beacon_node/client/src/builder.rs Show resolved Hide resolved
beacon_node/execution_layer/src/lib.rs Show resolved Hide resolved
beacon_node/execution_layer/src/engine_api.rs Show resolved Hide resolved
Copy link
Member

@macladson macladson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

}
}

pub fn start_engine_version_cache_refresh_service<T: BeaconChainTypes>(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure why we need to maintain a service to keep updating the version.
Why not instead do it once on startup, then do it in BeaconChain::prepare_beacon_proposer. Updating something that doesn't change repeatedly seems quite wasteful to me

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could do it there, but then it would happen right in window between notifying the EL that there's a proposal & actually proposing the block, which is a critical time. The whole reason I wrote it this way to avoid extra calls to the EL within that critical time.

Presumably we could have a service that only caches this result when we detect that we are connected to a validator which will have a proposal within the next epoch, but that code would surely be more complex that this service.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think that making a single call to an EL that returns a pre set value should be very intensive, but your point is fair. Better to not cram more calls than necessary just before a proposal.

}

// this service should run 3/8 of the way through the epoch
let epoch_delay = (epoch_duration * 3) / 8;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why at 3/8?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are other scheduled tasks that take place at certain times in the epoch so I just picked a time that wouldn't overlap with them

Copy link
Member

@pawanjay176 pawanjay176 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested this out with geth and it works as expected.
LGTM.

@realbigsean
Copy link
Member

@mergify queue

Copy link

mergify bot commented Apr 22, 2024

queue

🛑 The pull request has been removed from the queue default

The queue conditions cannot be satisfied due to failing checks.

You can take a look at Queue: Embarked in merge queue check runs for more details.

In case of a failure due to a flaky test, you should first retrigger the CI.
Then, re-embark the pull request into the merge queue by posting the comment
@mergifyio refresh on the pull request.

mergify bot added a commit that referenced this pull request Apr 22, 2024
@realbigsean
Copy link
Member

@mergify requeue

Copy link

mergify bot commented Apr 22, 2024

requeue

✅ This pull request will be re-embarked automatically

The followup queue command will be automatically executed to re-embark the pull request

Copy link

mergify bot commented Apr 22, 2024

queue

🛑 The pull request has been removed from the queue default

The queue conditions cannot be satisfied due to failing checks.

You can take a look at Queue: Embarked in merge queue check runs for more details.

In case of a failure due to a flaky test, you should first retrigger the CI.
Then, re-embark the pull request into the merge queue by posting the comment
@mergifyio refresh on the pull request.

mergify bot added a commit that referenced this pull request Apr 22, 2024
@realbigsean
Copy link
Member

@mergify requeue

Copy link

mergify bot commented Apr 22, 2024

requeue

✅ This pull request will be re-embarked automatically

The followup queue command will be automatically executed to re-embark the pull request

Copy link

mergify bot commented Apr 22, 2024

queue

🛑 The pull request has been removed from the queue default

The queue conditions cannot be satisfied due to failing checks.

You can take a look at Queue: Embarked in merge queue check runs for more details.

In case of a failure due to a flaky test, you should first retrigger the CI.
Then, re-embark the pull request into the merge queue by posting the comment
@mergifyio refresh on the pull request.

mergify bot added a commit that referenced this pull request Apr 22, 2024
@realbigsean realbigsean added ready-for-merge This PR is ready to merge. and removed ready-for-review The code is ready for review labels Apr 22, 2024
# Conflicts:
#	beacon_node/beacon_chain/Cargo.toml
@jimmygchen
Copy link
Member

@mergify queue

Copy link

mergify bot commented Apr 24, 2024

queue

✅ The pull request has been merged automatically

The pull request has been merged automatically at 4a48d7b

mergify bot added a commit that referenced this pull request Apr 24, 2024
@mergify mergify bot merged commit 4a48d7b into sigp:unstable Apr 24, 2024
27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready-for-merge This PR is ready to merge. v5.2.0 Q2 2024
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants