Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to the authentication token-based runner creation workflow #23

Closed

Conversation

pinchartl
Copy link
Contributor

Starting in GitLab v15.10 (and enabled by default in v16.0), the runner creation uses a workflow based on authentication tokens instead of registration tokens. This introduces a new runner property named system_id that needs to be passed to the /jobs/request API.

This changes the public API of the runner class, and will thus require a new release to flag the breakage. I haven't included that in this branch as I'm not sure how you would like it to be handled. Corresponding changes for lava-gitlab-runner are on their way.

Starting in GitLab v15.10 (and enabled by default in v16.0), the runner
creation uses a workflow based on authentication tokens instead of
registration tokens. This introduces a new runner property named
system_id that needs to be passed to the /jobs/request API.

Extend the Runner::new() and Runner::new_with_layout() functions with a
new system_id parameter, and plumb it through the implementation.
Additionall update the documentation in README.md to explain the new
workflow.

Signed-off-by: Laurent Pinchart <[email protected]>
Comment on lines +65 to +67
The system ID should be a unique string. GitLab doesn't currently require any
particular formatting, but it is recommended to follow the way the official
`gitlab-runner` creates system IDs:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The crate should probably be the one creating this system_id rather then requiring the user (either end of runner implementation) to create it themselves;

In our typical setup we run these runners in kubernetes where it should really be unique per container instance.

I do wonder how/if gitlab garbage collects instances

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine creating it in the gitlab-runner-rs crate. That gets beyond my very limited rust capabilities though, would you be able to give it a go ? The logic from the official gitlab-runner can be found in https://gitlab.com/gitlab-org/gitlab-runner/-/blob/main/common/system_id_state.go?ref_type=heads#L89

According to https://docs.gitlab.com/ee/architecture/blueprints/runner_tokens/#ci_runner_machines-record-lifetime, entries in the ci_runner_machines table are automatically cleaned 7 days after the last contact from the respective runner. That page also explains how the system ID is stored in a .runner_system_id file, separate from the main runner configuration file.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the pointers; was just digging through the gitlab code and it's indeed implement as mentioned (once an hour it cleans out all stale machines so 7 days or older). Also as you mentioned gitlab doesn't require a specific format, but it will check the id is less then 64 characters

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fwiw yeah happy to add the autogeneration; should be pretty simple

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pinchartl did the generation in PR #24 as it now happens behind the scenes most of your changes were dropped; but i did pick up your documentation change as a seperate patch with you as the author :)

@sjoerdsimons
Copy link
Collaborator

as discussed closing in favour of #24

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants