[Core feature] Add option to the Databricks Agent to provide a default databricks instance #6253
Open
2 tasks done
Labels
enhancement
New feature or request
Motivation: Why do you think this is important?
Today, users running Databricks tasks using the databricks webapi plugin can omit providing a databricks instance or token when registering their tasks, given that the Flyte propeller deployment contains this information by default through the its config. This works well in multi-cluster scenarios where each propeller deployment would point to a different Databricks instance, as users wouldn't need to be concerned with the specific platform details.
On the other hand, the Databricks agent requires the databricks instance to be provided by the user (link):
This results in the following scenarios:
Furthermore, considering that databricks tokens are usually created per user/instance combination, and that the databricks token is provided by the flyte agent, it makes sense to also provide the instance associated with it (unless overwritten by the workflow writers).
Goal: What should the final outcome look like, ideally?
Platform engineers should be able to provide the databricks instance to the flyte agent to use as a default. One easy implementation of this would be to use an environment variable provided through the agent chart (env variables to the agent deployment are already supported in the chart):
which could then be read like so in the agent code:
For users, this means that the migration from the databricks plugin to the databricks agent, the migration is as simple as updating the import from:
to:
given that the instance and token are setup by the platform engineers ahead of time.
Describe alternatives you've considered
N/A
Propose: Link/Inline OR Additional context
It would also be useful if this feature could be retroactively added to flytekit 1.14, as that would match the version we're currently upgrading to.
Are you sure this issue hasn't been raised already?
Have you read the Code of Conduct?
The text was updated successfully, but these errors were encountered: