You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are using dbx to deploy and launch databricks jobs. Currently we are using 11.3 LTS runtime and now planning to migrate to 13.3 LTS. The runtime version is currently configured in deployment.yml in dbx for all jobs and also in cluster policy. For easy management, we would like to get rid of the spark version parameter from deployment.yml and get it enforced using the cluster policy. But, we are facing following errors.
Current Behavior
Scenario 1: YML Version = 11.X and Policy Version = 13.X . The job failed with the following error
{ 'error_code': 'INVALID_PARAMETER_VALUE',
'message': 'Cluster validation error: Validation failed for spark_version, '
'must be 13.3.x-scala2.12 (is an element in '
'"List(11.3.x-aarch64-scala2.12, 11.3.x-scala2.12)")'}
Scenario 2: YML (Removed Spark Version Parameter ) and Policy Version = 13.x. The job failed with the following error.
We are using dbx to deploy and launch databricks jobs. Currently we are using 11.3 LTS runtime and now planning to migrate to 13.3 LTS. The runtime version is currently configured in deployment.yml in dbx for all jobs and also in cluster policy. For easy management, we would like to get rid of the spark version parameter from deployment.yml and get it enforced using the cluster policy. But, we are facing following errors.
Databricks Runtime version: 11.3 LTS
The text was updated successfully, but these errors were encountered:
Expected Behavior
We are using dbx to deploy and launch databricks jobs. Currently we are using 11.3 LTS runtime and now planning to migrate to 13.3 LTS. The runtime version is currently configured in deployment.yml in dbx for all jobs and also in cluster policy. For easy management, we would like to get rid of the spark version parameter from deployment.yml and get it enforced using the cluster policy. But, we are facing following errors.
Current Behavior
Scenario 1: YML Version = 11.X and Policy Version = 13.X . The job failed with the following error
{ 'error_code': 'INVALID_PARAMETER_VALUE',
'message': 'Cluster validation error: Validation failed for spark_version, '
'must be 13.3.x-scala2.12 (is an element in '
'"List(11.3.x-aarch64-scala2.12, 11.3.x-scala2.12)")'}
Scenario 2: YML (Removed Spark Version Parameter ) and Policy Version = 13.x. The job failed with the following error.
ValidationError: 2 validation errors for Deployment
workflows -> 0 -> Workflow -> job_clusters -> 0 -> new_cluster -> spark_version
field required (type=value_error.missing)
workflows -> 1 -> Workflow -> job_clusters -> 0 -> new_cluster -> spark_version
field required (type=value_error.missing)
ERROR during core_deployment workflow deployment (1)!
Context
We are using dbx to deploy and launch databricks jobs. Currently we are using 11.3 LTS runtime and now planning to migrate to 13.3 LTS. The runtime version is currently configured in deployment.yml in dbx for all jobs and also in cluster policy. For easy management, we would like to get rid of the spark version parameter from deployment.yml and get it enforced using the cluster policy. But, we are facing following errors.
The text was updated successfully, but these errors were encountered: