You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I couldn't find specific information in the Cosmos documentation about handling task dependencies to trigger a job only when the previous one fails. However, in Apache Airflow, you can achieve this by using the TriggerRule class. You can set the trigger_rule parameter of a task to TriggerRule.ONE_FAILED to ensure it runs only if the previous task fails.
Here's a basic example of how you might set this up in an Airflow DAG:
In this example, task_2 will only execute if task_1 fails. You can apply similar logic to your dbt tasks within the Cosmos integration by setting the trigger_rule for the relevant tasks.
Hi,
I want code for batch job executing and its triggers when the previous job is fails, in the integration of dbt cosmos and airflow
The text was updated successfully, but these errors were encountered: