The Automic Automation integration with Apache Airflow enables users to include airflow DAG executions into their enterprise business processes.
Integration with domain-specific automation tools such as airflow provides end-to-end visibility and control across traditional on-premises workloads and modern born-in-the-cloud workloads. In this short demo, you will see how Atomic Automation supports Google Cloud Composer - the Google Cloud managed service for Apache Airflow - to execute, monitor, and audit airflow jobs.
First, a quick look at our Google Cloud Composer environment showing the list of available DAGs or workflows. In the Atomic web interface you can add an object, search for Google Cloud Composer, and select the “Run_DAG Job.”
Now we need to enter location, enter the name of our environment, and now I can enter a DAG ID or if I want to browse, I have to assign this object to the appropriate Google Cloud Composer agent that we have running by selecting the Google Cloud Composer agent - agent TCC.
I go back now and I can browse for this list, which is the exact same list as we've seen earlier by looking at airflow itself. Let's choose this example DAG; save and execute.
We have now switched to the Process Monitoring view. As you can see, the job is active and running. In the Airflow UI we can see the same note that the Run id has Atomic prepended to the Run id. Once completed, all logs for this DAG job run are available from within the Atomic web interface.
In summary, Atomic Automation provides a single pane of glass, enabling visibility and control for end-to-end business processes that include workloads running on Airflow.