<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1110556&amp;fmt=gif">
Skip to content
    Academy_Hero-BG-2
    Video

    Automic Automation Cloud Integrations: Google Cloud Batch Agent Integration

    Broadcom's Google Cloud Batch Automation Agent enables you to easily schedule, queue, and execute batch processing workloads on Google Cloud resources. This video explains the Automic Automation Google Cloud Batch agent integration and its benefits. It presents its components and demonstrates how to install, configure, and use it.

    ESD_FY23_Academy.Automic Logo.White

     

    Video Transcript

    Welcome to this video on the Automic Automation Google Cloud Batch integration solution. In this video, we will explain the Google Cloud Batch integration and what it brings to the Automic Automation user community. Google Cloud Batch is a fully managed service that lets you schedule, cue, and execute batch processing workloads on Google Cloud resources. For example, consider using batch for high performance computing (HPC), machine learning (ML), and data processing workloads. Batch provisions resources and manages capacity on your behalf, allowing your batch workloads to run at scale.

    Integrating Automic Automation with Google Cloud Batch allows you to run Google Cloud Batch create jobs in your workspace from Automic Automation. We'll provide some technical insights so that the integration components are clearly identified and the deployment sequence is understood. We'll focus on the configuration of the agent and the design of the two core object templates: connections and jobs. Finally, we'll run through a demo.

    Automic Automation plays a central role in orchestrating operations across multiple environments, including the cloud. Automic Automation synchronizes these processes with other non-cloud operations. By integrating Google Cloud Batch, we can configure process automation centrally in Automic Automation and then trigger, monitor, and supervise everything in one place. Google Cloud Batch processes can then be synchronized with all other environments routinely supported by Automic Automation. Google Cloud Batch's role is reduced to execute the jobs. All other functions, especially those about automation, are delegated to Automic Automation. This means that we don't have to log in to the Google Cloud Batch environment and keep on refreshing it by ourselves. Automic Automation handles all the execution and monitoring aspects. The Automic Automation integration provides a simplified view to run jobs in Google Cloud Batch. Automic Automation lets us build configurations with intuitive interfaces like drag and drop workflows and supervised processes in simple dashboard tools designed natively for operations. Statuses are color-coded, and retrieving logs is done with a basic right-click.

    From an operations perspective, Automic Automation highly simplifies the configuration and orchestration of Google Cloud Batch jobs. Externalizing operations to a tool with a high degree of third-party integration means we can synchronize all cloud with non-cloud workload using various agents and job object types. We can build sophisticated configurations involving multiple applications, database packages, system processes like backups and data consolidation, file transfers, web services, and other on-premises workloads.

    A conventional architecture involves two systems: the Automic Automation host and a dedicated system for the agent. The agent is configured with a simple INI file containing standard values: system agent name, connection, and TLS. When we start the agent, it connects to the engine and it adds two new objects to the repository: a connection object to store the Google Cloud Batch endpoint and login data, and a job template designed to trigger Google Cloud Batch jobs. Let's assume we're automating for four instances of Google Cloud Batch. We create a connection object in Automic Automation for each instance by duplicating the con template for each of these instances. Lastly, we create a Google Cloud Batch job in Automic Automation for each corresponding process in Google Cloud Batch. The Automic Automation jobs include the connection object based on the target system. When we execute the jobs in Automic Automation, it triggers the corresponding process in Google Cloud Batch. We're able to retrieve the successive statuses, supervise the child processes in the cloud, and finally generate a job report. In Automic Automation, these jobs can be incorporated in workflows and integrated with other non-cloud processes.

    The procedure to deploy the Google Cloud Batch integration is as follows: First, we download the integration package from the marketplace. This package contains all the necessary elements. We unzip this package which produces a directory containing the agent, the INI configuration files, and several other items like the start command. We use the appropriate INI file for our specific platform. Google Cloud Batch is a standard Automic agent. It requires at least four values to be updated: agent name, Automic system, JCP connection and TLS port, and finally TLS certificate. When the agent is configured, we start it. New object templates are deployed. We create a connection to every Google Cloud Batch instance we need to support. For this, we use the template con object which we duplicate as many times as we need. The con object references the Google Cloud Batch endpoint. Finally, we use the Google Cloud Batch template job to create the jobs we need. We match these Automic Automation jobs to the Google Batch jobs, reference the connection object, and run them. We're able to supervise the jobs and their children, generate logs, and retrieve the statuses. The jobs can then be incorporated into non-cloud workflows.

    We install, configure, and start an agent to deploy the Google Cloud Batch integration. The agent is included in the Google Cloud Batch package which we download from marketplace. We unzip the package which creates a file system agents/batch/bin that contains the agent files. Based on the platform, we rename the agent configuration file UCXJCITX and set a minimum of four values: the agent name, the AE system name, the host name and port connection to the automation engine's JCP, and finally the directory containing the TLS certificate. Finally, we start the agent by invoking the JR file via the Java command. The agent connects to the AE and deploys the object templates needed to support the integration: the con or connection object and the Google Cloud Batch create job.

    In our demo, we will create a connection object. Once we have established the connection to the Google Cloud Batch environment, we'll create a Google Cloud Batch job. Finally, we'll execute and supervise the job. Let's examine the Google Cloud Batch environment before we look into the Automic system. Here you see Google Cloud Console that provides an overview of the jobs we have already created. We are using the specific DO01 autoz project for the demo. All jobs that run through Automic will appear in the console under this project ID regardless of whether they succeed or fail.

    Let's move on to the Automic system. Here we create connection and job objects with specific inputs that allow us to connect to Google Cloud Batch. If we open a connection object, we must enter the most important field, the endpoint, which is the URL of the Google Cloud Batch environment. Next, we specify an authentication type from the drop-down menu. Two types are supported: VM metadata instance which requires a service account email for authentication, like for example test@gmail.com. For this demo, we use the other option, service account key, where we must provide the input type for the service account. We can again select if we want to provide the input type through JSON, which requires us to enter the JSON payload definition, or through file path, where we enter the file path to the JSON file that contains the authentication information. We select JSON here and directly paste the JSON content from the GCP console. If you are using a proxy in your environment, you can specify the proxy host name, port, username, and password in the proxy section.

    Once the connection object is defined and saved, you can create a Google Cloud Batch create job. Here, start by opening the attributes page and make sure you select the agent that corresponds to the agent name you configured in the INI file. Now go back to the create job page and in connection select the connection object we have just created to connect to the GCP batch instance. It holds the necessary information to connect to the Google Cloud Batch environment. Following the connection input, the job requires the project ID which is the ID of the Google Cloud project. We use the one we have already seen in the GCP Batch console. Optionally, we can define the location details which is the region where our Google Batch resources are hosted.

    The next field is the task details field where you can choose between the JSON file path field where you can define the path to a JSON file containing the task attributes that we want to pass to the application. Make sure that this file is accessible from the agent machine. The JSON field on the other hand allows us to enter the JSON payload definition which we can directly paste from the GCP console or elsewhere into the text field. For this demo, we use a JSON payload. Once all the parameters are set, save and execute the job.

    Let's switch to the executions view. The details pane shows the remote status field which tracks the job status in the target environment. In our case, the execution was successful. The details panel also shows an object variable called and jose hashtag. It helps us clearly identify the specific job execution which we can use for further analysis.

    Let's have a look at the reports. The first report contains all the information that the target system sends to Automic Automation when the job has been executed. This information is presented in a structured output including job status, execution details, and results in JSON format. In other words, the information is presented in a way that is easy to break down and analyze for further use within Automic Automation. Let's check the file status on the Google Cloud Batch instance. If you go to the job list, you'll see the job that we have triggered and it has ended with the status succeeded.

    The next log type is the P log type agent log which shows all the agent's actions step by step. It lists all the parameters used to run the job and contains the response received from the target system. In our case, once the status reached the job status succeeded, Automic Automation concluded monitoring and marked the job as successfully completed. That wraps up the demo on how Automic Automation can integrate with Google Cloud Batch to run, execute, and monitor jobs. Thank you for watching this video.


    abstract ai

     

    Note: This transcript was generated with the assistance of an artificial intelligence language model. While we strive for accuracy and quality, please note that the transcription may not be entirely error-free. We recommend independently verifying the content and consulting with a product expert for specific advice or information. We do not assume any responsibility or liability for the use or interpretation of this content.

    Want more Automic content?

    Visit the Automic homepage, where you'll find blogs, videos, courses, and more!