<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1110556&amp;fmt=gif">
April 30, 2021

How to Implement Automic Automation Processing of Real-Time Serverless Events

by: Tobias Stanzel

Before You Start

The content you will require is held in a Github repository, you can access it here.

It contains:

  1. Implementation Guide (this document)
  2. An Automic extract of the Automation Engine objects you will require
  3. Source for the Lambda function itself

To create this in your own environment you will need:

  • S3 bucket that you will use to upload files
  • KMS to create an encryption key and assign the Lambda function role as a user
  • Lambda to deploy and configure the Automic Rest Api function
Automic Automation
  • Automic Automation v12.3 with REST interface accessible from the internet
Required Steps
  • Prepare Automic to be able to receive the Event via rest
  • Create/Choose an S3 Bucket
  • Create Lambda Function based on template and set it up
    • Make sure the password is encrypted
  • It's Ready

Below, I detail the individual actions you need to take.


Serverless App: Automic Automation REST API

To make it easy for you to use this integration, I published a small Function to AWS Lambda that can be easily configured to send any AWS Event in Lambda as a JSON payload to the Automic Automation REST API.


Automic Automation Engine: Objects to process S3 event

Let’s start by creating the required objects to receive the S3 JSON event and write it to a static VARA object for further processing in the Automation Engine.

You can use the provided export(link!) which contains these objects:Table of Automic objects

The workflow JOBP.AWS_LAMBDA.S3 is very simple, it uses the Promptset PRPT.AWS_LAMBDA which only has one parameter &EVENT_JSON# to receive the JSON payload Automic workflow screenshot

and one script SCRI.ADD_FILE_TO_PROCESS that will parse the JSON into AE variable Automic job source code 

and writes the file to the static Vara VARA.S3.FILES_TO_PROCESS for further processing.

The test_payload.json file contains a JSON payload for an S3 put request to a bucket that can be used to test the workflow.


AWS S3 bucket

Choose a region where you want to create your example, here I am using EU Frankfurt (eu-central-1), and make sure you create all required resources in this region.

First, we need to create an S3 bucket which we will use for this demo.
I called it “automic-aws-lambda-example” if you choose another name, make sure to also replace this in the Lambda function we will create
AWS S3 Screenshot


AWS Lambda Function and Encryption Key

Next, we will create an AWS Lambda function, choose “Browse serverless app repository” and search for “automic”

AWS Lambda function screenshot Select autiomic-rest-api and “Deploy” to create the function

In order to use the encryption for the password field, we need to create or use an encryption key in the Key Management Service (KMS)

AWS Key Management Service

Use “Symetric” and click “Next”

AWS Key Management Service labelsAWS Key Management Service - Define key administrative permissionsAWS Key Management Service - Define key usage permissions

Now we need to choose administrative and usage permissions.
AWS created a dedicated account for our Lambda function, so we need to assign that here in step 3 and 4. The name of the role will start with “serverlessrepo-automic-rest-api-”

AWS Key Management Service - Review and edit key policy

Create the key by clicking “Finish”, you should see something like this AWS Key Management Service - Customer managed keys

You can go back to your AWS Lambda function now and navigate to “Latest Configuration -> Environment” and Edit the environment variables.

AWS Lambda Functions - Edit environment variables

Choose “Enable helpers for encryption in transit”, enter the password and only on the password field press “Encrypt” and select the key we just createdAWS Lambda Functions - Encryption in transit

Now the password is encrypted and not in plain text anymore

AWS Lambda Functions - Edit environment variables

Save the Environment variables.
Now we only need to configure the trigger for this function.
Navigate to “Latest configuration -> Trigger” and click “Add trigger”

AWS Lambda Functions - Add trigger

Choose “S3”

AWS Lambda Add trigger

And now select the S3 bucket we created before, I did not further restrict the event definition but if you want you can do so according to your use case.

AWS Lambda - Trigger configuration

AWS Lambda - Recursive invocation

Good hint by AWS ;) Don’t write to the same S3 bucket or you will end up in the recursive hell and consume a lot of compute resources!

As soon as the trigger is active you will get a REST call to the Automic REST interface for every file that is uploaded to the S3 Bucket.

Uploaded file to S3

AWS Lambda Functions - Object overview

The workflow triggered in the Automic Automation Engine

Automic Automation workflow that is triggered

You can see the mapped properties from the S3 event in the workflow!