Man in suite behind a laptop managing integrated processes.

Guest blog by James Wakefield, EPM Practice Lead and Carrick Mortimer, Servian Pty Ltd. 

When it comes to data integration for your Enterprise Performance Management (EPM), it's important to ensure you have: (1) Trust in the process of sourcing data. That means you need reliable and secure integration to source systems, and (2) Automation of data flows to support your EPM solution with the flexibility, so business users can maintain the automation without being a technical guru Most modern ETL (Extract, Transform, Load) solutions provide very technical solutions that allow you to automate data integration, but they tend to be very technical solutions that business users would be challenging to navigate.

Over the last 20 years many organisations have tried to come up with a simple solution that allows a business user to toggle on and off certain data loads at certain days and times of the month. Invariably, it will end up being some form of a text file that contains the configurations for an ELT tool to use. A configuration text file to control batch automation is like a Swiss army knife, it can be very flexible, but comes with risks. The risks include no auditability, data validation, or workflow.. The irony being if you invest in an EPM solution to replace Excel, then it is inadvisable to  use a bailing string and twine approach for robust integration.

Recently, the Servian team worked with a forward thinking client who had a requirement to manage and control  an automation “Runbook” that lived within Anaplan. This Runbook  could be used to control the importation of data from a variety of sources including Snowflake, text files, SQL Server, files in AWS S3 buckets, and Microsoft Dynamics CRM; and tie these data sources to a particular combination of Anaplan Workspace, Model, Process, Import Action, and Import Data Source. To support this requirement, an Anaplan Module would house the Runbook configuration data that would in turn drive and coordinate the load of data into the respective systems and execute the respective Anaplan Actions and Processes. 

Housing the Runbook in an Anaplan Module naturally provides data validation, auditability, ease of use and security. This solves half the puzzle and to solve the other half and execute the required integration and automation operations, a technology known as OneCloud is up to the task. 

In OneCloud, a set of reusable OneCloud Chains can be deployed that are intuitive, business friendly, and code-free. These Chains execute the configuration instructions housed in the Anaplan Runbook. In addition to automating the tasks, OneCloud has pre-built drag & drop connectors for a range of technologies including systems of record, performance management, and business intelligence that co-exist on-premises and in the cloud.  Given that OneCloud is a software as a service, it is also easy to get setup fast and get quick ROI.

The How to Guide

Designing the Runbook

The following is an outline of an example Runbook pattern built as an example to showcase in this article. The Runbook will be managed in Anaplan then exported in the OneCloud Chain and subsequently used as the configuration for all the downstream data integration.

First of all we will build an Anaplan Module to house the Runbook configuration. 


For the Runbook, we create a Module with two dimensions, an Index dimension (or numbered list in Anaplan) and Measures. The measures used are just an example, and the beauty of this design pattern is how extensible it is to your specific data integration needs.

Note that some of the line items are set to be of a specific list format i.e. “Data_Integration_via” line item is set to Data_Integration_via format as we have built a list that contains the applicable options and acts as a data validation. This is especially useful for “Anaplan_Workspace_ID” and “Anaplan_Model_ID” since we can force the user to only input valid IDs and then a lookup formula will display the Name for validation.


Each line item (column) represents a value that we want to use to parametrise commands in OneCloud. 

Here is an overview of their use:

Line Item (Column)



Open text field to provide description to the user on what the function is. Here also is used as the name then for the numbered list item.


Drop down for users to say integration type so OneCloud knows the corresponding Chain logic to run.


Text field to house the SQL statement for the source system.


Dropdown so the user only chooses the correct ID.


Lookup of friendly names.


Drop down so the user only chooses the correct ID.


Lookup of friendly names.

Add Row Numbers?

Optional ability to give a unique row number to each row in datasource if your source system has no id.


The name of the file you want the result to be in Anaplan.


Typically a data file will have one header record.


Anaplan specific setting where you can set the chunk size so large data loads are more efficient. 


Name of the Anaplan Process to execute after the data has been uploaded. 


Allows the user to input any number that will then be used to sort the rows by the order in which they should run.


Notify users by email on success or errors.


The most important. Here it is formatted as boolean so a user can toggle whether that line will be included in the next run.


The Runbook has been setup to cover:

    • Loading of Snowflake tables to Anaplan

    • Loading of Salesforce object tables to Anaplan

    • Execution of Anaplan Processes to load the data to lists and modules 

You could configure the Runbook to use any OneCloud Bizapp (datasource) found here:

Designing the OneCloud Chains

Within a OneCloud Workspace, the following four Chains were created to execute the Runbook. OneCloud Chains can call other OneCloud Chains and pass variables/files between them. In turn, this makes it easy to create reusable Chains.


Chains can be scheduled, executed manually via a user, or even executed via a REST API from other Enterprise Schedulers that may be deployed in the organization such as Control-M or AutoSys. 

To keep this example as simple as possible, there is a Master chain “_Runbook”.


This Chain runs an Anaplan Export Action to get the Runbook configuration values.


Note the variables applied here are specifically for this Chain i.e. the Anaplan Workspace and Model IDs are set at the Chain level so you only have to update these values in one place.

When the Chain is run, this is the type of result you would see. Note the 12 rows exported is the direct output of the Anaplan Module.


In order for the Runbook to execute the correct OneCloud Commands, we have to filter the Runbook export based upon the “Data_Integration_via” column. To support this, OneCloud has a Command that allows the execution of a SQL statement against a text file. In this way, we can filter the Runbook text file and only return the rows where Data_Integration_via = “Snowflake”.  In this example, we leverage the Output of the Anaplan Command and apply the filter in a subsequent Command.


Here is the result of the command when run. Note we now have 6 rows.


Once we have the filtered rows, we can then call a subsequent OneCloud Chain that will specifically run commands for loading Snowflake to Anaplan. In the example below, we are taking the output of the “Get Snowflake” command and passing it to the chain “Run Snowflake to Anaplan”.


The “Run Snowflake to Anaplan” chain has the following Commands. Note that any source system that accepts SQL will follow a similar pattern.


The Runtime Inputs command just allows the OneCloud Chain to receive a file upon start. In this case, we have it marked as “Required”.


The following OneCloud Command, CSV to JSON,  converts the filtered Runbook tabular output to a JSON (Java Script Object Notation) list. The reason why we do this is because JSON is a cleaner structure to loop list elements and retrieve key value pairs versus parsing tabular data. 


The next two commands are specifically put together as a group because we need to loop over the JSON list. This group will retrieve each element  in the JSON list and apply the correct key/value pairs for the respective Command. For example, the correct Snowflake SQL Query will be applied and directly uploaded to Anaplan in combination with the associated Anaplan Import Data Source and Import Action. By organizing our OneCloud Commands like, this we have a logical serial order in which to process everything. Note in the example below, we have selected to iterate over the JSON file.




When OneCloud Chain is run, the six loops are executed (one per row) and the corresponding values are used when executing the Commands.  A SQL query is executed against Snowflake with a data set being the result/output.


The Anaplan Upload is the subsequent Command in the loop.  This Command uploads the Snowflake query Output from the prior Command which is controlled using the configuration detail of the Runbook.


The following is an example of the results with six loops uploading the Snowflake queries directly to Anaplan.


Meanwhile in parallel we have the Salesforce Commands also running on a separate branch in the Chain These Commands follow the same concepts of filtering the Runbook rows to focus on the associated  Salesforce queries,  and upload the output to the corresponding Anaplan Import Data Source.


The Salesforce chain looks and operates almost identically to the Snowflake chain but just used the Salesforce connection instead. 


Since we can query Salesforce with SOQL (which is pretty much just SQL for Salesforce), we can execute the SQL for each command by grabbing the SQL command from each JSON record, running it then passing to Anaplan for upload.


Once the “Snowflake to Anaplan” and “Salesforce to Anaplan” Chains have both completed, we can then execute the “Run Anaplan Process” chain.

Again this follows the same pattern of filtering the Runbook file to just Data_Integration_Via = “Anaplan”, turning the result set to JSON and then iterating over each JSON record to grab the pair/value parameters needed to execute the Anaplan Process load.



Hopefully this provides a starting point on how business users can control and manage their own integration needs for EPM applications. There are many other features we can also tie in including email notifications, error handling, and uploading into Anaplan insightful success/failure logs. Stay tuned as we will highlight these in future posts. 

If you would like to use OneCloud then give the 30 day trial a go here:


Written by: 

James Wakefield - Servian EPM Practice Lead

Carrick Mortimer - Servian EPM Southern Region


For more information: replay the webinar "Simplify EPM Data Integration with Anaplan Run Book".