OneCloud Integration Studio is a powerful mechanism to create integrations between Enterprise systems.  OneCloud Integration Studio, however, can also be used to complete many administrative tasks and  ensure your system processes run smoothly.  In this Solution-Highlight, we will explore how to use OIS to archive content on a network file system. While in the example below, we will discuss content stored within your network and managed by a GroundRunner, the concept can, with some minor modification, easily be extended to Cloud environments.

A common requirement is the need to archive log or application files as writing entries to a single aggregate log file can negatively impact application performance.  Moreover, troubleshooting with an unwieldy log file can be difficult and inefficient  In other cases, simply archiving a series of log files can save disk space.  Regardless of the reason, OneCloud has a number of features that can fully automate this process.  

Let’s review a configured Chain.  In this Chain, we are archiving files that are older than 30 days.  We start by getting a list of all of the files in the directory.  Next, we create two directories.  First, we create a directory where a zip file containing all of the files that were archived will be stored.  The second directory is for processing.  Any files found in the directory that are more than 30 days old will be moved to this processing directory.  Once all of the files have been moved, the processing directory is zipped and stored in the backup directory.  This allows for file restores in the event that historical information is needed in the future.  Finally, the processing directory is removed.  This action ensures that the original versions of the files are deleted and disk space is recovered. 

Configured Chain

Let’s look at each of the nodes of this Chain to understand the configuration.

List Directory Contents

We use the List Directory Command from the File Utilities BizApp to get the list of all of the files and subdirectories in the directory that needs to be cleaned.  We utilize a Workspace Variable to maintain the directory path.  

Step 1 - List Directory

Make Snapshot and Processing Directory

To make the snapshot and processing directories, we use the Make Directory Command from the File Utilities BizApp.  Notice the path creates a subdirectory within the directory specified in the Purge Directory Workspace Variable.

Step 2 - Make Snapshot Directory

Step 3 - Make Processing Directory

Additionally, when configuring the Make Directory Command, we leverage the built-in error handling of OneCloud Integration Studio and specify to ignore any errors caused by the directories that already exist.  This allows us to bypass the need to check if a directory exists before creating it.

Ignore Error - Make Directory

Command Group

Next, we configure the Command Group.  Iteration is enabled for the Command Group to process each of the files found by the List Directory Command.

Command Group

Within the Command Group, we have three nodes - Get File/Folder Info Command to retrieve file metadata, a Conditional to check if the file is more than 30 days old, and a Move File Command to move the file to the processing folder if it is more than 30 days old.

Get File/Folder Info

The Get File/Folder Info Command from the File Utilities BizApp is used to retrieve the metadata of each file in the directory that is being swept.  Metadata includes information such as file creation date and file size.  The Output of this Command is used in the following Conditional to determine if a file is older than 30 days and therefore needs to be archived.

Get File-Folder Info

Check if older than 30 days

This Conditional compares the file modification date against the run date of the Chain.  If the modification date is greater than 30 days compared to the system date and the item being evaluated is a file and not a directory, then the Conditional state will be success and the next Command in the Group will execute.


To achieve this outcome, we leverage Variable Transformation on Runtime Variable for System Date and Time.  First, we subtract 30 days from the current date and time to create a reference date.


Any file with a modification date before (less than) this date is more than 30 days old and can be flagged for archiving.

System-File Date

The second rule in the Conditional checks to ensure that a file, as opposed to a subdirectory,  is being processed.  We have limited the logic in this Conditional to only apply to files as archiving an entire subdirectory could have broader impacts.

Conditional Directory v File

File v Directory

If the Conditional has a success state, this means that the item in the iteration is a file that is more than 30 days old.  The next Command in the Group is then invoked.

Move to Processing Folder

The Move Command from the File Utilities BizApp is used to move the file from the directory that is being swept to the processing directory.  Each file in the list of items being iterated that matches the conditions will be moved as part of the Group iteration.

Move File

Zip Processing Directory

At the end of the iteration, the Zip Command from the File Utilities BIzApp is used to zip the processing directory and save it as a snapshot.  The zipped archive that is created is date and time stamped to clearly identify when it was created. 


Delete Processing Directory

Finally, the Delete Directory Command from the File Utilities BizApp is used to delete the processing directory and all of the files that were moved into the directory from the Command Group iteration.  We select the Recursive option to ensure that all files as well as the zProcessing directory are deleted.



The below shows the resulting zip archive created in the Archive_Snapshots directory.


We hope that you find this solution-highlight helpful.  We encourage you to modify the process as is needed to support data archiving as well as data retention policies for your organization.  We welcome any comments or questions.  If you need assistance configuring this solution, please don’t hesitate to contact us at