Automated file backups to Amazon S3

This sample creates a service that will automatically back up a file to Amazon S3 if the file is located in a specified directory and if any changes are made to the file.

To achieve this, our Linx solution will use the Amazon S3 plugin and the DirectoryWatch service contained in the File plugin.


  • AWS user credentials and permissions have been set up on IAM to allow the user to perform the S3 PutObject function.

  • An Amazon S3 bucket to where files can be backed up has already been created.


To run the sample

  1. Download the sample file (

  2. Open the BackupFiles solution with Linx Designer.

  3. Click the Settings button on the toolbar.

  4. On the Settings tab enter the AWS credentials of a user that has been registered on AWS IAM and has the required permissions for accessing S3.

    Example (where the values of the mandatory variables are presented by X’s):

    _{"Key":"XXXXXXXXXXXXX", "KeySecret":"XXXXXXXXXXXXXXXXX", "RegionEndpoint":"XXXXXXXXXX", "UserAccountNumber":"999999999999", "UserName":"XXXXXXXXXX"}_
  5. In the Properties section for the DirectoryWatch service, note the Path value: c:\temp\backup - update this value to a suitable path, or create the c:\temp\backup directory on your machine.

  6. Follow the steps provided to recreate the end-to-end implementation of the sample.

Creating the sample

The following steps can be used to re-create the solution used in this sample, or to use as a guide when deploying and running the sample:

  1. Create a Solution, Project, and Process with descriptive names.

  2. Add the following Plugins to your Solution:

    • Amazon S3
    • File
  3. From the Amazon S3 plugin, add the PutObject function to your process, and rename it to UploadFile.

  4. In the Solution Explorer section, click on UploadFile to display the properties for the process; then do the following:

    a. Open the editor for Input

    b. Add an input field with a Name of "MyFilePath", select "String" for the Type and leave the Value blank.

  5. Click on the UploadFile function in the main canvas, then enter the following details in the Properties section:

    a. AWS Credentials of the user performing the function

    b. Bucket name – the name of the bucket to where the file will be uploaded

    c. File path – the local path of the file to upload (in this example, select MyFilePath from the drop-down list)

  6. From the File plugin, add the DirectoryWatch Service to the Project.

  7. Enter the details of the Properties associated with the DirectoryWatch Service:

    a. Notify filter – select FileName, Size, and LastWrite

    b. Watch options – Select Watch for Changes

    c. Path – indicate the path of the directory to watch

    The effect of the above settings are: if any changes occur to the name, size or date/time of the last file update of any file in the specified directory, then ChangedEvent will be triggered.

  8. Set up ChangedEvent by doing the following:

    a. Click on ChangedEvent in the Solution Explorer section

    b. Drag and drop UploadFile from the Solution Explorer section onto the main canvas – the effect of this is that, when triggered, ChangedEvent will call the UploadFile function

    c. Select “=$.Input.Data.FullPath” from the drop-down list of the FilePath property

  9. Debug the process

    For details on how to debug, please go here

  10. Deploy the Solution to Linx Server

  11. Start the Service on Linx Server by doing the following:

    a. Log in to Linx Server

    b. Select the applicable Solution

    c. Click the Start button to start the service

  12. Verify the successful backup of your files

    To check whether your service is executing as expected, do the following:

    a. From the specified directory, change a file’s content and save the file

    b. On Linx Server, check that the Event executed successfully

    c. On the AWS console (S3), check that the file that was changed was indeed backed up to the relevant S3 bucket