Building a file reader

Jul 14, 2022

A common task for any IT department, or individual developer, is to move data from one location to another. A simple task to do once, but if this must be done many times for many different files (each with its own metadata, file type and data structure), it is a task that can become tedious. Luckily there are many low-code tools that can make this task much more manageable. As an example, Linx allows us to import a file and make the data available elsewhere quickly.

In this post, we will look at a simple file import process with Linx and we will be making the data available via a REST Service.

Here are the sections in this post if you want to go to anything specifically:
1. Specification: A file reader
2. Getting Started
3. Loading the file
4. Building the REST API
5. Debugging the solution
6. What to do next

The specification – a File Reader: 

Before anything can be built, we need to determine what we need to build. I decided to go with a generic solution that every IT department will be sure to come across: A simple file importer that makes the data available for consumption elsewhere.

In a few minutes, I was able to create a little app that loads data from a CSV file and makes it available via REST API (As per the below GIF):

You can even host the REST Service in debug mode, making testing a breeze. Here is what you get when you call the service from a browser (using Postman will give you a much nicer testing experience; however for our purposes, a browser is fine):

Something worth noting is that Linx has moved towards a low code for developers model. This means that it is quite easy for a developer to use since it follows similar patterns and concepts as coding. One notable change in the new Linx designer is that lists now need to be initialised. All this means is that before you can assign or add a value to a list, you need to set the list to be empty. More details follow in the Creating the solution section. 

Creating the solution:

We will be using three significant elements for this solution:

  • A Custom type to store our customer data
  • A Function that will read the file and return a list of customers (one customer per row in the CSV file)
  • A SimpleRESTHost to make the data available as a JSON object 

The solution and CSV are available to download and tinker with. You can get it here

Getting started

I generated the CSV File using an online data generator. The file is quite simple, with 100 records containing Name, Surname and Number fields. 

I created my first function with the name LoadCustomers since we will be reading from the file here. A personal choice is to break down each task into its own process. I want to ensure that processes are as small as possible to make future maintenance, enhancements or debugging easier. 

I also decided to create a custom type that will store my client’s information. Think of custom types as objects. When we create a custom-type customer, we are essentially creating an object that will contain the three fields of that customer record. These custom types are handled as JSON objects. 

Loading the file

I added functionality in the LoadCustomers function to load the file. This process has its result set to be a list of customers. This means that after the process has been executed, we should be presented with a list of our 100 customer records from the file.  The process follows these steps:

  • Initializer the output list
  • Create a separate list of customers
  • Read the file
  • Add the record to the list of customers
  • Assign the list of customers to be the output of the process

To read the file, we do need to add the File plugin to our solution. This is simply done by clicking on the ADD PLUGINS button and adding the File plugin. 

Something that I really appreciate is how lazy Linx allows me to be. The TextFileRead component has a great feature that allows you to load fields from the file. If your file has headers for each field, these headers can be pulled in and will be added to the field list. If the file is small like our example, this does not really help so much but imagine working with a file that has 52 fields.

Of course, you do not have to read from a text file. You can also read from an Excel file, PDF, database, Google Drive file (such as a sheet) and more. It all depends on your requirement. 

After setting the result of my process to be the list of customers that we retrieved, we now have a process that reads the file and returns our customer data. 

Now that we have the data, we want to do something with it. Now you have a few options. You can either read this into a database, reformat it, do calculations with it and write it back to a new file, create a report from the data, email it to a recipient, make the data available via an API and more. For our example, we will be making the data available via a REST API. 

Making the data available via a REST Service

We will be using a REST service for our API, you can read more about the power of REST here

Since we want to keep the process as simple as possible, I used a SimpleRESTHost component to build my API. We need to add the REST plugin to our solution before we can use this component. 

If you want a bit more control and you are looking to create a Swagger file for your API, you can use a RESTHost component. This component will allow you to import the definition for the web service through a Swagger API description file. 

You need to set a few things on your RESTHost:

  1. BaseURI – I choose to create a Setting because it might need to change later on. Settings can be specified when the solution is hosed via the Linx Server, this way we can have different BaseURIs for each environment. Whatever you choose, the value should be “http://localhost:8080/service
  2. Operations – here you set up what operations the REST service will have, or defined otherwise, web methods. I created a customer operation that will return a list of customers in its response body. I use this event to return the data in the CSV file as a JSON object.

If you choose to use the SimpleRESTHost, Linx can generate the API documentation for you. You can choose the documentation either be generated as either Swagger or Redocly. This documentation can be accessed by calling the service and adding either /swagger or /redocly to your Base URI. 

Because we already did the work of reading from the file and outputting it as a list, all we need to do now is call the LoadCustomers function and assign its output to the Response body.

You can also choose to enhance your solution by adding security, error handling and logging.  

Debugging the solution

When ready, we can test our solution by debugging it. When clicking on the debug button the solution will be compiled. The debugger will highlight any errors that is stopping the solution from being compiled, but if there are none you are good to go. If you want to debug only file loading, you can select that function and click the debug button.

As with any good IDE, breakpoints are part of the Linx debugging process. You can add a breakpoint when in debug mode to stop the debugging process at a certain point. Do this by right-clicking on a component and selecting Add Breakpoint. This is useful when you want to review the process or view what variables or outputs are at a specific time.

To debug the REST service as we have illustrated above, select the SimpleRESTHost component and then click the debug button. Then click start. You can call the service from a browser, or Postman, by using the BaseURI that was set and the Path of the event. For our example, we use localhost:8080/service/customer.

As a bonus, here is the output of Postman when calling the REST Service:

What to do next

After we have debugged the solution and we are happy with the outcome, the solution can be deployed to a Linx server. The Linx server hosts solutions. This means that you do not need to run it on your local machine. Based on how they are set up, your solution’s processes, functions and services will also work on their own. As an example, the REST service we created will be available to be called by users. There are also other services such as timers, directory watches (if you want to monitor a directory for an event) or any other event monitor that you want to set up. 

As previously mentioned, possible enhancements to this solution include:

  • Adding Error management
    • This can be done by adding TryCatch components to catch errors
  • Logging
    • To log successful and error events
    • To log the read data into a target data container such as a database
  • Data Quality and Cleanups
  • Notifications

All of this can be done with Linx. The solution and the CSV are available for you to download