It slices! It dices! But wait! There is more! Link to heading

From Unsplash- Capar Camille

Use the right tool for the job. But it is awesome when we find a tool that can be used effectively in many different ways.

I use Node to develop solutions for my customers. Those solutions may just be an API, or integrating with a front end framework over a REST interface. But those solutions sometimes involve setting up WebSocket servers, command line scripts, one-off data manipulation tasks,creating or accessing job queues, or even a comprehensive package that does everything in a modular way.

Here is how I break down my tasks

  • Data Access. I use Knex.js for creating my data models when I am dealing with relational databases. For non-relational databases I’ll use the appropriate library. Either way I wrap the access to the data store into meaningful data objects that can be used independently from the application. This allows me to call my data items from batch files, or other applications with little effort.*

  • Business Logic. I like to isolate the application logic into a separate layer, especially for the non-trivial applications. It is handy to separate what we want to do from how we ask for it to happen, and how the data is stored. I find that this separation gives me greater flexibility to respond to the evolving needs of the application without getting bogged down in the communication methods. It also makes breaking the application into smaller modules much easier.

  • APIs. I use Express.JS to quickly build access points when needed. Those access points are designed to be very light. I prefer to extract the information I need from the request and then call a solution specific logic layer to do any data processing or retrieval. That logic layer in turn calls out to data objects which interact with the database to store/retrieve data from persistent storage. I’ll create REST end points where it makes sense, but I’ll also create POST endpoints for the more complex requests.

  • Websockets. When it makes sense I’ll use the ws module to create a socket server. I may set up express-ws as well to create an HTTP/HTTPS endpoint to access the socket server. When possible I try to use the same business logic objects the API calls.

  • Reporting. I leave the presentation of the data to dedicated systems — such as Tableau or a charting tool. I use Node though to prepare the data for those systems.

  • File Management. There are times we just need to move a file, or remove it, or read the contents. I use the native fs module, or the fs-extra module to perform these steps. I use through2 when I’m using streams to process large data sets and need to pipe structured data into another step. I’ll also use fast-csv to work with CSV files.

  • Pre-Deployment processing. In practice this is normally running Webpack, Parcel, or another bundler system against the code. Sometimes though there is preparation work to be done when the next set of code is deployed — applying database structure changes, revising any dynamic templates, collecting files, etc. In these cases I write a script that does any other steps that the bundler system can’t handle. In essence it is batch processing.

  • Scheduled Tasks. I create JS files that perform a specific task. This could be purging outdated logs, processing the job queue, doing bulk emails out of band, etc. I write each script with a single intent, and then use CRON to call that script at a reasonable interval. These tasks are usually the more processing intensive things that would not play well within a web page. Or are house keeping chores to keep the application humming along at a brisk pace.

  • Batch Processing. Too often I have created an interface to allow for easy data capture. Then management wants to do a bulk entry, but do not want expose a bulk entry process in the application interface. This could be pre-populating the application with years of data, or ALL the products, or updating 100s or user records at once. These tasks may be one-off things, or could be recurring on a regular basis. Either way the work to be done is pretty much the same. Read the source data, massage it in some way, then do something with that data.

  • Command Line Scripts. Similar to batch processing but a batch script has a much narrower usage pattern. A command line script can take in arguments to define where the source data is, the desired location for the output, and other settings to adjust how the processing is to be done. I use yargs or minimist to handle parsing the command line parameters. From there it is pretty much just like the batch processing.

For one-off scripts I just call these with node path/to/my_script.js. For the recurring tasks though I add a script property to the package.json file so I can execute the commands with npm run mytask type statements.

Do you use Node.JS in a different way? If so, leave a comment and let me know. It would be great to hear of other approaches I can explore.