Node.js Production best practices: performance, reliability and stability

“Agrippa Solutions” (www.agrippa.no) company from Norway is based in their solutions on node.js platform. Everyday node.js applications  serve tens of thousands of operations and transactions per second. Node.js easily copes with such a load. Their REST API works PERFECT. But even the best platform with a badly written application can contribute to a real defeat. In this article I’d like to present performance, reliability and stability best practices for node.js applications deployed to production.

Below there is list best practices to improve your application performance built on node.js:

  • Use validation parameters
  • Use bundle and gzip compression
  • Don’t use synchronous functions
  • Handle logging correctly
  • Handle exceptions properly
  • Optimize SQL queries and Stored Procedures

 

Use validation parameters

Application wrote on node.js without handle of exceptions can be crashed in case some invalid  input parameters e.g. invalid type conversion, too long string,  etc.   This means a break in the action of our application in time until be restarted, repeating the same operations by the customers many times, what leads to  growth the number of requests function rest api (which each time are finished fail). Even if they pass through the api functions with node.js to database that can cause exceptions and error messages (not even a mention about the values saved to not appropriate column). This is the first step to defeat of company and the first important link in our chain to stabilize the operation of our server.

Function rest api at the beginning has to check  correctness input parameters before sending them to Data Base or to another functions. 

The list of available modules to validate you can find a lot of ready-made packages that can be used for this purpose. One of them jst express-validator (https://www.npmjs.com/package/express-validator) and is ideal for production. By this module we can validate body, queries and headers data

Below I present sample How use it:
1. First You have to place module to man file of server node.js e.g. index.js or server.js
var util = require('util'),
    bodyParser = require('body-parser'),
    express = require('express'),
    expressValidator = require('express-validator'),
    app = express();

app.use(bodyParser.json());

// this line must be immediately after any of the bodyParser middlewares!
app.use(expressValidator({
      errorFormatter: function(param, msg, value) {
     var namespace = param.split('.')
        ,root = namespace.shift()
        ,formParam = root;
    while(namespace.length) {
           formParam += '[' + namespace.shift() + ']';
    }
    return {
            param : formParam,
            msg : msg,
           value : value
          };
 }
}));

app.listen(8888);

2. Declare in one file e.g. validator.js your own module with below source code. It will be used to validation in many files of our application


module.exports = function(request,response,schema)
{
   request.checkBody(schema);

  var errors = request.validationErrors();
  if (errors) {
       console.error({Error: errors});
       response.status(422).send({Error: {Validator: errors}});
       return true;
   }
   else return false;

}

3. Declare in schema to validation input parameters in separate file.

module.exports = {
    'sensor_device_id': {

        isInt:{
             errorMessage: 'sensor_device_id - Integer expected'
        },
        errorMessage: 'sensor_device_id is required'

    },
    'active': {

        notEmpty: true,
        errorMessage: 'Active Validation error'

    },
   'organisation': {

        isInt:{
            errorMessage: 'Organisation - Integer expected'
        },
        errorMessage: 'Organisation is required'

    }
}

3. Place instruction to every action where schema You should place correct name of file with defined schema

if(validator(request,response,schema)) return next();

For instance


const validator = require('../schema/validator.js');
const schema_sensorActive = require('../schema/sensorActive.js');

app.post('/sensor', function (request, response, next) {

   if(validator(request,response,schema_sensorActive)) return next();

 const mssql = request.service.mssql; 
 const params = request.body;
 const {
        organisation=null,
        sensor_device_id=null,
        active=1
    } = params;

    const sqlQuery = 'EXEC [AppCenter].[Sensor_Active_UPDATE] ?, ?, ?, ?';
    const sqlParams = [request.auth.UserID, organisation, sensor_device_id, active];


    mssql.query(sqlQuery, sqlParams, {
        success(res) {

          try{
                   return response.status(200).send(res);

            } catch(err) {  return response.status(201).send({Error: res}); }
        },
        error: handleSqlError(response, sqlQuery, sqlParams)
    });
}

Every time when Validator detect errors then information about parameter name and defined by us communication error will be sent to fronted client app.

You can also use validation for output parameter e.g after performed data from db, from device etc

Use bundle and gzip compression

In case of static files (e.g. html files) or json answers gzip compressing can greatly decrease the size of the response body and hence increase the speed of a web app.
For applications built on angular, angular2, react very important is creating bundle and compressed files – we have a lot of tools for this operation like webpack.
But we can’t forget about huge structures of data sending in json format. Pagination doesn’t replace edit operation where we fetch sometimes huge structure of document e.g. to edition. We can replace this mechanism by multiple fetching various pieces of data – but what if we have the possibility of compression which is a  automatically served by browsers and mobile devices.

const express = require('express')
const compression = require('compression')
const app = express()

app.use(compression())

Here JSON as compressed result by our node.js in browser:

Don’t use synchronous functions

Node.js  is based on an asynchronous architecture. Node.js is an open-source, cross-platform runtime environment for developing highly scalable server-side web applications, especially web servers written in JavaScript. It allows you to create applications that useasynchronous event-driven entry-exit system. Node.js provides an event-driven architecture and a non-blocking I/O API designed to optimize an application’s throughput and scalability for real-time web applications. Synchronous functions and methods tie up the executing process until they return. 

More details:

Node.js vs PHP

Node.js Server Architecture in practice (IBM, Microsoft, Yahoo!, Walmart, Groupon, SAP, LinkedIn, Rakuten, PayPal, Voxer and GoDaddy)

Node.js – Features and Application


app.get('/sensor', function (req, res, next) {
 
 if(validator(request,response,schema)) return next();

  mssql()
    .then(function (data) {
      
      return performData(data)
    })
    .then(function (csv) {
       try{
                   return response.status(200).send(res);

            } catch(err) {  return response.status(201).send({Error: res}); }
    })
    .catch(handleSqlError(response, sqlQuery, sqlParams))
})

Handle logging correctly

In the previous point we learned that in Node.js we should use asynchronous functions. Remember that applying functions as console.log() or console.error()  to print log messages to the terminal are synchronous. On production when the destination is a terminal or a file,  they are not suitable. If you’re going logging app activity (for example, to catch exceptions), instead of using mentioned synchronous functions, use a logging modules like Winston or other asynchronous.

 

Handle exceptions properly

Use Try-catch construct to catch exceptions not only in synchronous. For Instance if TSQL Stored procedure  returns two results in below code what will cause exception – two times sending response – that’s means crash of node.js server

 mssql()
    .then(function (data) {
      
      return performData(data)
    })
    .then(function (csv) {
       try{
                   return response.status(200).send(res);

            } catch(err) {  return response.status(201).send({Error: res}); }
    })
    .catch(handleSqlError(response, sqlQuery, sqlParams))

Optimize SQL queries and Stored Procedures

And last very important issue: even super-written code in node.js will work slowly when response from the DB server will take longer for a few seconds which translates into time response our Rest api. The second important thing: more data processing on the TSQL side in DB and  less on node.js side. This particularly applies to data collected from multiple tables

 

My own company JRB System (www.jrbsystem.com) cooperate with Agrippa Solutions in case of Node.js technologies

March 1st, 2017

  • My wife and i ended up being absolutely joyous when Ervin could deal with his preliminary research from the ideas he got using your weblog. It is now and again perplexing to just always be giving for free secrets people could have been making money from. And we all grasp we have got the writer to be grateful to for this. The entire explanations you made, the straightforward blog menu, the relationships you can give support to create – it is most overwhelming, and it’s aiding our son and us do think the subject matter is amusing, and that is seriously pressing. Thanks for the whole thing!.

  • Youre so cool! I dont suppose Ive learn anything like this before. So good to find someone with some authentic thoughts on this subject. realy thanks for beginning this up. this website is something that is wanted on the net, someone with a little originality. helpful job for bringing one thing new to the web!.

  • You made some first rate points there. I regarded on the internet for the problem and located most individuals will associate with along with your website..

  • I wanted to create you a little note to be able to thank you very much the moment again for all the extraordinary knowledge you’ve documented here. It was really strangely open-handed of you to convey unhampered just what numerous people might have marketed as an ebook to make some money on their own, specifically given that you could possibly have done it in the event you desired. These good tips also worked to be the great way to be sure that other individuals have similar zeal much like my very own to grasp good deal more with reference to this condition. I am certain there are numerous more pleasant moments ahead for those who look into your site..

  • I precisely needed to thank you so much again. I’m not certain what I would have carried out in the absence of the techniques provided by you on such a industry. Previously it was an absolute troublesome scenario in my circumstances, nevertheless considering a professional technique you managed the issue took me to leap over gladness. I’m just happier for this work and hope that you really know what a great job that you’re carrying out teaching other individuals through the use of your site. Most likely you’ve never come across any of us..