Intro ⚡

When we create an application in NodeJS, our application runs sequentially, for example there is user A making a request to endpoint /slow and the endpoint is running a very heavy process, so when there is user B hits to the same application but different endpoint e.g / while the process that belongs to A is still running, then user B will wait for user A’s process to finish, so in this article we’ll learn how to make our NodeJS application run concurrently. ⚡⚡


Before we talk about how to make our nodejs application run concurrently, I think it’s good to know the difference between concurrent and parallelism.


What is Concurrency / Concurrent?

“Concurrent or Concurrency means we can do several jobs one by one at a time.”

For example, we eat -> drink -> open cellphone -> eat -> drink -> chat. This is an example of a concurrency process, at one time we did one job but WE CAN CHANGE JOBS.

What is Parallelism?

“Parallelism means doing jobs at the same time.”

For example, we eat while watching a movie, we cook while on the phone, we walking while looking at cell phone and etc. WE DO SOME JOBS AT THE SAME TIME.

Actually, we don’t need to worry about the difference between concurrency and parallel, because when we create a concurrent application, it is usually already with parallel process, we can change jobs and we can also do several jobs at once.

Concurrency and Paralelism Diagram


Jump into the code ⚡️

For practice, we use ExpressJS as http server, you can clone from my repository, then install dependencies using npm install or yarn. The project structure is like below.

.
├── node_modules/
├── package.json
├── index.js
├── my-heavy-process.js
├── README.md
└── .gitignore

Next we will create 2 endpoints, / and /slow

// index.js
const app = require("express")();
const { fork } = require("child_process");

const PORT = 8888;

app.get("/", function (request, response) {
  response.send("This page should be fast");
  response.end();
});

app.get("/slow", function (request, response) {
  const fiveBillion = 5_000_000_000;
  let count = 0;

  for (let i = 0; i < fiveBillion; i++) {
    count += i;
  }

  response.send(`The final count is : ${count}`);
  response.end();
});

app.listen(PORT, function () {
  console.info(`Server is running on port ${PORT}`);
});

Run our application with node index.js then open http://localhost:8888/ it will be loaded very fast, because there is no heavy process that need a lot of memory, but if we open /slow endpoint it will be slow because there is a heavy process.

And then what’s the problem ?? 🤔

Hmmmm the problem is when we open /slow endpoint and the process is not finished yet but we make another request on / endpoint it will wait for process at endpoint /slow finish. It means our application running in sequential process or FIFO (First In First Out).


sample-1


Fix the problem

Now we update our /slow endpoint to run on concurrent requests,

Firstly, we fork the my-heavy-process.js file when user trying to call our /slow endpoint, so our heavy process will be processed in file that we forked.

const childProcess = fork("./my-heavy-process.js");

Note : Import fork from child_process module, see second line in index.js file.


Next we send a data string “start” to the file we forked.

childProcess.send("start", function (error) {
  if (error) {
    console.error("Child process error");
    console.error(error.message);
  }
});

Next we create a listener to capture incoming data from our child process (my-heavy-process.js) and then we will send it to the client.

childProcess.on("message", function (data) {
  response.send(`The final count is : ${data}`);
  response.end();
});

So, our /slow endpoint will looks like this.

// index.js
app.get("/slow", function (request, response) {
  const childProcess = fork("./my-heavy-process.js"); // <-- fork child process

  childProcess.send("start", function (error) {
    if (error) {
      console.error("Child process error");
      console.error(error.message);
    }
  });

  childProcess.on("message", function (data) {
    response.send(`The final count is : ${data}`);
    response.end();
  });
});

Now, we will focus in our child process or my-heavy-process.js file, we create a listener to capture the incoming data from /slow endpoint.

// my-heavy-process.js
process.on("message", function (data) {
  // data will process here
});

If you remember that /slow endpoint send string “start” to the child process, so we will first check whether the data is the same as string “start” or not. If yes, run the heavy process or skip it and send the result to our /slow endpoint then we kill the child process.

// my-heavy-process.js
let count = 0;

if (data === "start") {
  const tenBillion = 5_000_000_000;

  for (let i = 0; i < tenBillion; i++) {
    count += i;
  }
}

So, the full code will be like this.

// my-heavy-process.js
process.on("message", function (data) {
  let count = 0;

  if (data === "start") {
    const tenBillion = 5_000_000_000;

    for (let i = 0; i < tenBillion; i++) {
      count += i;
    }
  }

  process.send(count);

  process.exit(); // <-- kill child process
});

Now if we restart our application, your / endpoint will not wait for the process on /slow endpoint anymore.

sample-1


Conclusion

For easy to understand that our parent (index.js) send a heavy process to be executed by another (child process). Actually, to solve our problem (sequential process) there are several ways, we can use Clustering, Child Process or Worker Thread, if you want to know more about the difference between them, read here.

Thank you for reading 🙂