Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Separation of Concerns in Node.js

Separation of Concerns in Node.js


Key Takeaways

  • Separation of concerns is a Node.js architecture that ensures code readability, easy refactoring, and good code collaboration.
  • Using the separation of concerns principle, you can ensure that the final system is stable and maintainable.
  • The principle of separation of concerns ensures that components are not duplicated, making the system easier to maintain and refactor.
  • The separation of concerns principle holds that business logic should be kept separate from controllers. This simplifies the development of thin controllers and the writing of tests.
  • Separation of Concerns principle aids in code reusability. This reduces maintenance costs and time by making it simple to determine where a fault is coming from and how to isolate it from the entire system.

Best Practices for Creating a Good Node.js Project Architecture

Most of the time, we work in large groups with different people handling different parts of the system, which can get messy if everything isn't properly arranged and just jumbled all together. More teams are working remotely as a result of the pandemic, and having clear and well-defined code structures has never been more crucial.

Essentially, project structuring is an important topic because how you bootstrap your application can determine the overall development experience throughout the project's life cycle.

The amazing and somewhat frustrating aspect of Node.js is that you can structure your code however you want. There is no "correct way". You have the option of writing all of your code in a single app.js file or creating multiple files and placing them in different folders.

Most developers, however, would recommend structuring your projects by grouping related data together rather than having it all together. It would be preferable to know that you can make changes to your models by browsing through the model folders rather than having to navigate through a single file containing models, controllers, loaders, and services.

Why is good project architecture so important?

As previously stated, good project architecture is critical, and messy architecture can be problematic. Here are a few benefits of good architecture:

  • makes code more readable and tidy.
  • simpler to avoid repetition.
  • makes scaling and changes easier.
  • simplifies test authoring

Separation of concerns

Separation of concerns is a design principle that divides a  software program into segments. Each subsection attempts to address a distinct issue, a group of details that have an impact on the program's code.

This concept essentially refers to an architecture pattern in which program logic is separated from program content and presentation. This makes the project easier to maintain, and less prone to repetitions. It also streamlines changes and team collaboration.

A Node.js project can be organized in a variety of ways. Every organizational method has advantages and disadvantages. At the end of the day, every developer's goal is to create  scalable and clean codes.  Projects which follow this architecture pattern are often structured this way:

└───app.js        # Our Application's entry point
└───api           # Contains controllers, routes, middlewares     
└───config        # Application configs for development and production   
└───loaders       # Contains the startup processes
└───models        # Database models   
└───services      # Contains our business logic
└───jobs          # Jobs definitions(if you have cron jobs in your program, we don't)
└───subscribers   # Event handlers for async task      
└───test          # Our Program test files here

In order to explain folder structuring and the concept of separation of concerns, we will create a simple authentication REST API. We will be constructing a scalable structure that facilitates team collaboration.  We will be utilizing Node.js,  Express.JS, and MongoDB. To get started, make sure you have Node.js and MongoDB installed. 

Our example application is a simple REST API for authentication. When a user registers, their information is saved in our MongoDB database. When a user attempts to login, we verify their information and return a token if they are verified. While building this, we will implement a scalable project structure and see what is required to achieve it. 

Setting up our project folders

Our application will be structured in the following manner:

  •  All files and logic will be kept in a single folder called src.
  • The app entry and startup are taken care of by our server.js and app.js scripts.
  • The api folder comprises subfolders for controllers, middlewares, routes, and repositories, which are mostly used to handle tasks like data transmission, request processing, and validation. 
  • Our configuration folder, config, contains information about how our development and production environments are managed.
  • The loaders folder contains the actions that the program performs when it first launches. This comprises our database loader, which tells our database to start, and our express loaders, which execute our express app.
  • The models folder contains files that describe the type of data transferring to or receiving from the database. 
  • The services folder contains reusable business logic that handles tasks such as data processing, implementing unique business logic, calling the database, etc.
  • The utils folder includes documents such as helpers, validators, error handlers, constants, etc. Other files in the application may call these files in order to help with an operation.

Utils Folder - Helper files

These files support other portions of the application. They get utilized by several files or modules to possibly validate or change a request or piece of data because they have a reusable structure. For instance, developing a helper function that verifies that emails are written in an appropriate format. This feature can be used to verify that the email being input follows the proper format in places like sign up or login.

Our utils folder contains four files: 

  • validator.js 
  • helpers.js
  • error_handler.js
  • error_response.js


A method in this file called signupValidator verifies that the required arguments are being supplied and that they are being passed correctly. For instance, we verify that the name and email are supplied and the password is in the format that we desire (at least 8 characters and a mix of Alphanumeric and special characters).

import { celebrate, Joi, Segments } from 'celebrate';
export default class Validator {
  static signupValidator = celebrate({
        [Segments.BODY]: Joi.object().keys({
            name: Joi.string().required(),
            email: Joi.string().email().required().trim().lowercase(),
            password: Joi.string().regex(/^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)[a-zA-Z\d\w\W]{8,}$/).required().label('Password').messages({
                "string.min": "{#label} Must have at least 8 characters",
                "string.pattern.base": "{#label} must include at least eight characters, one uppercase and lowercase letter and one number"



This file contains functions that manage the format of our JSON responses, the hashing of our passwords, the generation of random strings, and more. The helpers file simply contains many functions that are utilized by many services; instead of building these functions within your services, import them as needed to keep code clean and accelerate development.

import bcrypt from 'bcryptjs';

import crypto from 'crypto';

const ENCRYPTION_KEY = "(some_r**n_5_str_$$8276_-yuiuj6]"; // Must be 256 bits (32 characters)
const IV_LENGTH = 16; // For AES, this is always 16

export class JsonResponse {
    constructor(statusCode = 200) {
        this.statusCode = statusCode;
    error = (res, message, data) => {
        return res.status(this.statusCode).json({
            status: false,
    success = (res, message, data) => {
        return res.status(this.statusCode).json({
            status: true,

export const randomString = (length) => {
    let numbers = "0123456789";
    let chars = "acdefhiklmnoqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXY";

    let randomstring = '';
    let randomstring2 = '';

    for (let x = 0; x < Math.floor(length / 2); x++) {
        let rnum = Math.floor(Math.random() * chars.length);
        randomstring += chars.substring(rnum, rnum + 1);

    for (let y = 0; y < Math.floor(length / 2); y++) {

        let rnum2 = Math.floor(Math.random() * numbers.length);
        randomstring2 += numbers.substring(rnum2, rnum2 + 1);

    let finalString = (randomstring + randomstring2).split('');
    return shuffle(finalString).join('');

// compare hash
export const compareHash = (string, hash) =>, hash);

export const hashString = async function (string) {
    const salt = await bcrypt.genSalt(10);
    return await bcrypt.hash(string, salt);

export const encryptData = data => {
    let iv = crypto.randomBytes(IV_LENGTH);
    let cipher = crypto.createCipheriv('aes-256-cbc', Buffer.from(ENCRYPTION_KEY), iv);
    let encrypted = cipher.update(data);

    encrypted = Buffer.concat([encrypted,]);

    return iv.toString('hex') + ':' + encrypted.toString('hex');

export const decryptData = data => {
    let textParts = data.split(':');
    let iv = Buffer.from(textParts.shift(), 'hex');
    let encryptedText = Buffer.from(textParts.join(':'), 'hex');
    let decipher = crypto.createDecipheriv('aes-256-cbc', Buffer.from(ENCRYPTION_KEY), iv);
    let decrypted = decipher.update(encryptedText);

    decrypted = Buffer.concat([decrypted,]);

    return decrypted.toString();


This defines the error response structure. For instance, you can call this in the catch section whenever you try to build a try-catch event and provide the necessary parameters, such as the status, data, and message. Instead of having to declare an error structure everywhere, you can reuse this approach. It is crucial to display errors accurately since it helps the user and developer consuming the API to comprehend the issue at hand.

export default class ErrorResponse extends Error {
   constructor(message, status) {
       this.status = status;


We can infer from the file's name that this contains functions that handle different error conditions. For instance, it has functions for dealing with 404 problems, duplicate fields in our database, and server issues.

import ErrorResponse from './error_response';
import { isCelebrateError } from 'celebrate';

const errorHandler = (err, req, res, next) => {
   let error = { ...err };
   error.message = err.message;

   //celebrate error
   if (isCelebrateError(err)) {
       if (!err) {
           error = new ErrorResponse("Unable to process request, try again", 400);
       } else {
           const errorBody = err.details.get('body');
           if (errorBody) {
               const { details: [errorDetails] } = errorBody;
               const message = errorDetails.message;
               error = new ErrorResponse(message, 400);
           } else {
               error = new ErrorResponse("Invalid payload sent, review and try again", 400);

   // mongoose duplicate error
   if (err.code == 11000) {
       const message = "Field already exists or duplicate value encountered";
       error = new ErrorResponse(message, 400);

   // mongoose validation error
   if ( == "CastError") {
       const message = "Invalid parameter passed";
       error = new ErrorResponse(message, 400);

   // mongoose validation error
   if ( == "ValidationError") {
       const message = Object.values(err.errors).map(val => val.message);
       error = new ErrorResponse(message, 400);

   res.status(error.status || 500).json({
       status: false,
       message: error.message || "Server error! request not completed",
       data: {}

export default errorHandler;

Config Folder - Environment Management

Most of the time, we have varying environmental variables. For instance, if we are working locally in our development environment, our MongoDB URI will most likely begin with localhost, whereas a link leading to an atlas database will be present in our production environment. Therefore, it is wise to handle these differences with care. Our config folder will contain three files: dev.js (for the development environment), prod.js (for the production environment), and an index.js file, where they get imported. Additionally, the index.js file has a switch case that determines which file should get utilized depending on the environment.

Don't forget to make an .env file with all the variables you require.


import '../.env'
import dotenv from 'dotenv';

export const config = {
  secrets: {
    jwt: process.env.JWT_SECRET_DEV,
    jwtExp: '100d'
  dbUrl: process.env.MONGO_URI_DEV,


import '../.env'
import dotenv from 'dotenv';

export const config = {
  secrets: {
    jwt: process.env.JWT_SECRET,
    jwtExp: '7d'
  dbUrl: process.env.MONGO_URI,


import { merge } from 'lodash';
const env = process.env.NODE_ENV || 'development';
const port = process.env.PORT || 4002;

const baseConfig = {
    isDev: env === 'development',

let envConfig = {}

switch (env) {
    case 'dev':
    case 'development':
        envConfig = require('./dev').config
    case 'prod':
    case 'production':
        envConfig = require('./prod').config
        envConfig = require('./dev').config
export default merge(baseConfig, envConfig)

Loaders Folder

The loaders folder contains files required for the initialization of specific functions. For example, we have an express and a database loader that start the express app and database, respectively. 

The idea is to divide the application's startup process into testable components. The various loaders are imported into an index.js file in the loaders folder, which makes them available to other files.


import mongoose from 'mongoose';
import dotenv from 'dotenv'; 
import options from '../config';

require('dotenv').config({path: __dirname + '/.env' })

export default (url = options.dbUrl, opts = {}) => {
  let dbOptions = { ...opts, useNewUrlParser: true, useUnifiedTopology: true };
  mongoose.connect(url, dbOptions);
  const conn = mongoose.connection;
  return conn;


import * as fs from 'fs';
import morgan from 'morgan';
import mongoSanitize from 'express-mongo-sanitize';
import rateLimit from 'express-rate-limit';
import helmet from 'helmet';
import xss from 'xss-clean';
import cors from 'cors';
import ErrorResponse from '../utils/error_response';
import errorHandler from '../utils/error_handler';

// import routes
import apiRoutes from '../api/routes';

const apiLimiter = rateLimit({
    windowMs: 20 * 60 * 1000, // 20 minutes
    max: 100, // Limit each IP to 100 requests per `window` (here, per 20 minutes)
    standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers
    legacyHeaders: false, // Disable the `X-RateLimit-*` headers
    handler: (_request, res, _next) => res.status(429).json({
        status: false,
        message: "Too many requests, please try again later."

export default ({ app, express }) => {


    app.use(express.urlencoded({ extended: true }))
    // Dev logging middleware
    if (process.env.NODE_ENV === 'development') {

    app.enable('trust proxy');


    // add secure headers

    app.get('/ip', (request, response) => response.send(request.ip))
    app.use('/api/v1', apiLimiter, apiRoutes);


    app.use((_req, _res, next) => next(new ErrorResponse('Route not found', 404)));
    return app;


import dbConnect from './db-loader';
import expressLoader from './express-loader';

export default async ({ app, express }) => {
    const connection = dbConnect();
    console.log('MongoDB has been Initialized');
    expressLoader({ app, express });
    console.log('Express package has been  Initialized');

Entry Files

Our app's entry point is app.js. It is common practice to put significant amounts of  code here, but the separation of concerns ensures that all logic gets separated. We will create two entry points, namely server.js and app.js. In our server.js file, we will import our loaders and configuration files, as well as begin listening to our PORT. Our app.js file simply imports our server.js file. So, technically, when our server attempts to run our application, it reaches the app.js file and attempts to start the various functions specified in our server.js file. 


import express from 'express';
import dotenv from 'dotenv';
import appLoader from './loaders';
import appConfig from './config';
export const app = express();
require('dotenv').config({path: __dirname + '/.env' })

export const start = async () => {
  try {
    await appLoader({ app, express });
    app.listen(appConfig.port, () => {
      console.log(`REST API on http://localhost:${appConfig.port}/api/v1`);
  } catch (e) {


import { start } from './server'

process.on('unhandledRejection', (err, _) => {
    console.log(`Server error: ${err}`)

So far, when we run our application, we get a message saying that our app is running on our preferred port, that the express server has started, and that our database has been successfully connected.


Then there are the models which are simply interfaces between our application and the database. They structure the data that we want to pass around our application. As a result, we'll make two files in our models folder:  user models file and an index.js file into which we'll import every other model.


import mongoose from 'mongoose';
import bcrypt from 'bcryptjs';
import { sign } from 'jsonwebtoken';

import config from '../config';
const UserSchema = new mongoose.Schema({
    name: {
        type: String,
        trim: true,
        required: [true, "Name is required"]
    email: {
        type: String,
        trim: true,
        unique: true,
        match: [/^(([^<>()[\]\\.,;:\s@"]+(\.[^<>()[\]\\.,;:\s@"]+)*)|(".+"))@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\])|(([a-zA-Z\-0-9]+\.)+[a-zA-Z]{2,}))$/, "Please enter a valid email"],
        required: [true, "Email is required"]
    password: {
        type: String,
        select: false,
    created_at: {
        type: Date,


import User from './user.model';
export { User};


Services handle things like data manipulation, database calls, and other business logic. Separating app services from controllers is a separation of concerns technique. The service layer contains business-related logic and nothing related to the HTTP layer. This technique allows for easier test writing, refactoring, and thinner controllers. Services implement our application's logic and communicate with the database via the data access layer before returning a necessary response to the controller. We made a simple auth service file that contains our signin and signup logic.


import { User } from '../models';
import ErrorResponse from '../utils/error_response';
import { randomToken } from '../utils/helpers';

export default class AuthService {
    //User register
    async signup(data) {
        try {
            const { email, password, name } = data;
            // find user by email
            let query = { $or: [{ email: { $regex: email, $options: 'i' } }] };
            const hasEmail = await User.find(query);
            // throw error if user not found
            if (hasEmail.length > 0) { throw new ErrorResponse('Email already exists', 400); }
            const user = await User.create({ email, password, name });
           return user;
        } catch (e) {
            throw e;

 async signin(data) {
        try {
            let { email, password } = data;
            let query = {
                $or: [
                    { email: { $regex: email, $options: 'i' } }

            // find user by email
            const user = await User.findOne(query).select('+password');
            // throw error if user not found
            if (!user) { throw new ErrorResponse('Invalid credentials', 401); }

            // check user password
            const isMatch = await user.comparePassword(password);
            if (!isMatch) { throw new ErrorResponse('Invalid credentials', 401); }
            return {
                user: user.toMap(),
                token: user.getJwtToken(),
        } catch (e) {
            throw e;


Finally, we have our api folder, which contains three other important folders: controllers, routes, and middleware, which we will go over individually.


Middlewares are in charge of handling various validation or other general checks in an application. We'll make two files, async_handler.js and auth_handler.js, to handle res (response) and req (request) objects, as well as user authorization.


export const asyncHandler = fn => (req, res, next) => Promise.resolve(fn(req, res, next)).catch(next);


import { verify } from 'jsonwebtoken';
import ErrorResponse from '../../utils/error_response';
import { asyncHandler } from './async_handler';
import config from '../../config';

export const userAuth = asyncHandler(
    async (req, res, next) => {
        let authHeader = req.headers.authorization;
        let token = authHeader && authHeader.startsWith('Bearer') && authHeader.split(' ')[1];

        if (!token) {
            return next(new ErrorResponse('Unauthorized access', 401));
        try {
            const decoded = verify(token, config.secrets.jwt);
            req.user = decoded.result;
        } catch (e) {
            return next(new ErrorResponse('Unauthorized access', 401));


Controllers receive requests, make a call to the required service, and then communicate with the database via the data access layer, then sending the results back to the service, which then sends the results back to the controller, and the controller delivers the result to the client. We will create a file called index.js in our controllers folder that will contain our signin and signup controllers. These controllers use the res and req objects in the async handler.js file to send requests to various services.


import { asyncHandler } from '../middlewares/async_handler';
import { JsonResponse } from '../../utils/helpers';
import AuthService from '../../services/auth';
export default class IndexController {
    constructor() {
        this.authService = new AuthService();
    index = asyncHandler(
        async (req, res, _) => {
            res.json({ status: true, message: "Base API Endpoint." })

 loginUser = asyncHandler(
        async (req, res, _) => {

            const { user, token } = await this.authService.signin(req.body);

            return new JsonResponse().success(res, "User logged in successfully", { user, token });


    registerUser = asyncHandler(
        async (req, res, _) => {

            await this.authService.signup(req.body);

            return new JsonResponse(201).success(res, "User account created successfully", {});




Routes simply define how our application should respond to HTTP requests from clients. It is the portion of our program's code related to HTTP verbs. Middleware may or may not protect these routes. Routes' primary function is to handle requests as they arrive.

For example, a POST request makes the route and expects data to be posted or passed.

In our routes folder, we've created an index.js file that contains all of the routes required to access the platform's various services. Routes receive a request, forward it to the controller, which then forwards it to the database and returns a report to the controller.


import { Router } from 'express';
import Validator from '../../utils/validator';
import IndexController from '../controllers';

const router = Router();
// import all controllers
let indexController = new IndexController();

// register all routes
router.get('/', indexController.index);'/login', indexController.loginUser);'/register', Validator.signupValidator, indexController.registerUser);

//export the base router
export default router;


Every developer should strive for clean, readable, and reusable code, which makes it easier to refactor, collaborate with others, test, and make fewer mistakes. There are various approaches to designing API architecture, and there are many right ways; At all costs, make sure that scalability and readability are your top considerations when choosing an architecture.

We do, however, recommend using the separation of technique architecture because, as you can see, it has numerous advantages. This technique has proven to be useful in building projects regardless of project complexity or team size. You don't want anything to go wrong during production!

About the Author

Rate this Article


Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p