CRUD API using node, Sequelize, Postgres and Docker

In this article, we will set some CRUD node Api with Sequelize, Postgres and Docker

GitHub Repository: github.com/FrancescoXX/docker-nsp

NODE

image.png

Node is a back-end JavaScript runtime environment, which means briefly that can execute JavaScript code on a computer, for example, yours or the one where Node is installed. The good thing is that, by having Docker, you DON't actually need to install it, because we will use the Node image, and so we can also avoid versioning between my version of Node installed on my machine and yours

POSTGRES

image.png

Postgres (PostgreSQL) is a free open-source relational database, very popular and stable

SEQUELIZE

image.png

Sequelize is a Promise-Based Object-relational mapping (ORM) for Node. ORM is a technique for converting data from incompatible type systems, using Object-oriented programming languages.

IT allows us to create and modify tables inside a database without executing SQL commands

IT also works with MySQL, MariaDB, SQLite, Microsoft SQL Server.

DOCKER

image.png

Docker is a platform to build run and share application using the idea of containers. If you want a brief introduction, here is a short video

IMAGE ALT TEXT HERE

youtu.be/eN_O4zd4D9o

Step by Step

  1. Create a folder named docker-nsp (Which stands for node, sequelize, postgres) and enter into it

    mkdir docker-nsp && cd docker-nsp
    
  2. Initialize node application using npm

npm init -y
  1. Install the dependencies
npm install express pg sequelize
  1. Create Structure
mkdir app
cd app
mkdir controllers
mkdir models
mkdir routes
mkdir utils

Then create an index.js file

Our folder structure should look like this:

image.png

Let's write the index.js file

const express = require('express');
const bodyParser = require('body-parser');

const sequelize = require('./util/database'); //database initializations
const User = require('./models/users'); //REQUIRED even if IDE says not used!

//INITIALIZE APP WITH EXPRESS
const app = express();

//BODYPARSER
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));

//Set proper Headers on Backend
app.use((req, res, next) => {
  res.setHeader('Access-Control-Allow-Origin', '*');
  res.setHeader('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE');
  res.setHeader('Access-Control-Allow-Headers', 'Content-Type, Authorization');
  next();
});

//ROUTES
app.use('/dev', require('./routes/dev')); //All test routes are placed here
app.use('/users', require('./routes/users')); //users crud

(async () => {
  try {
    await sequelize.sync(
      { force: true } //Reset db every time
    );
    app.listen(process.env.EXTERNAL_PORT); //DEF in docker.compose.yml
  } catch (error) {
    console.log(error);
  }
})();

inside the util folder, let's create the connection to postgres db

cd util

create a database.js file

const Sequelize = require('sequelize');

//GET ENV VARIABLES FROM
const sequelize = new Sequelize(
    process.env.PGDATABASE,
    process.env.PGUSER,
    process.env.PGPASSWORD,
    {
        host: process.env.PGHOST,
        dialect: 'postgres'
    });

module.exports = sequelize;

inside the routes folder, let's create a couple of files to redirect HTTP requests: dev.js and users.js

The dev.js file inside the routes folder:

const controller = require('../controllers/dev'); 
const router = require('express').Router();

router.get('/config', controller.getConfig);
router.get('/version', controller.getVersion);
router.get('/seq', controller.seq); //test sequelize connection

module.exports = router;

the users.js file inside the routes folder:

const controller = require('../controllers/' + 'users');
const router = require('express').Router();

//CRUD Model-Agnostic. 
//Keep them at the end of the route file for url parsing requests
router
  .get('/', controller.getAll)
  .get('/:id', controller.getOne)
  .post('/', controller.createOne)
  .put('/:id', controller.updateOne)
  .delete('/:id', controller.deleteOne);

module.exports = router;

inside the models folder, let's create users.js file to use as a model for the Users:

the users.js file inside the models folder:

const Sequelize = require('sequelize');
const db = require('../util/database');

const User = db.define('users', {
    id: {
        type: Sequelize.INTEGER,
        autoIncrement: true,
        allowNull: false,
        primaryKey: true
    },
    username: {
        type: Sequelize.STRING,
        allowNull: false,
        unique: true
    },
    email: {
        type: Sequelize.STRING,
        allowNull: false,
    },
    password: {
        type: Sequelize.STRING,
        allowNull: false
    }
});

module.exports = User;

inside the controllers' folder, let's create a couple of files, named dev.js and users.js

The dev.js file inside the controllers' folder:

const packJson = require('../../package.json');
const sequelize = require('../util/database');

// [GET] ../dev/config
exports.getConfig = (req, res, next) => {
  return res.status(200).json({ packJson });
};

// [GET] ../dev/version
exports.getVersion = (req, res, next) => {
  return res.status(200).json({ 'nps Backend': packJson.version });
};

// [GET] ../dev/seq
exports.seq = async (req, res, next) => {
  try {
    await sequelize.authenticate();
    console.log('Sequelize Connection established');
    res.status(200).json('Sequelize Connection established');
    next();
  } catch (error) {
    next(error);
  }
};

The users.js file inside the controllers' folder:

const User = require("../models/users");

/**
 * CRUD CONTROLLERS
 */

//CREATE-ONE
exports.createOne = async (req, res, next) => {
    console.log("createOne: [POST] /users/");
    try {
        const USER_MODEL = {
            username: req.body.username,
            email: req.body.email,
            password: req.body.password,
            role: req.body.role,
        }

        try {
            const user = await User.create(USER_MODEL);
            console.log("OK createOne USER: ", user);
            return res.status(201).json(user);
        } catch (error) {
            console.log('ERROR in createOne ' + "USER:", error);
            return res.status(500).json(error);
        }
    } catch (error) {
        return res.status(400).json("Bad Request");
    }
};

//GET-ALL
exports.getAll = async (req, res, next) => {
    console.log("getAll: [GET] /users/");
    try {
        const ALL = await User.findAll();
        console.log("OK getAll USER: ", ALL.map(el => el.dataValues));
        return res.status(200).json(ALL);
    } catch (error) {
        console.log('ERROR in getAll ' + "USER:", error);
        return res.status(500).json(error);
    }
};

//GET-ONE
exports.getOne = async (req, res, next) => {
    console.log("getOne: [GET] /users/:id");
    try {
        const u = await User.findByPk(req.params.id);
        console.log("OK getOne USER: ", u.dataValues);
        return res.status(200).json(u);
    } catch (error) {
        console.log('ERROR in getOne ' + "USER:", error);
        return res.status(500).json(error);
    }
};

//UPDATE-ONE.
exports.updateOne = async (req, res, next) => {
    console.log("updateOne: [PUT] /users/:id");
    try {
        const USER_MODEL = {
            username: req.body.username,
            email: req.body.email,
            password: req.body.password,
            role: req.body.role
        }

        try {
            const u = await User.update(USER_MODEL, { where: { id: req.params.id } });
            console.log("OK updateOne USER: ", u);
            return res.status(200).json(u);
        } catch (error) {
            console.log('ERROR in updateOne ' + "USER:", error);
            return res.status(500).json(error);
        }
    } catch (error) {
        return res.status(400).json("Bad Request");
    }
};

//DELETE-ONE
exports.deleteOne = async (req, res, next) => {
    console.log("[DELETE] /users/:id");
    try {
        const u = await User.destroy({ where: { id: req.params.id } });
        console.log("OK deleteOne USER: ", );
        return res.status(200).json(u);
    } catch (error) {
        console.log('ERROR in deleteOne ' + "USER:", error);
        return res.status(500).json(error);
    }
};

DOCKER

image.png

Now the Docker Part!

In the main folder, create 3 files:

  • Dockerfile
  • docker-compose.yml
  • .dockerignore (it starts with a dot)

the .dockerignore file:

.git
node_modules
npm-debug.log

the Dockerfile:

FROM node:14

EXPOSE 3001

# Use latest version of npm
RUN npm i npm@latest -g

COPY package.json package-lock.json* ./

RUN npm install --no-optional && npm cache clean --force

# copy in our source code last, as it changes the most
WORKDIR /opt
COPY . .

CMD [ "node", "app/index.js" ]

The docker-compose.yml file:

version: "3.8"
services:
  nsp_backend:
    container_name: nsp_backend
    image: francescoxx/nsp-template:0.0.2
    build:
      context: .
    ports:
      - "3001:3001"
    environment:
      - EXTERNAL_PORT=3001
      - PGUSER=francesco
      - PGPASSWORD=12345
      - PGDATABASE=nps_database
      - PGHOST=nsp_db # NAME OF THE SERVICE
    depends_on:
      - nsp_db
  nsp_db:
    container_name: nsp_db
    image: "postgres:12"
    ports:
      - "5432:5432"
    environment:
      - POSTGRES_USER=francesco
      - POSTGRES_PASSWORD=12345
      - POSTGRES_DB=nps_database
    volumes:
      - nps_data:/var/lib/postgresql/data
volumes:
  nps_data: {}

replace the image "francescoxx/nsp-template:0.0.2" with an image name of your choice!

Time to build our image!

from the folder where the docker-compose.yml file is located, run

docker-compose build

then let's start the nsp_db service:

docker-compose up -d nsp_db

we should have a Postgres DB up and running!

let's check what is inside the DB:

docker exec -it nsp_db psql -U francesco postgres

and once we are inside the container (you can verify this by checking the postgres=# terminal)

connect to the nsp_database

\c nsp_database

this means that a database named "nsp_database" has been created by postgres using the environment variable we have passed at the beginning

and then:

\dt

and you should get the message:

"Did not find any relations."

This is because we have created the database, using the environment variable, but we haven't created any table or relationship yet

Exit

exit

And you are again at your terminal

Now it's time to run our node application

docker-compose up -d nsp_backend

WE can verify if both the containers are running, by using the 'docker ps -a' command

image.png

Sequelize will create a Db for us

Try to launch again the command

docker exec -it nsp_db psql -U francesco postgres

And now

\dt

you should see the table "users". This has been created by Sequelize when we have launched our Node Application.

Anyway, if we execute

select * from users;

WE should get an empty Table

nsp_database=# select * from users; id | username | email | password | createdAt | updatedAt ----+----------+-------+----------+-----------+----------- (0 rows)

And now The funny Part, let's test the API!


POSTMAN

image.png

We will use Postman, but you can use a whenever tool you want

let's make a GET request like this

image.png

This means that our server is up and running

Let's try to get all the users

image.png

To add a User, we can make a POST request like this:

image.png

Let's add another one

image.png

Now, is we look again at the list of users, we should see something like this:

image.png

WE can also Check one user, by adding the id of the user at the end of the URL:

image.png

We can also UPDATE an existing user, with a PUT request

image.png

And if we try to GET the User again, we get this

image.png

Now let's try to DELETE one User, using a DELETE request, with the relative id

image.png

The server says that one user has been deleted

image.png

And if we try to get all the users again only number 2 is showned

image.png


Conclusion

If you have tried to follow this article, I would like to know if you have encountered any problem. Thanks

GitHub Repository: github.com/FrancescoXX/docker-nsp

Comments (1)

avocado2050's photo

Small typo.... below command need to changed to

\c nps_database