Dockerize Django Postgres

March 21, 2021
PPL Uni Life Docker Django PostgreSQL

Hi friends! Have you ever imagined how to run applications independently without having to install any required dependencies on your local machine? We could do that by building and containerizing our applications using Docker Engine. In this post, I’m going to show you step-by-step how I built and containerized Industri Pilar’s backend that runs on Django and PostgreSQL with Docker (a.k.a. dockerize the application).

Wait, what is Docker?

Taken from the Docker Docs, Docker is an open platform for developing, shipping, and running applications. It enables us to separate our applications from our infrastructure because it provides the ability to package and run an application in a loosely isolated environment called a container. The isolation and security allow us to run many containers simultaneously on a given host. Containers are lightweight and contain everything needed to run the application, so we don’t need to rely on what is currently installed on the host. There’s no more problems that arise from different OS developments because everyone has the same container that works in the same way.

How to build the Docker container?

 

Industri Pilar Architecture Overview

 

For Industri Pilar’s backend, we use Django Framework with PostgreSQL database to build the REST API. We containerized our application separately from the database, so we need to use Docker Compose to define and run multi-container Docker applications. To build the Docker image, we need to create a Dockerfile to define the commands for our container on our project’s root folder.

Dockerfile

FROM python:3.8-slim-buster
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.txt /code/
RUN pip3 install -r requirements.txt
COPY . /code/
COPY wait-for-it.sh /wait-for-it.sh
RUN chmod +x /wait-for-it.sh

 

After that, we need to define our docker-compose.yml on our project’s root folder to configure the multi-container Docker applications.

docker-compose.yml

version: "3.9"
   
services:
  db:
    image: postgres:13
    environment:
      - POSTGRES_DB=<database-name>
      - POSTGRES_USER=<database-user>
      - POSTGRES_PASSWORD=<database-password>
      - POSTGRES_HOST=<database-host>
  web:
    build: .
    command: bash -c "
        python manage.py makemigrations &&
        python manage.py migrate &&
        /wait-for-it.sh db:5432 -- python manage.py runserver 0.0.0.0:8000"
    volumes:
      - ./wait-for-it.sh:/wait-for-it.sh
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db
    env_file: .env

 

In this docker-compose.yml, we tell the database container named db to use Postgres 13 images from Postgres Docker Official Images to build our image. Then, we define the environment variables that are needed for our database. For the Django application container, named web, we build the image based on the Dockerfile that we previously created. We just need to add some commands to make migrations and migrate our database, wait for our database on port 5432 to be online using the wait-for-it bash script, and then run our Django application on port 8000. The volumes part defines the directories that we use to be the volume data for our container. Don’t forget to define the depends_on part on the web container to tell that the web container depends on the db container. The last step is we define the env_file part to tell the container the name of our environment variables file that is needed for the web application.  

After that, we just need to create the environment variables file named .env on our project’s root folder and then run docker-compose with sudo docker-compose up to build our image, create the containers, and start the containers. If the container is running properly, it will looks like this:  

 

Insdustri Pilar's backend running container

 

In another terminal, check the running container.

  $ docker ps

  To run collectstatic, run:

  $ docker exec -it <web-container-id> python manage.py collectstatic --noinput

  To create or update API configuration in database, run:

  $ docker exec -it <web-container-id> python manage.py createorupdateapiconfig

  To access the database, run:

  $ docker exec -it <postgres-container-id> psql -U <database-user> -d <database-name> -h db

 

That’s it, we dockerized our Industri Pilar’s backend that uses Django and PostgreSQL using Docker Engine and Docker Compose! If you want to explore more about Docker, you can visit the Official Docker Docs.


Scrum of Scrums

May 23, 2021
PPL Uni Life Product Management

Persona and Requirements Gathering

May 2, 2021
PPL Uni Life UI/UX

Usability Testing

March 21, 2021
PPL Uni Life QA