In this tutorial, you are going to learn how to run a Postgres database using docker and docker-compose by either creating a docker image locally or using one from the docker hub; and, subsequently, connecting to the Postgres database running in the docker container using Python code.
It is important that the reader is familiar with Python and SQL and understands concepts like functions and how to create one in python, conditional statements as well as basic DDL commands in SQL such as CREATE, SELECT, etc.
Docker and Docker Compose
A Dockerfile contains all the commands/instructions a user could call on the command line to assemble an image. Docker Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services.
Running Postgres with Docker and docker-compose makes it very easy to run and maintain, especially in a development environment. In this post, we will look into how to run and use Postgres with Docker and Docker compose, either to create a docker image locally or using one from the docker hub step-by-step keeping things simple and easy. Let’s get started!
Before we dive into some CLI commands and a bit of code below are some prerequisites best to have:
- Basic knowledge of Docker will be helpful in executing commands like docker run, execute, etc.
- Any prior grasp of Docker compose would be useful but not necessary.
- An intermediate understanding of how relational databases work, especially PostgreSQL would be highly beneficial.
- Have the docker engine installed on your machine
Given the aforementioned, we move forward to the next section where we will run some docker commands. Get those itchy fingers ready now.
The following are the steps we are going to take to set up our project:
- Create a project folder ( in this tutorial, you can clone the project folder from this GitHub repo.)
- To build the docker image, run
3. There is a file called main.py, this has the python project entry point.
4. Once you have finished the project or want to test run the code, run the following command in the root directory:
PostgreSQL is a powerful, open-source object-relational database system that uses and extends the SQL language combined with many features that safely store and scale the most complicated data workloads. The image below shows what a Postgres database looks like but the goal of this tutorial is to enable microservices with Postgres and docker and not to teach you the basics of Postgres.
Now, it’s time to create the Python script that works together with the database. The details are in the following:
- Define a function that contains the SQL queries or commands to create the three tables in the Postgres database. This function is an SQL script that ingests the data into the tables in the Postgres database.
- Define the parameters as environment variables to create the connection string needed for the psycopg2 which allows us to make the connection to PostgreSQL.
- Define two functions that fetch the results after the tables have been created and populated with the CSV files and print the result for inference while the second function gets the commands or queries.
- The test_util.py which relies on the pytest library is used to test that everything is working fine.
For making the python script work, we specified the dependencies in the requirements.txt file.
The next thing now is to create a docker image through the docker file that was created in the project folder. Below shows some basic commands we need to get familiar with when creating a docker file.
- FROM: provides the base image first to the Docker server that needs to be pulled from the Docker hub. In this case, we pulled the latest version of python from the Docker hub.
- WORKDIR: used to specify the working directory where our COPY directive copies files when no file path is specified
- COPY: Copies all the files and contents from the local directory to the Docker directory
- RUN: Run a command during the building of the image. In this case, we install the libraries specified in requirements.txt (that have already been copied into the image working directory).
For more details, refer to the Docker documentation
Putting things together
The final step of the tutorial is the union of the two images that we created. The most elegant way to do that is by creating a docker-compose.yml file at the root of the project.
In this, we are declaring three container services inside our application:
The following docker commands facilitate building the image, defining and running multi-containers docker applications using docker-compose and shutting them down.