Docker Compose for Python and PostgreSQL: Setup Guide
Use a
docker-compose.yml file to define services for Python and PostgreSQL. The Python service runs your app, and the PostgreSQL service runs the database, linked via a network and environment variables.Syntax
A docker-compose.yml file defines multiple services that run together. Each service has a name and configuration like the image to use, ports to expose, environment variables, and volumes for data persistence.
Key parts:
- services: Lists containers to run.
- image: Docker image to use.
- build: Build context for custom images.
- ports: Maps container ports to host ports.
- environment: Sets environment variables inside containers.
- volumes: Shares files or folders between host and container.
yaml
version: '3.8' services: python-app: build: ./app ports: - "8000:8000" environment: - DATABASE_URL=postgresql://user:password@db:5432/mydb depends_on: - db db: image: postgres:15 environment: POSTGRES_USER: user POSTGRES_PASSWORD: password POSTGRES_DB: mydb volumes: - pgdata:/var/lib/postgresql/data volumes: pgdata: {}
Example
This example shows a Python app connecting to a PostgreSQL database using Docker Compose. The db service runs PostgreSQL, and the python-app service runs the Python code that connects to it.
The Python app uses environment variables to get database connection info.
yaml
version: '3.8' services: python-app: build: ./app ports: - "8000:8000" environment: - DATABASE_URL=postgresql://user:password@db:5432/mydb depends_on: - db db: image: postgres:15 environment: POSTGRES_USER: user POSTGRES_PASSWORD: password POSTGRES_DB: mydb volumes: - pgdata:/var/lib/postgresql/data volumes: pgdata: {} # app/Dockerfile # ---------------- # FROM python:3.11-slim # WORKDIR /app # COPY requirements.txt ./ # RUN pip install -r requirements.txt # COPY . ./ # CMD ["python", "app.py"] # app/app.py # ---------------- # import os # import psycopg2 # import time # # def connect_db(): # while True: # try: # conn = psycopg2.connect(os.getenv('DATABASE_URL')) # print("Connected to database") # conn.close() # break # except Exception as e: # print(f"Waiting for DB: {e}") # time.sleep(2) # # if __name__ == '__main__': # connect_db()
Output
Connected to database
Common Pitfalls
Common mistakes include:
- Not setting
depends_onso Python tries to connect before PostgreSQL is ready. - Missing environment variables for database credentials.
- Not using volumes for PostgreSQL data, causing data loss on container restart.
- Incorrect database URLs or ports.
Always check logs with docker-compose logs to debug connection issues.
yaml
version: '3.8' services: python-app: build: ./app ports: - "8000:8000" environment: - DATABASE_URL=postgresql://user:password@db:5432/mydb # Missing depends_on here causes race condition db: image: postgres:15 environment: POSTGRES_USER: user POSTGRES_PASSWORD: password POSTGRES_DB: mydb # Missing volume means data is lost on restart volumes: {} # Corrected version adds depends_on and volume as shown in previous examples.
Quick Reference
- Use
depends_onto control startup order. - Set environment variables for database credentials and connection strings.
- Use volumes to persist database data.
- Map ports to access services from your host machine.
- Build Python app with a Dockerfile specifying dependencies.
Key Takeaways
Define Python and PostgreSQL as separate services in docker-compose.yml for easy management.
Use environment variables to pass database connection info securely to the Python app.
Add depends_on to ensure PostgreSQL starts before the Python app tries to connect.
Use volumes to keep PostgreSQL data safe across container restarts.
Check logs with docker-compose logs to troubleshoot connection or startup issues.