I’m very new to Apache Airflow, so I started to learn from the quick start. Today I tried to run Airflow in Docker as a first step.
Running Airflow in Docker
Dev environment
- Ubuntu version: 20.04.3 LTS
- Docker Engine version: 10.10.11
- docker-compose version: 1.29.2 (I upgraded from 1.25.0 following Install Docker Compose)
docker-compose.yaml
I followed the quick start guide of documentation of Apache airflow.
To begin with, I fetched docker-compose.yaml.
Created airflow directory.
mkdir airflow & cd airflow
Fetched docker-compose.yaml using curl command.
curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.2.4/docker-compose.yaml'
This file contains several service definitions:

Created the following directories which can mount the directories in the container.
./dags
– you can put your DAG files here../logs
– contains logs from task execution and scheduler../plugins
– you can put your custom plugins here.
mkdir -p ./dags ./logs ./plugins
You can check the setting of volume in docker-compose.yaml file.

This is the list of the files in the airflow directory.

Set user id for the docker-compose.
echo -e "AIRFLOW_UID=$(id -u)" > .env
Initialize the database, which means running database migrations and create the first user account.
docker-compose up airflow-init
After initialization is complete, the following message were displayed.

Started all services.
docker-compose up
Log in to the web interface. The webserver is available at: http://localhost:8080.

I used Ubuntu in virtual box, so I set port forwarding of the network in virtual box.


I could start Apache Airflow in docker. I’ll work on tutorial of Airflow in the next post.
Thank you,
Toge