afctl
The proposed CLI tool is authored to make creating and deployment of Apache Airflow (https://airflow.apache.org/) projects faster and smoother. As of now, there is no tool out there that can empower the user to create a boilerplate code structure for airflow projects and make development + deployment of projects seamless.
Requirements
- Python 3.5+
- Docker
Getting Started
1. Installation
Create a new python virtualenv. You can use the following command.
python3 -m venv <name>
Activate your virtualenv
source /path_to_venv/bin/activate
pip3 install afctl
2. Initialize a new afctl project.
The project is created in your present working directory. Along with this a configuration file with the same name is generated in /home/.afctl_configs directory.
afctl init <name of the project>
Eg.
afctl init project_demo
- The following directory structure will be generated
.
βββ deployments
βΒ Β βββ project_demo-docker-compose.yml
βββ migrations
βββ plugins
βββ project_demo
βΒ Β βββ commons
βΒ Β βββ dags
βββ requirements.txt
βββ tests
If you already have a git repository and want to turn it into an afctl project. Run the following command :-
afctl init .
3. Add a new module in the project.
afctl generate module -n <name of the module>
The following directory structure will be generated :
afctl generate module -n first_module
afctl generate module -n second_module
.
βββ deployments
βΒ Β βββ project_demo-docker-compose.yml
βββ migrations
βββ plugins
βββ project_demo
βΒ Β βββ commons
βΒ Β βββ dags
βΒ Β βββ first_module
βΒ Β βββ second_module
βββ requirements.txt
βββ tests
βββ first_module
βββ second_module
4. Generate dag
afctl generate dag -n <name of dag> -m <name of module>
The following directory structure will be generate :
afctl generate dag -n new -m first_module
.
βββ deployments
βΒ Β βββ project_demo-docker-compose.yml
βββ migrations
βββ plugins
βββ project_demo
βΒ Β βββ commons
βΒ Β βββ dags
βΒ Β βββ first_module
βΒ Β βΒ Β βββ new_dag.py
βΒ Β βββ second_module
βββ requirements.txt
βββ tests
βββ first_module
βββ second_module
The dag file will look like this :
from airflow import DAG
from datetime import datetime, timedelta
default_args = {
'owner': 'project_demo',
# 'depends_on_past': ,
# 'start_date': ,
# 'email': ,
# 'email_on_failure': ,
# 'email_on_retry': ,
# 'retries': 0
}
dag = DAG(dag_id='new', default_args=default_args, schedule_interval='@once')
5. Deploy project locally
You can add python packages that will be required by your dags in requirements.txt
. They will automatically get
installed.
- To deploy your project, run the following command (make sure docker is running) :
afctl deploy local
If you do not want to see the logs, you can run
afctl deploy local -d
This will run it in detached mode and won't print the logs on the console.
- You can access your airflow webserver on browser at
localhost:8080
6. Deploy project on production
- Here we will be deploying our project to Qubole. Sign up at us.qubole.com.
- add git-origin and access-token (if want to keep the project as private repo on Github) to the configs. See how
- Push the project once completed to Github.
- Deploying to Qubole will require adding deployment configurations.
afctl config add -d qubole -n <name of deployment> -e <env> -c <cluster-label> -t <auth-token>
This command will modify your config file. You can see your config file with the following command :
afctl config show
For example -
afctl config add -d qubole -n demo -e https://api.qubole.com -c airflow_1102 -t khd34djs3
- To deploy run the following command
afctl deploy qubole -n <name>
https://www.youtube.com/watch?v=A4rcZDGtJME&feature=youtu.be
Manage configurations
The configuration file is used for deployment contains the following information.
global:
-airflow_version:
-git:
--origin:
--access-token:
deployment:
-qubole:
--local:
---compose:
airflow_version
can be added to the project when you initialize the project.
afctl init <name> -v <version>
- global configs (airflow_version, origin, access-token) can all be added/ updated with the following command :
afctl config global -o <git-origin> -t <access-token> -v <airflow_version>
Usage
Commands right now supported are
- init
- config
- deploy
- list
- generate
To learn more, run
afctl <command> -h
Caution
Not yet ported for Windows.
Credits
Docker-compose file : https://github.com/puckel/docker-airflow