Retrospective: https://github.com/kernelv5/Airflow-docker-SQS-EFS-RDS
Apache 1.10.12
Step 1: Create one Connection using Airflow UI
Create a new connection with the following attributes:
Airflow UI > Admin > Connections
Conn Id: my_conn_S3
Conn Type: S3
Extra:{"aws_access_key_id":"_your_aws_access_key_id_", "aws_secret_access_key": "_your_aws_secret_access_key_"}
If webserver and scheduler are running into a separate node, use the AIRFLOW__CORE__FERNET_KEY
env var to explicitly set the key for both containers. Unless Each node will generate its own fernet key meaning they cannot decode each other’s secrets. Using this fernet key Airflow key encrypt or decrypt.
AIRFLOW__CORE__REMOTE_LOGGING=True
AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER=s3://<bucketName>/<subFolder>
AIRFLOW__CORE__REMOTE_LOG_CONN_ID=my_conn_S3
AIRFLOW__CORE__FERNET_KEY=I9So6d3okNjVsj_Fb_FXB-piZ8NLSWaSeDNWz2aHs0c=
Typing Randomly something for AIRFLOW__CORE__FERNET_KEY not gonna work.
python3 -c “from cryptography.fernet import Fernet; print(Fernet.generate_key())”
For installation and testing
https://airflow.apache.org/docs/stable/installation.html
pip install ‘apache-airflow[crypto]’python3 -c “from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())”