tsugumi-sys commented on issue #22863:
URL: https://github.com/apache/airflow/issues/22863#issuecomment-1132516126
In my case, I need to re-install `pycurl` and set environment variables like
below.
When I try to use SQS, a worker instance caused errors like `ImportError:
The curl client requires the pycurl library.`
Next, I try to re-install pycurl and `Permission denied` error caused
(curl-config), so I run `pip install pycurl` as root and airflow seems to be
working in my laptop.
I also feel it is so haed to understand how to use SQS in airflow from the
documents. I am not sure why this settings works for now but this might help?
Dockerfile
``` Dockerfile
FROM apache/airflow:2.3.0
ENV AIRFLOW_HOME=/opt/airflow
ENV ENTRYPOINT_CONTAINER_FILE_PATH=/tmp/entrypoint.sh
ARG ENTRYPOINT_FILE_PATH
USER root
RUN apt-get update && apt-get install -y python3-pip \
libcurl4-gnutls-dev \
librtmp-dev \
python3-dev \
libpq-dev
COPY ${ENTRYPOINT_FILE_PATH} ${ENTRYPOINT_CONTAINER_FILE_PATH}
RUN chmod +x /tmp/entrypoint.sh
RUN chown -R airflow: ${AIRFLOW_HOME}
RUN echo "airflow ALL=(ALL) NOPASSWD:ALL" >> /etc/sudoers
USER airflow
RUN pip uninstall pycurl
RUN sudo pip install pycurl
WORKDIR ${AIRFLOW_HOME}
ENTRYPOINT [ "/tmp/entrypoint.sh" ]
```
.env
```
# Airflow envs
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL=sqs://@sqs.ap-northeast-1.amazonaws.com/119814312462/akiranoda-test-sqs-for-airflow
AIRFLOW__CORE__EXECUTOR=CeleryExecutor
AIRFLOW_DAGS_FOLDER_PATH=./dags
AIRFLOW_ADMIN_PASS=test1234
[email protected]
# AWS
AWS_ACCESS_KEY_ID=***
AWS_SECRET_ACCESS_KEY=***
AWS_DEFAULT_REGION=***
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]