The MQ (rabbit / redis / ...) gets the `airflow run {dag_id} {task_id}
{...}` command to execute, and I think the worker runs it blindly as far as
I remember it. It's not ideal as far as security goes since if the MQ is
compromised, there's an open vector to the workers. Eventually it would be
safe
Hi Niranda,
What version of Airflow are you running? There are a lot of improvements
coming up in Airflow 1.10 regarding timezones.
https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/default_airflow.cfg#L77-L79
Cheers, Fokko
2018-07-05 14:57 GMT+02:00 Niranda Per
If there are fixes we want to get in should we merge them into the 1-10-test
branch?
I'm not sure what the release/branching workflow is yet, and I don't think it's
written in the wiki anywere (or I'm bad at finding it)
-ash
> On 1 Jul 2018, at 11:03, Bolke de Bruin wrote:
>
> Hi All,
>
>
Hello,
In support of adding fine-grained Connection encryption (Jira Issue:
https://issues.apache.org/jira/browse/AIRFLOW-2062) I wanted to gather
feedback on a proposed design, as it affects a few different Airflow
components. A full design doc is coming next week.
The end goal is to allow per-C
Thanks Andreas for the quick answer.
On 06/07/2018, 12:16, "Andreas Koeltringer"
wrote:
Because systemd does not read environment files from .bashrc. It has its
own way of getting environment files.
Have a look at the systemd unit file doc [1]. Look out for
"Environmen
Because systemd does not read environment files from .bashrc. It has its
own way of getting environment files.
Have a look at the systemd unit file doc [1]. Look out for
"Environment=" and "EnvironmentFile=" directives.
Another possibility would be to set the env vars systemwide (e.g. in
/et
Hi guys,
I have recently setup Airflow on a new VM with system integration. I have added
some environment variables in my bash_rc file for airflow user. Now when I try
to run a BashOperator by first starting airflow using `airflow webserver -D`
and `airflow scheduler -D` it seem to have access