Hi Sumit,
I think it is unrelated to the PR tool. Please check the below link:
https://stackoverflow.com/questions/44316292/ssl-sslerror-tlsv1-alert-protocol-version
Try running: pip install requests[security]
Regards,
Kaxil
On 15/05/2018, 07:46, "Sumit Maheshwari"
Hi all,
I have been using airflow with Celery executor in the background
https://hastebin.com/sipecovomi.ini --> airflow.cfg
https://hastebin.com/urutokuvoq.py --> The dag I have been using
This shows that the dag is always in running state.
Airflow flower shows nothing in the tasks or
Kaxil,
Can you try these steps and update the airflow wiki (committer guide) based on
your findings?
-s
Sent from Sid's iPhone
Begin forwarded message:
> From: Martin Gainty
> Date: May 15, 2018 at 4:21:13 AM PDT
> To: "san...@apache.org"
> Subject:
I have had very similar issues when there was a problem with the connection
string pointing to the message broker. Triple check those connection
strings and attempt to connect outside of airflow.
On Tue, May 15, 2018 at 9:27 AM Goutham Pratapa
wrote:
> Hi all,
>
> I
Sure I will do this and update the wiki
On 15/05/2018, 14:51, "siddharth anand" wrote:
Kaxil,
Can you try these steps and update the airflow wiki (committer guide) based
on your findings?
-s
Sent from Sid's iPhone
Kaxil Naik
Data
What I find is that when celery rejects we hit this. For us we don't do
work on the hosts so solve by over provisioning tasks in celery
On Tue, May 15, 2018, 6:30 AM Andy Cooper wrote:
> I have had very similar issues when there was a problem with the connection
>
Hi Sid,
I think Jenkins trigger a build on incubator svn repo daily as seen here
https://builds.apache.org/view/H-L/view/Incubator/job/Incubator%20Site/426/console
When I checked http://incubator.apache.org/projects/airflow.html the page has
already been updated.
However, I still tried to
Thanks for the reply Kaxil.
Yeah, I too figured out that issue is with my laptop's OpenSSL lib and
python installed on it. So upgraded OpenSSL and installed python3.6, and
created a new virtualenv for now.
On Tue, May 15, 2018 at 1:07 PM, Naik Kaxil wrote:
> Hi Sumit,
>
> I
You are right, but that's within the same process. The way each operator
gets executed is that one `airflow run` command get generated and sent to
the local executor, executor spun up subprocesses to run `airflow run
--raw` (which parses the file again and calls the operator.execute()). Thus
each
Hi,
Are there any plans to update the HDFS_hook.py script to remove the reference
to the snakebite python library? I’d like to run airflow on python3, and this
is causing some issues. The hdfs_hook script is referenced in the sensors
module.
Any suggestions?
Thanks,
Cindy
Thanks for the explanation, really helpful.
Cheers,
Ali
On 2018/05/16 03:27:27, Ruiqin Yang wrote:
> You are right, but that's within the same process. The way each operator
> gets executed is that one `airflow run` command get generated and sent to
> the local executor,
I think there might be two ways:
1. Setup the connections via. the Airflow UI:
http://airflow.readthedocs.io/en/latest/configuration.html#connections, I guess
this could be done in your code also.
2. Put your connection setup into a operator at the begin of your dag
Not exactly answering your question but the reason db.py is loaded in each
task might be because you have something like `import db` in each of your
*.py file, and Airflow spun up one process to parse one *.py file, thus
your db.py was loaded multiple time.
I'm not sure how you can share the
13 matches
Mail list logo