MatrixManAtYrService opened a new issue, #26042:
URL: https://github.com/apache/airflow/issues/26042

   ### Apache Airflow version
   
   main (development)
   
   ### What happened
   
   I was walking through this tutorial: 
https://astro-sdk-python.readthedocs.io/en/stable/getting-started/GETTING_STARTED.html
 which had me create an aws connection.  After doing so, I clicked the "test" 
button, and saw this:
   
   <img width="403" alt="Screen Shot 2022-08-29 at 10 54 35 AM" 
src="https://user-images.githubusercontent.com/5834582/187253551-bbcace51-b203-444d-805e-e7608ec41b0e.png";>
   
   
   And this in the webserver logs:
   ```
   [2022-08-29 10:52:59,759] {base.py:69} INFO - Using connection ID 'hSL34Jpz' 
for task execution.
   [2022-08-29 10:52:59,759] {connection_wrapper.py:210} INFO - AWS Connection 
(conn_id='hSL34Jpz', conn_type='aws') credentials retrieved from login and 
password.
   [2022-08-29 10:52:59 -0600] [64218] [WARNING] Worker with pid 64227 was 
terminated due to signal 11
   ```
   
   ### What you think should happen instead
   
   If the test was successful, I should not see a red box.  If it failed, I 
should see a red box with something useful in it
   
   ### How to reproduce
   
   test a connection.  I've also seen this with the snowflake provider, so I 
don't think it's aws specific
   
   ### Operating System
   
   Mac OS
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==5.0.0
   apache-airflow-providers-common-sql==1.1.0
   apache-airflow-providers-ftp==3.1.0
   apache-airflow-providers-http==4.0.0
   apache-airflow-providers-imap==3.0.0
   apache-airflow-providers-snowflake==3.2.0
   apache-airflow-providers-sqlite==3.2.0
   
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   I've been setting up airflow venvs like this (nothing especially weird in 
this case)
   
   1. clone airflow (currently at 018f1d5ebb )
   2. `python scripts/ci/pre_commit/pre_commit_compile_www_assets.py`
   3. run postgres in docker in a separate shell and set up the venv like so:
   ```
   ENV_NAME="$(pwd | sed 's#/#_#g')"
   pyenv virtualenv 3.10.6 $ENV_NAME
   pyenv activate $ENV_NAME
   export AIRFLOW_HOME=$(pwd)
   pip install psycopg2
   pip install ~/src/airflow
   pip install 'astro-sdk-python[amazon,snowflake]>=0.11'
   airflow info
   sed -i .bak 's#^sql_alchemy_conn = .*$#sql_alchemy_conn = 
postgres://postgres:[email protected]#g' airflow.cfg
   sed -i .bak 's#^load_examples =.*$#load_examples = False#' airflow.cfg
   sed -i .bak 's#^executor =.*$#executor = LocalExecutor#' airflow.cfg
   rm *.bak
   airflow db init
   airflow users create --username admin --password admin --firstname admin 
--lastname admin --role Admin --email [email protected]
   ```
   5. modify airflow.cfg to include the following
   ```
   [core]
   enable_xcom_pickling = True
   
   [astro_sdk]
   SQL_SCHEMA = ASTRO_SDK_SCHEMA
   ```
   5. `airflow webserver`
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to