josh-fell commented on a change in pull request #19237:
URL: https://github.com/apache/airflow/pull/19237#discussion_r737829769
##########
File path:
airflow/providers/amazon/aws/example_dags/example_google_api_to_s3_transfer_advanced.py
##########
@@ -33,13 +33,13 @@
on a YouTube channel you want to retrieve.
"""
+from datetime import datetime
from os import getenv
from airflow import DAG
from airflow.operators.dummy import DummyOperator
-from airflow.operators.python import BranchPythonOperator, get_current_context
+from airflow.operators.python import BranchPythonOperator
from airflow.providers.amazon.aws.transfers.google_api_to_s3 import
GoogleApiToS3Operator
-from airflow.utils.dates import days_ago
# [START howto_operator_google_api_to_s3_transfer_advanced_env_variables]
YOUTUBE_CONN_ID = getenv("YOUTUBE_CONN_ID", "google_cloud_default")
Review comment:
This was left in for a couple reasons.
- There is documentation that references these variables and specifically
states "This example relies on the following variables, which can be passed via
OS environment variables". I didn't want to remove any context related to main
operator documentation.
- There is an Amazon provider system test which runs this DAG so I wanted to
be cognizant of impact to these tests when folks run them. (For context, system
tests rely on local credentials and _actually_ connect to external services to
run example DAGs. I'm not sure what kind of automated testing is done by those
who are maintaining this provider that may reference this environment variable.)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]