ThiagoPositeli opened a new issue #21800:
URL: https://github.com/apache/airflow/issues/21800
### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
Google Cloud Composer-2.0.0-preview.5
Airflow-2.1.4
### Apache Airflow version
2.1.4
### Operating System
UNIX
### Deployment
Composer
### Deployment details
_No response_
### What happened
I write this dag(above) in cloud composer to create and delete dataproc
cluster.
But when the cluster is created even with the option
enable_component_gateway=True it does not enable the component gateway with
access to jupyter notebook as parameterized in the dag.
But the additional components are enabled as per the image.
`from airflow.contrib.sensors.gcs_sensor import
GoogleCloudStoragePrefixSensor
from airflow import DAG
from datetime import datetime, timedelta
from airflow.contrib.operators.gcs_to_bq import
GoogleCloudStorageToBigQueryOperator
from airflow.providers.google.cloud.operators.dataproc import
DataprocCreateClusterOperator, DataprocDeleteClusterOperator, ClusterGenerator
yesterday = datetime.combine(datetime.today() - timedelta(1),
datetime.min.time())
default_args = {
'owner': 'teste3',
'depends_on_past': False,
'start_date' :yesterday,
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(minutes=5),
}
dag = DAG(
'teste-dag-3',catchup=False, default_args=default_args,
schedule_interval=None)
CLUSTER_GENERATOR = ClusterGenerator(
project_id="sandbox-coe",
cluster_name='teste-ge-{{ ds }}',
num_masters=1,
master_machine_type='n2-standard-8',
worker_machine_type='n2-standard-8',
worker_disk_size=500,
master_disk_size=500,
master_disk_type='pd-ssd',
worker_disk_type='pd-ssd',
image_version='1.5.56-ubuntu18',
tags=['allow-dataproc-internal'],
region='us-central1',
zone='us-central1-f',
storage_bucket = 'bucket-dataproc-ge',
labels = {'product' : 'sample-label'},
enable_component_gateway=True, # this is not working
optional_components = [ 'JUPYTER', 'ANACONDA' ]
).make()
create_cluster=DataprocCreateClusterOperator(
dag=dag,
task_id='start_cluster_example',
cluster_name='teste-ge-{{ ds }}',
project_id="sandbox-coe",
cluster_config=CLUSTER_GENERATOR,
region='us-central1'
)
stop_cluster_example = DataprocDeleteClusterOperator(
dag=dag,
task_id='stop_cluster_example',
cluster_name='teste-ge-{{ ds }}',
project_id= 'sandbox-coe',
region='us-central1',
) #stops a running dataproc cluster
create_cluster >> stop_cluster_example
`
<img width="650" alt="Captura de Tela 2022-02-24 às 14 46 35"
src="https://user-images.githubusercontent.com/92527247/155581482-a8943c98-8959-4cdc-bbf4-0633d0f9b771.png">
<img width="1045" alt="Captura de Tela 2022-02-24 às 15 02 05"
src="https://user-images.githubusercontent.com/92527247/155581497-39069190-8ad2-40e3-b1dd-79733563febc.png">
### What you expected to happen
I hope it happens that when I create the cluster, the web interface
components that I activated when creating the cluster appear as in the image
<img width="1045" alt="Captura de Tela 2022-02-24 às 15 02 05"
src="https://user-images.githubusercontent.com/92527247/155581576-c3923836-233c-42aa-ac4c-341d1d4772fa.png">
### How to reproduce
Execute the dag above in cloud composer with DataprocCreateClusterOperator.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]