Taragolis commented on issue #23900: URL: https://github.com/apache/airflow/issues/23900#issuecomment-1137357273
@eladkal much better than fetch all logs by Airflow, because someone could spawn Glue Job with 40 DPU and in this case Airflow should fetch logs from 80 log prefixes (error and output) Unfortunately it is not easy integrate with AWS UI, for example Developer login in browser into `Account A` and Airflow execute Glue Job in `Account B` - that mean link wouldn't work. But even this better rather than fetch all Glue logs to Airflow Other disgusting fact which I found when I tried to implement extra link for glue job (spoiler alert - I obsolete work on it) execution is URL pattern: *Glue Studio* `https://{region_name}.console.{aws_console_endpoint}/gluestudio/home?region={region_name}#/job/{job_name}/run/{job_id}` *Glue Job (Legacy)* `https://{region_name}.console.{aws_console_endpoint}/glue/home?region={region_name}#jobRun:jobName={job_name};jobRunId={job_id}` Where `aws_console_endpoint` - *aws.amazon.com* - AWS General - *amazonaws.cn* - AWS China - *amazonaws-us-gov.com* - AWS GovCloud (US) Also `region_name` could define: - In operator - In connection - As Environment Variable All these facts make it difficult to develop Extra Link for AWS Glue (as well as other AWS services), but nothing is impossible, e.g. create link during `execute()` and save it in XCom, and finally fetch it by extra link @virendhar-aws WDYT? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
