roveo opened a new issue #11476:
URL: https://github.com/apache/airflow/issues/11476
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the
following questions.
Don't worry if they're not all applicable; just try to include what you can
:-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the
context.
-->
**Apache Airflow version**:
1.10.10
**Environment**:
Docker on Linux (but it doesn't really matter here).
**What happened**:
`VerticaOperator` uses `vertica_python`. By default, if there are multiple
statements in the query, `vertica_python` will return only the first result set
on `cursor.fetchall()` and raise exceptions associated with this result set.
E.g. a task running this query:
```sql
select 1;
select throw_error('test');
```
executed by the operator will be marked as "Success" and won't show any
exceptions in the logs. This is a minimal example of an exception, but the same
goes for any error on the side of Vertica (most notably, constraint violations).
**What you expected to happen**:
All exceptions raised properly.
DAG:
```py
from airflow import DAG
from airflow.contrib.operators.vertica_operator import VerticaOperator
from airflow.utils.dates import days_ago
query = """
select 1;
select throw_error("test");
"""
with DAG(dag_id="test", schedule_interval="@daily", start_date=days_ago(0))
as dag:
VerticaOperator(task_id="test", sql=query, vertica_conn_id="dwh")
```
Then run:
```sh
airflow test test test 2020-01-01
```
Result on my setup:
```airflow@d98a1f7740e6:/opt/airflow$ airflow test test test 2020-01-01
[2020-10-12 17:34:34,381] {__init__.py:51} INFO - Using executor
SequentialExecutor
[2020-10-12 17:34:34,382] {dagbag.py:396} INFO - Filling up the DagBag from
/opt/airflow/dags
[2020-10-12 17:34:36,856] {taskinstance.py:669} INFO - Dependencies all met
for <TaskInstance: test.test 2020-01-01T00:00:00+00:00 [None]>
[2020-10-12 17:34:37,038] {taskinstance.py:669} INFO - Dependencies all met
for <TaskInstance: test.test 2020-01-01T00:00:00+00:00 [None]>
[2020-10-12 17:34:37,038] {taskinstance.py:879} INFO -
--------------------------------------------------------------------------------
[2020-10-12 17:34:37,039] {taskinstance.py:880} INFO - Starting attempt 1 of
1
[2020-10-12 17:34:37,040] {taskinstance.py:881} INFO -
--------------------------------------------------------------------------------
[2020-10-12 17:34:37,042] {taskinstance.py:900} INFO - Executing
<Task(VerticaOperator): test> on 2020-01-01T00:00:00+00:00
[2020-10-12 17:34:37,349] {vertica_operator.py:47} INFO - Executing:
select 1;
select throw_error("test");
[2020-10-12 17:34:37,414] {base_hook.py:87} INFO - Using connection to: id:
***. Host: ***, Port: None, Schema: ***, Login: ***, Password: ***, extra: None
[2020-10-12 17:34:37,506] {dbapi_hook.py:174} INFO -
select 1;
select throw_error("test");
[2020-10-12 17:34:37,618] {taskinstance.py:1065} INFO - Marking task as
SUCCESS.dag_id=test, task_id=test, execution_date=20200101T000000,
start_date=20201012T173436, end_date=20201012T173437
```
** Fix ideas **
Ideally, we should run some variation of this code either in the operator or
in the hook:
```py
cursor.execute(query)
res = cursor.fetchone()
while res is not None:
cursor.nextset()
res = cursor.fetchone()
```
I can create a PR, but the main question is whether this is expected
behaviour (well, I expected it and spent quite some time figuring out where the
problem is) and where the fix should go.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]