mrichman commented on PR #28338:
URL: https://github.com/apache/airflow/pull/28338#issuecomment-1489526670
> > It's failing on the last line of the test `assert sensor.poke(None)`.
What am I doing wrong here?
>
> Can you include some more context (traceback, logs, etc) for the failure
you're seeing?
Here's the output of `breeze testing tests
./tests/providers/amazon/aws/sensors/test_dynamodb.py`
```
============================= test session starts
==============================
platform linux -- Python 3.10.10, pytest-7.2.2, pluggy-1.0.0
rootdir: /opt/airflow, configfile: pytest.ini
plugins: asyncio-0.21.0, httpx-0.21.3, time-machine-2.9.0, instafail-0.4.2,
timeouts-1.2.1, requests-mock-1.10.0, anyio-3.6.2, xdist-3.2.1, cov-4.0.0,
rerunfailures-11.1.2, capture-warnings-0.0.4
asyncio: mode=strict
setup timeout: 60.0s, execution timeout: 60.0s, teardown timeout: 60.0s
collected 3 items
tests/providers/amazon/aws/sensors/test_dynamodb.py ..F
[100%]
=================================== FAILURES
===================================
______________ TestDynamoDBValueSensor.test_sensor_with_pk_and_sk
______________
self =
<tests.providers.amazon.aws.sensors.test_dynamodb.TestDynamoDBValueSensor
object at 0x7f4139d84f10>
ddb_mock = <MagicMock name='DynamoDBHook' id='139917553043200'>
@mock_dynamodb
@mock.patch("airflow.providers.amazon.aws.sensors.dynamodb.DynamoDBHook")
def test_sensor_with_pk_and_sk(self, ddb_mock):
hook = DynamoDBHook(
aws_conn_id=AWS_CONN_ID, table_name=TABLE_NAME,
table_keys=["PK"], region_name=REGION_NAME
)
hook.conn.create_table(
TableName=TABLE_NAME,
KeySchema=[
{"AttributeName": "PK", "KeyType": "HASH"},
{"AttributeName": "SK", "KeyType": "RANGE"},
],
AttributeDefinitions=[
{"AttributeName": "PK", "AttributeType": "S"},
{"AttributeName": "SK", "AttributeType": "S"},
],
BillingMode="PAY_PER_REQUEST",
)
table = hook.conn.Table(TABLE_NAME)
table.meta.client.get_waiter("table_exists").wait(TableName=TABLE_NAME)
assert table.table_status == "ACTIVE"
sensor = DynamoDBValueSensor(
task_id=TASK_ID,
poke_interval=30,
timeout=120,
soft_fail=False,
retries=10,
table_name=TABLE_NAME, # replace with your table name
partition_key_name="PK", # replace with your partition key name
partition_key_value="Test", # replace with your partition key
value
sort_key_name="SK", # replace with your sort key name (if
applicable)
sort_key_value="2023-03-28T11:11:25-0400", # replace with your
sort key value (if applicable)
attribute_name="Foo", # replace with the attribute name to wait
for
attribute_value="Bar", # replace with the attribute value to
wait for (sensor will return true when this value matches the attribute value
in the item)
)
assert not sensor.poke(None)
table.put_item(Item={"PK": "123", "SK": "2023-03-28T11:11:25-0400",
"Foo": "Bar"})
> assert sensor.poke(None)
E assert False
E + where False = <bound method DynamoDBValueSensor.poke of
<Task(DynamoDBValueSensor): dynamodb_value_sensor>>(None)
E + where <bound method DynamoDBValueSensor.poke of
<Task(DynamoDBValueSensor): dynamodb_value_sensor>> =
<Task(DynamoDBValueSensor): dynamodb_value_sensor>.poke
tests/providers/amazon/aws/sensors/test_dynamodb.py:104: AssertionError
----------------------------- Captured stderr call
-----------------------------
INFO [airflow.hooks.base] Using connection ID 'aws_default' for task
execution.
INFO [botocore.credentials] Found credentials in environment variables.
INFO [airflow.task.operators] Checking table test_airflow foritem Partition
Key: PK=Test
Sort Key: SK=2023-03-28T11:11:25-0400
attribute: Foo=Bar
INFO [airflow.task.operators] Response: <MagicMock
name='DynamoDBHook().conn.Table().get_item()' id='139917471067984'>
INFO [airflow.task.operators] Checking table test_airflow foritem Partition
Key: PK=Test
Sort Key: SK=2023-03-28T11:11:25-0400
attribute: Foo=Bar
INFO [airflow.task.operators] Response: <MagicMock
name='DynamoDBHook().conn.Table().get_item()' id='139917471067984'>
------------------------------ Captured log call
-------------------------------
INFO airflow.hooks.base:base.py:73 Using connection ID 'aws_default' for
task execution.
INFO botocore.credentials:credentials.py:1124 Found credentials in
environment variables.
INFO airflow.task.operators:dynamodb.py:79 Checking table test_airflow
foritem Partition Key: PK=Test
Sort Key: SK=2023-03-28T11:11:25-0400
attribute: Foo=Bar
INFO airflow.task.operators:dynamodb.py:82 Response: <MagicMock
name='DynamoDBHook().conn.Table().get_item()' id='139917471067984'>
INFO airflow.task.operators:dynamodb.py:79 Checking table test_airflow
foritem Partition Key: PK=Test
Sort Key: SK=2023-03-28T11:11:25-0400
attribute: Foo=Bar
INFO airflow.task.operators:dynamodb.py:82 Response: <MagicMock
name='DynamoDBHook().conn.Table().get_item()' id='139917471067984'>
----------- generated xml file: /files/test_result-All-postgres.xml
------------
============================ slowest 100 durations
=============================
5.38s setup
tests/providers/amazon/aws/sensors/test_dynamodb.py::TestDynamoDBValueSensor::test_init
0.30s call
tests/providers/amazon/aws/sensors/test_dynamodb.py::TestDynamoDBValueSensor::test_sensor_with_pk_and_sk
0.27s call
tests/providers/amazon/aws/sensors/test_dynamodb.py::TestDynamoDBValueSensor::test_conn_returns_a_boto3_connection
(6 durations < 0.005s hidden. Use -vv to show these durations.)
=========================== short test summary info
============================
FAILED
tests/providers/amazon/aws/sensors/test_dynamodb.py::TestDynamoDBValueSensor::test_sensor_with_pk_and_sk
- assert False
+ where False = <bound method DynamoDBValueSensor.poke of
<Task(DynamoDBValueSensor): dynamodb_value_sensor>>(None)
+ where <bound method DynamoDBValueSensor.poke of
<Task(DynamoDBValueSensor): dynamodb_value_sensor>> =
<Task(DynamoDBValueSensor): dynamodb_value_sensor>.poke
=================== 1 failed, 2 passed, 6 warnings in 6.27s
====================
Number of warnings: 0 /files/warnings-All-postgres.txt
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]