[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7689: [AIRFLOW-7038] Cassandra sensors tests

2020-03-15 Thread GitBox
nuclearpinguin commented on a change in pull request #7689: [AIRFLOW-7038] 
Cassandra sensors tests
URL: https://github.com/apache/airflow/pull/7689#discussion_r392713867
 
 

 ##
 File path: tests/providers/apache/cassandra/sensors/test_record.py
 ##
 @@ -0,0 +1,71 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import unittest
+
+import pytest
+
+from airflow import DAG
+from airflow.providers.apache.cassandra.hooks.cassandra import CassandraHook
+from airflow.providers.apache.cassandra.sensors.record import 
CassandraRecordSensor
+from airflow.utils import timezone
+
+DEFAULT_DATE = timezone.datetime(2017, 1, 1)
+
+
+@pytest.mark.integration("cassandra")
+class TestCassandraRecordSensor(unittest.TestCase):
+def setUp(self) -> None:
+args = {
+'owner': 'airflow',
+'start_date': DEFAULT_DATE
+}
+
+self.dag = DAG('test_dag_id', default_args=args)
+
+def test_poke(self):
+hook = CassandraHook(cassandra_conn_id="cassandra_default")
+session = hook.get_conn()
+cqls = [
+"DROP TABLE IF EXISTS s.t",
+"CREATE TABLE s.t (pk1 text, pk2 text, c text, PRIMARY KEY (pk1, 
pk2))",
+"INSERT INTO s.t (pk1, pk2, c) VALUES ('foo', 'bar', 'baz')",
+]
+for cql in cqls:
+session.execute(cql)
 
 Review comment:
   I would mock `CassandraHook` and:
   - check if it was initialized with required parameter
   - the `record_exists` of initialized `hook`  was called with expected 
parameters
   - check if the expected value was returned (mocking it)
   - make sure that all methods I mock are already covered by tests. In this 
case, I would check tests for `CassandraHook`
   
   At least that is the approach we use for GCP operators / hooks. Once you 
cover the hook with tests you can mock the hook in operator tests. In my 
opinion, it's a nice way to separate and highlight what exactly is tested. 
   
   Of course, mocking only asserts that arguments are passed as expected and 
some methods are called. To check integrity we should use system test (that 
runs example DAG). It's something still discussed but is easy to do with 
`SystemTest`:
   https://github.com/apache/airflow/blob/master/TESTING.rst#id24
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7689: [AIRFLOW-7038] Cassandra sensors tests

2020-03-14 Thread GitBox
nuclearpinguin commented on a change in pull request #7689: [AIRFLOW-7038] 
Cassandra sensors tests
URL: https://github.com/apache/airflow/pull/7689#discussion_r392589839
 
 

 ##
 File path: tests/providers/apache/cassandra/sensors/test_record.py
 ##
 @@ -0,0 +1,71 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import unittest
+
+import pytest
+
+from airflow import DAG
+from airflow.providers.apache.cassandra.hooks.cassandra import CassandraHook
+from airflow.providers.apache.cassandra.sensors.record import 
CassandraRecordSensor
+from airflow.utils import timezone
+
+DEFAULT_DATE = timezone.datetime(2017, 1, 1)
+
+
+@pytest.mark.integration("cassandra")
+class TestCassandraRecordSensor(unittest.TestCase):
+def setUp(self) -> None:
+args = {
+'owner': 'airflow',
+'start_date': DEFAULT_DATE
+}
+
+self.dag = DAG('test_dag_id', default_args=args)
+
+def test_poke(self):
+hook = CassandraHook(cassandra_conn_id="cassandra_default")
+session = hook.get_conn()
+cqls = [
+"DROP TABLE IF EXISTS s.t",
+"CREATE TABLE s.t (pk1 text, pk2 text, c text, PRIMARY KEY (pk1, 
pk2))",
+"INSERT INTO s.t (pk1, pk2, c) VALUES ('foo', 'bar', 'baz')",
+]
+for cql in cqls:
+session.execute(cql)
 
 Review comment:
   This is not a pure unit test and it has side effects (the table is not 
dropped in tear down). I would suggest to mock connection with database (this 
will make the test faster) and add system tests (and example DAG). For example 
here's system test for PostgresToGCS:
   
https://github.com/apache/airflow/blob/master/tests/providers/google/cloud/operators/test_postgres_to_gcs_system.py
   
   System tests are not run regularly but this hopefully will change soon as 
@potiuk is working on that :)


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services