This is an automated email from the ASF dual-hosted git repository.

chengpan pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/kyuubi.git


The following commit(s) were added to refs/heads/master by this push:
     new 9075fbb62 [KYUUBI #6281][PY] Initialize github action for python unit 
testing
9075fbb62 is described below

commit 9075fbb62392a3578dbb493d8a5900fa1463d106
Author: Harry <[email protected]>
AuthorDate: Tue May 7 18:05:03 2024 +0800

    [KYUUBI #6281][PY] Initialize github action for python unit testing
    
    # :mag: Description
    ## Issue References ๐Ÿ”—
    
    This pull request fixes #6281
    
    ## Describe Your Solution ๐Ÿ”ง
    
    The change initialize a CI job to run unit testing on python client, 
including:
    - Set up Github Action based on docker-compose
    - Update test cases and test succeeded for dialect `presto` and `trino`
    - Temporary disabled hive related test due to test cases are not valid, not 
about connection
    - Update dev dependencies to support python 3.10
    - Speed up testing with `pytest-xdist` plugin
    
    ## Types of changes :bookmark:
    
    - [ ] Bugfix (non-breaking change which fixes an issue)
    - [ ] New feature (non-breaking change which adds functionality)
    - [ ] Breaking change (fix or feature that would cause existing 
functionality to change)
    
    ## Test Plan ๐Ÿงช
    
    #### Behavior Without This Pull Request :coffin:
    Not able to ran unit test in local and on CI
    
    #### Behavior With This Pull Request :tada:
    Able to run and partially cover a couple of test cases
    
    #### Related Unit Tests
    No
    
    ## Additional notes
    Next action is about fixing failing test cases or considering skipping some 
of them if necessary
    
    ---
    
    # Checklist ๐Ÿ“
    
    - [ ] This patch was not authored or co-authored using [Generative 
Tooling](https://www.apache.org/legal/generative-tooling.html)
    
    **Be nice. Be informative.**
    
    Closes #6343 from sudohainguyen/ci/init.
    
    Closes #6281
    
    682e575c4 [Harry] Remove xdist out of scope
    dc42ca1ff [Harry] Pin pytest packages version
    469f1d955 [Harry] Pin ubuntu version
    00cef476a [Harry] Use v4 checkout action
    96ef83148 [Harry] Remove unnecessary steps
    732344a2c [Harry] Add step to tear down containers
    1e2c2481a [Harry] Resolved trino and presto test
    5b33e3924 [Harry] Make tests runnable
    1be033ba3 [Harry] Remove randome flag which causes failed test run
    2bc6dc036 [Harry] Switch action setup provider to docker
    ea2a76319 [Harry] Initialize github action for python unit testing
    
    Authored-by: Harry <[email protected]>
    Signed-off-by: Cheng Pan <[email protected]>
---
 .github/workflows/python.yml                       | 65 +++++++++++++++
 python/.gitignore                                  | 19 +++++
 python/dev_requirements.txt                        |  8 +-
 python/docker/conf/presto/catalog/hive.properties  |  2 +
 .../conf}/presto/config.properties                 |  0
 .../travis-conf => docker/conf}/presto/jvm.config  |  0
 .../conf}/presto/node.properties                   |  0
 python/docker/conf/trino/catalog/hive.properties   |  2 +
 .../conf}/trino/config.properties                  |  0
 .../travis-conf => docker/conf}/trino/jvm.config   |  0
 .../conf}/trino/node.properties                    |  0
 python/docker/docker-compose.yml                   | 61 ++++++++++++++
 python/docker/hadoop-hive.env                      | 30 +++++++
 python/pyhive/tests/test_hive.py                   | 16 ++--
 python/pyhive/tests/test_presto.py                 |  2 +
 python/pyhive/tests/test_sqlalchemy_hive.py        |  2 +
 python/pyhive/tests/test_trino.py                  |  6 +-
 .../dropbox/DummyPasswdAuthenticationProvider.java |  0
 .../hive/hive-site-custom.xml                      |  0
 .../{travis-conf => conf}/hive/hive-site-ldap.xml  |  0
 .../{travis-conf => conf}/hive/hive-site.xml       |  0
 python/scripts/install-deps.sh                     | 10 +++
 .../travis-conf/presto/catalog/hive.properties     |  2 -
 .../travis-conf/trino/catalog/hive.properties      |  2 -
 python/scripts/travis-install.sh                   | 96 ----------------------
 python/setup.cfg                                   | 13 +--
 26 files changed, 210 insertions(+), 126 deletions(-)

diff --git a/.github/workflows/python.yml b/.github/workflows/python.yml
new file mode 100644
index 000000000..ac49b0ecf
--- /dev/null
+++ b/.github/workflows/python.yml
@@ -0,0 +1,65 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+name: Python Client
+
+on:
+  push:
+    branches:
+      - master
+      - branch-*
+  pull_request:
+    branches:
+      - master
+      - branch-*
+
+concurrency:
+  group: python-${{ github.head_ref || github.run_id }}
+  cancel-in-progress: true
+
+jobs:
+  unit-test:
+    runs-on: ubuntu-22.04
+    strategy:
+      fail-fast: false
+      matrix:
+        python-version: ["3.8", "3.9", "3.10"]
+    env:
+      PYTHONHASHSEED: random
+    steps:
+      - uses: actions/checkout@v4
+      - name: Set up Python ${{ matrix.python-version }}
+        uses: actions/setup-python@v2
+        with:
+          python-version: ${{ matrix.python-version }}
+      - name: Start Testing Containers
+        run: |
+          cd python/docker/
+          docker compose up -d --wait
+          docker compose exec hive-server /opt/hive/scripts/make_test_tables.sh
+      - name: Install dependencies
+        run: |
+          cd python
+          ./scripts/install-deps.sh
+      - name: Run tests
+        run: |
+          cd python
+          pytest -v
+      - name: Tear down Containers
+        run: |
+          cd python/docker/
+          docker compose down --volumes
diff --git a/python/.gitignore b/python/.gitignore
new file mode 100644
index 000000000..a473be421
--- /dev/null
+++ b/python/.gitignore
@@ -0,0 +1,19 @@
+cover/
+.coverage
+/dist/
+/build/
+.DS_Store
+*.egg
+/env/
+/htmlcov/
+.idea/
+.project
+*.pyc
+.pydevproject
+/*.egg-info/
+.settings
+.cache/
+*.iml
+/scripts/.thrift_gen
+venv/
+.envrc
diff --git a/python/dev_requirements.txt b/python/dev_requirements.txt
index 40bb605a6..333b76eb4 100644
--- a/python/dev_requirements.txt
+++ b/python/dev_requirements.txt
@@ -2,13 +2,11 @@
 flake8==3.4.1
 mock==2.0.0
 pycodestyle==2.3.1
-pytest==3.2.1
-pytest-cov==2.5.1
-pytest-flake8==0.8.1
-pytest-random==0.2
-pytest-timeout==1.2.0
+pytest==7.4.4
+pytest-cov==5.0.0
 
 # actual dependencies: let things break if a package changes
+sqlalchemy>=1.3.0
 requests>=1.0.0
 requests_kerberos>=0.12.0
 sasl>=0.2.1
diff --git a/python/docker/conf/presto/catalog/hive.properties 
b/python/docker/conf/presto/catalog/hive.properties
new file mode 100644
index 000000000..628738ac6
--- /dev/null
+++ b/python/docker/conf/presto/catalog/hive.properties
@@ -0,0 +1,2 @@
+connector.name=hive-hadoop2
+hive.metastore.uri=thrift://hive-metastore:9083
diff --git a/python/scripts/travis-conf/presto/config.properties 
b/python/docker/conf/presto/config.properties
similarity index 100%
rename from python/scripts/travis-conf/presto/config.properties
rename to python/docker/conf/presto/config.properties
diff --git a/python/scripts/travis-conf/presto/jvm.config 
b/python/docker/conf/presto/jvm.config
similarity index 100%
rename from python/scripts/travis-conf/presto/jvm.config
rename to python/docker/conf/presto/jvm.config
diff --git a/python/scripts/travis-conf/presto/node.properties 
b/python/docker/conf/presto/node.properties
similarity index 100%
rename from python/scripts/travis-conf/presto/node.properties
rename to python/docker/conf/presto/node.properties
diff --git a/python/docker/conf/trino/catalog/hive.properties 
b/python/docker/conf/trino/catalog/hive.properties
new file mode 100644
index 000000000..628738ac6
--- /dev/null
+++ b/python/docker/conf/trino/catalog/hive.properties
@@ -0,0 +1,2 @@
+connector.name=hive-hadoop2
+hive.metastore.uri=thrift://hive-metastore:9083
diff --git a/python/scripts/travis-conf/trino/config.properties 
b/python/docker/conf/trino/config.properties
similarity index 100%
rename from python/scripts/travis-conf/trino/config.properties
rename to python/docker/conf/trino/config.properties
diff --git a/python/scripts/travis-conf/trino/jvm.config 
b/python/docker/conf/trino/jvm.config
similarity index 100%
rename from python/scripts/travis-conf/trino/jvm.config
rename to python/docker/conf/trino/jvm.config
diff --git a/python/scripts/travis-conf/trino/node.properties 
b/python/docker/conf/trino/node.properties
similarity index 100%
rename from python/scripts/travis-conf/trino/node.properties
rename to python/docker/conf/trino/node.properties
diff --git a/python/docker/docker-compose.yml b/python/docker/docker-compose.yml
new file mode 100644
index 000000000..f88989744
--- /dev/null
+++ b/python/docker/docker-compose.yml
@@ -0,0 +1,61 @@
+version: "3"
+
+services:
+  namenode:
+    image: bde2020/hadoop-namenode:2.0.0-hadoop2.7.4-java8
+    volumes:
+      - namenode:/hadoop/dfs/name
+    environment:
+      - CLUSTER_NAME=test
+    env_file:
+      - hadoop-hive.env
+    ports:
+      - "50070:50070"
+  datanode:
+    image: bde2020/hadoop-datanode:2.0.0-hadoop2.7.4-java8
+    volumes:
+      - datanode:/hadoop/dfs/data
+    env_file:
+      - hadoop-hive.env
+    environment:
+      SERVICE_PRECONDITION: "namenode:50070"
+    ports:
+      - "50075:50075"
+  hive-server:
+    image: bde2020/hive:2.3.2-postgresql-metastore
+    env_file:
+      - hadoop-hive.env
+    volumes:
+      - ../scripts:/opt/hive/scripts
+    environment:
+      HIVE_CORE_CONF_javax_jdo_option_ConnectionURL: 
"jdbc:postgresql://hive-metastore/metastore"
+      SERVICE_PRECONDITION: "hive-metastore:9083"
+    ports:
+      - "10000:10000"
+  hive-metastore:
+    image: bde2020/hive:2.3.2-postgresql-metastore
+    env_file:
+      - hadoop-hive.env
+    command: /opt/hive/bin/hive --service metastore
+    environment:
+      SERVICE_PRECONDITION: "namenode:50070 datanode:50075 
hive-metastore-postgresql:5432"
+    ports:
+      - "9083:9083"
+  hive-metastore-postgresql:
+    image: bde2020/hive-metastore-postgresql:2.3.0
+  presto-coordinator:
+    image: shawnzhu/prestodb:0.181
+    ports:
+      - "8080:8080"
+    volumes:
+      - ./conf/presto/:/etc/presto
+  trino:
+    image: trinodb/trino:351
+    ports:     
+      - "18080:18080"
+    volumes:
+      - ./conf/trino:/etc/trino
+
+volumes:
+  namenode:
+  datanode:
diff --git a/python/docker/hadoop-hive.env b/python/docker/hadoop-hive.env
new file mode 100644
index 000000000..3da87a94c
--- /dev/null
+++ b/python/docker/hadoop-hive.env
@@ -0,0 +1,30 @@
+HIVE_SITE_CONF_javax_jdo_option_ConnectionURL=jdbc:postgresql://hive-metastore-postgresql/metastore
+HIVE_SITE_CONF_javax_jdo_option_ConnectionDriverName=org.postgresql.Driver
+HIVE_SITE_CONF_javax_jdo_option_ConnectionUserName=hive
+HIVE_SITE_CONF_javax_jdo_option_ConnectionPassword=hive
+HIVE_SITE_CONF_datanucleus_autoCreateSchema=false
+HIVE_SITE_CONF_hive_metastore_uris=thrift://hive-metastore:9083
+HDFS_CONF_dfs_namenode_datanode_registration_ip___hostname___check=false
+
+CORE_CONF_fs_defaultFS=hdfs://namenode:8020
+CORE_CONF_hadoop_http_staticuser_user=root
+CORE_CONF_hadoop_proxyuser_hue_hosts=*
+CORE_CONF_hadoop_proxyuser_hue_groups=*
+
+HDFS_CONF_dfs_webhdfs_enabled=true
+HDFS_CONF_dfs_permissions_enabled=false
+
+YARN_CONF_yarn_log___aggregation___enable=true
+YARN_CONF_yarn_resourcemanager_recovery_enabled=true
+YARN_CONF_yarn_resourcemanager_store_class=org.apache.hadoop.yarn.server.resourcemanager.recovery.FileSystemRMStateStore
+YARN_CONF_yarn_resourcemanager_fs_state___store_uri=/rmstate
+YARN_CONF_yarn_nodemanager_remote___app___log___dir=/app-logs
+YARN_CONF_yarn_log_server_url=http://historyserver:8188/applicationhistory/logs/
+YARN_CONF_yarn_timeline___service_enabled=true
+YARN_CONF_yarn_timeline___service_generic___application___history_enabled=true
+YARN_CONF_yarn_resourcemanager_system___metrics___publisher_enabled=true
+YARN_CONF_yarn_resourcemanager_hostname=resourcemanager
+YARN_CONF_yarn_timeline___service_hostname=historyserver
+YARN_CONF_yarn_resourcemanager_address=resourcemanager:8032
+YARN_CONF_yarn_resourcemanager_scheduler_address=resourcemanager:8030
+YARN_CONF_yarn_resourcemanager_resource__tracker_address=resourcemanager:8031
diff --git a/python/pyhive/tests/test_hive.py b/python/pyhive/tests/test_hive.py
index b49fc1904..73fda6334 100644
--- a/python/pyhive/tests/test_hive.py
+++ b/python/pyhive/tests/test_hive.py
@@ -17,6 +17,7 @@ import unittest
 from decimal import Decimal
 
 import mock
+import pytest
 import thrift.transport.TSocket
 import thrift.transport.TTransport
 import thrift_sasl
@@ -30,11 +31,12 @@ from pyhive.tests.dbapi_test_case import with_cursor
 _HOST = 'localhost'
 
 
[email protected](reason="Temporary disabled")
 class TestHive(unittest.TestCase, DBAPITestCase):
     __test__ = True
 
     def connect(self):
-        return hive.connect(host=_HOST, configuration={'mapred.job.tracker': 
'local'})
+        return hive.connect(host=_HOST, port=10000, 
configuration={'mapred.job.tracker': 'local'})
 
     @with_cursor
     def test_description(self, cursor):
@@ -151,10 +153,11 @@ class TestHive(unittest.TestCase, DBAPITestCase):
         self.assertIsNone(cursor.description)
         self.assertRaises(hive.ProgrammingError, cursor.fetchone)
 
+    @pytest.mark.skip
     def test_ldap_connection(self):
         rootdir = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
-        orig_ldap = os.path.join(rootdir, 'scripts', 'travis-conf', 'hive', 
'hive-site-ldap.xml')
-        orig_none = os.path.join(rootdir, 'scripts', 'travis-conf', 'hive', 
'hive-site.xml')
+        orig_ldap = os.path.join(rootdir, 'scripts', 'conf', 'hive', 
'hive-site-ldap.xml')
+        orig_none = os.path.join(rootdir, 'scripts', 'conf', 'hive', 
'hive-site.xml')
         des = os.path.join('/', 'etc', 'hive', 'conf', 'hive-site.xml')
         try:
             subprocess.check_call(['sudo', 'cp', orig_ldap, des])
@@ -209,11 +212,12 @@ class TestHive(unittest.TestCase, DBAPITestCase):
             with contextlib.closing(conn.cursor()) as cursor:
                 cursor.execute('SELECT * FROM one_row')
                 self.assertEqual(cursor.fetchall(), [(1,)])
-
+    
+    @pytest.mark.skip
     def test_custom_connection(self):
         rootdir = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
-        orig_ldap = os.path.join(rootdir, 'scripts', 'travis-conf', 'hive', 
'hive-site-custom.xml')
-        orig_none = os.path.join(rootdir, 'scripts', 'travis-conf', 'hive', 
'hive-site.xml')
+        orig_ldap = os.path.join(rootdir, 'scripts', 'conf', 'hive', 
'hive-site-custom.xml')
+        orig_none = os.path.join(rootdir, 'scripts', 'conf', 'hive', 
'hive-site.xml')
         des = os.path.join('/', 'etc', 'hive', 'conf', 'hive-site.xml')
         try:
             subprocess.check_call(['sudo', 'cp', orig_ldap, des])
diff --git a/python/pyhive/tests/test_presto.py 
b/python/pyhive/tests/test_presto.py
index 187b1c214..dd60a7cdf 100644
--- a/python/pyhive/tests/test_presto.py
+++ b/python/pyhive/tests/test_presto.py
@@ -13,6 +13,7 @@ from decimal import Decimal
 
 import requests
 
+import pytest
 from pyhive import exc
 from pyhive import presto
 from pyhive.tests.dbapi_test_case import DBAPITestCase
@@ -231,6 +232,7 @@ class TestPresto(unittest.TestCase, DBAPITestCase):
             ).cursor()
         )
 
+    @pytest.mark.skip(reason='This test requires a proxy server running on 
localhost:9999')
     def test_requests_kwargs(self):
         connection = presto.connect(
             host=_HOST, port=_PORT, source=self.id(),
diff --git a/python/pyhive/tests/test_sqlalchemy_hive.py 
b/python/pyhive/tests/test_sqlalchemy_hive.py
index 790bec4c3..28968ab62 100644
--- a/python/pyhive/tests/test_sqlalchemy_hive.py
+++ b/python/pyhive/tests/test_sqlalchemy_hive.py
@@ -5,6 +5,7 @@ from pyhive.sqlalchemy_hive import HiveDate
 from pyhive.sqlalchemy_hive import HiveDecimal
 from pyhive.sqlalchemy_hive import HiveTimestamp
 from sqlalchemy.exc import NoSuchTableError, OperationalError
+import pytest
 from pyhive.tests.sqlalchemy_test_case import SqlAlchemyTestCase
 from pyhive.tests.sqlalchemy_test_case import with_engine_connection
 from sqlalchemy import types
@@ -60,6 +61,7 @@ _ONE_ROW_COMPLEX_CONTENTS = [
 # ]
 
 
[email protected](reason="Temporarily disabled")
 class TestSqlAlchemyHive(unittest.TestCase, SqlAlchemyTestCase):
     def create_engine(self):
         return create_engine('hive://localhost:10000/default')
diff --git a/python/pyhive/tests/test_trino.py 
b/python/pyhive/tests/test_trino.py
index 41bb489b6..59db54607 100644
--- a/python/pyhive/tests/test_trino.py
+++ b/python/pyhive/tests/test_trino.py
@@ -70,10 +70,10 @@ class TestTrino(TestPresto):
             ('timestamp', 'timestamp', None, None, None, None, True),
             ('binary', 'varbinary', None, None, None, None, True),
             ('array', 'array(integer)', None, None, None, None, True),
-            ('map', 'map(integer,integer)', None, None, None, None, True),
-            ('struct', 'row(a integer,b integer)', None, None, None, None, 
True),
+            ('map', 'map(integer, integer)', None, None, None, None, True),
+            ('struct', 'row(a integer, b integer)', None, None, None, None, 
True),
             # ('union', 'varchar', None, None, None, None, True),
-            ('decimal', 'decimal(10,1)', None, None, None, None, True),
+            ('decimal', 'decimal(10, 1)', None, None, None, None, True),
         ])
         rows = cursor.fetchall()
         expected = [(
diff --git 
a/python/scripts/travis-conf/com/dropbox/DummyPasswdAuthenticationProvider.java 
b/python/scripts/conf/com/dropbox/DummyPasswdAuthenticationProvider.java
similarity index 100%
rename from 
python/scripts/travis-conf/com/dropbox/DummyPasswdAuthenticationProvider.java
rename to python/scripts/conf/com/dropbox/DummyPasswdAuthenticationProvider.java
diff --git a/python/scripts/travis-conf/hive/hive-site-custom.xml 
b/python/scripts/conf/hive/hive-site-custom.xml
similarity index 100%
rename from python/scripts/travis-conf/hive/hive-site-custom.xml
rename to python/scripts/conf/hive/hive-site-custom.xml
diff --git a/python/scripts/travis-conf/hive/hive-site-ldap.xml 
b/python/scripts/conf/hive/hive-site-ldap.xml
similarity index 100%
rename from python/scripts/travis-conf/hive/hive-site-ldap.xml
rename to python/scripts/conf/hive/hive-site-ldap.xml
diff --git a/python/scripts/travis-conf/hive/hive-site.xml 
b/python/scripts/conf/hive/hive-site.xml
similarity index 100%
rename from python/scripts/travis-conf/hive/hive-site.xml
rename to python/scripts/conf/hive/hive-site.xml
diff --git a/python/scripts/install-deps.sh b/python/scripts/install-deps.sh
new file mode 100755
index 000000000..a67c44ecd
--- /dev/null
+++ b/python/scripts/install-deps.sh
@@ -0,0 +1,10 @@
+#!/bin/bash -eux
+
+source /etc/lsb-release
+
+sudo apt-get -q update
+sudo apt-get -q install -y g++ libsasl2-dev libkrb5-dev
+
+pip install --upgrade pip
+pip install -r dev_requirements.txt
+pip install -e .
diff --git a/python/scripts/travis-conf/presto/catalog/hive.properties 
b/python/scripts/travis-conf/presto/catalog/hive.properties
deleted file mode 100644
index 5129f3c37..000000000
--- a/python/scripts/travis-conf/presto/catalog/hive.properties
+++ /dev/null
@@ -1,2 +0,0 @@
-connector.name=hive-hadoop2
-hive.metastore.uri=thrift://localhost:9083
diff --git a/python/scripts/travis-conf/trino/catalog/hive.properties 
b/python/scripts/travis-conf/trino/catalog/hive.properties
deleted file mode 100644
index 5129f3c37..000000000
--- a/python/scripts/travis-conf/trino/catalog/hive.properties
+++ /dev/null
@@ -1,2 +0,0 @@
-connector.name=hive-hadoop2
-hive.metastore.uri=thrift://localhost:9083
diff --git a/python/scripts/travis-install.sh b/python/scripts/travis-install.sh
deleted file mode 100755
index 5bca8d98b..000000000
--- a/python/scripts/travis-install.sh
+++ /dev/null
@@ -1,96 +0,0 @@
-#!/bin/bash -eux
-
-source /etc/lsb-release
-
-echo "deb [arch=amd64] 
https://archive.cloudera.com/${CDH}/ubuntu/${DISTRIB_CODENAME}/amd64/cdh 
${DISTRIB_CODENAME}-cdh${CDH_VERSION} contrib
-deb-src 
https://archive.cloudera.com/${CDH}/ubuntu/${DISTRIB_CODENAME}/amd64/cdh 
${DISTRIB_CODENAME}-cdh${CDH_VERSION} contrib" | sudo tee 
/etc/apt/sources.list.d/cloudera.list
-sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 327574EE02A818DD
-sudo apt-get -q update
-
-sudo apt-get -q install -y oracle-java8-installer python-dev g++ libsasl2-dev 
maven
-sudo update-java-alternatives -s java-8-oracle
-
-#
-# LDAP
-#
-sudo apt-get -q -y --no-install-suggests --no-install-recommends --force-yes 
install ldap-utils slapd
-sudo mkdir -p /tmp/slapd
-sudo slapd -f $(dirname $0)/ldap_config/slapd.conf -h ldap://localhost:3389 &
-while ! nc -vz localhost 3389; do sleep 1; done
-sudo ldapadd -h localhost:3389 -D cn=admin,dc=example,dc=com -w test -f 
$(dirname $0)/../pyhive/tests/ldif_data/base.ldif
-sudo ldapadd -h localhost:3389 -D cn=admin,dc=example,dc=com -w test -f 
$(dirname $0)/../pyhive/tests/ldif_data/INITIAL_TESTDATA.ldif
-
-#
-# Hive
-#
-
-sudo apt-get -q install -y --force-yes hive
-
-javac -cp /usr/lib/hive/lib/hive-service.jar $(dirname 
$0)/travis-conf/com/dropbox/DummyPasswdAuthenticationProvider.java
-jar cf $(dirname $0)/dummy-auth.jar -C $(dirname $0)/travis-conf com
-sudo cp $(dirname $0)/dummy-auth.jar /usr/lib/hive/lib
-
-# Hack around broken symlink in Hive's installation
-# /usr/lib/hive/lib/zookeeper.jar -> ../../zookeeper/zookeeper.jar
-# Without this, Hive fails to start up due to failing to find ZK classes.
-sudo ln -nsfv /usr/share/java/zookeeper.jar /usr/lib/hive/lib/zookeeper.jar
-
-sudo mkdir -p /user/hive
-sudo chown hive:hive /user/hive
-sudo cp $(dirname $0)/travis-conf/hive/hive-site.xml 
/etc/hive/conf/hive-site.xml
-sudo apt-get -q install -y --force-yes hive-metastore hive-server2 || (grep . 
/var/log/hive/* && exit 2)
-
-while ! nc -vz localhost 9083; do sleep 1; done
-while ! nc -vz localhost 10000; do sleep 1; done
-
-sudo -Eu hive $(dirname $0)/make_test_tables.sh
-
-#
-# Presto
-#
-
-sudo apt-get -q install -y python # Use python2 for presto server
-
-mvn -q org.apache.maven.plugins:maven-dependency-plugin:3.0.0:copy \
-    -Dartifact=com.facebook.presto:presto-server:${PRESTO}:tar.gz \
-    -DoutputDirectory=.
-tar -x -z -f presto-server-*.tar.gz
-rm -rf presto-server
-mv presto-server-*/ presto-server
-
-cp -r $(dirname $0)/travis-conf/presto presto-server/etc
-
-/usr/bin/python2.7 presto-server/bin/launcher.py start
-
-#
-# Trino
-#
-
-sudo apt-get -q install -y python # Use python2 for trino server
-
-mvn -q org.apache.maven.plugins:maven-dependency-plugin:3.0.0:copy \
-    -Dartifact=io.trino:trino-server:${TRINO}:tar.gz \
-    -DoutputDirectory=.
-tar -x -z -f trino-server-*.tar.gz
-rm -rf trino-server
-mv trino-server-*/ trino-server
-
-cp -r $(dirname $0)/travis-conf/trino trino-server/etc
-
-/usr/bin/python2.7 trino-server/bin/launcher.py start
-
-#
-# Python
-#
-
-pip install $SQLALCHEMY
-pip install -e .
-pip install -r dev_requirements.txt
-
-# Sleep so Presto has time to start up.
-# Otherwise we might get 'No nodes available to run query' or 'Presto server 
is still initializing'
-while ! grep -q 'SERVER STARTED' /tmp/presto/data/var/log/server.log; do sleep 
1; done
-
-# Sleep so Trino has time to start up.
-# Otherwise we might get 'No nodes available to run query' or 'Presto server 
is still initializing'
-while ! grep -q 'SERVER STARTED' /tmp/trino/data/var/log/server.log; do sleep 
1; done
diff --git a/python/setup.cfg b/python/setup.cfg
index e19ce9d30..a26237aac 100644
--- a/python/setup.cfg
+++ b/python/setup.cfg
@@ -2,20 +2,9 @@
 tag_build =
 
 [tool:pytest]
-timeout = 100
-timeout_method = thread
-addopts = --random --tb=short --cov pyhive --cov-report html --cov-report term 
--flake8
+addopts = --tb=short --cov pyhive --cov-report html --cov-report term
 norecursedirs = env
 python_files = test_*.py
-flake8-max-line-length = 100
-flake8-ignore =
-    TCLIService/*.py ALL
-    pyhive/sqlalchemy_backports.py ALL
-    presto-server/** ALL
-    pyhive/hive.py F405
-    pyhive/presto.py F405
-    pyhive/trino.py F405
-    W503
 filterwarnings =
     error
     # For Python 2 flake8

Reply via email to