ozw1z5rd opened a new issue #10200:
URL: https://github.com/apache/airflow/issues/10200


   **Apache Airflow version**:
   1.10.10
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   none
   **Environment**:
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release):
   NAME="CentOS Linux"
   VERSION="7 (Core)"
   ID="centos"
   ID_LIKE="rhel fedora"
   VERSION_ID="7"
   PRETTY_NAME="CentOS Linux 7 (Core)"
   ANSI_COLOR="0;31"
   CPE_NAME="cpe:/o:centos:centos:7"
   HOME_URL="https://www.centos.org/";
   BUG_REPORT_URL="https://bugs.centos.org/";
   
   CENTOS_MANTISBT_PROJECT="CentOS-7"
   CENTOS_MANTISBT_PROJECT_VERSION="7"
   REDHAT_SUPPORT_PRODUCT="centos"
   REDHAT_SUPPORT_PRODUCT_VERSION="7"
   - **Kernel** (e.g. `uname -a`):
   Linux mid1-e-1 3.10.0-514.2.2.el7.x86_64 #1 SMP Tue Dec 6 23:06:41 UTC 2016 
x86_64 x86_64 x86_64 GNU/Linux
   - **Install tools**:
   pip yum
   - **Others**:
   
   **What happened**:
   
   It there are no dags the last page button ( >> ) has 
   
   `http://airflow-test.buongiorno.com/home?search=&page=-1` it should be 
_page=0_ 
   
   set as link and this lead this exception:
   
   ```
   Node: datalake-test.docomodigital.com
   
-------------------------------------------------------------------------------
   Traceback (most recent call last):
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/app.py",
 line 2446, in wsgi_app
       response = self.full_dispatch_request()
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/app.py",
 line 1951, in full_dispatch_request
       rv = self.handle_user_exception(e)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/app.py",
 line 1820, in handle_user_exception
       reraise(exc_type, exc_value, tb)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/app.py",
 line 1949, in full_dispatch_request
       rv = self.dispatch_request()
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/app.py",
 line 1935, in dispatch_request
       return self.view_functions[rule.endpoint](**req.view_args)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask_appbuilder/security/decorators.py",
 line 101, in wraps
       return f(self, *args, **kwargs)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/airflow/utils/db.py",
 line 74, in wrapper
       return func(*args, **kwargs)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/airflow/www_rbac/views.py",
 line 302, in index
       joinedload(DagModel.tags)).offset(start).limit(dags_per_page).all()
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/orm/query.py",
 line 3244, in all
       return list(self)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/orm/query.py",
 line 3403, in __iter__
       return self._execute_and_instances(context)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/orm/query.py",
 line 3428, in _execute_and_instances
       result = conn.execute(querycontext.statement, self._params)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 984, in execute
       return meth(self, multiparams, params)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/sql/elements.py",
 line 293, in _execute_on_connection
       return connection._execute_clauseelement(self, multiparams, params)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 1103, in _execute_clauseelement
       distilled_params,
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 1288, in _execute_context
       e, statement, parameters, cursor, context
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 1482, in _handle_dbapi_exception
       sqlalchemy_exception, with_traceback=exc_info[2], from_=e
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/engine/base.py",
 line 1248, in _execute_context
       cursor, statement, parameters, context
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/sqlalchemy/engine/default.py",
 line 588, in do_execute
       cursor.execute(statement, parameters)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/MySQLdb/cursors.py",
 line 255, in execute
       self.errorhandler(self, exc, value)
     File 
"/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib64/python2.7/site-packages/MySQLdb/connections.py",
 line 50, in defaulterrorhandler
       raise errorvalue
   ProgrammingError: (_mysql_exceptions.ProgrammingError) (1064, "You have an 
error in your SQL syntax; check the manual that corresponds to your MySQL 
server version for the right syntax to use near '-25, 25) AS anon_1 LEFT OUTER 
JOIN dag_tag AS dag_tag_1 ON anon_1.dag_dag_id = d' at line 7")
   [SQL: SELECT anon_1.dag_dag_id AS anon_1_dag_dag_id, anon_1.dag_root_dag_id 
AS anon_1_dag_root_dag_id, anon_1.dag_is_paused AS anon_1_dag_is_paused, 
anon_1.dag_is_subdag AS anon_1_dag_is_subdag, anon_1.dag_is_active AS 
anon_1_dag_is_active, anon_1.dag_last_scheduler_run AS 
anon_1_dag_last_scheduler_run, anon_1.dag_last_pickled AS 
anon_1_dag_last_pickled, anon_1.dag_last_expired AS anon_1_dag_last_expired, 
anon_1.dag_scheduler_lock AS anon_1_dag_scheduler_lock, anon_1.dag_pickle_id AS 
anon_1_dag_pickle_id, anon_1.dag_fileloc AS anon_1_dag_fileloc, 
anon_1.dag_owners AS anon_1_dag_owners, anon_1.dag_description AS 
anon_1_dag_description, anon_1.dag_default_view AS anon_1_dag_default_view, 
anon_1.dag_schedule_interval AS anon_1_dag_schedule_interval, dag_tag_1.name AS 
dag_tag_1_name, dag_tag_1.dag_id AS dag_tag_1_dag_id 
   FROM (SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, 
dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active 
AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, 
dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, 
dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, 
dag.fileloc AS dag_fileloc, dag.owners AS dag_owners, dag.description AS 
dag_description, dag.default_view AS dag_default_view, dag.schedule_interval AS 
dag_schedule_interval 
   FROM dag 
   WHERE dag.is_subdag = 0 AND dag.is_active = 1 AND (EXISTS (SELECT 1 
   FROM dag_tag 
   WHERE dag.dag_id = dag_tag.dag_id AND dag_tag.name IN (%s))) ORDER BY 
dag.dag_id 
    LIMIT %s, %s) AS anon_1 LEFT OUTER JOIN dag_tag AS dag_tag_1 ON 
anon_1.dag_dag_id = dag_tag_1.dag_id ORDER BY anon_1.dag_dag_id]
   [parameters: (u'example', -25, 25)]
   (Background on this error at: http://sqlalche.me/e/f405)
   
   ```
   
   **What you expected to happen**:
   
   I expected the last page button acts like the 1st page button: nothing has 
to happen.
   
   **How to reproduce it**:
   
   Set all the dags as inactive, the click last page.
   
   **Anything else we need to know**:
   
   It's just an annoyance.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to