juneauwang opened a new issue #13378:
URL: https://github.com/apache/superset/issues/13378


   We enabled impersonation on Presto, Hive, impala.... etc sources and enabled 
global async in superset 1.0.0. However, worker can't get effective_username 
and queries will be run as system user
   
   ### Expected results
   
   Dashboard query can be run as actual login user of superset when datasource 
enabled impersonation.
   
   ### Actual results
   
   Dashboard will be run as system user no matter impersonation is on in 
datasource. 
   
   #### Screenshots
   ```
   object user_name is None
   [2021-03-01 07:04:27,745: INFO/ForkPoolWorker-127] object user_name is None
   effective_username is None
   [2021-03-01 07:04:27,745: INFO/ForkPoolWorker-127] effective_username is None
   df username is None
   [2021-03-01 07:04:27,746: INFO/ForkPoolWorker-127] df username is None
   [2021-03-01 07:04:27,746: INFO/ForkPoolWorker-127] username is 
svc_acc_bdp_superset
   [2021-03-01 07:04:27,746: INFO/ForkPoolWorker-127] SELECT "db_name" AS 
"db_name",
          "event_name" AS "event_name",
          sum(nb_events) AS "count"
   FROM xxx
   WHERE "event_date" >= '2021-02-01 00:00:00.000000'
     AND "event_date" < '2021-03-01 00:00:00.000000'
   GROUP BY "db_name",
            "event_name"
   LIMIT 500
   ```
   I added logger in superset/models/core.py line 289, line 301 to debug. 
svc_acc_bdp_superset is a linux user which run celery workers
   
   #### How to reproduce the bug
   
   1. Go to 'Dashboard'
   2. Click on any dashboards which datasource is impersonation enabled.
   3. Check worker log to see query user
   4. See error
   
   ### Environment
   
   (please complete the following information):
   
   Superset 1.0.0
   Python 3.8.2
   Flask    1.1.2
   PyHive   0.6.2
   
   ### Checklist
   
   Make sure to follow these steps before submitting your issue - thank you!
   
   - [x ] I have checked the superset logs for python stacktraces and included 
it here as text if there are any.
   - [x] I have reproduced the issue with at least the latest released version 
of superset.
   - [x] I have checked the issue tracker for the same issue and I haven't 
found one similar.
   
   ### Additional context
   
   stacktraces: 
   
   ```
   Traceback (most recent call last):
     File 
"/srv/python3/lib/python3.8/site-packages/superset/connectors/sqla/models.py", 
line 1321, in query
       df = self.database.get_df(sql, self.schema, mutator)
     File "/srv/python3/lib/python3.8/site-packages/superset/models/core.py", 
line 392, in get_df
       data = self.db_engine_spec.fetch_data(cursor)
     File 
"/srv/python3/lib/python3.8/site-packages/superset/db_engine_specs/base.py", 
line 321, in fetch_data
       return cursor.fetchall()
     File "/srv/python3/lib/python3.8/site-packages/pyhive/common.py", line 
136, in fetchall
       return list(iter(self.fetchone, None))
     File "/srv/python3/lib/python3.8/site-packages/pyhive/common.py", line 
105, in fetchone
       self._fetch_while(lambda: not self._data and self._state != 
self._STATE_FINISHED)
     File "/srv/python3/lib/python3.8/site-packages/pyhive/common.py", line 45, 
in _fetch_while
       self._fetch_more()
     File "/srv/python3/lib/python3.8/site-packages/pyhive/presto.py", line 
264, in _fetch_more
       self._process_response(self._requests_session.get(self._nextUri, 
**self._requests_kwargs, verify=False 
,auth=HTTPKerberosAuth(mutual_authentication=OPTIONAL)))
     File "/srv/python3/lib/python3.8/site-packages/pyhive/presto.py", line 
303, in _process_response
       raise DatabaseError(response_json['error'])
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to