lanyusea opened a new issue #12052:
URL: https://github.com/apache/incubator-superset/issues/12052


   I got a group of data about 4million stored in mysql, I'm trying to load it 
in superset and do the virtualization.
   
   It works well when there is only thousands data but after I put all data in 
the mysql, the superset failed to do the job with error `An error occurred` in 
the chart page but no further information, nor in the log.
   
   ### Expected results
   Superset can fetch the data though it reports problems
   
   ### Actual results
   1. It says `An error occurred` with no result.
   2. a large temp db is created in my mysql
   
   #### Screenshots
   
![image](https://user-images.githubusercontent.com/2766729/102171538-b888eb80-3ed1-11eb-9a9b-02779761f7fe.png)
   
   and the log file has nothing useful:
   
![image](https://user-images.githubusercontent.com/2766729/102172491-d6efe680-3ed3-11eb-8fa9-af3929020a81.png)
   
   #### How to reproduce the bug
   
   1. Add 4million rows of data
   2. do the virtualization
   
   ### Environment
   
   (please complete the following information):
   
   - superset version: `0.37.2`
   - python version: `3.6.9`
   - node.js version: `didn't install`
   
   superset is installed from Scratch, not docker
   
   
   ### Checklist
   
   Make sure to follow these steps before submitting your issue - thank you!
   
   - [ ] I have checked the superset logs for python stacktraces and included 
it here as text if there are any.
   - [ ] I have reproduced the issue with at least the latest released version 
of superset.
   - [ ] I have checked the issue tracker for the same issue and I haven't 
found one similar.
   
   ### Additional context
   
   I have searched and find few discussion about the data scale. the official 
document only mentions issue about the [loading 
speed](https://superset.apache.org/docs/frequently-asked-questions#how-big-can-my-datasource-be)
 while in #4588 people also talks about the speed. Seems nobody meets a fail 
issue as me.
   
   also, to make it runable, I have extended the timeout to 3000s and several 
other limit settings.
   
![image](https://user-images.githubusercontent.com/2766729/102172766-6d240c80-3ed4-11eb-956b-c57c12ea2959.png)
   
   The problem always happen after querying about 600s, so I'm thinking if 
there is still some timeout in the connection but I didn't find any other in 
the superset config.py.
   
   So I'm wondering how could I know what the exact error is, and how to I 
solve it.
   
   or is there any suggestion I can bypass this issue.
   
   Thanks!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to