Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/5843#issuecomment-101952647
  
    Okay I did another pass.
    
    Only one comment left. I noticed there is a limit to how far I can zoom 
outwards. For instance I could not get the entire stage visualized in one view 
of the timeline when I ran this:
    
    ```
    val rdd = sc.parallelize(1 to 100000, 1000).map(x => (x, x)).reduceByKey(_ 
+ _).cache.count()
    ```
    
    Why don't we just set `maxZoom` equal to the difference between the very 
first and last task (or maybe like 1.5 of that to get some buffer room)? Right 
now it seems like it is trying to constrain the zooming, but my preference was 
that it should be unconstrained, otherwise it's pretty confusing user 
experience.
    
    Is there a reason you wanted to constrain the maximum?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to