Hi folks, Well, at least I know what's possible. The problem I'm having isn't because of GUI latency or anything like that. It's due to the nature of gtk_timeout_add() and gtk_timeout_remove(). It's a different paradigm.
If I have a timeout pulsing every 0.3 seconds, then change that to something else, I have removed the 0.3 second pulse. So of course it won't be smooth. While I'm updating my timer pulse via a slider or any other mechanism, unless the delay between the update is LESS than the pulse interval, of course I won't hear anything. Once I move the slider, I'm creating a new timer interval, which starts at time 0. A continuous slider move creates dozens of them, all waiting for their first beat. This is why a gtk_timeouot_update() would be so nice. All I really need to do is change the value of the timer interval, not reset it each time. There's a big difference. Am I making sense? It's a GTK design issue, not a latency / scheduling one. DT -- Technical Director - Virginia Center for Computer Music http://www.virginia.edu/music/vccm.html _______________________________________________ gtk-list mailing list [EMAIL PROTECTED] http://mail.gnome.org/mailman/listinfo/gtk-list
