On Thu, Oct 25, 2012 at 4:11 PM, Peter Hutterer <[email protected]> wrote: > A server without a config file that inits all input devices takes just over > a second on my machine. Up this timeout so we don't get spurious signals > later. > > Signed-off-by: Peter Hutterer <[email protected]> > --- > src/xserver.cpp | 2 +- > 1 file changed, 1 insertion(+), 1 deletion(-) > > diff --git a/src/xserver.cpp b/src/xserver.cpp > index 29c0430..9b163bb 100644 > --- a/src/xserver.cpp > +++ b/src/xserver.cpp > @@ -419,7 +419,7 @@ void xorg::testing::XServer::Start(const std::string > &program) { > std::string err_msg; > > sigset_t sig_mask; > - struct timespec sig_timeout = {1, 0}; /* 1 sec + 0 nsec */ > + struct timespec sig_timeout = {3, 0}; /* 3 sec + 0 nsec */ > > /* add SIGUSR1 to the signal mask */ > sigemptyset(&sig_mask);
One might wonder if the test failing to start the server in 1 second is a real failure :). I'm ok with this change, it just seems like it shouldn't take that long... For the series: Reviewed-by: Chase Douglas <[email protected]> _______________________________________________ [email protected]: X.Org development Archives: http://lists.x.org/archives/xorg-devel Info: http://lists.x.org/mailman/listinfo/xorg-devel
