Hi everyone!
I've found out that it's a postgres concurrency bug: postgres is just bad at concurrent transactions updating the same row/page. Switching to DB2 Express solved it. Cheers. ocoolio wrote: > > Hello! > > I have a bad performance problem: > > Im using NMS with .NET (C#) and two Qs: > SUBMITQ > DESTQ > > A server component listens on SUBMITQ. When message received, transforms > it and submits it to DESTQ. All done in one transaction > (AcknowledgementMode.Transactional and sess.Commit(), PrefetchSize=1). > This works great with stuffed queues (>10 msg pending) since the receiver > doenst block the producer. > The performance in this (filled q) case: ~130msg/sec on consumer. > The producer can send about 500msg/sec. > > The problem comes when the consumer empties the q. When it happens the > producer becomes extremely slow (10msg/s) as well as the consumer. The q > remains empty so somehow the prod and cons waits for each other. > When I pause the consumer for a sec, the q gets a lot of messages and the > performance is back to normal: the producer sends with 500, the consumer > works with 130. > > I use persistent messages, journal and PostgreSQL. > Because of .NET NMS, I cannot make async sends on the client, however the > 500/sec is just enough for me. > > Please, tell me what this is, I just cannot figure it out. I dont want to > implement wait cycles on the consumer just to get the queue packed with > messages! > (ActiveMQ 5.0, 2007 07 10). > > Thanks, > > Adam > -- View this message in context: http://www.nabble.com/Performance-problem---On-empty-queue-sender-gets-blocked---on-filled-queue-it%27s-OK-tf4142407s2354.html#a11941918 Sent from the ActiveMQ - User mailing list archive at Nabble.com.
