On 23 January 2018 at 19:17, Petr Jelinek <petr.jeli...@2ndquadrant.com> wrote:
> I am not sure if this helps streaming use-case though as > there is not going to be any external transaction management involved there. So, I think we need some specific discussion of what to do in that case. Streaming happens only with big transactions and only for short periods. The problem only occurs when we are decoding and we hit a catalog table change. Processing of that is short, then we continue. So it seems perfectly fine to block aborts in those circumstances. We can just mark that state in a in-memory array of StreamingDecodedTransactions that has size SizeOf(TransactionId) * MaxNumWalSenders. We can add a check into RecordTransactionAbort() just before the critical section to see if we are currently processing a StreamingDecodedTransaction and if so, poll until we're OK to abort. The check will be quick and the abort path is not considered one we need to optimize. -- Simon Riggs http://www.2ndQuadrant.com/ PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services