On Mar 6, 2018, at 5:19 PM, RW <rwmailli...@googlemail.com> wrote:
> 
> Or probably more commonly when running  the spamassassin perl script as
> an ordinary user for test purposes.

Right, if the DB is owned by that user, then they would see the rule fire with 
spamassassin and might assume it's all good for everyone.  If the DB is owned 
by root/nobody/some other user, but they test as a regular non-owner user, the 
rule should fail and they might notice it... and then give up on fixing it 
because it's an obscure problem to track down! =)

> Actually it does get cached results from the DB and does fall back to
> doing its own lookup. The debug is a little misleading because it tries
> either to update the timestamp or create a new entry before it even logs
> the result, and that's where it fails.

Gotcha. I didn't see the lookups happening but maybe I wasn't looking closely 
enough.  But, clearly, the failure to write appears to stop things in their 
tracks and the HAS_SHORT_URL rule never triggers, which causes everything 
downstream to also miss.

> I don't really understand the database interface, but if the two
> relevant  execute commands are placed in eval blocks it seems to work.
> cache_add() does use an eval block for setting-up the SQL, but the
> execute is outside the block.

That should be an easy patch, then.  Please submit on git!  I'm not 
sufficiently familiar with git yet...

But this only takes care of item #1: failing gracefully.  It no longer blocks 
but still fails to write to the DB.  So item #2 (per-user DBs) would still be 
needed for a fully robust fix, and I imagine that's more than a few lines of 
code, unfortunately.

Thanks for testing!

--- Amir

Reply via email to