I put together a patch which adds a regression test for large objects,
hopefully attached to this message.  I would like some critique of it, to
see if I have gone about it the right way.  Also I would be happy to hear
any additional tests which should be added to it.

On Tue, 5 Sep 2006, Jeremy Drake wrote:

> I noticed when I was working on a patch quite a while back that there are
> no regression tests for large object support.  I know, large objects
> are not the most sexy part of the code-base, and I think they tend to be
> ignored/forgotten most of the time.  Which IMHO is all the more reason
> they should have some regression tests.  Otherwise, if someone managed to
> break them somehow, it is quite likely not to be noticed for quite some
> time.
> So in this vein, I have recently found myself with some free time, and a
> desire to contribute something, and decided this would be the perfect
> place to get my feet wet without stepping on any toes.
> I guess what I should ask is, would a patch to add a test for large
> objects to the regression suite be well received?  And, is there any
> advice for how to go about making these tests?
> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?  I
> was thinking about using an excerpt from a public domain text such as Moby
> Dick, but on second thought binary data may be better to test things with.
> My current efforts, and probably the preliminary portion of the final
> test, involves loading a small amount (less than one block) of text into a
> large object inline from a sql script and calling the various functions
> against it to verify that they do what they should.  In the course of
> doing so, I find that it is necessary to stash certain values across
> statements (large object ids, large object 'handles'), and so far I am
> using a temporary table to store these.  Is this reasonable, or is there a
> cleaner way to do that?

One seldom sees a monument to a committee.

Attachment: largeobj-regress.patch.gz
Description: Binary data

---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?


Reply via email to