On Sun, 7 Apr 2024 at 19:30, Andres Freund <[email protected]> wrote:
> Good call. Added and pushed.
I understand you're already aware of the reference in the comment to
heapgetpage(), which no longer exists as of 44086b097.
Melanie and I had discussed the heap_prepare_pagescan() name while I
was reviewing that recent refactor. Aside from fixing the comment, how
about also renaming heapgetpage_collect() to
heap_prepare_pagescan_tuples()?
Patch attached for reference. Not looking for any credit.
I'm also happy to revisit the heap_prepare_pagescan() name and call
heapgetpage_collect() some appropriate derivative of whatever we'd
rename that to.
Copied Melanie as she may want to chime in too.
David
diff --git a/src/backend/access/heap/heapam.c b/src/backend/access/heap/heapam.c
index 2663f52d1a..d3c2a60985 100644
--- a/src/backend/access/heap/heapam.c
+++ b/src/backend/access/heap/heapam.c
@@ -434,16 +434,15 @@ heap_setscanlimits(TableScanDesc sscan, BlockNumber
startBlk, BlockNumber numBlk
}
/*
- * Per-tuple loop for heapgetpage() in pagemode. Pulled out so it can be
- * called multiple times, with constant arguments for all_visible,
+ * Per-tuple loop for heap_prepare_pagescan(). Pulled out so it can be called
+ * multiple times, with constant arguments for all_visible,
* check_serializable.
*/
pg_attribute_always_inline
static int
-heapgetpage_collect(HeapScanDesc scan, Snapshot snapshot,
- Page page, Buffer buffer,
- BlockNumber block, int lines,
- bool all_visible, bool
check_serializable)
+heap_prepare_pagescan_tuples(HeapScanDesc scan, Snapshot snapshot, Page page,
+ Buffer buffer,
BlockNumber block, int lines,
+ bool all_visible, bool
check_serializable)
{
int ntup = 0;
OffsetNumber lineoff;
@@ -547,28 +546,36 @@ heap_prepare_pagescan(TableScanDesc sscan)
CheckForSerializableConflictOutNeeded(scan->rs_base.rs_rd,
snapshot);
/*
- * We call heapgetpage_collect() with constant arguments, to get the
- * compiler to constant fold the constant arguments. Separate calls with
- * constant arguments, rather than variables, are needed on several
+ * We call heap_prepare_pagescan_tuples() with constant arguments, to
get
+ * the compiler to constant fold the constant arguments. Separate calls
+ * with constant arguments, rather than variables, are needed on several
* compilers to actually perform constant folding.
*/
if (likely(all_visible))
{
if (likely(!check_serializable))
- scan->rs_ntuples = heapgetpage_collect(scan, snapshot,
page, buffer,
-
block, lines, true, false);
+ scan->rs_ntuples = heap_prepare_pagescan_tuples(scan,
snapshot,
+
page, buffer,
+
block, lines,
+
true, false);
else
- scan->rs_ntuples = heapgetpage_collect(scan, snapshot,
page, buffer,
-
block, lines, true, true);
+ scan->rs_ntuples = heap_prepare_pagescan_tuples(scan,
snapshot,
+
page, buffer,
+
block, lines,
+
true, true);
}
else
{
if (likely(!check_serializable))
- scan->rs_ntuples = heapgetpage_collect(scan, snapshot,
page, buffer,
-
block, lines, false, false);
+ scan->rs_ntuples = heap_prepare_pagescan_tuples(scan,
snapshot,
+
page, buffer,
+
block, lines,
+
false, false);
else
- scan->rs_ntuples = heapgetpage_collect(scan, snapshot,
page, buffer,
-
block, lines, false, true);
+ scan->rs_ntuples = heap_prepare_pagescan_tuples(scan,
snapshot,
+
page, buffer,
+
block, lines,
+
false, true);
}
LockBuffer(buffer, BUFFER_LOCK_UNLOCK);