The following review has been posted through the commitfest application:
make installcheck-world:  not tested
Implements feature:       not tested
Spec compliant:           not tested
Documentation:            not tested

First part of code review (for about 1/3rd of code):
pglogical_output.h:

+ /* Protocol capabilities */
+ #define PGLOGICAL_PROTO_VERSION_NUM 1
+ #define PGLOGICAL_PROTO_MIN_VERSION_NUM 1
Is this protocol version number and minimal recognized version number,
or major and minor version number? I assume that PGLOGICAL_PROTO_VERSION_NUM
is current protocol version (as in config max_proto_version and
min_proto_version). But why we have MIN_VERSION and not MAX_VERSION?

From code in pglogical_output.c (lines 215-225 it looks like
PGLOGICAL_PROTO_VERSION_NUM is maximum, and PGLOGICAL_PROTO_MIN_VERSION_NUM
is treated as minimal protocol version number.

I can see that those constants are exported in pglogical_infofuncs.c and
pglogical.sql, but I do not understand omission of MAX.


+ typedef struct HookFuncName
+ typedef struct PGLogicalTupleData
I haven't found those used anything, and they are not mentioned in
documentation. Are those structures needed?


+ pglogical_config.c:
+               switch(get_param_key(elem->defname))
+               {
+                       val = get_param_value(elem, false, 
OUTPUT_PARAM_TYPE_UINT32);
Why do we need this line here? All cases contain some variant of
val = get_param_value(elem, false, TYPE);
as first line after "case PARAM_*:" so this line should be removed.

+       val = get_param(options, "startup_params_format", false, false,
+                                       OUTPUT_PARAM_TYPE_UINT32, &found);
+ 
+       params_format = DatumGetUInt32(val);
Shouldn't we check "found" here? We work with val (which is Datum(0)) - won't
it throw SIGFAULT, or similar?
Additionally - I haven't found any case where code would use "found"
passed from get_param() - so as it's not used it might be removed.


pglogical_output.c:

+               elog(DEBUG1, "Binary mode rejected: Server and client endian 
sizeof(Datum) mismatch");
+               return false;
+       }
+ 
+       if (data->client_binary_sizeofint != 0
+               && data->client_binary_sizeofint != sizeof(int))
+       {
+               elog(DEBUG1, "Binary mode rejected: Server and client endian 
sizeof(int) mismatch");
+               return false;
+       }
+ 
+       if (data->client_binary_sizeoflong != 0
+               && data->client_binary_sizeoflong != sizeof(long))
+       {
+               elog(DEBUG1, "Binary mode rejected: Server and client endian 
sizeof(long) mismatch");
Isn't "endian" here case of copy-paste from first error?
Error messages should rather look like:
                elog(DEBUG1, "Binary mode rejected: Server and client 
sizeof(Datum) mismatch");

+ static void pg_decode_shutdown(LogicalDecodingContext * ctx)
In pg_decode_startup we create main memory context, and create hooks memory
context. In pg_decode_shutdown we delete hooks memory context but not main
context. Is this OK, or should we also add:
MemoryContextDelete(data->context);

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to