The following review has been posted through the commitfest application:
make installcheck-world:  tested, failed
Implements feature:       tested, passed
Spec compliant:           tested, failed
Documentation:            not tested

When running gmake installcheck for regression tests, 2 tests are failing:
[vagrant@localhost regress]$ cat 
/home/vagrant/postgresql/src/test/regress/regression.diffs
*** /home/vagrant/postgresql/src/test/regress/expected/int8.out 2016-02-11 
22:41:33.983260509 -0500
--- /home/vagrant/postgresql/src/test/regress/results/int8.out  2016-02-11 
22:51:58.631238323 -0500
***************
*** 583,593 ****
  SELECT  AS to_char_13, to_char(q2, 'L9999999999999999.000')  FROM INT8_TBL;
   to_char_13 |        to_char         
  ------------+------------------------
!             |                456.000
!             |   4567890123456789.000
!             |                123.000
!             |   4567890123456789.000
!             |  -4567890123456789.000
  (5 rows)
  
  SELECT '' AS to_char_14, to_char(q2, 'FM9999999999999999.999') FROM INT8_TBL;
--- 583,593 ----
  SELECT '' AS to_char_13, to_char(q2, 'L9999999999999999.000')  FROM INT8_TBL;
   to_char_13 |        to_char         
  ------------+------------------------
!             | $              456.000
!             | $ 4567890123456789.000
!             | $              123.000
!             | $ 4567890123456789.000
!             | $-4567890123456789.000
  (5 rows)
  
  SELECT '' AS to_char_14, to_char(q2, 'FM9999999999999999.999') FROM INT8_TBL;

======================================================================

*** /home/vagrant/postgresql/src/test/regress/expected/numeric.out      
2016-02-11 22:41:33.993260509 -0500
--- /home/vagrant/postgresql/src/test/regress/results/numeric.out       
2016-02-11 22:51:58.865238315 -0500
***************
*** 1061,1076 ****
  SELECT '' AS to_char_16, to_char(val, 'L9999999999999999.099999999999999')    
FROM num_data;
   to_char_16 |              to_char               
  ------------+------------------------------------
!             |                   .000000000000000
!             |                   .000000000000000
!             |          -34338492.215397047000000
!             |                  4.310000000000000
!             |            7799461.411900000000000
!             |              16397.038491000000000
!             |              93901.577630260000000
!             |          -83028485.000000000000000
!             |              74881.000000000000000
!             |          -24926804.045047420000000
  (10 rows)
  
  SELECT '' AS to_char_17, to_char(val, 'FM9999999999999999.99999999999999')    
FROM num_data;
--- 1061,1076 ----
  SELECT '' AS to_char_16, to_char(val, 'L9999999999999999.099999999999999')    
FROM num_data;
   to_char_16 |              to_char               
  ------------+------------------------------------
!             | $                 .000000000000000
!             | $                 .000000000000000
!             | $        -34338492.215397047000000
!             | $                4.310000000000000
!             | $          7799461.411900000000000
!             | $            16397.038491000000000
!             | $            93901.577630260000000
!             | $        -83028485.000000000000000
!             | $            74881.000000000000000
!             | $        -24926804.045047420000000
  (10 rows)
  
  SELECT  AS to_char_17, to_char(val, 'FM9999999999999999.99999999999999')      
FROM num_data;

======================================================================


The feature seems to work as described, but is it necessary to enclose multiple 
GUC settings in a parenthesis? This seems a deviation from the usual syntax of 
altering multiple settings separated with comma. 

Will test out more once I receive a response from the author.  

The new status of this patch is: Waiting on Author

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to