[jira] [Commented] (HAWQ-1578) Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan function was not found

2018-01-02 Thread Chiyang Wan (JIRA)

[ 
https://issues.apache.org/jira/browse/HAWQ-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16309258#comment-16309258
 ] 

Chiyang Wan commented on HAWQ-1578:
---

Sorry for the introducing bug, I will fix it as soon as possible.

> Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan 
> function was not found 
> 
>
> Key: HAWQ-1578
> URL: https://issues.apache.org/jira/browse/HAWQ-1578
> Project: Apache HAWQ
>  Issue Type: Bug
>  Components: PXF, Tests
>Reporter: WANG Weinan
>Assignee: Chiyang Wan
>
> The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
> log is shown as follow:
> Note: Google Test filter = TestHawqRanger.PXFHiveTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHiveTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
> [--] 1 test from TestHawqRanger (89777 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (89777 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHiveTest
>  1 FAILED TEST
> [125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
> ms)
> [128/133] TestHawqRanger.PXFHBaseTest (87121 ms)  
>   
> Note: Google Test filter = TestHawqRanger.PXFHBaseTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHBaseTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
> [--] 1 test from TestHawqRanger (87098 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (87099 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHBaseTest
> We can find some suspicious log in master segment log file :
> 2018-01-03 05:21:30.170970 
> UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
>  05:21:29 
> UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
>  function was not found (nodeExternalscan.c:310)",,"select * from 
> test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
> 10x8cf31e postgres errstart (elog.c:505)
> 20x8d11bb postgres elog_finish (elog.c:1459)
> 30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
> 40x670b9d postgres ExecInitNode (execProcnode.c:371)
> 50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
> 60x670064 postgres ExecInitNode (execProcnode.c:629)
> 70x66a407 postgres ExecutorStart (execMain.c:2048)
> 80x7f8fcd postgres PortalStart (pquery.c:1308)
> 90x7f0628 postgres  (postgres.c:1795)
> 10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
> 11   0x7a40c0 postgres  (postmaster.c:5486)
> 12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
> 13   0x4a5a59 postgres main (main.c:226)
> 14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
> 15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (HAWQ-1578) Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan function was not found

2018-01-02 Thread Hongxu Ma (JIRA)

[ 
https://issues.apache.org/jira/browse/HAWQ-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16309225#comment-16309225
 ] 

Hongxu Ma commented on HAWQ-1578:
-

[~Chiyang Wan]
Caused by your recent commit:
https://github.com/apache/incubator-hawq/commit/76e38c53b9377a055e6a2db6f63dc2e984c25025

Help to fix, thanks!


> Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan 
> function was not found 
> 
>
> Key: HAWQ-1578
> URL: https://issues.apache.org/jira/browse/HAWQ-1578
> Project: Apache HAWQ
>  Issue Type: Bug
>  Components: PXF, Tests
>Reporter: WANG Weinan
>Assignee: Chiyang Wan
>
> The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
> log is shown as follow:
> Note: Google Test filter = TestHawqRanger.PXFHiveTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHiveTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
> [--] 1 test from TestHawqRanger (89777 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (89777 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHiveTest
>  1 FAILED TEST
> [125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
> ms)
> [128/133] TestHawqRanger.PXFHBaseTest (87121 ms)  
>   
> Note: Google Test filter = TestHawqRanger.PXFHBaseTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHBaseTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
> [--] 1 test from TestHawqRanger (87098 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (87099 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHBaseTest
> We can find some suspicious log in master segment log file :
> 2018-01-03 05:21:30.170970 
> UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
>  05:21:29 
> UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
>  function was not found (nodeExternalscan.c:310)",,"select * from 
> test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
> 10x8cf31e postgres errstart (elog.c:505)
> 20x8d11bb postgres elog_finish (elog.c:1459)
> 30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
> 40x670b9d postgres ExecInitNode (execProcnode.c:371)
> 50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
> 60x670064 postgres ExecInitNode (execProcnode.c:629)
> 70x66a407 postgres ExecutorStart (execMain.c:2048)
> 80x7f8fcd postgres PortalStart (pquery.c:1308)
> 90x7f0628 postgres  (postgres.c:1795)
> 10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
> 11   0x7a40c0 postgres  (postmaster.c:5486)
> 12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
> 13   0x4a5a59 postgres main (main.c:226)
> 14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
> 15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (HAWQ-1578) Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan function was not found

2018-01-02 Thread Radar Lei (JIRA)

[ 
https://issues.apache.org/jira/browse/HAWQ-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16309224#comment-16309224
 ] 

Radar Lei commented on HAWQ-1578:
-

Seems this is related to commit: 
commit  
76e38c53b9377a055e6a2db6f63dc2e984c25025
message 
HAWQ-1565. Include Pluggable Storage Format Framework in External Table Scan

Assign to [~chiyang1] for further check.

> Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan 
> function was not found 
> 
>
> Key: HAWQ-1578
> URL: https://issues.apache.org/jira/browse/HAWQ-1578
> Project: Apache HAWQ
>  Issue Type: Bug
>  Components: PXF, Tests
>Reporter: WANG Weinan
>Assignee: Chiyang Wan
>
> The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
> log is shown as follow:
> Note: Google Test filter = TestHawqRanger.PXFHiveTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHiveTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
> [--] 1 test from TestHawqRanger (89777 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (89777 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHiveTest
>  1 FAILED TEST
> [125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
> ms)
> [128/133] TestHawqRanger.PXFHBaseTest (87121 ms)  
>   
> Note: Google Test filter = TestHawqRanger.PXFHBaseTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHBaseTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
> [--] 1 test from TestHawqRanger (87098 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (87099 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHBaseTest
> We can find some suspicious log in master segment log file :
> 2018-01-03 05:21:30.170970 
> UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
>  05:21:29 
> UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
>  function was not found (nodeExternalscan.c:310)",,"select * from 
> test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
> 10x8cf31e postgres errstart (elog.c:505)
> 20x8d11bb postgres elog_finish (elog.c:1459)
> 30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
> 40x670b9d postgres ExecInitNode (execProcnode.c:371)
> 50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
> 60x670064 postgres ExecInitNode (execProcnode.c:629)
> 70x66a407 postgres ExecutorStart (execMain.c:2048)
> 80x7f8fcd postgres PortalStart (pquery.c:1308)
> 90x7f0628 postgres  (postgres.c:1795)
> 10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
> 11   0x7a40c0 postgres  (postmaster.c:5486)
> 12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
> 13   0x4a5a59 postgres main (main.c:226)
> 14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
> 15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Assigned] (HAWQ-1578) Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan function was not found

2018-01-02 Thread Radar Lei (JIRA)

 [ 
https://issues.apache.org/jira/browse/HAWQ-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Radar Lei reassigned HAWQ-1578:
---

Assignee: Chiyang Wan  (was: Ed Espino)

> Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan 
> function was not found 
> 
>
> Key: HAWQ-1578
> URL: https://issues.apache.org/jira/browse/HAWQ-1578
> Project: Apache HAWQ
>  Issue Type: Bug
>  Components: PXF, Tests
>Reporter: WANG Weinan
>Assignee: Chiyang Wan
>
> The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
> log is shown as follow:
> Note: Google Test filter = TestHawqRanger.PXFHiveTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHiveTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
> [--] 1 test from TestHawqRanger (89777 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (89777 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHiveTest
>  1 FAILED TEST
> [125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
> ms)
> [128/133] TestHawqRanger.PXFHBaseTest (87121 ms)  
>   
> Note: Google Test filter = TestHawqRanger.PXFHBaseTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHBaseTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
> [--] 1 test from TestHawqRanger (87098 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (87099 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHBaseTest
> We can find some suspicious log in master segment log file :
> 2018-01-03 05:21:30.170970 
> UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
>  05:21:29 
> UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
>  function was not found (nodeExternalscan.c:310)",,"select * from 
> test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
> 10x8cf31e postgres errstart (elog.c:505)
> 20x8d11bb postgres elog_finish (elog.c:1459)
> 30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
> 40x670b9d postgres ExecInitNode (execProcnode.c:371)
> 50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
> 60x670064 postgres ExecInitNode (execProcnode.c:629)
> 70x66a407 postgres ExecutorStart (execMain.c:2048)
> 80x7f8fcd postgres PortalStart (pquery.c:1308)
> 90x7f0628 postgres  (postgres.c:1795)
> 10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
> 11   0x7a40c0 postgres  (postmaster.c:5486)
> 12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
> 13   0x4a5a59 postgres main (main.c:226)
> 14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
> 15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (HAWQ-1578) Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan function was not found

2018-01-02 Thread WANG Weinan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HAWQ-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

WANG Weinan updated HAWQ-1578:
--
Summary: Regression Test (Feature->Ranger)Failed because 
pxfwritable_import_beginscan function was not found   (was: Regression Test 
(Feature->Ranger)Failed due to )

> Regression Test (Feature->Ranger)Failed because pxfwritable_import_beginscan 
> function was not found 
> 
>
> Key: HAWQ-1578
> URL: https://issues.apache.org/jira/browse/HAWQ-1578
> Project: Apache HAWQ
>  Issue Type: Bug
>  Components: PXF, Tests
>Reporter: WANG Weinan
>Assignee: Ed Espino
>
> The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
> log is shown as follow:
> Note: Google Test filter = TestHawqRanger.PXFHiveTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHiveTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
> [--] 1 test from TestHawqRanger (89777 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (89777 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHiveTest
>  1 FAILED TEST
> [125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
> ms)
> [128/133] TestHawqRanger.PXFHBaseTest (87121 ms)  
>   
> Note: Google Test filter = TestHawqRanger.PXFHBaseTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHBaseTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
> [--] 1 test from TestHawqRanger (87098 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (87099 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHBaseTest
> We can find some suspicious log in master segment log file :
> 2018-01-03 05:21:30.170970 
> UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
>  05:21:29 
> UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
>  function was not found (nodeExternalscan.c:310)",,"select * from 
> test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
> 10x8cf31e postgres errstart (elog.c:505)
> 20x8d11bb postgres elog_finish (elog.c:1459)
> 30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
> 40x670b9d postgres ExecInitNode (execProcnode.c:371)
> 50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
> 60x670064 postgres ExecInitNode (execProcnode.c:629)
> 70x66a407 postgres ExecutorStart (execMain.c:2048)
> 80x7f8fcd postgres PortalStart (pquery.c:1308)
> 90x7f0628 postgres  (postgres.c:1795)
> 10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
> 11   0x7a40c0 postgres  (postmaster.c:5486)
> 12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
> 13   0x4a5a59 postgres main (main.c:226)
> 14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
> 15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (HAWQ-1578) Regression Test (Feature->Ranger)Failed due to

2018-01-02 Thread WANG Weinan (JIRA)

 [ 
https://issues.apache.org/jira/browse/HAWQ-1578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

WANG Weinan updated HAWQ-1578:
--
Summary: Regression Test (Feature->Ranger)Failed due to   (was: Feature 
Test Failed)

> Regression Test (Feature->Ranger)Failed due to 
> ---
>
> Key: HAWQ-1578
> URL: https://issues.apache.org/jira/browse/HAWQ-1578
> Project: Apache HAWQ
>  Issue Type: Bug
>  Components: PXF, Tests
>Reporter: WANG Weinan
>Assignee: Ed Espino
>
> The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
> log is shown as follow:
> Note: Google Test filter = TestHawqRanger.PXFHiveTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHiveTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
> [--] 1 test from TestHawqRanger (89777 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (89777 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHiveTest
>  1 FAILED TEST
> [125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
> ms)
> [128/133] TestHawqRanger.PXFHBaseTest (87121 ms)  
>   
> Note: Google Test filter = TestHawqRanger.PXFHBaseTest
> [==] Running 1 test from 1 test case.
> [--] Global test environment set-up.
> [--] 1 test from TestHawqRanger
> [ RUN  ] TestHawqRanger.PXFHBaseTest
> lib/sql_util.cpp:197: Failure
> Value of: is_sql_ans_diff
>   Actual: true
> Expected: false
> lib/sql_util.cpp:203: Failure
> Value of: true
>   Actual: true
> Expected: false
> [  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
> [--] 1 test from TestHawqRanger (87098 ms total)
> [--] Global test environment tear-down
> [==] 1 test from 1 test case ran. (87099 ms total)
> [  PASSED  ] 0 tests.
> [  FAILED  ] 1 test, listed below:
> [  FAILED  ] TestHawqRanger.PXFHBaseTest
> We can find some suspicious log in master segment log file :
> 2018-01-03 05:21:30.170970 
> UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
>  05:21:29 
> UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
>  function was not found (nodeExternalscan.c:310)",,"select * from 
> test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
> 10x8cf31e postgres errstart (elog.c:505)
> 20x8d11bb postgres elog_finish (elog.c:1459)
> 30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
> 40x670b9d postgres ExecInitNode (execProcnode.c:371)
> 50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
> 60x670064 postgres ExecInitNode (execProcnode.c:629)
> 70x66a407 postgres ExecutorStart (execMain.c:2048)
> 80x7f8fcd postgres PortalStart (pquery.c:1308)
> 90x7f0628 postgres  (postgres.c:1795)
> 10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
> 11   0x7a40c0 postgres  (postmaster.c:5486)
> 12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
> 13   0x4a5a59 postgres main (main.c:226)
> 14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
> 15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Created] (HAWQ-1578) Feature Test Failed

2018-01-02 Thread WANG Weinan (JIRA)
WANG Weinan created HAWQ-1578:
-

 Summary: Feature Test Failed
 Key: HAWQ-1578
 URL: https://issues.apache.org/jira/browse/HAWQ-1578
 Project: Apache HAWQ
  Issue Type: Bug
  Components: PXF, Tests
Reporter: WANG Weinan
Assignee: Ed Espino


The TestHawqRanger failed when do test PXFHiveTest and PXFHBaseTest, the test 
log is shown as follow:

Note: Google Test filter = TestHawqRanger.PXFHiveTest
[==] Running 1 test from 1 test case.
[--] Global test environment set-up.
[--] 1 test from TestHawqRanger
[ RUN  ] TestHawqRanger.PXFHiveTest
lib/sql_util.cpp:197: Failure
Value of: is_sql_ans_diff
  Actual: true
Expected: false
lib/sql_util.cpp:203: Failure
Value of: true
  Actual: true
Expected: false
[  FAILED  ] TestHawqRanger.PXFHiveTest (89777 ms)
[--] 1 test from TestHawqRanger (89777 ms total)

[--] Global test environment tear-down
[==] 1 test from 1 test case ran. (89777 ms total)
[  PASSED  ] 0 tests.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] TestHawqRanger.PXFHiveTest

 1 FAILED TEST
[125/133] TestHawqRanger.PXFHiveTest returned/aborted with exit code 1 (89787 
ms)
[128/133] TestHawqRanger.PXFHBaseTest (87121 ms)
Note: Google Test filter = TestHawqRanger.PXFHBaseTest
[==] Running 1 test from 1 test case.
[--] Global test environment set-up.
[--] 1 test from TestHawqRanger
[ RUN  ] TestHawqRanger.PXFHBaseTest
lib/sql_util.cpp:197: Failure
Value of: is_sql_ans_diff
  Actual: true
Expected: false
lib/sql_util.cpp:203: Failure
Value of: true
  Actual: true
Expected: false
[  FAILED  ] TestHawqRanger.PXFHBaseTest (87098 ms)
[--] 1 test from TestHawqRanger (87098 ms total)

[--] Global test environment tear-down
[==] 1 test from 1 test case ran. (87099 ms total)
[  PASSED  ] 0 tests.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] TestHawqRanger.PXFHBaseTest


We can find some suspicious log in master segment log file :

2018-01-03 05:21:30.170970 
UTC,"gpadmin","hawq_feature_test_db",p109703,th-290256608,"127.0.0.1","56288",2018-01-03
 05:21:29 
UTC,14669,con2342,cmd4,seg-1,,,x14669,sx1,"ERROR","XX000","pxfwritable_import_beginscan
 function was not found (nodeExternalscan.c:310)",,"select * from 
test_hbase;",0,,"nodeExternalscan.c",310,"Stack trace:
10x8cf31e postgres errstart (elog.c:505)
20x8d11bb postgres elog_finish (elog.c:1459)
30x69134a postgres ExecInitExternalScan (nodeExternalscan.c:215)
40x670b9d postgres ExecInitNode (execProcnode.c:371)
50x69b7d1 postgres ExecInitMotion (nodeMotion.c:1096)
60x670064 postgres ExecInitNode (execProcnode.c:629)
70x66a407 postgres ExecutorStart (execMain.c:2048)
80x7f8fcd postgres PortalStart (pquery.c:1308)
90x7f0628 postgres  (postgres.c:1795)
10   0x7f1cb0 postgres PostgresMain (postgres.c:4897)
11   0x7a40c0 postgres  (postmaster.c:5486)
12   0x7a6e89 postgres PostmasterMain (postmaster.c:1459)
13   0x4a5a59 postgres main (main.c:226)
14   0x7fceea8a1d1d libc.so.6 __libc_start_main (??:0)
15   0x4a5ad9 postgres  (??:0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] incubator-hawq issue #1325: HAWQ-1125. Running pl/python related feature_tes...

2018-01-02 Thread radarwave
Github user radarwave commented on the issue:

https://github.com/apache/incubator-hawq/pull/1325
  
Merged, please close this PR. Thanks.


---


[GitHub] incubator-hawq issue #1324: HAWQ-1573. Clear debug_query_string in proc_exit...

2018-01-02 Thread radarwave
Github user radarwave commented on the issue:

https://github.com/apache/incubator-hawq/pull/1324
  
Merged, please close this PR. Thanks.


---


[GitHub] incubator-hawq issue #1325: HAWQ-1125. Running pl/python related feature_tes...

2018-01-02 Thread radarwave
Github user radarwave commented on the issue:

https://github.com/apache/incubator-hawq/pull/1325
  
+1


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread denalex
Github user denalex commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159308210
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/utilities/HdfsUtilities.java
 ---
@@ -151,18 +153,42 @@ public static boolean isThreadSafe(String dataDir, 
String compCodec) {
  * @param fsp file split to be serialized
  * @return byte serialization of fsp
  * @throws IOException if I/O errors occur while writing to the 
underlying
- * stream
+ * stream
  */
 public static byte[] prepareFragmentMetadata(FileSplit fsp)
 throws IOException {
-ByteArrayOutputStream byteArrayStream = new 
ByteArrayOutputStream();
-ObjectOutputStream objectStream = new ObjectOutputStream(
-byteArrayStream);
-objectStream.writeLong(fsp.getStart());
-objectStream.writeLong(fsp.getLength());
-objectStream.writeObject(fsp.getLocations());
+
+return prepareFragmentMetadata(fsp.getStart(), fsp.getLength(), 
fsp.getLocations());
+
+}
+
+public static byte[] prepareFragmentMetadata(long start, long length, 
String[] locations)
--- End diff --

or better to incorporate 2 lines from this function into the parent 
function, if it only is used once.


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread denalex
Github user denalex commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159306278
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/ParquetDataFragmenter.java
 ---
@@ -0,0 +1,103 @@
+package org.apache.hawq.pxf.plugins.hdfs;
+
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * 
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.mapreduce.InputSplit;
+import org.apache.hadoop.mapreduce.lib.input.FileSplit;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapreduce.Job;
+import org.apache.hawq.pxf.api.Fragment;
+import org.apache.hawq.pxf.api.Fragmenter;
+import org.apache.hawq.pxf.api.utilities.InputData;
+import org.apache.hawq.pxf.plugins.hdfs.utilities.HdfsUtilities;
+import org.apache.parquet.format.converter.ParquetMetadataConverter;
+import org.apache.parquet.hadoop.ParquetFileReader;
+import org.apache.parquet.hadoop.ParquetInputFormat;
+import org.apache.parquet.example.data.Group;
+import org.apache.parquet.hadoop.metadata.ParquetMetadata;
+import org.apache.parquet.schema.MessageType;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.List;
+
+public class ParquetDataFragmenter extends Fragmenter {
+private Job job;
+
+public ParquetDataFragmenter(InputData md) {
+super(md);
+JobConf jobConf = new JobConf(new Configuration(), 
ParquetDataFragmenter.class);
+try {
+job = Job.getInstance(jobConf);
+} catch (IOException e) {
+throw new RuntimeException("Unable to instantiate a job for 
reading fragments", e);
+}
+}
+
+
+@Override
+public List getFragments() throws Exception {
+String absoluteDataPath = 
HdfsUtilities.absoluteDataPath(inputData.getDataSource());
+ArrayList splits = getSplits(new 
Path(absoluteDataPath));
--- End diff --

usually best to declare type as List, especially since there is no direct 
access calls that would justify narrowing this to ArrayList


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread denalex
Github user denalex commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159308579
  
--- Diff: pxf/pxf-service/src/scripts/pxf-env.sh ---
@@ -54,3 +54,5 @@ export HADOOP_DISTRO=${HADOOP_DISTRO}
 # Parent directory of Hadoop client installation (optional)
 # used in case of tarball-based installation when all clients are under a 
common parent directory
 export HADOOP_ROOT=${HADOOP_ROOT}
+
+export 
CATALINA_OPTS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005"
--- End diff --

this is for debugging, should not be committed !


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread shivzone
Github user shivzone commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159298493
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/ParquetFileAccessor.java
 ---
@@ -0,0 +1,168 @@
+package org.apache.hawq.pxf.plugins.hdfs;
+
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * 
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.mapred.FileSplit;
+import org.apache.hawq.pxf.api.OneRow;
+import org.apache.hawq.pxf.api.ReadAccessor;
+import org.apache.hawq.pxf.api.utilities.InputData;
+import org.apache.hawq.pxf.api.utilities.Plugin;
+import org.apache.hawq.pxf.plugins.hdfs.utilities.HdfsUtilities;
+
+import org.apache.parquet.column.page.PageReadStore;
+import org.apache.parquet.example.data.Group;
+import org.apache.parquet.example.data.simple.convert.GroupRecordConverter;
+import org.apache.parquet.format.converter.ParquetMetadataConverter;
+import org.apache.parquet.hadoop.ParquetFileReader;
+import org.apache.parquet.io.ColumnIOFactory;
+import org.apache.parquet.io.MessageColumnIO;
+import org.apache.parquet.io.RecordReader;
+import org.apache.parquet.schema.MessageType;
+
+import java.io.IOException;
+import java.util.Iterator;
+
+/**
+ * Parquet file accessor.
--- End diff --

Also mention what exactly the accessor call returns (record or chunk etc)


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread shivzone
Github user shivzone commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159297466
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/ParquetDataFragmenter.java
 ---
@@ -0,0 +1,103 @@
+package org.apache.hawq.pxf.plugins.hdfs;
+
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * 
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.mapreduce.InputSplit;
+import org.apache.hadoop.mapreduce.lib.input.FileSplit;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapreduce.Job;
+import org.apache.hawq.pxf.api.Fragment;
+import org.apache.hawq.pxf.api.Fragmenter;
+import org.apache.hawq.pxf.api.utilities.InputData;
+import org.apache.hawq.pxf.plugins.hdfs.utilities.HdfsUtilities;
+import org.apache.parquet.format.converter.ParquetMetadataConverter;
+import org.apache.parquet.hadoop.ParquetFileReader;
+import org.apache.parquet.hadoop.ParquetInputFormat;
+import org.apache.parquet.example.data.Group;
+import org.apache.parquet.hadoop.metadata.ParquetMetadata;
+import org.apache.parquet.schema.MessageType;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.List;
+
+public class ParquetDataFragmenter extends Fragmenter {
+private Job job;
+
+public ParquetDataFragmenter(InputData md) {
+super(md);
+JobConf jobConf = new JobConf(new Configuration(), 
ParquetDataFragmenter.class);
+try {
+job = Job.getInstance(jobConf);
+} catch (IOException e) {
+throw new RuntimeException("Unable to instantiate a job for 
reading fragments", e);
+}
+}
+
+
+@Override
+public List getFragments() throws Exception {
+String absoluteDataPath = 
HdfsUtilities.absoluteDataPath(inputData.getDataSource());
+ArrayList splits = getSplits(new 
Path(absoluteDataPath));
+
+for (InputSplit split : splits) {
+FileSplit fsp = (FileSplit) split;
+
+String filepath = fsp.getPath().toUri().getPath();
+String[] hosts = fsp.getLocations();
+
+Path file = new Path(filepath);
+
+ParquetMetadata metadata = ParquetFileReader.readFooter(
+job.getConfiguration(), file, 
ParquetMetadataConverter.NO_FILTER);
+MessageType schema = metadata.getFileMetaData().getSchema();
+
+byte[] fragmentMetadata = 
HdfsUtilities.prepareFragmentMetadata(fsp.getStart(), fsp.getLength(), 
fsp.getLocations());
--- End diff --

Can we simply use prepareFragmentMetadata(fsp)


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread shivzone
Github user shivzone commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159297714
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/utilities/HdfsUtilities.java
 ---
@@ -151,18 +153,42 @@ public static boolean isThreadSafe(String dataDir, 
String compCodec) {
  * @param fsp file split to be serialized
  * @return byte serialization of fsp
  * @throws IOException if I/O errors occur while writing to the 
underlying
- * stream
+ * stream
  */
 public static byte[] prepareFragmentMetadata(FileSplit fsp)
 throws IOException {
-ByteArrayOutputStream byteArrayStream = new 
ByteArrayOutputStream();
-ObjectOutputStream objectStream = new ObjectOutputStream(
-byteArrayStream);
-objectStream.writeLong(fsp.getStart());
-objectStream.writeLong(fsp.getLength());
-objectStream.writeObject(fsp.getLocations());
+
+return prepareFragmentMetadata(fsp.getStart(), fsp.getLength(), 
fsp.getLocations());
+
+}
+
+public static byte[] prepareFragmentMetadata(long start, long length, 
String[] locations)
--- End diff --

This can be made private 


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread shivzone
Github user shivzone commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159299485
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/ParquetFileAccessor.java
 ---
@@ -0,0 +1,168 @@
+package org.apache.hawq.pxf.plugins.hdfs;
+
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * 
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.mapred.FileSplit;
+import org.apache.hawq.pxf.api.OneRow;
+import org.apache.hawq.pxf.api.ReadAccessor;
+import org.apache.hawq.pxf.api.utilities.InputData;
+import org.apache.hawq.pxf.api.utilities.Plugin;
+import org.apache.hawq.pxf.plugins.hdfs.utilities.HdfsUtilities;
+
+import org.apache.parquet.column.page.PageReadStore;
+import org.apache.parquet.example.data.Group;
+import org.apache.parquet.example.data.simple.convert.GroupRecordConverter;
+import org.apache.parquet.format.converter.ParquetMetadataConverter;
+import org.apache.parquet.hadoop.ParquetFileReader;
+import org.apache.parquet.io.ColumnIOFactory;
+import org.apache.parquet.io.MessageColumnIO;
+import org.apache.parquet.io.RecordReader;
+import org.apache.parquet.schema.MessageType;
+
+import java.io.IOException;
+import java.util.Iterator;
+
+/**
+ * Parquet file accessor.
+ */
+public class ParquetFileAccessor extends Plugin implements ReadAccessor {
+private ParquetFileReader reader;
+private MessageColumnIO columnIO;
+private RecordIterator recordIterator;
+private MessageType schema;
+
+
+private class RecordIterator implements Iterator {
+
+private final ParquetFileReader reader;
+private PageReadStore currentRowGroup;
+private RecordReader recordReader;
+private long rowsRemainedInRowGroup;
+
+public RecordIterator(ParquetFileReader reader) {
+this.reader = reader;
+readNextRowGroup();
+}
+
+@Override
+public boolean hasNext() {
+return rowsRemainedInRowGroup > 0;
+}
+
+@Override
+public OneRow next() {
+return new OneRow(null, readNextGroup());
+}
+
+@Override
+public void remove() {
+throw new UnsupportedOperationException();
+}
+
+private void readNextRowGroup() {
+try {
+currentRowGroup = reader.readNextRowGroup();
+} catch (IOException e) {
+throw new RuntimeException("Error occurred during reading 
new row group", e);
+}
+if (currentRowGroup == null)
+return;
+rowsRemainedInRowGroup = currentRowGroup.getRowCount();
+recordReader = columnIO.getRecordReader(currentRowGroup, new 
GroupRecordConverter(schema));
+}
+
+private Group readNextGroup() {
+Group g = null;
+if (rowsRemainedInRowGroup == 0) {
+readNextRowGroup();
+if (currentRowGroup != null) {
+g = recordReader.read();
+}
+} else {
+g = recordReader.read();
+if (g == null) {
--- End diff --

Instead of this. Why don't we simply invoke recordReader.read() 
currentRowGroup.getRowCount() # of times instead of decrementing 
rowsRemainedInRowGroup which is making this code look complicated


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread shivzone
Github user shivzone commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159298191
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/ParquetDataFragmenter.java
 ---
@@ -0,0 +1,103 @@
+package org.apache.hawq.pxf.plugins.hdfs;
+
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * 
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ * 
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.mapreduce.InputSplit;
+import org.apache.hadoop.mapreduce.lib.input.FileSplit;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapreduce.Job;
+import org.apache.hawq.pxf.api.Fragment;
+import org.apache.hawq.pxf.api.Fragmenter;
+import org.apache.hawq.pxf.api.utilities.InputData;
+import org.apache.hawq.pxf.plugins.hdfs.utilities.HdfsUtilities;
+import org.apache.parquet.format.converter.ParquetMetadataConverter;
+import org.apache.parquet.hadoop.ParquetFileReader;
+import org.apache.parquet.hadoop.ParquetInputFormat;
+import org.apache.parquet.example.data.Group;
+import org.apache.parquet.hadoop.metadata.ParquetMetadata;
+import org.apache.parquet.schema.MessageType;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.List;
+
+public class ParquetDataFragmenter extends Fragmenter {
+private Job job;
+
+public ParquetDataFragmenter(InputData md) {
+super(md);
+JobConf jobConf = new JobConf(new Configuration(), 
ParquetDataFragmenter.class);
+try {
+job = Job.getInstance(jobConf);
+} catch (IOException e) {
+throw new RuntimeException("Unable to instantiate a job for 
reading fragments", e);
+}
+}
+
+
+@Override
--- End diff --

Comments would be useful describing the components of the Fragment data here


---


[GitHub] incubator-hawq pull request #1326: HAWQ-1575. Implemented readable Parquet p...

2018-01-02 Thread shivzone
Github user shivzone commented on a diff in the pull request:

https://github.com/apache/incubator-hawq/pull/1326#discussion_r159297386
  
--- Diff: 
pxf/pxf-hdfs/src/main/java/org/apache/hawq/pxf/plugins/hdfs/utilities/HdfsUtilities.java
 ---
@@ -151,18 +153,42 @@ public static boolean isThreadSafe(String dataDir, 
String compCodec) {
  * @param fsp file split to be serialized
  * @return byte serialization of fsp
  * @throws IOException if I/O errors occur while writing to the 
underlying
- * stream
+ * stream
  */
 public static byte[] prepareFragmentMetadata(FileSplit fsp)
 throws IOException {
-ByteArrayOutputStream byteArrayStream = new 
ByteArrayOutputStream();
-ObjectOutputStream objectStream = new ObjectOutputStream(
-byteArrayStream);
-objectStream.writeLong(fsp.getStart());
-objectStream.writeLong(fsp.getLength());
-objectStream.writeObject(fsp.getLocations());
+
+return prepareFragmentMetadata(fsp.getStart(), fsp.getLength(), 
fsp.getLocations());
+
+}
+
+public static byte[] prepareFragmentMetadata(long start, long length, 
String[] locations)
+throws IOException {
+
+ByteArrayOutputStream byteArrayStream = 
writeBaseFragmentInfo(start, length, locations);
 
 return byteArrayStream.toByteArray();
+
+}
+
+private static ByteArrayOutputStream writeBaseFragmentInfo(long start, 
long length, String[] locations) throws IOException {
+ByteArrayOutputStream byteArrayStream = new 
ByteArrayOutputStream();
+ObjectOutputStream objectStream = new 
ObjectOutputStream(byteArrayStream);
+objectStream.writeLong(start);
+objectStream.writeLong(length);
+objectStream.writeObject(locations);
+return byteArrayStream;
+}
+
+public static byte[] prepareFragmentMetadata(long start,
--- End diff --

I do'nt see this function used anywhere ?


---


[GitHub] incubator-hawq issue #1324: HAWQ-1573. Clear debug_query_string in proc_exit...

2018-01-02 Thread linwen
Github user linwen commented on the issue:

https://github.com/apache/incubator-hawq/pull/1324
  
+1 


---