[jira] [Commented] (HAWQ-914) Improve user experience of HAWQ's build infrastructure
[ https://issues.apache.org/jira/browse/HAWQ-914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15414685#comment-15414685 ] Paul Guo commented on HAWQ-914: --- There are also some info need to be more platform nuetral. e.g. "bzip2-devel is required" > Improve user experience of HAWQ's build infrastructure > -- > > Key: HAWQ-914 > URL: https://issues.apache.org/jira/browse/HAWQ-914 > Project: Apache HAWQ > Issue Type: Improvement > Components: Build >Affects Versions: 2.0.0.0-incubating >Reporter: Roman Shaposhnik >Assignee: Paul Guo > Fix For: 2.0.1.0-incubating > > > This is likely to end up being an umbrella JIRA so feel free to fork off > sub-tasks whenever it makes sense. > As an end-user of HAWQ's build system, I'd like to see the default of the > build system (running configure/etc. with no arguments) to be: > # treating optional missing dependencies with a WARNING similar to what > PostrgreSQL configure does in the following example: > {noformat} > checking for bison... no > configure: WARNING: > *** Without Bison you will not be able to build PostgreSQL from CVS nor > *** change any of the parser definition files. You can obtain Bison from > *** a GNU mirror site. (If you are using the official distribution of > *** PostgreSQL then you do not need to worry about this, because the Bison > *** output is pre-generated.) To use a different yacc program (possible, > *** but not recommended), set the environment variable YACC before running > *** 'configure'. > {noformat} > # treating all the missing suggested dependencies by failing the build and > suggesting how to point at binary copies of these missing dependencies > similar to what PostrgreSQL configure does in the following example: > {noformat} > checking for -ledit... no > configure: error: readline library not found > If you have readline already installed, see config.log for details on the > failure. It is possible the compiler isn't looking in the proper directory. > Use --without-readline to disable readline support. > {noformat} > # treating the core dependencies same as suggested dependencies, but > obviously about the option of continuing the build without them. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-992) PXF Hive data type check in Fragmenter too restrictive
[ https://issues.apache.org/jira/browse/HAWQ-992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shivram Mani updated HAWQ-992: -- Description: HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) But Hawq type numeric is not compatible with hive's decimal(10,10) Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. Support following additional hive data types: date, varchar, char was: HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) Hawq type numeric is not compatible with hive's decimal(10,10) Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. Support following additional hive data types: date, varchar, char > PXF Hive data type check in Fragmenter too restrictive > -- > > Key: HAWQ-992 > URL: https://issues.apache.org/jira/browse/HAWQ-992 > Project: Apache HAWQ > Issue Type: Bug > Components: PXF >Reporter: Shivram Mani >Assignee: Shivram Mani > Fix For: backlog > > > HiveDataFragmenter used by both HiveText and HiveRC profiles has a very > strict type check. > Hawq type numeric(10,10) is compatible with hive's decimal(10,10) > But Hawq type numeric is not compatible with hive's decimal(10,10) > Similar issue exits with other data types which have variable optional > arguments. The type check should be modified to allow hawq type that is a > compabitle type but without optional precision/length arguments to work with > the corresponding hive type. > Support following additional hive data types: date, varchar, char -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-975) Queries run much slower with 'explain analyze' than which without 'explain analyze'
[ https://issues.apache.org/jira/browse/HAWQ-975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Goden Yao updated HAWQ-975: --- Priority: Critical (was: Major) > Queries run much slower with 'explain analyze' than which without 'explain > analyze' > > > Key: HAWQ-975 > URL: https://issues.apache.org/jira/browse/HAWQ-975 > Project: Apache HAWQ > Issue Type: Bug > Components: Core >Reporter: Chunling Wang >Assignee: Lei Chang >Priority: Critical > Labels: performance > Fix For: 2.0.1.0-incubating > > > When we run queries with 'explain analyze' in AWS cluster, the total running > time is about 2-3 times longer than which without 'explain analyze'. > Here is a group of TPC-H results for queries with 'explain analyze' and > queries without 'explain analyze'. > ||query ||without 'explain analyze' ||with 'explain analyze' > ||multiple > |TPCH_Query_01| 311843 | 818658 | 2.63 > |TPCH_Query_02| 34675 | 117884 | 3.40 > |TPCH_Query_03| 166155 | 422131 | 2.54 > |TPCH_Query_04| 157807 | 507143 | 3.21 > |TPCH_Query_05| 272657 | 710573 | 2.61 > |TPCH_Query_06| 12508 | 22276 | 1.78 > |TPCH_Query_07| 71893 | 370338 | 5.15 > |TPCH_Query_08| 12 | 672625 | 5.17 > |TPCH_Query_09| 575709 | 1171672 | 2.04 > |TPCH_Query_10| 93770 | 233391 | 2.49 > |TPCH_Query_11| 16252 | 58360 | 3.59 > |TPCH_Query_12| 142576 | 237270 | 1.66 > |TPCH_Query_13| 72682 | 343257 | 4.72 > |TPCH_Query_14| 10410 | 32337 | 3.11 > |TPCH_Query_15| 25719 | 98705 | 3.84 > |TPCH_Query_16| 21382 | 76877 | 3.60 > |TPCH_Query_17| 839683 | 2041169 | 2.43 > |TPCH_Query_18| 460570 | 1065940 | 2.31 > |TPCH_Query_19| 69075 | 82286 | 1.19 > |TPCH_Query_20| 78263 | 292041 | 3.73 > |TPCH_Query_21| 505606 | 1549690 | 3.07 > |TPCH_Query_22| 56450 | 329837 | 5.84 > |Total| 4125684 | 11254460| > 2.73 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-975) Queries run much slower with 'explain analyze' than which without 'explain analyze'
[ https://issues.apache.org/jira/browse/HAWQ-975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Goden Yao updated HAWQ-975: --- Fix Version/s: (was: backlog) 2.0.1.0-incubating > Queries run much slower with 'explain analyze' than which without 'explain > analyze' > > > Key: HAWQ-975 > URL: https://issues.apache.org/jira/browse/HAWQ-975 > Project: Apache HAWQ > Issue Type: Bug > Components: Core >Reporter: Chunling Wang >Assignee: Lei Chang > Labels: performance > Fix For: 2.0.1.0-incubating > > > When we run queries with 'explain analyze' in AWS cluster, the total running > time is about 2-3 times longer than which without 'explain analyze'. > Here is a group of TPC-H results for queries with 'explain analyze' and > queries without 'explain analyze'. > ||query ||without 'explain analyze' ||with 'explain analyze' > ||multiple > |TPCH_Query_01| 311843 | 818658 | 2.63 > |TPCH_Query_02| 34675 | 117884 | 3.40 > |TPCH_Query_03| 166155 | 422131 | 2.54 > |TPCH_Query_04| 157807 | 507143 | 3.21 > |TPCH_Query_05| 272657 | 710573 | 2.61 > |TPCH_Query_06| 12508 | 22276 | 1.78 > |TPCH_Query_07| 71893 | 370338 | 5.15 > |TPCH_Query_08| 12 | 672625 | 5.17 > |TPCH_Query_09| 575709 | 1171672 | 2.04 > |TPCH_Query_10| 93770 | 233391 | 2.49 > |TPCH_Query_11| 16252 | 58360 | 3.59 > |TPCH_Query_12| 142576 | 237270 | 1.66 > |TPCH_Query_13| 72682 | 343257 | 4.72 > |TPCH_Query_14| 10410 | 32337 | 3.11 > |TPCH_Query_15| 25719 | 98705 | 3.84 > |TPCH_Query_16| 21382 | 76877 | 3.60 > |TPCH_Query_17| 839683 | 2041169 | 2.43 > |TPCH_Query_18| 460570 | 1065940 | 2.31 > |TPCH_Query_19| 69075 | 82286 | 1.19 > |TPCH_Query_20| 78263 | 292041 | 3.73 > |TPCH_Query_21| 505606 | 1549690 | 3.07 > |TPCH_Query_22| 56450 | 329837 | 5.84 > |Total| 4125684 | 11254460| > 2.73 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-992) PXF Hive data type check in Fragmenter too restrictive
[ https://issues.apache.org/jira/browse/HAWQ-992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shivram Mani updated HAWQ-992: -- Description: HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) Hawq type numeric is not compatible with hive's decimal(10,10) Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. Support following additional hive data types: date, varchar, char was: HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) Hawq type numeric is not compatible with hive's decimal(10,10) Support following additional hive data types: date, varchar, char, Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. > PXF Hive data type check in Fragmenter too restrictive > -- > > Key: HAWQ-992 > URL: https://issues.apache.org/jira/browse/HAWQ-992 > Project: Apache HAWQ > Issue Type: Bug > Components: PXF >Reporter: Shivram Mani >Assignee: Goden Yao > Fix For: backlog > > > HiveDataFragmenter used by both HiveText and HiveRC profiles has a very > strict type check. > Hawq type numeric(10,10) is compatible with hive's decimal(10,10) > Hawq type numeric is not compatible with hive's decimal(10,10) > Similar issue exits with other data types which have variable optional > arguments. The type check should be modified to allow hawq type that is a > compabitle type but without optional precision/length arguments to work with > the corresponding hive type. > Support following additional hive data types: date, varchar, char -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-992) PXF Hive data type check in Fragmenter too restrictive
[ https://issues.apache.org/jira/browse/HAWQ-992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shivram Mani updated HAWQ-992: -- Description: HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) Hawq type numeric is not compatible with hive's decimal(10,10) Support following additional hive data types: date, varchar, char, Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. was: HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) Hawq type numeric is not compatible with hive's decimal(10,10) Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. > PXF Hive data type check in Fragmenter too restrictive > -- > > Key: HAWQ-992 > URL: https://issues.apache.org/jira/browse/HAWQ-992 > Project: Apache HAWQ > Issue Type: Bug > Components: PXF >Reporter: Shivram Mani >Assignee: Goden Yao > Fix For: backlog > > > HiveDataFragmenter used by both HiveText and HiveRC profiles has a very > strict type check. > Hawq type numeric(10,10) is compatible with hive's decimal(10,10) > Hawq type numeric is not compatible with hive's decimal(10,10) > Support following additional hive data types: date, varchar, char, > Similar issue exits with other data types which have variable optional > arguments. The type check should be modified to allow hawq type that is a > compabitle type but without optional precision/length arguments to work with > the corresponding hive type. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-993) Symbolic link (contrib/pgcrypto) pointing to non-existent location
[ https://issues.apache.org/jira/browse/HAWQ-993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Goden Yao updated HAWQ-993: --- Summary: Symbolic link (contrib/pgcrypto) pointing to non-existent location (was: Symbolic link (contrib/pgcrypto) pointing to non-existent loctation) > Symbolic link (contrib/pgcrypto) pointing to non-existent location > -- > > Key: HAWQ-993 > URL: https://issues.apache.org/jira/browse/HAWQ-993 > Project: Apache HAWQ > Issue Type: Bug > Components: Build >Affects Versions: 2.0.0.0-incubating >Reporter: Ed Espino >Assignee: Ed Espino >Priority: Minor > Fix For: 2.0.0.0-incubating > > > The following symbolic link points to a non-existent location. This > generates a potential error for the "rat" utility (see below): > contrib/pgcrypto@ -> ../depends/thirdparty/postgres/contrib/pgcrypto > {code} > java -Xms1024m -Xmx1024m -jar ~/Downloads/apache-rat-0.11/apache-rat-0.11.jar > . > Exception in thread "main" org.apache.rat.api.RatException: Cannot read header > at > org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:44) > at org.apache.rat.walker.DirectoryWalker.report(DirectoryWalker.java:147) > at > org.apache.rat.walker.DirectoryWalker.processNonDirectories(DirectoryWalker.java:131) > at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:98) > at > org.apache.rat.walker.DirectoryWalker.processDirectory(DirectoryWalker.java:71) > at > org.apache.rat.walker.DirectoryWalker.processDirectories(DirectoryWalker.java:114) > at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:99) > at org.apache.rat.walker.DirectoryWalker.run(DirectoryWalker.java:83) > at org.apache.rat.Report.report(Report.java:418) > at org.apache.rat.Report.report(Report.java:394) > at org.apache.rat.Report.report(Report.java:366) > at org.apache.rat.Report.styleReport(Report.java:346) > at org.apache.rat.Report.main(Report.java:109) > Caused by: org.apache.rat.document.RatDocumentAnalysisException: Cannot read > header > at > org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:45) > at > org.apache.rat.analysis.DefaultAnalyserFactory$DefaultAnalyser.analyse(DefaultAnalyserFactory.java:60) > at > org.apache.rat.document.impl.util.DocumentAnalyserMultiplexer.analyse(DocumentAnalyserMultiplexer.java:36) > at > org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:42) > ... 12 more > Caused by: java.io.FileNotFoundException: ./contrib/pgcrypto (No such file or > directory) > at java.io.FileInputStream.open(Native Method) > at java.io.FileInputStream.(FileInputStream.java:131) > at java.io.FileReader.(FileReader.java:72) > at org.apache.rat.document.impl.FileDocument.reader(FileDocument.java:52) > at > org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:40) > ... 15 more > ERROR: 'Pipe broken' > ERROR: 'com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe > broken' > Exception in thread "Thread-0" org.apache.rat.ReportFailedRuntimeException: > javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at org.apache.rat.ReportTransformer.run(ReportTransformer.java:59) > at java.lang.Thread.run(Thread.java:745) > Caused by: javax.xml.transform.TransformerException: > javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:755) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:357) > at org.apache.rat.ReportTransformer.transform(ReportTransformer.java:64) > at org.apache.rat.ReportTransformer.run(ReportTransformer.java:57) > ... 1 more > Caused by: javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:584) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:745) > ... 4 more > Caused by: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: > Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:427) > at > com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:215) > at >
[jira] [Updated] (HAWQ-992) PXF Hive data type check in Fragmenter too restrictive
[ https://issues.apache.org/jira/browse/HAWQ-992?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Goden Yao updated HAWQ-992: --- Fix Version/s: backlog > PXF Hive data type check in Fragmenter too restrictive > -- > > Key: HAWQ-992 > URL: https://issues.apache.org/jira/browse/HAWQ-992 > Project: Apache HAWQ > Issue Type: Bug > Components: PXF >Reporter: Shivram Mani >Assignee: Goden Yao > Fix For: backlog > > > HiveDataFragmenter used by both HiveText and HiveRC profiles has a very > strict type check. > Hawq type numeric(10,10) is compatible with hive's decimal(10,10) > Hawq type numeric is not compatible with hive's decimal(10,10) > Similar issue exits with other data types which have variable optional > arguments. The type check should be modified to allow hawq type that is a > compabitle type but without optional precision/length arguments to work with > the corresponding hive type. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-993) Symbolic link (contrib/pgcrypto) pointing to non-existent loctation
[ https://issues.apache.org/jira/browse/HAWQ-993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ed Espino updated HAWQ-993: --- Priority: Minor (was: Major) > Symbolic link (contrib/pgcrypto) pointing to non-existent loctation > --- > > Key: HAWQ-993 > URL: https://issues.apache.org/jira/browse/HAWQ-993 > Project: Apache HAWQ > Issue Type: Bug > Components: Build >Affects Versions: 2.0.0.0-incubating >Reporter: Ed Espino >Assignee: Ed Espino >Priority: Minor > > The following symbolic link points to a non-existent location. This > generates a potential error for the "rat" utility (see below): > contrib/pgcrypto@ -> ../depends/thirdparty/postgres/contrib/pgcrypto > {code} > java -Xms1024m -Xmx1024m -jar ~/Downloads/apache-rat-0.11/apache-rat-0.11.jar > . > Exception in thread "main" org.apache.rat.api.RatException: Cannot read header > at > org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:44) > at org.apache.rat.walker.DirectoryWalker.report(DirectoryWalker.java:147) > at > org.apache.rat.walker.DirectoryWalker.processNonDirectories(DirectoryWalker.java:131) > at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:98) > at > org.apache.rat.walker.DirectoryWalker.processDirectory(DirectoryWalker.java:71) > at > org.apache.rat.walker.DirectoryWalker.processDirectories(DirectoryWalker.java:114) > at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:99) > at org.apache.rat.walker.DirectoryWalker.run(DirectoryWalker.java:83) > at org.apache.rat.Report.report(Report.java:418) > at org.apache.rat.Report.report(Report.java:394) > at org.apache.rat.Report.report(Report.java:366) > at org.apache.rat.Report.styleReport(Report.java:346) > at org.apache.rat.Report.main(Report.java:109) > Caused by: org.apache.rat.document.RatDocumentAnalysisException: Cannot read > header > at > org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:45) > at > org.apache.rat.analysis.DefaultAnalyserFactory$DefaultAnalyser.analyse(DefaultAnalyserFactory.java:60) > at > org.apache.rat.document.impl.util.DocumentAnalyserMultiplexer.analyse(DocumentAnalyserMultiplexer.java:36) > at > org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:42) > ... 12 more > Caused by: java.io.FileNotFoundException: ./contrib/pgcrypto (No such file or > directory) > at java.io.FileInputStream.open(Native Method) > at java.io.FileInputStream.(FileInputStream.java:131) > at java.io.FileReader.(FileReader.java:72) > at org.apache.rat.document.impl.FileDocument.reader(FileDocument.java:52) > at > org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:40) > ... 15 more > ERROR: 'Pipe broken' > ERROR: 'com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe > broken' > Exception in thread "Thread-0" org.apache.rat.ReportFailedRuntimeException: > javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at org.apache.rat.ReportTransformer.run(ReportTransformer.java:59) > at java.lang.Thread.run(Thread.java:745) > Caused by: javax.xml.transform.TransformerException: > javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:755) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:357) > at org.apache.rat.ReportTransformer.transform(ReportTransformer.java:64) > at org.apache.rat.ReportTransformer.run(ReportTransformer.java:57) > ... 1 more > Caused by: javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:584) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:745) > ... 4 more > Caused by: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: > Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:427) > at > com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:215) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:562) > ... 5 more > Espino-Pivotal:~/workspace/HAWQ-projects/apache-hawq-src-2.0.0.0-incubating > {code} -- This message was sent by Atlassian JIRA
[jira] [Updated] (HAWQ-993) Symbolic link (contrib/pgcrypto) pointing to non-existent loctation
[ https://issues.apache.org/jira/browse/HAWQ-993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ed Espino updated HAWQ-993: --- Affects Version/s: 2.0.0.0-incubating > Symbolic link (contrib/pgcrypto) pointing to non-existent loctation > --- > > Key: HAWQ-993 > URL: https://issues.apache.org/jira/browse/HAWQ-993 > Project: Apache HAWQ > Issue Type: Bug > Components: Build >Affects Versions: 2.0.0.0-incubating >Reporter: Ed Espino >Assignee: Ed Espino > > The following symbolic link points to a non-existent location. This > generates a potential error for the "rat" utility (see below): > contrib/pgcrypto@ -> ../depends/thirdparty/postgres/contrib/pgcrypto > {code} > java -Xms1024m -Xmx1024m -jar ~/Downloads/apache-rat-0.11/apache-rat-0.11.jar > . > Exception in thread "main" org.apache.rat.api.RatException: Cannot read header > at > org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:44) > at org.apache.rat.walker.DirectoryWalker.report(DirectoryWalker.java:147) > at > org.apache.rat.walker.DirectoryWalker.processNonDirectories(DirectoryWalker.java:131) > at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:98) > at > org.apache.rat.walker.DirectoryWalker.processDirectory(DirectoryWalker.java:71) > at > org.apache.rat.walker.DirectoryWalker.processDirectories(DirectoryWalker.java:114) > at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:99) > at org.apache.rat.walker.DirectoryWalker.run(DirectoryWalker.java:83) > at org.apache.rat.Report.report(Report.java:418) > at org.apache.rat.Report.report(Report.java:394) > at org.apache.rat.Report.report(Report.java:366) > at org.apache.rat.Report.styleReport(Report.java:346) > at org.apache.rat.Report.main(Report.java:109) > Caused by: org.apache.rat.document.RatDocumentAnalysisException: Cannot read > header > at > org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:45) > at > org.apache.rat.analysis.DefaultAnalyserFactory$DefaultAnalyser.analyse(DefaultAnalyserFactory.java:60) > at > org.apache.rat.document.impl.util.DocumentAnalyserMultiplexer.analyse(DocumentAnalyserMultiplexer.java:36) > at > org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:42) > ... 12 more > Caused by: java.io.FileNotFoundException: ./contrib/pgcrypto (No such file or > directory) > at java.io.FileInputStream.open(Native Method) > at java.io.FileInputStream.(FileInputStream.java:131) > at java.io.FileReader.(FileReader.java:72) > at org.apache.rat.document.impl.FileDocument.reader(FileDocument.java:52) > at > org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:40) > ... 15 more > ERROR: 'Pipe broken' > ERROR: 'com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe > broken' > Exception in thread "Thread-0" org.apache.rat.ReportFailedRuntimeException: > javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at org.apache.rat.ReportTransformer.run(ReportTransformer.java:59) > at java.lang.Thread.run(Thread.java:745) > Caused by: javax.xml.transform.TransformerException: > javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:755) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:357) > at org.apache.rat.ReportTransformer.transform(ReportTransformer.java:64) > at org.apache.rat.ReportTransformer.run(ReportTransformer.java:57) > ... 1 more > Caused by: javax.xml.transform.TransformerException: > com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:584) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:745) > ... 4 more > Caused by: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: > Pipe broken > at > com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:427) > at > com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:215) > at > com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:562) > ... 5 more > Espino-Pivotal:~/workspace/HAWQ-projects/apache-hawq-src-2.0.0.0-incubating > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (HAWQ-993) Symbolic link (contrib/pgcrypto) pointing to non-existent loctation
Ed Espino created HAWQ-993: -- Summary: Symbolic link (contrib/pgcrypto) pointing to non-existent loctation Key: HAWQ-993 URL: https://issues.apache.org/jira/browse/HAWQ-993 Project: Apache HAWQ Issue Type: Bug Components: Build Reporter: Ed Espino Assignee: Lei Chang The following symbolic link points to a non-existent location. This generates a potential error for the "rat" utility (see below): contrib/pgcrypto@ -> ../depends/thirdparty/postgres/contrib/pgcrypto {code} java -Xms1024m -Xmx1024m -jar ~/Downloads/apache-rat-0.11/apache-rat-0.11.jar . Exception in thread "main" org.apache.rat.api.RatException: Cannot read header at org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:44) at org.apache.rat.walker.DirectoryWalker.report(DirectoryWalker.java:147) at org.apache.rat.walker.DirectoryWalker.processNonDirectories(DirectoryWalker.java:131) at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:98) at org.apache.rat.walker.DirectoryWalker.processDirectory(DirectoryWalker.java:71) at org.apache.rat.walker.DirectoryWalker.processDirectories(DirectoryWalker.java:114) at org.apache.rat.walker.DirectoryWalker.process(DirectoryWalker.java:99) at org.apache.rat.walker.DirectoryWalker.run(DirectoryWalker.java:83) at org.apache.rat.Report.report(Report.java:418) at org.apache.rat.Report.report(Report.java:394) at org.apache.rat.Report.report(Report.java:366) at org.apache.rat.Report.styleReport(Report.java:346) at org.apache.rat.Report.main(Report.java:109) Caused by: org.apache.rat.document.RatDocumentAnalysisException: Cannot read header at org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:45) at org.apache.rat.analysis.DefaultAnalyserFactory$DefaultAnalyser.analyse(DefaultAnalyserFactory.java:60) at org.apache.rat.document.impl.util.DocumentAnalyserMultiplexer.analyse(DocumentAnalyserMultiplexer.java:36) at org.apache.rat.report.claim.util.ClaimReporterMultiplexer.report(ClaimReporterMultiplexer.java:42) ... 12 more Caused by: java.io.FileNotFoundException: ./contrib/pgcrypto (No such file or directory) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.(FileInputStream.java:131) at java.io.FileReader.(FileReader.java:72) at org.apache.rat.document.impl.FileDocument.reader(FileDocument.java:52) at org.apache.rat.analysis.DocumentHeaderAnalyser.analyse(DocumentHeaderAnalyser.java:40) ... 15 more ERROR: 'Pipe broken' ERROR: 'com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken' Exception in thread "Thread-0" org.apache.rat.ReportFailedRuntimeException: javax.xml.transform.TransformerException: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken at org.apache.rat.ReportTransformer.run(ReportTransformer.java:59) at java.lang.Thread.run(Thread.java:745) Caused by: javax.xml.transform.TransformerException: javax.xml.transform.TransformerException: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:755) at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:357) at org.apache.rat.ReportTransformer.transform(ReportTransformer.java:64) at org.apache.rat.ReportTransformer.run(ReportTransformer.java:57) ... 1 more Caused by: javax.xml.transform.TransformerException: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:584) at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.transform(TransformerImpl.java:745) ... 4 more Caused by: com.sun.org.apache.xml.internal.utils.WrappedRuntimeException: Pipe broken at com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:427) at com.sun.org.apache.xalan.internal.xsltc.dom.XSLTCDTMManager.getDTM(XSLTCDTMManager.java:215) at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getDOM(TransformerImpl.java:562) ... 5 more Espino-Pivotal:~/workspace/HAWQ-projects/apache-hawq-src-2.0.0.0-incubating {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (HAWQ-992) PXF Hive data type check in Fragmenter is too strict
Shivram Mani created HAWQ-992: - Summary: PXF Hive data type check in Fragmenter is too strict Key: HAWQ-992 URL: https://issues.apache.org/jira/browse/HAWQ-992 Project: Apache HAWQ Issue Type: Bug Components: PXF Reporter: Shivram Mani Assignee: Goden Yao HiveDataFragmenter used by both HiveText and HiveRC profiles has a very strict type check. Hawq type numeric(10,10) is compatible with hive's decimal(10,10) Hawq type numeric is not compatible with hive's decimal(10,10) Similar issue exits with other data types which have variable optional arguments. The type check should be modified to allow hawq type that is a compabitle type but without optional precision/length arguments to work with the corresponding hive type. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Comment Edited] (HAWQ-957) NOTICE file clean up
[ https://issues.apache.org/jira/browse/HAWQ-957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15414355#comment-15414355 ] Goden Yao edited comment on HAWQ-957 at 8/9/16 10:47 PM: - We are reworking the NOTICE file in accordance with the IPMC member ([~jmclean]) request. was (Author: espino): We are reworking the NOTICE file in accordance with the PCM member (Justin Mclean) request. > NOTICE file clean up > > > Key: HAWQ-957 > URL: https://issues.apache.org/jira/browse/HAWQ-957 > Project: Apache HAWQ > Issue Type: Task > Components: Documentation >Reporter: Goden Yao >Assignee: Ed Espino > Fix For: 2.0.0.0-incubating > > > From [~jmclean] IPMC review feedback: > {quote} > NOTICE incorrecly contains a long list of copyright statements. I would > expect to see one or perhaps two here i.e. the original authors who donated > the software and who copyright statements were removed from the original > files. > {quote} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HAWQ-957) NOTICE file clean up
[ https://issues.apache.org/jira/browse/HAWQ-957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15414355#comment-15414355 ] Ed Espino commented on HAWQ-957: We are reworking the NOTICE file in accordance with the PCM member (Justin Mclean) request. > NOTICE file clean up > > > Key: HAWQ-957 > URL: https://issues.apache.org/jira/browse/HAWQ-957 > Project: Apache HAWQ > Issue Type: Task > Components: Documentation >Reporter: Goden Yao >Assignee: Ed Espino > Fix For: 2.0.0.0-incubating > > > From [~jmclean] IPMC review feedback: > {quote} > NOTICE incorrecly contains a long list of copyright statements. I would > expect to see one or perhaps two here i.e. the original authors who donated > the software and who copyright statements were removed from the original > files. > {quote} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HAWQ-952) Clean up COPYRIGHT file and review NOTICE File
[ https://issues.apache.org/jira/browse/HAWQ-952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15414353#comment-15414353 ] Ed Espino commented on HAWQ-952: To be consistent in the file naming conventions, we will be recommending the removal of the COPYRIGHT file with it's contents added to the NOTICE. > Clean up COPYRIGHT file and review NOTICE File > -- > > Key: HAWQ-952 > URL: https://issues.apache.org/jira/browse/HAWQ-952 > Project: Apache HAWQ > Issue Type: Task > Components: Documentation >Reporter: Goden Yao >Assignee: Ed Espino > Fix For: 2.0.0.0-incubating > > > Per mentor's suggestion, we should not have a separate COPYRIGHT file > And we need to review which copyright should be in NOTICE file in what way. > (need more clarification and discussion with mentors) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HAWQ-991) Add support for "HAWQ register" that could register tables by using "hawq extract" output
[ https://issues.apache.org/jira/browse/HAWQ-991?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15413795#comment-15413795 ] Goden Yao commented on HAWQ-991: can you add details in description? e.g. sample ddl you proposed. I also mark this as 2.0.1.0 milestone. > Add support for "HAWQ register" that could register tables by using "hawq > extract" output > - > > Key: HAWQ-991 > URL: https://issues.apache.org/jira/browse/HAWQ-991 > Project: Apache HAWQ > Issue Type: Improvement > Components: Command Line Tools, External Tables >Affects Versions: 2.0.1.0-incubating >Reporter: hongwu >Assignee: hongwu > Fix For: 2.0.1.0-incubating > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (HAWQ-991) Add support for "HAWQ register" that could register tables by using "hawq extract" output
[ https://issues.apache.org/jira/browse/HAWQ-991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Goden Yao updated HAWQ-991: --- Fix Version/s: 2.0.1.0-incubating > Add support for "HAWQ register" that could register tables by using "hawq > extract" output > - > > Key: HAWQ-991 > URL: https://issues.apache.org/jira/browse/HAWQ-991 > Project: Apache HAWQ > Issue Type: Improvement > Components: Command Line Tools, External Tables >Affects Versions: 2.0.1.0-incubating >Reporter: hongwu >Assignee: hongwu > Fix For: 2.0.1.0-incubating > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-hawq pull request #820: HAWQ-953 hawq pxf-hive support partition c...
Github user jiadexin closed the pull request at: https://github.com/apache/incubator-hawq/pull/820 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] incubator-hawq pull request #840: HAWQ-985. Add feature test for agg with gr...
Github user ztao1987 closed the pull request at: https://github.com/apache/incubator-hawq/pull/840 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Updated] (HAWQ-991) Add support for "HAWQ register" that could register tables by using "hawq extract" output
[ https://issues.apache.org/jira/browse/HAWQ-991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] hongwu updated HAWQ-991: Affects Version/s: 2.0.1.0-incubating > Add support for "HAWQ register" that could register tables by using "hawq > extract" output > - > > Key: HAWQ-991 > URL: https://issues.apache.org/jira/browse/HAWQ-991 > Project: Apache HAWQ > Issue Type: Improvement > Components: Command Line Tools, External Tables >Affects Versions: 2.0.1.0-incubating >Reporter: hongwu >Assignee: hongwu > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Assigned] (HAWQ-991) Add support for "HAWQ register" that could register tables by using "hawq extract" output
[ https://issues.apache.org/jira/browse/HAWQ-991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] hongwu reassigned HAWQ-991: --- Assignee: hongwu (was: Lei Chang) > Add support for "HAWQ register" that could register tables by using "hawq > extract" output > - > > Key: HAWQ-991 > URL: https://issues.apache.org/jira/browse/HAWQ-991 > Project: Apache HAWQ > Issue Type: Improvement > Components: Command Line Tools, External Tables >Reporter: hongwu >Assignee: hongwu > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (HAWQ-991) Add support for "HAWQ register" that could register tables by using "hawq extract" output
hongwu created HAWQ-991: --- Summary: Add support for "HAWQ register" that could register tables by using "hawq extract" output Key: HAWQ-991 URL: https://issues.apache.org/jira/browse/HAWQ-991 Project: Apache HAWQ Issue Type: Improvement Components: Command Line Tools, External Tables Reporter: hongwu Assignee: Lei Chang -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] incubator-hawq issue #842: HAWQ-939. Add coverity scan badge
Github user huor commented on the issue: https://github.com/apache/incubator-hawq/pull/842 +1 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---