[
https://issues.apache.org/jira/browse/HAWQ-1637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Oushu_WangZiming updated HAWQ-1637:
-----------------------------------
Description:
Follow instruction ([https://cwiki.apache.org/confluence/disp
/usr/local/bin/mvn package
-DskipTestslay/HAWQ/Build+and+Install)|https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install)]
to build apache hawq on osx 10.11, it fails due to Failed to execute goal
org.apache.maven.plugins:maven-javadoc-plugin:2.9.1:aggregate-jar
{code:java}
/usr/local/bin/mvn package -DskipTests
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] hawq-hadoop [pom]
[INFO] hawq-mapreduce-common [jar]
[INFO] hawq-mapreduce-ao [jar]
[INFO] hawq-mapreduce-parquet [jar]
[INFO] hawq-mapreduce-tool [jar]
[INFO]
[INFO] --------------------< com.pivotal.hawq:hawq-hadoop >--------------------
[INFO] Building hawq-hadoop 1.1.0 [1/5]
[INFO] --------------------------------[ pom ]---------------------------------
[INFO]
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ hawq-hadoop ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[WARNING] The following dependencies could not be resolved at this point of the
build but seem to be part of the reactor:
[WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile)
[WARNING] Try running the build up to the lifecycle phase "package"
[WARNING] The following dependencies could not be resolved at this point of the
build but seem to be part of the reactor:
[WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile)
[WARNING] Try running the build up to the lifecycle phase "package"
[WARNING] The following dependencies could not be resolved at this point of the
build but seem to be part of the reactor:
[WARNING] o com.pivotal.hawq:hawq-mapreduce-ao:jar:1.1.0 (compile)
[WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile)
[WARNING] o com.pivotal.hawq:hawq-mapreduce-parquet:jar:1.1.0 (compile)
[WARNING] Try running the build up to the lifecycle phase "package"
[INFO]
[INFO] --- maven-javadoc-plugin:2.9.1:aggregate-jar (default) @ hawq-hadoop ---
[WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0]
can't be resolved but has been found in the reactor (probably snapshots).
This dependency has been excluded from the Javadoc classpath. You should rerun
javadoc after executing mvn install.
[WARNING] IGNORED to add some artifacts in the classpath. See above.
[WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0]
can't be resolved but has been found in the reactor (probably snapshots).
This dependency has been excluded from the Javadoc classpath. You should rerun
javadoc after executing mvn install.
[WARNING] IGNORED to add some artifacts in the classpath. See above.
[WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0]
can't be resolved but has been found in the reactor (probably snapshots).
This dependency has been excluded from the Javadoc classpath. You should rerun
javadoc after executing mvn install.
[WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-ao:jar:1.1.0] can't
be resolved but has been found in the reactor (probably snapshots).
This dependency has been excluded from the Javadoc classpath. You should rerun
javadoc after executing mvn install.
[WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-parquet:jar:1.1.0]
can't be resolved but has been found in the reactor (probably snapshots).
This dependency has been excluded from the Javadoc classpath. You should rerun
javadoc after executing mvn install.
[WARNING] IGNORED to add some artifacts in the classpath. See above.
[INFO]
正在加载程序包com.pivotal.hawq.mapreduce.conf的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.datatype的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.file的源文件...
正在加载程序包com.pivotal.hawq.mapreduce的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.metadata的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.schema的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.util的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.ao.file的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.ao的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.ao.io的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.ao.util的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.parquet.convert的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.parquet的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.parquet.support的源文件...
正在加载程序包com.pivotal.hawq.mapreduce.parquet.util的源文件...
正在构造 Javadoc 信息...
100 个错误
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] hawq-hadoop 1.1.0 .................................. FAILURE [ 7.035 s]
[INFO] hawq-mapreduce-common .............................. SKIPPED
[INFO] hawq-mapreduce-ao .................................. SKIPPED
[INFO] hawq-mapreduce-parquet ............................. SKIPPED
[INFO] hawq-mapreduce-tool 1.1.0 .......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.235 s
[INFO] Finished at: 2018-07-05T13:50:16+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-javadoc-plugin:2.9.1:aggregate-jar (default) on
project hawq-hadoop: MavenReportException: Error while creating archive:
[ERROR] Exit code: 1 -
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetInputFormat.java:34:
错误: 程序包parquet.hadoop不存在
[ERROR] import parquet.hadoop.ParquetInputFormat;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetInputFormat.java:42:
错误: 找不到符号
[ERROR] public class HAWQParquetInputFormat extends
ParquetInputFormat<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 ParquetInputFormat
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:38:
错误: 找不到符号
[ERROR] public class HAWQBoxConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:41:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQBoxConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:78:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQBoxConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:37:
错误: 找不到符号
[ERROR] public class HAWQCircleConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:40:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQCircleConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:70:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQCircleConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:38:
错误: 找不到符号
[ERROR] public class HAWQLineSegmentConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:41:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQLineSegmentConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:78:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQLineSegmentConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:42:
错误: 找不到符号
[ERROR] public class HAWQPathConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:45:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPathConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:70:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPathConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:36:
错误: 找不到符号
[ERROR] public class HAWQPointConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:39:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPointConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:62:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPointConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:28:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:48:
错误: 找不到符号
[ERROR] public class HAWQPolygonConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:51:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPolygonConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:77:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPolygonConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:29:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Binary;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:30:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:31:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:32:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:33:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:44:
错误: 找不到符号
[ERROR] public class HAWQRecordConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:48:
错误: 找不到符号
[ERROR] private final Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:52:
错误: 找不到符号
[ERROR] public HAWQRecordConverter(MessageType requestedSchema, HAWQSchema
hawqSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:56:
错误: 找不到符号
[ERROR] public HAWQRecordConverter(ParentValueContainer parent, MessageType
requestedSchema, HAWQSchema hawqSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:201:
错误: 找不到符号
[ERROR] private Converter newConverter(HAWQField hawqType, ParentValueContainer
parent) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:259:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:285:
错误: 找不到符号
[ERROR] static class HAWQPrimitiveConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:328:
错误: 找不到符号
[ERROR] static class HAWQShortConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:343:
错误: 找不到符号
[ERROR] static class HAWQBigDecimalConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:351:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQBigDecimalConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:361:
错误: 找不到符号
[ERROR] static class HAWQStringConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:369:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQStringConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:378:
错误: 找不到符号
[ERROR] static class HAWQBitsConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:386:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQBitsConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:395:
错误: 找不到符号
[ERROR] static class HAWQByteArrayConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:403:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQByteArrayConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:411:
错误: 找不到符号
[ERROR] static class HAWQDateConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:430:
错误: 找不到符号
[ERROR] static class HAWQTimeConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:448:
错误: 找不到符号
[ERROR] static class HAWQTimeTZConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:456:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQTimeTZConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:466:
错误: 找不到符号
[ERROR] static class HAWQTimestampConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:484:
错误: 找不到符号
[ERROR] static class HAWQTimestampTZConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:502:
错误: 找不到符号
[ERROR] static class HAWQIntervalConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:510:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQIntervalConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:520:
错误: 找不到符号
[ERROR] static class HAWQMacaddrConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:528:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQMacaddrConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:538:
错误: 找不到符号
[ERROR] static class HAWQInetConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:546:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQInetConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:556:
错误: 找不到符号
[ERROR] static class HAWQCidrConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:564:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQCidrConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordMaterializer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:27:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:36:
错误: 找不到符号
[ERROR] public class HAWQRecordMaterializer extends
RecordMaterializer<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 RecordMaterializer
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:40:
错误: 找不到符号
[ERROR] public HAWQRecordMaterializer(MessageType requestedSchema, HAWQSchema
hawqSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQRecordMaterializer
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:50:
错误: 找不到符号
[ERROR] public GroupConverter getRootConverter() {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR] 位置: 类 HAWQRecordMaterializer
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:29:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Binary;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:30:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordConsumer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:39:
错误: 找不到符号
[ERROR] private RecordConsumer consumer;
[ERROR] ^
[ERROR] 符号: 类 RecordConsumer
[ERROR] 位置: 类 HAWQRecordWriter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:42:
错误: 找不到符号
[ERROR] public HAWQRecordWriter(RecordConsumer consumer, HAWQSchema schema) {
[ERROR] ^
[ERROR] 符号: 类 RecordConsumer
[ERROR] 位置: 类 HAWQRecordWriter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:27:
错误: 程序包parquet.hadoop不存在
[ERROR] import parquet.hadoop.ParquetOutputFormat;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:28:
错误: 程序包parquet.hadoop.util不存在
[ERROR] import parquet.hadoop.util.ContextUtil;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:30:
错误: 找不到符号
[ERROR] class HAWQParquetOutputFormat extends ParquetOutputFormat<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 ParquetOutputFormat
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:27:
错误: 程序包parquet.hadoop.api不存在
[ERROR] import parquet.hadoop.api.ReadSupport;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:28:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordMaterializer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:29:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:37:
错误: 找不到符号
[ERROR] public class HAWQReadSupport extends ReadSupport<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 ReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:45:
错误: 找不到符号
[ERROR] MessageType fileSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:43:
错误: 找不到符号
[ERROR] public ReadContext init(Configuration configuration,
[ERROR] ^
[ERROR] 符号: 类 ReadContext
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:63:
错误: 找不到符号
[ERROR] MessageType fileSchema, ReadContext readContext) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:63:
错误: 找不到符号
[ERROR] MessageType fileSchema, ReadContext readContext) {
[ERROR] ^
[ERROR] 符号: 类 ReadContext
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:61:
错误: 找不到符号
[ERROR] public RecordMaterializer<HAWQRecord> prepareForRead(Configuration
configuration,
[ERROR] ^
[ERROR] 符号: 类 RecordMaterializer
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:29:
错误: 程序包parquet.hadoop.api不存在
[ERROR] import parquet.hadoop.api.WriteSupport;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:30:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordConsumer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:31:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:40:
错误: 找不到符号
[ERROR] public class HAWQWriteSupport extends WriteSupport<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 WriteSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:46:
错误: 找不到符号
[ERROR] private MessageType parquetSchema;
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQWriteSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:54:
错误: 找不到符号
[ERROR] public WriteContext init(Configuration configuration) {
[ERROR] ^
[ERROR] 符号: 类 WriteContext
[ERROR] 位置: 类 HAWQWriteSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:65:
错误: 找不到符号
[ERROR] public void prepareForWrite(RecordConsumer recordConsumer) {
[ERROR] ^
[ERROR] 符号: 类 RecordConsumer
[ERROR] 位置: 类 HAWQWriteSupport
[ERROR]
[ERROR] Command line was:
/Library/Java/JavaVirtualMachines/jdk-10.0.1.jdk/Contents/Home/bin/javadoc
@options @packages
[ERROR]
[ERROR] Refer to the generated Javadoc files in
'/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/target/apidocs'
dir.
[ERROR]
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
make[2]: *** [hawq-mapreduce-tool/target/hawq-mapreduce-tool-1.1.0.jar] Error 1
make[1]: *** [all] Error 2
make: *** [all] Error 2{code}
was:
Follow instruction ([https://cwiki.apache.org/confluence/disp
/usr/local/bin/mvn package
-DskipTestslay/HAWQ/Build+and+Install)|https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install)]
to build apache hawq on osx 10.11, it fails due to Failed to execute goal
org.apache.maven.plugins:maven-javadoc-plugin:2.9.1:aggregate-jar
{code:java}
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ hawq-hadoop --- [WARNING]
JAR will be empty - no content was marked for inclusion! [WARNING] The
following dependencies could not be resolved at this point of the build but
seem to be part of the reactor: [WARNING] o
com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile) [WARNING] Try
running the build up to the lifecycle phase "package" [WARNING] The following
dependencies could not be resolved at this point of the build but seem to be
part of the reactor: [WARNING] o
com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile) [WARNING] Try
running the build up to the lifecycle phase "package" [WARNING] The following
dependencies could not be resolved at this point of the build but seem to be
part of the reactor: [WARNING] o com.pivotal.hawq:hawq-mapreduce-ao:jar:1.1.0
(compile) [WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0
(compile) [WARNING] o com.pivotal.hawq:hawq-mapreduce-parquet:jar:1.1.0
(compile) [WARNING] Try running the build up to the lifecycle phase "package"
[INFO] [INFO] --- maven-javadoc-plugin:2.9.1:aggregate-jar (default) @
hawq-hadoop --- [WARNING] The dependency:
[com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0] can't be resolved but has
been found in the reactor (probably snapshots). This dependency has been
excluded from the Javadoc classpath. You should rerun javadoc after executing
mvn install. [WARNING] IGNORED to add some artifacts in the classpath. See
above. [WARNING] The dependency:
[com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0] can't be resolved but has
been found in the reactor (probably snapshots). This dependency has been
excluded from the Javadoc classpath. You should rerun javadoc after executing
mvn install. [WARNING] IGNORED to add some artifacts in the classpath. See
above. [WARNING] The dependency:
[com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0] can't be resolved but has
been found in the reactor (probably snapshots). This dependency has been
excluded from the Javadoc classpath. You should rerun javadoc after executing
mvn install. [WARNING] The dependency:
[com.pivotal.hawq:hawq-mapreduce-ao:jar:1.1.0] can't be resolved but has been
found in the reactor (probably snapshots). This dependency has been excluded
from the Javadoc classpath. You should rerun javadoc after executing mvn
install. [WARNING] The dependency:
[com.pivotal.hawq:hawq-mapreduce-parquet:jar:1.1.0] can't be resolved but has
been found in the reactor (probably snapshots). This dependency has been
excluded from the Javadoc classpath. You should rerun javadoc after executing
mvn install. [WARNING] IGNORED to add some artifacts in the classpath. See
above.[ERROR] import parquet.hadoop.ParquetInputFormat;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetInputFormat.java:43:
错误: 找不到符号
[ERROR] public class HAWQParquetInputFormat extends
ParquetInputFormat<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 ParquetInputFormat
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:38:
错误: 找不到符号
[ERROR] public class HAWQBoxConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:41:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQBoxConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:78:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQBoxConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:37:
错误: 找不到符号
[ERROR] public class HAWQCircleConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:40:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQCircleConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:70:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQCircleConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:38:
错误: 找不到符号
[ERROR] public class HAWQLineSegmentConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:41:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQLineSegmentConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:78:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQLineSegmentConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:42:
错误: 找不到符号
[ERROR] public class HAWQPathConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:45:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPathConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:70:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPathConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:36:
错误: 找不到符号
[ERROR] public class HAWQPointConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:39:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPointConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:62:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPointConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:27:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:28:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:48:
错误: 找不到符号
[ERROR] public class HAWQPolygonConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:51:
错误: 找不到符号
[ERROR] private Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPolygonConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:77:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQPolygonConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:29:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Binary;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:30:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Converter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:31:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:32:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.PrimitiveConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:33:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:44:
错误: 找不到符号
[ERROR] public class HAWQRecordConverter extends GroupConverter {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:48:
错误: 找不到符号
[ERROR] private final Converter[] converters;
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:52:
错误: 找不到符号
[ERROR] public HAWQRecordConverter(MessageType requestedSchema, HAWQSchema
hawqSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:56:
错误: 找不到符号
[ERROR] public HAWQRecordConverter(ParentValueContainer parent, MessageType
requestedSchema, HAWQSchema hawqSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:201:
错误: 找不到符号
[ERROR] private Converter newConverter(HAWQField hawqType, ParentValueContainer
parent) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:259:
错误: 找不到符号
[ERROR] public Converter getConverter(int fieldIndex) {
[ERROR] ^
[ERROR] 符号: 类 Converter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:285:
错误: 找不到符号
[ERROR] static class HAWQPrimitiveConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:328:
错误: 找不到符号
[ERROR] static class HAWQShortConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:343:
错误: 找不到符号
[ERROR] static class HAWQBigDecimalConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:351:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQBigDecimalConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:361:
错误: 找不到符号
[ERROR] static class HAWQStringConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:369:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQStringConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:378:
错误: 找不到符号
[ERROR] static class HAWQBitsConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:386:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQBitsConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:395:
错误: 找不到符号
[ERROR] static class HAWQByteArrayConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:403:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQByteArrayConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:411:
错误: 找不到符号
[ERROR] static class HAWQDateConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:430:
错误: 找不到符号
[ERROR] static class HAWQTimeConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:448:
错误: 找不到符号
[ERROR] static class HAWQTimeTZConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:456:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQTimeTZConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:466:
错误: 找不到符号
[ERROR] static class HAWQTimestampConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:484:
错误: 找不到符号
[ERROR] static class HAWQTimestampTZConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:502:
错误: 找不到符号
[ERROR] static class HAWQIntervalConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:510:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQIntervalConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:520:
错误: 找不到符号
[ERROR] static class HAWQMacaddrConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:528:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQMacaddrConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:538:
错误: 找不到符号
[ERROR] static class HAWQInetConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:546:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQInetConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:556:
错误: 找不到符号
[ERROR] static class HAWQCidrConverter extends PrimitiveConverter {
[ERROR] ^
[ERROR] 符号: 类 PrimitiveConverter
[ERROR] 位置: 类 HAWQRecordConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:564:
错误: 找不到符号
[ERROR] public void addBinary(Binary value) {
[ERROR] ^
[ERROR] 符号: 类 Binary
[ERROR] 位置: 类 HAWQCidrConverter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:25:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.GroupConverter;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:26:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordMaterializer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:27:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:36:
错误: 找不到符号
[ERROR] public class HAWQRecordMaterializer extends
RecordMaterializer<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 RecordMaterializer
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:40:
错误: 找不到符号
[ERROR] public HAWQRecordMaterializer(MessageType requestedSchema, HAWQSchema
hawqSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQRecordMaterializer
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:50:
错误: 找不到符号
[ERROR] public GroupConverter getRootConverter() {
[ERROR] ^
[ERROR] 符号: 类 GroupConverter
[ERROR] 位置: 类 HAWQRecordMaterializer
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:29:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.Binary;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:30:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordConsumer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:39:
错误: 找不到符号
[ERROR] private RecordConsumer consumer;
[ERROR] ^
[ERROR] 符号: 类 RecordConsumer
[ERROR] 位置: 类 HAWQRecordWriter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:42:
错误: 找不到符号
[ERROR] public HAWQRecordWriter(RecordConsumer consumer, HAWQSchema schema) {
[ERROR] ^
[ERROR] 符号: 类 RecordConsumer
[ERROR] 位置: 类 HAWQRecordWriter
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:27:
错误: 程序包parquet.hadoop不存在
[ERROR] import parquet.hadoop.ParquetOutputFormat;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:28:
错误: 程序包parquet.hadoop.util不存在
[ERROR] import parquet.hadoop.util.ContextUtil;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:30:
错误: 找不到符号
[ERROR] class HAWQParquetOutputFormat extends ParquetOutputFormat<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 ParquetOutputFormat
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:27:
错误: 程序包parquet.hadoop.api不存在
[ERROR] import parquet.hadoop.api.ReadSupport;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:28:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordMaterializer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:29:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:37:
错误: 找不到符号
[ERROR] public class HAWQReadSupport extends ReadSupport<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 ReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:45:
错误: 找不到符号
[ERROR] MessageType fileSchema) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:43:
错误: 找不到符号
[ERROR] public ReadContext init(Configuration configuration,
[ERROR] ^
[ERROR] 符号: 类 ReadContext
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:63:
错误: 找不到符号
[ERROR] MessageType fileSchema, ReadContext readContext) {
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:63:
错误: 找不到符号
[ERROR] MessageType fileSchema, ReadContext readContext) {
[ERROR] ^
[ERROR] 符号: 类 ReadContext
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:61:
错误: 找不到符号
[ERROR] public RecordMaterializer<HAWQRecord> prepareForRead(Configuration
configuration,
[ERROR] ^
[ERROR] 符号: 类 RecordMaterializer
[ERROR] 位置: 类 HAWQReadSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:29:
错误: 程序包parquet.hadoop.api不存在
[ERROR] import parquet.hadoop.api.WriteSupport;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:31:
错误: 程序包parquet.io.api不存在
[ERROR] import parquet.io.api.RecordConsumer;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:32:
错误: 程序包parquet.schema不存在
[ERROR] import parquet.schema.MessageType;
[ERROR] ^
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:41:
错误: 找不到符号
[ERROR] public class HAWQWriteSupport extends WriteSupport<HAWQRecord> {
[ERROR] ^
[ERROR] 符号: 类 WriteSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:47:
错误: 找不到符号
[ERROR] private MessageType parquetSchema;
[ERROR] ^
[ERROR] 符号: 类 MessageType
[ERROR] 位置: 类 HAWQWriteSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:55:
错误: 找不到符号
[ERROR] public WriteContext init(Configuration configuration) {
[ERROR] ^
[ERROR] 符号: 类 WriteContext
[ERROR] 位置: 类 HAWQWriteSupport
[ERROR]
/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:66:
错误: 找不到符号
[ERROR] public void prepareForWrite(RecordConsumer recordConsumer) {
[ERROR] ^
[ERROR] 符号: 类 RecordConsumer
[ERROR] 位置: 类 HAWQWriteSupport
[ERROR]
[ERROR] Command line was:
/Library/Java/JavaVirtualMachines/jdk-10.0.1.jdk/Contents/Home/bin/javadoc
@options @packages
[ERROR]
[ERROR] Refer to the generated Javadoc files in
'/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/target/apidocs'
dir.
[ERROR]
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException{code}
> Compile apache hawq failure due to Failed to execute goal
> org.apache.maven.plugins:maven-javadoc-plugin:2.9.1:aggregate-jar on osx 10.11
> ----------------------------------------------------------------------------------------------------------------------------------------
>
> Key: HAWQ-1637
> URL: https://issues.apache.org/jira/browse/HAWQ-1637
> Project: Apache HAWQ
> Issue Type: Bug
> Components: Build
> Reporter: Oushu_WangZiming
> Assignee: Radar Lei
> Priority: Major
>
> Follow instruction ([https://cwiki.apache.org/confluence/disp
> /usr/local/bin/mvn package
> -DskipTestslay/HAWQ/Build+and+Install)|https://cwiki.apache.org/confluence/display/HAWQ/Build+and+Install)]
> to build apache hawq on osx 10.11, it fails due to Failed to execute goal
> org.apache.maven.plugins:maven-javadoc-plugin:2.9.1:aggregate-jar
>
> {code:java}
> /usr/local/bin/mvn package -DskipTests
> [INFO] Scanning for projects...
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Build Order:
> [INFO]
> [INFO] hawq-hadoop [pom]
> [INFO] hawq-mapreduce-common [jar]
> [INFO] hawq-mapreduce-ao [jar]
> [INFO] hawq-mapreduce-parquet [jar]
> [INFO] hawq-mapreduce-tool [jar]
> [INFO]
> [INFO] --------------------< com.pivotal.hawq:hawq-hadoop
> >--------------------
> [INFO] Building hawq-hadoop 1.1.0 [1/5]
> [INFO] --------------------------------[ pom
> ]---------------------------------
> [INFO]
> [INFO] --- maven-jar-plugin:2.4:test-jar (default) @ hawq-hadoop ---
> [WARNING] JAR will be empty - no content was marked for inclusion!
> [WARNING] The following dependencies could not be resolved at this point of
> the build but seem to be part of the reactor:
> [WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile)
> [WARNING] Try running the build up to the lifecycle phase "package"
> [WARNING] The following dependencies could not be resolved at this point of
> the build but seem to be part of the reactor:
> [WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile)
> [WARNING] Try running the build up to the lifecycle phase "package"
> [WARNING] The following dependencies could not be resolved at this point of
> the build but seem to be part of the reactor:
> [WARNING] o com.pivotal.hawq:hawq-mapreduce-ao:jar:1.1.0 (compile)
> [WARNING] o com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0 (compile)
> [WARNING] o com.pivotal.hawq:hawq-mapreduce-parquet:jar:1.1.0 (compile)
> [WARNING] Try running the build up to the lifecycle phase "package"
> [INFO]
> [INFO] --- maven-javadoc-plugin:2.9.1:aggregate-jar (default) @ hawq-hadoop
> ---
> [WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0]
> can't be resolved but has been found in the reactor (probably snapshots).
> This dependency has been excluded from the Javadoc classpath. You should
> rerun javadoc after executing mvn install.
> [WARNING] IGNORED to add some artifacts in the classpath. See above.
> [WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0]
> can't be resolved but has been found in the reactor (probably snapshots).
> This dependency has been excluded from the Javadoc classpath. You should
> rerun javadoc after executing mvn install.
> [WARNING] IGNORED to add some artifacts in the classpath. See above.
> [WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-common:jar:1.1.0]
> can't be resolved but has been found in the reactor (probably snapshots).
> This dependency has been excluded from the Javadoc classpath. You should
> rerun javadoc after executing mvn install.
> [WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-ao:jar:1.1.0]
> can't be resolved but has been found in the reactor (probably snapshots).
> This dependency has been excluded from the Javadoc classpath. You should
> rerun javadoc after executing mvn install.
> [WARNING] The dependency: [com.pivotal.hawq:hawq-mapreduce-parquet:jar:1.1.0]
> can't be resolved but has been found in the reactor (probably snapshots).
> This dependency has been excluded from the Javadoc classpath. You should
> rerun javadoc after executing mvn install.
> [WARNING] IGNORED to add some artifacts in the classpath. See above.
> [INFO]
> 正在加载程序包com.pivotal.hawq.mapreduce.conf的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.datatype的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.file的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.metadata的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.schema的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.util的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.ao.file的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.ao的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.ao.io的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.ao.util的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.parquet.convert的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.parquet的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.parquet.support的源文件...
> 正在加载程序包com.pivotal.hawq.mapreduce.parquet.util的源文件...
> 正在构造 Javadoc 信息...
> 100 个错误
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] hawq-hadoop 1.1.0 .................................. FAILURE [ 7.035 s]
> [INFO] hawq-mapreduce-common .............................. SKIPPED
> [INFO] hawq-mapreduce-ao .................................. SKIPPED
> [INFO] hawq-mapreduce-parquet ............................. SKIPPED
> [INFO] hawq-mapreduce-tool 1.1.0 .......................... SKIPPED
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 7.235 s
> [INFO] Finished at: 2018-07-05T13:50:16+08:00
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-javadoc-plugin:2.9.1:aggregate-jar (default)
> on project hawq-hadoop: MavenReportException: Error while creating archive:
> [ERROR] Exit code: 1 -
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetInputFormat.java:34:
> 错误: 程序包parquet.hadoop不存在
> [ERROR] import parquet.hadoop.ParquetInputFormat;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetInputFormat.java:42:
> 错误: 找不到符号
> [ERROR] public class HAWQParquetInputFormat extends
> ParquetInputFormat<HAWQRecord> {
> [ERROR] ^
> [ERROR] 符号: 类 ParquetInputFormat
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:25:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:26:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:27:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.PrimitiveConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:38:
> 错误: 找不到符号
> [ERROR] public class HAWQBoxConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:41:
> 错误: 找不到符号
> [ERROR] private Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQBoxConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQBoxConverter.java:78:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQBoxConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:25:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:26:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:27:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.PrimitiveConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:37:
> 错误: 找不到符号
> [ERROR] public class HAWQCircleConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:40:
> 错误: 找不到符号
> [ERROR] private Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQCircleConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQCircleConverter.java:70:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQCircleConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:25:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:26:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:27:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.PrimitiveConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:38:
> 错误: 找不到符号
> [ERROR] public class HAWQLineSegmentConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:41:
> 错误: 找不到符号
> [ERROR] private Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQLineSegmentConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQLineSegmentConverter.java:78:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQLineSegmentConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:26:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:27:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:42:
> 错误: 找不到符号
> [ERROR] public class HAWQPathConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:45:
> 错误: 找不到符号
> [ERROR] private Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQPathConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPathConverter.java:70:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQPathConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:25:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:26:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:27:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.PrimitiveConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:36:
> 错误: 找不到符号
> [ERROR] public class HAWQPointConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:39:
> 错误: 找不到符号
> [ERROR] private Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQPointConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPointConverter.java:62:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQPointConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:27:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:28:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:48:
> 错误: 找不到符号
> [ERROR] public class HAWQPolygonConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:51:
> 错误: 找不到符号
> [ERROR] private Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQPolygonConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQPolygonConverter.java:77:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQPolygonConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:29:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Binary;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:30:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Converter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:31:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:32:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.PrimitiveConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:33:
> 错误: 程序包parquet.schema不存在
> [ERROR] import parquet.schema.MessageType;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:44:
> 错误: 找不到符号
> [ERROR] public class HAWQRecordConverter extends GroupConverter {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:48:
> 错误: 找不到符号
> [ERROR] private final Converter[] converters;
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:52:
> 错误: 找不到符号
> [ERROR] public HAWQRecordConverter(MessageType requestedSchema, HAWQSchema
> hawqSchema) {
> [ERROR] ^
> [ERROR] 符号: 类 MessageType
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:56:
> 错误: 找不到符号
> [ERROR] public HAWQRecordConverter(ParentValueContainer parent, MessageType
> requestedSchema, HAWQSchema hawqSchema) {
> [ERROR] ^
> [ERROR] 符号: 类 MessageType
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:201:
> 错误: 找不到符号
> [ERROR] private Converter newConverter(HAWQField hawqType,
> ParentValueContainer parent) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:259:
> 错误: 找不到符号
> [ERROR] public Converter getConverter(int fieldIndex) {
> [ERROR] ^
> [ERROR] 符号: 类 Converter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:285:
> 错误: 找不到符号
> [ERROR] static class HAWQPrimitiveConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:328:
> 错误: 找不到符号
> [ERROR] static class HAWQShortConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:343:
> 错误: 找不到符号
> [ERROR] static class HAWQBigDecimalConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:351:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQBigDecimalConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:361:
> 错误: 找不到符号
> [ERROR] static class HAWQStringConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:369:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQStringConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:378:
> 错误: 找不到符号
> [ERROR] static class HAWQBitsConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:386:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQBitsConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:395:
> 错误: 找不到符号
> [ERROR] static class HAWQByteArrayConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:403:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQByteArrayConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:411:
> 错误: 找不到符号
> [ERROR] static class HAWQDateConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:430:
> 错误: 找不到符号
> [ERROR] static class HAWQTimeConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:448:
> 错误: 找不到符号
> [ERROR] static class HAWQTimeTZConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:456:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQTimeTZConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:466:
> 错误: 找不到符号
> [ERROR] static class HAWQTimestampConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:484:
> 错误: 找不到符号
> [ERROR] static class HAWQTimestampTZConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:502:
> 错误: 找不到符号
> [ERROR] static class HAWQIntervalConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:510:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQIntervalConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:520:
> 错误: 找不到符号
> [ERROR] static class HAWQMacaddrConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:528:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQMacaddrConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:538:
> 错误: 找不到符号
> [ERROR] static class HAWQInetConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:546:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQInetConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:556:
> 错误: 找不到符号
> [ERROR] static class HAWQCidrConverter extends PrimitiveConverter {
> [ERROR] ^
> [ERROR] 符号: 类 PrimitiveConverter
> [ERROR] 位置: 类 HAWQRecordConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordConverter.java:564:
> 错误: 找不到符号
> [ERROR] public void addBinary(Binary value) {
> [ERROR] ^
> [ERROR] 符号: 类 Binary
> [ERROR] 位置: 类 HAWQCidrConverter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:25:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.GroupConverter;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:26:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.RecordMaterializer;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:27:
> 错误: 程序包parquet.schema不存在
> [ERROR] import parquet.schema.MessageType;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:36:
> 错误: 找不到符号
> [ERROR] public class HAWQRecordMaterializer extends
> RecordMaterializer<HAWQRecord> {
> [ERROR] ^
> [ERROR] 符号: 类 RecordMaterializer
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:40:
> 错误: 找不到符号
> [ERROR] public HAWQRecordMaterializer(MessageType requestedSchema, HAWQSchema
> hawqSchema) {
> [ERROR] ^
> [ERROR] 符号: 类 MessageType
> [ERROR] 位置: 类 HAWQRecordMaterializer
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordMaterializer.java:50:
> 错误: 找不到符号
> [ERROR] public GroupConverter getRootConverter() {
> [ERROR] ^
> [ERROR] 符号: 类 GroupConverter
> [ERROR] 位置: 类 HAWQRecordMaterializer
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:29:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.Binary;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:30:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.RecordConsumer;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:39:
> 错误: 找不到符号
> [ERROR] private RecordConsumer consumer;
> [ERROR] ^
> [ERROR] 符号: 类 RecordConsumer
> [ERROR] 位置: 类 HAWQRecordWriter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/convert/HAWQRecordWriter.java:42:
> 错误: 找不到符号
> [ERROR] public HAWQRecordWriter(RecordConsumer consumer, HAWQSchema schema) {
> [ERROR] ^
> [ERROR] 符号: 类 RecordConsumer
> [ERROR] 位置: 类 HAWQRecordWriter
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:27:
> 错误: 程序包parquet.hadoop不存在
> [ERROR] import parquet.hadoop.ParquetOutputFormat;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:28:
> 错误: 程序包parquet.hadoop.util不存在
> [ERROR] import parquet.hadoop.util.ContextUtil;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/HAWQParquetOutputFormat.java:30:
> 错误: 找不到符号
> [ERROR] class HAWQParquetOutputFormat extends ParquetOutputFormat<HAWQRecord>
> {
> [ERROR] ^
> [ERROR] 符号: 类 ParquetOutputFormat
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:27:
> 错误: 程序包parquet.hadoop.api不存在
> [ERROR] import parquet.hadoop.api.ReadSupport;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:28:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.RecordMaterializer;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:29:
> 错误: 程序包parquet.schema不存在
> [ERROR] import parquet.schema.MessageType;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:37:
> 错误: 找不到符号
> [ERROR] public class HAWQReadSupport extends ReadSupport<HAWQRecord> {
> [ERROR] ^
> [ERROR] 符号: 类 ReadSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:45:
> 错误: 找不到符号
> [ERROR] MessageType fileSchema) {
> [ERROR] ^
> [ERROR] 符号: 类 MessageType
> [ERROR] 位置: 类 HAWQReadSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:43:
> 错误: 找不到符号
> [ERROR] public ReadContext init(Configuration configuration,
> [ERROR] ^
> [ERROR] 符号: 类 ReadContext
> [ERROR] 位置: 类 HAWQReadSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:63:
> 错误: 找不到符号
> [ERROR] MessageType fileSchema, ReadContext readContext) {
> [ERROR] ^
> [ERROR] 符号: 类 MessageType
> [ERROR] 位置: 类 HAWQReadSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:63:
> 错误: 找不到符号
> [ERROR] MessageType fileSchema, ReadContext readContext) {
> [ERROR] ^
> [ERROR] 符号: 类 ReadContext
> [ERROR] 位置: 类 HAWQReadSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQReadSupport.java:61:
> 错误: 找不到符号
> [ERROR] public RecordMaterializer<HAWQRecord> prepareForRead(Configuration
> configuration,
> [ERROR] ^
> [ERROR] 符号: 类 RecordMaterializer
> [ERROR] 位置: 类 HAWQReadSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:29:
> 错误: 程序包parquet.hadoop.api不存在
> [ERROR] import parquet.hadoop.api.WriteSupport;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:30:
> 错误: 程序包parquet.io.api不存在
> [ERROR] import parquet.io.api.RecordConsumer;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:31:
> 错误: 程序包parquet.schema不存在
> [ERROR] import parquet.schema.MessageType;
> [ERROR] ^
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:40:
> 错误: 找不到符号
> [ERROR] public class HAWQWriteSupport extends WriteSupport<HAWQRecord> {
> [ERROR] ^
> [ERROR] 符号: 类 WriteSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:46:
> 错误: 找不到符号
> [ERROR] private MessageType parquetSchema;
> [ERROR] ^
> [ERROR] 符号: 类 MessageType
> [ERROR] 位置: 类 HAWQWriteSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:54:
> 错误: 找不到符号
> [ERROR] public WriteContext init(Configuration configuration) {
> [ERROR] ^
> [ERROR] 符号: 类 WriteContext
> [ERROR] 位置: 类 HAWQWriteSupport
> [ERROR]
> /Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/hawq-mapreduce-parquet/src/main/java/com/pivotal/hawq/mapreduce/parquet/support/HAWQWriteSupport.java:65:
> 错误: 找不到符号
> [ERROR] public void prepareForWrite(RecordConsumer recordConsumer) {
> [ERROR] ^
> [ERROR] 符号: 类 RecordConsumer
> [ERROR] 位置: 类 HAWQWriteSupport
> [ERROR]
> [ERROR] Command line was:
> /Library/Java/JavaVirtualMachines/jdk-10.0.1.jdk/Contents/Home/bin/javadoc
> @options @packages
> [ERROR]
> [ERROR] Refer to the generated Javadoc files in
> '/Users/wangziming/workplace/incubator-hawq/contrib/hawq-hadoop/target/apidocs'
> dir.
> [ERROR]
> [ERROR] -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please
> read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
> make[2]: *** [hawq-mapreduce-tool/target/hawq-mapreduce-tool-1.1.0.jar] Error
> 1
> make[1]: *** [all] Error 2
> make: *** [all] Error 2{code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)