stoty commented on a change in pull request #26:
URL: https://github.com/apache/phoenix-connectors/pull/26#discussion_r470619252
##########
File path:
phoenix-hive-base/src/main/java/org/apache/phoenix/hive/util/PhoenixStorageHandlerUtilBase.java
##########
@@ -20,18 +20,27 @@
import java.io.ByteArrayInputStream;
Review comment:
Please mark the *Base classes abstract, to make sure no-one tries to
instantiate them.
##########
File path:
phoenix-hive-base/phoenix5-hive/src/main/java/org/apache/phoenix/hive/SpecificPhoenixSeriazer.java
##########
@@ -15,21 +15,21 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-
package org.apache.phoenix.hive;
-import static org.junit.Assert.fail;
-
-import org.junit.BeforeClass;
+import org.apache.hadoop.hive.common.type.Date;
+import org.apache.hadoop.hive.common.type.Timestamp;
-public class HiveMapReduceIT extends HivePhoenixStoreIT {
-
- @BeforeClass
- public static void setUpBeforeClass() throws Exception {
- final String hadoopConfDir = System.getenv("HADOOP_CONF_DIR");
- if (hadoopConfDir != null && hadoopConfDir.length() != 0) {
- fail("HADOOP_CONF_DIR is non-empty in the current shell
environment which will very likely cause this test to fail.");
+/**
+ * Serializer used in PhoenixSerDe and PhoenixRecordUpdater to produce
Writable.
+ */
+public class SpecificPhoenixSeriazer {
Review comment:
This is a pretty awkward name.
Diff seems to be bit confused, but if this is a new class, try to find a
better name for it.
##########
File path: phoenix-hive-base/pom.xml
##########
@@ -195,101 +180,75 @@
</exclusion>
</exclusions>
</dependency>
- <dependency>
- <groupId>org.eclipse.jetty</groupId>
- <artifactId>jetty-util</artifactId>
- <scope>test</scope>
- <version>${jetty.version}</version>
- </dependency>
- <dependency>
- <groupId>org.eclipse.jetty</groupId>
- <artifactId>jetty-http</artifactId>
- <scope>test</scope>
- <version>${jetty.version}</version>
- </dependency>
- <dependency>
- <groupId>org.eclipse.jetty</groupId>
- <artifactId>jetty-server</artifactId>
- <scope>test</scope>
- <version>${jetty.version}</version>
- </dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>${mockito-all.version}</version>
<scope>test</scope>
</dependency>
- <dependency>
- <groupId>com.google.guava</groupId>
- <artifactId>guava</artifactId>
- <version>19.0</version>
- </dependency>
- <dependency>
- <groupId>org.apache.calcite.avatica</groupId>
- <artifactId>avatica</artifactId>
- <!-- Overriding the version of Avatica that PQS uses so that Hive
will work -->
- <version>${avatica.version}</version>
- <scope>test</scope>
- <!-- And removing a bunch of dependencies that haven't been shaded
in this older
- Avatica version which conflict with HDFS -->
- <exclusions>
- <exclusion>
- <groupId>org.hsqldb</groupId>
- <artifactId>hsqldb</artifactId>
- </exclusion>
- <exclusion>
- <groupId>com.fasterxml.jackson.core</groupId>
- <artifactId>jackson-databind</artifactId>
- </exclusion>
- <exclusion>
- <groupId>com.fasterxml.jackson.core</groupId>
- <artifactId>jackson-annotations</artifactId>
- </exclusion>
- <exclusion>
- <groupId>com.fasterxml.jackson.core</groupId>
- <artifactId>jackson-core</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
- </plugin>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-failsafe-plugin</artifactId>
+ <version>3.0.0</version>
<executions>
<execution>
- <id>HBaseManagedTimeTests</id>
+ <id>add-source</id>
+ <phase>generate-sources</phase>
+ <goals>
+ <goal>add-source</goal>
+ </goals>
<configuration>
- <encoding>UTF-8</encoding>
- <forkCount>1</forkCount>
- <runOrder>alphabetical</runOrder>
- <reuseForks>false</reuseForks>
- <argLine>-enableassertions -Xmx2000m
-XX:MaxPermSize=256m
- -Djava.security.egd=file:/dev/./urandom
-
"-Djava.library.path=${hadoop.library.path}${path.separator}${java.library.path}"
- -XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=./target/
-
-Dorg.apache.hadoop.hbase.shaded.io.netty.packagePrefix=org.apache.hadoop.hbase.shaded.
- </argLine>
- <redirectTestOutputToFile>${test.output.tofile}
- </redirectTestOutputToFile>
-
<testSourceDirectory>${basedir}/src/it/java</testSourceDirectory>
-
<groups>org.apache.phoenix.end2end.HBaseManagedTimeTest</groups>
- <shutdown>kill</shutdown>
- <useSystemClassLoader>false</useSystemClassLoader>
+ <sources>
+
<source>${project.parent.basedir}/src/main/java</source>
+ </sources>
</configuration>
+ </execution>
+ <execution>
+ <id>add-test-source</id>
+ <phase>generate-sources</phase>
<goals>
- <goal>integration-test</goal>
- <goal>verify</goal>
+ <goal>add-test-source</goal>
</goals>
+ <configuration>
Review comment:
Probably depending on common would be better.
##########
File path: phoenix-hive-base/pom.xml
##########
@@ -195,101 +180,75 @@
</exclusion>
</exclusions>
</dependency>
- <dependency>
- <groupId>org.eclipse.jetty</groupId>
- <artifactId>jetty-util</artifactId>
- <scope>test</scope>
- <version>${jetty.version}</version>
- </dependency>
- <dependency>
- <groupId>org.eclipse.jetty</groupId>
- <artifactId>jetty-http</artifactId>
- <scope>test</scope>
- <version>${jetty.version}</version>
- </dependency>
- <dependency>
- <groupId>org.eclipse.jetty</groupId>
- <artifactId>jetty-server</artifactId>
- <scope>test</scope>
- <version>${jetty.version}</version>
- </dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>${mockito-all.version}</version>
<scope>test</scope>
</dependency>
- <dependency>
- <groupId>com.google.guava</groupId>
- <artifactId>guava</artifactId>
- <version>19.0</version>
- </dependency>
- <dependency>
- <groupId>org.apache.calcite.avatica</groupId>
- <artifactId>avatica</artifactId>
- <!-- Overriding the version of Avatica that PQS uses so that Hive
will work -->
- <version>${avatica.version}</version>
- <scope>test</scope>
- <!-- And removing a bunch of dependencies that haven't been shaded
in this older
- Avatica version which conflict with HDFS -->
- <exclusions>
- <exclusion>
- <groupId>org.hsqldb</groupId>
- <artifactId>hsqldb</artifactId>
- </exclusion>
- <exclusion>
- <groupId>com.fasterxml.jackson.core</groupId>
- <artifactId>jackson-databind</artifactId>
- </exclusion>
- <exclusion>
- <groupId>com.fasterxml.jackson.core</groupId>
- <artifactId>jackson-annotations</artifactId>
- </exclusion>
- <exclusion>
- <groupId>com.fasterxml.jackson.core</groupId>
- <artifactId>jackson-core</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
- </plugin>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-failsafe-plugin</artifactId>
+ <version>3.0.0</version>
<executions>
<execution>
- <id>HBaseManagedTimeTests</id>
+ <id>add-source</id>
+ <phase>generate-sources</phase>
+ <goals>
+ <goal>add-source</goal>
+ </goals>
<configuration>
- <encoding>UTF-8</encoding>
- <forkCount>1</forkCount>
- <runOrder>alphabetical</runOrder>
- <reuseForks>false</reuseForks>
- <argLine>-enableassertions -Xmx2000m
-XX:MaxPermSize=256m
- -Djava.security.egd=file:/dev/./urandom
-
"-Djava.library.path=${hadoop.library.path}${path.separator}${java.library.path}"
- -XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=./target/
-
-Dorg.apache.hadoop.hbase.shaded.io.netty.packagePrefix=org.apache.hadoop.hbase.shaded.
- </argLine>
- <redirectTestOutputToFile>${test.output.tofile}
- </redirectTestOutputToFile>
-
<testSourceDirectory>${basedir}/src/it/java</testSourceDirectory>
-
<groups>org.apache.phoenix.end2end.HBaseManagedTimeTest</groups>
- <shutdown>kill</shutdown>
- <useSystemClassLoader>false</useSystemClassLoader>
+ <sources>
+
<source>${project.parent.basedir}/src/main/java</source>
+ </sources>
</configuration>
+ </execution>
+ <execution>
+ <id>add-test-source</id>
+ <phase>generate-sources</phase>
<goals>
- <goal>integration-test</goal>
- <goal>verify</goal>
+ <goal>add-test-source</goal>
</goals>
+ <configuration>
+ <sources>
+
<source>${project.parent.basedir}/src/it/java</source>
+ <source>${project.basedir}/src/it/java</source>
+
<source>${project.parent.basedir}/src/test/java</source>
+ </sources>
+ </configuration>
</execution>
</executions>
</plugin>
+ <plugin>
Review comment:
I think that in this case we simply want to depend on -base, instead of
copying stuff from it.
##########
File path:
phoenix-hive-base/src/main/java/org/apache/phoenix/hive/PhoenixSerializer.java
##########
@@ -167,6 +168,8 @@ public Writable serialize(Object values, ObjectInspector
objInspector, DmlType d
value = ((HiveDecimal) value).bigDecimalValue();
} else if (value instanceof HiveChar) {
value = ((HiveChar) value).getValue().trim();
+ } else if (CompatUtil.DateAndTimestampSupport()){
+ value = SpecificPhoenixSeriazer.GetValue(value);
Review comment:
Fix the typo while at it.
##########
File path: phoenix-hive-base/pom.xml
##########
@@ -29,42 +29,46 @@
<artifactId>phoenix-connectors</artifactId>
<version>6.0.0-SNAPSHOT</version>
</parent>
- <artifactId>phoenix-hive3</artifactId>
- <name>Phoenix Hive Connector for Phoenix 5</name>
+ <artifactId>phoenix-hive-base</artifactId>
+ <name>Phoenix Hive Connector - Base</name>
+
+ <packaging>pom</packaging>
+ <modules>
+ <module>phoenix4-hive</module>
+ <module>phoenix5-hive</module>
+ </modules>
+
<properties>
<test.tmp.dir>${project.build.directory}/tmp</test.tmp.dir>
- <netty.version>4.1.47.Final</netty.version>
- <phoenix.version>5.1.0-SNAPSHOT</phoenix.version>
- <hbase.version>2.2.4</hbase.version>
- <hadoop.version>3.0.3</hadoop.version>
- <avatica.version>1.12.0</avatica.version>
- <hive3.version>3.1.2</hive3.version>
- <curator.version>4.0.0</curator.version>
+ <netty.version>4.0.52.Final</netty.version>
Review comment:
These dependencies are pretty ancient.
Use the hive 3 dependency versions, if you can.
##########
File path:
phoenix-hive-base/src/main/java/org/apache/phoenix/hive/PhoenixSerializer.java
##########
@@ -167,6 +168,8 @@ public Writable serialize(Object values, ObjectInspector
objInspector, DmlType d
value = ((HiveDecimal) value).bigDecimalValue();
} else if (value instanceof HiveChar) {
value = ((HiveChar) value).getValue().trim();
+ } else if (CompatUtil.DateAndTimestampSupport()){
Review comment:
Nice solution, but doesn't follow the java naming Conventions.
use .isDateAndTimeSupported() instead.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]