[jira] [Commented] (HIVE-11339) org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) makes incorrect cast

2016-05-02 Thread Ashutosh Chauhan (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11339?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15266931#comment-15266931
 ] 

Ashutosh Chauhan commented on HIVE-11339:
-

+1

> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) 
> makes incorrect cast
> -
>
> Key: HIVE-11339
> URL: https://issues.apache.org/jira/browse/HIVE-11339
> Project: Hive
>  Issue Type: Bug
>  Components: Serializers/Deserializers
>Affects Versions: 0.14.0
>Reporter: Arnaud Linz
>Assignee: Zoltan Haindrich
>  Labels: easyfix, newbie
> Attachments: HIVE-11339.patch
>
>
> Hi, it's my first Jira and I don't know how to make patches, so I'll explain 
> the issue in the description as it is rather simple.
> I have a problem serializing "DefaultHCatRecord" using Apache Flink when 
> those records include Timestamps because of an incorrect class cast in 
> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out). It 
> is implemented using a cast to Outputstream  : 
> public void write(DataOutput out) throws IOException {
> write((OutputStream) out);
>  }
> but nothing says that a DataOutput object is an OutputStream, (and it's not 
> the case in Flink) it should rather be implmented using the same code as 
> write(OutputStream) :
> {
> checkBytes();
> out.write(currentBytes, offset, getTotalLength());
> }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11339) org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) makes incorrect cast

2016-05-02 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11339?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15266479#comment-15266479
 ] 

Hive QA commented on HIVE-11339:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12801031/HIVE-11339.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 42 failed/errored test(s), 9983 tests 
executed
*Failed tests:*
{noformat}
TestHWISessionManager - did not produce a TEST-*.xml file
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver_index_bitmap3
org.apache.hadoop.hive.cli.TestNegativeMinimrCliDriver.testNegativeCliDriver_minimr_broken_pipe
org.apache.hadoop.hive.metastore.TestAuthzApiEmbedAuthorizerInRemote.org.apache.hadoop.hive.metastore.TestAuthzApiEmbedAuthorizerInRemote
org.apache.hadoop.hive.metastore.TestFilterHooks.org.apache.hadoop.hive.metastore.TestFilterHooks
org.apache.hadoop.hive.metastore.TestMetaStoreEndFunctionListener.testEndFunctionListener
org.apache.hadoop.hive.metastore.TestMetaStoreEventListenerOnlyOnCommit.testEventStatus
org.apache.hadoop.hive.metastore.TestMetaStoreInitListener.testMetaStoreInitListener
org.apache.hadoop.hive.metastore.TestMetaStoreMetrics.org.apache.hadoop.hive.metastore.TestMetaStoreMetrics
org.apache.hadoop.hive.metastore.TestPartitionNameWhitelistValidation.testAppendPartitionWithCommas
org.apache.hadoop.hive.metastore.TestPartitionNameWhitelistValidation.testAppendPartitionWithUnicode
org.apache.hadoop.hive.metastore.TestPartitionNameWhitelistValidation.testAppendPartitionWithValidCharacters
org.apache.hadoop.hive.metastore.TestRetryingHMSHandler.testRetryingHMSHandler
org.apache.hadoop.hive.metastore.TestSetUGIOnOnlyServer.testSimpleTable
org.apache.hadoop.hive.ql.security.TestClientSideAuthorizationProvider.testSimplePrivileges
org.apache.hadoop.hive.ql.security.TestExtendedAcls.org.apache.hadoop.hive.ql.security.TestExtendedAcls
org.apache.hadoop.hive.ql.security.TestFolderPermissions.org.apache.hadoop.hive.ql.security.TestFolderPermissions
org.apache.hadoop.hive.ql.security.TestMetastoreAuthorizationProvider.testSimplePrivileges
org.apache.hadoop.hive.ql.security.TestMultiAuthorizationPreEventListener.org.apache.hadoop.hive.ql.security.TestMultiAuthorizationPreEventListener
org.apache.hadoop.hive.ql.security.TestStorageBasedClientSideAuthorizationProvider.testSimplePrivileges
org.apache.hadoop.hive.ql.security.TestStorageBasedMetastoreAuthorizationDrops.testDropPartition
org.apache.hadoop.hive.ql.security.TestStorageBasedMetastoreAuthorizationProvider.testSimplePrivileges
org.apache.hadoop.hive.ql.security.TestStorageBasedMetastoreAuthorizationProviderWithACL.testSimplePrivileges
org.apache.hadoop.hive.ql.security.TestStorageBasedMetastoreAuthorizationReads.testReadDbSuccess
org.apache.hadoop.hive.ql.security.TestStorageBasedMetastoreAuthorizationReads.testReadTableFailure
org.apache.hadoop.hive.thrift.TestHadoopAuthBridge23.testDelegationTokenSharedStore
org.apache.hadoop.hive.thrift.TestHadoopAuthBridge23.testMetastoreProxyUser
org.apache.hadoop.hive.thrift.TestHadoopAuthBridge23.testSaslWithHiveMetaStore
org.apache.hive.hcatalog.api.TestHCatClient.testBasicDDLCommands
org.apache.hive.hcatalog.api.TestHCatClient.testDatabaseLocation
org.apache.hive.hcatalog.api.TestHCatClient.testDropPartitionsWithPartialSpec
org.apache.hive.hcatalog.api.TestHCatClient.testDropTableException
org.apache.hive.hcatalog.api.TestHCatClient.testGetPartitionsWithPartialSpec
org.apache.hive.hcatalog.api.TestHCatClient.testObjectNotFoundException
org.apache.hive.hcatalog.api.TestHCatClient.testRenameTable
org.apache.hive.hcatalog.api.TestHCatClient.testReplicationTaskIter
org.apache.hive.hcatalog.api.TestHCatClient.testTransportFailure
org.apache.hive.hcatalog.api.repl.commands.TestCommands.org.apache.hive.hcatalog.api.repl.commands.TestCommands
org.apache.hive.hcatalog.listener.TestDbNotificationListener.cleanupNotifs
org.apache.hive.hcatalog.listener.TestDbNotificationListener.dropDatabase
org.apache.hive.jdbc.TestSSL.testSSLFetchHttp
org.apache.hive.minikdc.TestJdbcWithDBTokenStore.testNegativeTokenAuth
{noformat}

Test results: 
http://ec2-54-177-240-2.us-west-1.compute.amazonaws.com/job/PreCommit-HIVE-MASTER-Build/150/testReport
Console output: 
http://ec2-54-177-240-2.us-west-1.compute.amazonaws.com/job/PreCommit-HIVE-MASTER-Build/150/console
Test logs: 
http://ec2-50-18-27-0.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-MASTER-Build-150/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 42 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12801031 - 

[jira] [Commented] (HIVE-11339) org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) makes incorrect cast

2016-04-29 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11339?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15265086#comment-15265086
 ] 

Hive QA commented on HIVE-11339:




Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12801031/HIVE-11339.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 23 failed/errored test(s), 10003 tests 
executed
*Failed tests:*
{noformat}
TestHWISessionManager - did not produce a TEST-*.xml file
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby1_limit
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_nomore_ambiguous_table_col
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_regexp_extract
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver_index_bitmap3
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern3
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern4
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_nonkey_groupby
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_selectDistinctStarNeg_2
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_subquery_shared_alias
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_udtf_not_supported1
org.apache.hadoop.hive.metastore.TestAuthzApiEmbedAuthorizerInRemote.org.apache.hadoop.hive.metastore.TestAuthzApiEmbedAuthorizerInRemote
org.apache.hadoop.hive.metastore.TestFilterHooks.org.apache.hadoop.hive.metastore.TestFilterHooks
org.apache.hadoop.hive.metastore.TestHiveMetaStoreGetMetaConf.testGetMetaConfDefault
org.apache.hadoop.hive.metastore.TestMetaStoreAuthorization.testMetaStoreAuthorization
org.apache.hadoop.hive.metastore.TestRetryingHMSHandler.testRetryingHMSHandler
org.apache.hadoop.hive.ql.security.TestClientSideAuthorizationProvider.testSimplePrivileges
org.apache.hadoop.hive.ql.security.TestExtendedAcls.org.apache.hadoop.hive.ql.security.TestExtendedAcls
org.apache.hadoop.hive.thrift.TestHadoopAuthBridge23.testDelegationTokenSharedStore
org.apache.hadoop.hive.thrift.TestHadoopAuthBridge23.testMetastoreProxyUser
org.apache.hadoop.hive.thrift.TestHadoopAuthBridge23.testSaslWithHiveMetaStore
org.apache.hive.hcatalog.listener.TestDbNotificationListener.dropDatabase
org.apache.hive.minikdc.TestJdbcWithDBTokenStore.testNegativeTokenAuth
{noformat}

Test results: 
http://ec2-54-177-240-2.us-west-1.compute.amazonaws.com/job/PreCommit-HIVE-MASTER-Build/122/testReport
Console output: 
http://ec2-54-177-240-2.us-west-1.compute.amazonaws.com/job/PreCommit-HIVE-MASTER-Build/122/console
Test logs: 
http://ec2-50-18-27-0.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-MASTER-Build-122/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 23 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12801031 - PreCommit-HIVE-MASTER-Build

> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) 
> makes incorrect cast
> -
>
> Key: HIVE-11339
> URL: https://issues.apache.org/jira/browse/HIVE-11339
> Project: Hive
>  Issue Type: Bug
>  Components: Serializers/Deserializers
>Affects Versions: 0.14.0
>Reporter: Arnaud Linz
>Assignee: Zoltan Haindrich
>  Labels: easyfix, newbie
> Attachments: HIVE-11339.patch
>
>
> Hi, it's my first Jira and I don't know how to make patches, so I'll explain 
> the issue in the description as it is rather simple.
> I have a problem serializing "DefaultHCatRecord" using Apache Flink when 
> those records include Timestamps because of an incorrect class cast in 
> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out). It 
> is implemented using a cast to Outputstream  : 
> public void write(DataOutput out) throws IOException {
> write((OutputStream) out);
>  }
> but nothing says that a DataOutput object is an OutputStream, (and it's not 
> the case in Flink) it should rather be implmented using the same code as 
> write(OutputStream) :
> {
> checkBytes();
> out.write(currentBytes, offset, getTotalLength());
> }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11339) org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) makes incorrect cast

2016-04-28 Thread Arnaud Linz (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11339?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15262393#comment-15262393
 ] 

Arnaud Linz commented on HIVE-11339:


Ok for me.

> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out) 
> makes incorrect cast
> -
>
> Key: HIVE-11339
> URL: https://issues.apache.org/jira/browse/HIVE-11339
> Project: Hive
>  Issue Type: Bug
>  Components: Serializers/Deserializers
>Affects Versions: 0.14.0
>Reporter: Arnaud Linz
>Assignee: Zoltan Haindrich
>  Labels: easyfix, newbie
> Attachments: HIVE-11339.patch
>
>
> Hi, it's my first Jira and I don't know how to make patches, so I'll explain 
> the issue in the description as it is rather simple.
> I have a problem serializing "DefaultHCatRecord" using Apache Flink when 
> those records include Timestamps because of an incorrect class cast in 
> org.apache.hadoop.hive.serde2.io.TimestampWritable.write(DataOutput out). It 
> is implemented using a cast to Outputstream  : 
> public void write(DataOutput out) throws IOException {
> write((OutputStream) out);
>  }
> but nothing says that a DataOutput object is an OutputStream, (and it's not 
> the case in Flink) it should rather be implmented using the same code as 
> write(OutputStream) :
> {
> checkBytes();
> out.write(currentBytes, offset, getTotalLength());
> }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)