[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-23 Thread Prasanth Jayachandran (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14639235#comment-14639235
 ] 

Prasanth Jayachandran commented on HIVE-11209:
--

I just checked that this is not a problem for vectorized version which does not 
call setFromBytes, this will only impact the non-vectorized code path. 
Non-vectorized code path already does many allocations for HiveDecimal and so 
shouldn't be noticeable. +1

 Clean up dependencies in HiveDecimalWritable
 

 Key: HIVE-11209
 URL: https://issues.apache.org/jira/browse/HIVE-11209
 Project: Hive
  Issue Type: Sub-task
Reporter: Owen O'Malley
Assignee: Owen O'Malley
 Fix For: 2.0.0

 Attachments: HIVE-11209.patch, HIVE-11209.patch, HIVE-11209.patch, 
 HIVE-11209.patch


 Currently HiveDecimalWritable depends on:
 * org.apache.hadoop.hive.serde2.ByteStream
 * org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils
 * org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils
 since we need HiveDecimalWritable for the decimal VectorizedColumnBatch, 
 breaking these dependencies will improve things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-22 Thread Owen O'Malley (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14637099#comment-14637099
 ] 

Owen O'Malley commented on HIVE-11209:
--

For most types, it would be a problem. With decimal, we are doing so many 
allocations in the inner loop that this won't be noticeable. We really need to 
fix or even better reimplement the Decimal128 to get good performance on 
decimal columns.

If you really want me to fix this instance, how about a thread local. Hive's 
over use of static caches has been a huge source of problems.

 Clean up dependencies in HiveDecimalWritable
 

 Key: HIVE-11209
 URL: https://issues.apache.org/jira/browse/HIVE-11209
 Project: Hive
  Issue Type: Sub-task
Reporter: Owen O'Malley
Assignee: Owen O'Malley
 Fix For: 2.0.0

 Attachments: HIVE-11209.patch, HIVE-11209.patch, HIVE-11209.patch, 
 HIVE-11209.patch


 Currently HiveDecimalWritable depends on:
 * org.apache.hadoop.hive.serde2.ByteStream
 * org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils
 * org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils
 since we need HiveDecimalWritable for the decimal VectorizedColumnBatch, 
 breaking these dependencies will improve things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-21 Thread Prasanth Jayachandran (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14636017#comment-14636017
 ] 

Prasanth Jayachandran commented on HIVE-11209:
--

Vint object is no longer reusable with this patch. Vint allocation in inner 
loop will hit performance right?

 Clean up dependencies in HiveDecimalWritable
 

 Key: HIVE-11209
 URL: https://issues.apache.org/jira/browse/HIVE-11209
 Project: Hive
  Issue Type: Sub-task
Reporter: Owen O'Malley
Assignee: Owen O'Malley
 Fix For: 2.0.0

 Attachments: HIVE-11209.patch, HIVE-11209.patch, HIVE-11209.patch, 
 HIVE-11209.patch


 Currently HiveDecimalWritable depends on:
 * org.apache.hadoop.hive.serde2.ByteStream
 * org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils
 * org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils
 since we need HiveDecimalWritable for the decimal VectorizedColumnBatch, 
 breaking these dependencies will improve things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-17 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14632168#comment-14632168
 ] 

Hive QA commented on HIVE-11209:




{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12745893/HIVE-11209.patch

{color:red}ERROR:{color} -1 due to 1 failed/errored test(s), 9226 tests executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_cbo_rp_join0
{noformat}

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/4643/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/4643/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-4643/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 1 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12745893 - PreCommit-HIVE-TRUNK-Build

 Clean up dependencies in HiveDecimalWritable
 

 Key: HIVE-11209
 URL: https://issues.apache.org/jira/browse/HIVE-11209
 Project: Hive
  Issue Type: Sub-task
Reporter: Owen O'Malley
Assignee: Owen O'Malley
 Fix For: 2.0.0

 Attachments: HIVE-11209.patch, HIVE-11209.patch, HIVE-11209.patch, 
 HIVE-11209.patch


 Currently HiveDecimalWritable depends on:
 * org.apache.hadoop.hive.serde2.ByteStream
 * org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils
 * org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils
 since we need HiveDecimalWritable for the decimal VectorizedColumnBatch, 
 breaking these dependencies will improve things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-14 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14626023#comment-14626023
 ] 

Hive QA commented on HIVE-11209:




{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12745188/HIVE-11209.patch

{color:red}ERROR:{color} -1 due to 74 failed/errored test(s), 9190 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_acid_join
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_compute_stats_decimal
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_2
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_3
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_4
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_5
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_join
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_join2
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_precision
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_trailing
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_decimal_udf
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_extrapolate_part_stats_partial_ndv
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_insert_values_dynamic_partitioned
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_insert_values_non_partitioned
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_insert_values_partitioned
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_insert_values_tmp_table
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_analyze
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_merge_incompat2
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_orc_min_max
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_parquet_decimal
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sum_expr_with_order
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udaf_collect_set_2
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_update_all_types
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_aggregate_9
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_between_in
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_binary_join_groupby
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_cast_constant
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_data_types
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_3
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_4
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_5
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_aggregate
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_mapjoin
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_precision
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_round
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_round_2
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_trailing
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_decimal_udf
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_vector_reduce_groupby_decimal
org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_windowing_rank
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_dynamic_partition_pruning_2
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_hybridgrace_hashjoin_1
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_insert_values_dynamic_partitioned
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_insert_values_non_partitioned
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_insert_values_tmp_table
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_mapjoin_decimal
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_orc_analyze
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_orc_merge_incompat2
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_tez_union_decimal
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_update_all_types
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_aggregate_9
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_between_in
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_binary_join_groupby
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_cast_constant
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_data_types
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_decimal_3
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_vector_decimal_4

[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-10 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14623229#comment-14623229
 ] 

Hive QA commented on HIVE-11209:




{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12744806/HIVE-11209.patch

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/4581/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/4581/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-4581/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ [[ -n /usr/java/jdk1.7.0_45-cloudera ]]
+ export JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ export 
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.7.0_45-cloudera/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ 
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.7.0_45-cloudera/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-TRUNK-Build-4581/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at e6ea691 HIVE-11206 : CBO (Calcite Return Path): Join translation 
should update all ExprNode recursively (Jesus Camacho Rodriguez via Ashutosh 
Chauhan)
+ git clean -f -d
+ git checkout master
Already on 'master'
+ git reset --hard origin/master
HEAD is now at e6ea691 HIVE-11206 : CBO (Calcite Return Path): Join translation 
should update all ExprNode recursively (Jesus Camacho Rodriguez via Ashutosh 
Chauhan)
+ git merge --ff-only origin/master
Already up-to-date.
+ git gc
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh 
/data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12744806 - PreCommit-HIVE-TRUNK-Build

 Clean up dependencies in HiveDecimalWritable
 

 Key: HIVE-11209
 URL: https://issues.apache.org/jira/browse/HIVE-11209
 Project: Hive
  Issue Type: Sub-task
Reporter: Owen O'Malley
Assignee: Owen O'Malley
 Attachments: HIVE-11209.patch, HIVE-11209.patch


 Currently HiveDecimalWritable depends on:
 * org.apache.hadoop.hive.serde2.ByteStream
 * org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils
 * org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils
 since we need HiveDecimalWritable for the decimal VectorizedColumnBatch, 
 breaking these dependencies will improve things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-11209) Clean up dependencies in HiveDecimalWritable

2015-07-08 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-11209?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14619489#comment-14619489
 ] 

Hive QA commented on HIVE-11209:




{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12744244/HIVE-11209.patch

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/4542/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/4542/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-4542/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ [[ -n /usr/java/jdk1.7.0_45-cloudera ]]
+ export JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ export 
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.7.0_45-cloudera/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ 
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.7.0_45-cloudera/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-TRUNK-Build-4542/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ cd apache-github-source-source
+ git fetch origin
From https://github.com/apache/hive
   1e0eace..b92d3dd  llap   - origin/llap
+ git reset --hard HEAD
HEAD is now at f6ea8cb HIVE-10927 : Add number of HMS/HS2 connection metrics 
(Szehon, reviewed by Jimmy Xiang)
+ git clean -f -d
+ git checkout master
Already on 'master'
+ git reset --hard origin/master
HEAD is now at f6ea8cb HIVE-10927 : Add number of HMS/HS2 connection metrics 
(Szehon, reviewed by Jimmy Xiang)
+ git merge --ff-only origin/master
Already up-to-date.
+ git gc
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh 
/data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12744244 - PreCommit-HIVE-TRUNK-Build

 Clean up dependencies in HiveDecimalWritable
 

 Key: HIVE-11209
 URL: https://issues.apache.org/jira/browse/HIVE-11209
 Project: Hive
  Issue Type: Sub-task
Reporter: Owen O'Malley
Assignee: Owen O'Malley
 Attachments: HIVE-11209.patch


 Currently HiveDecimalWritable depends on:
 * org.apache.hadoop.hive.serde2.ByteStream
 * org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils
 * org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils
 since we need HiveDecimalWritable for the decimal VectorizedColumnBatch, 
 breaking these dependencies will improve things.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)