[jira] [Commented] (HIVE-14366) Conversion of a Non-ACID table to an ACID table produces non-unique primary keys
[ https://issues.apache.org/jira/browse/HIVE-14366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15400574#comment-15400574 ] Lefty Leverenz commented on HIVE-14366: --- [~ekoifman] or [~saketj], this was committed to master, branch-2.1, and branch-1 so please update the status and fix version/s. > Conversion of a Non-ACID table to an ACID table produces non-unique primary > keys > > > Key: HIVE-14366 > URL: https://issues.apache.org/jira/browse/HIVE-14366 > Project: Hive > Issue Type: Bug > Components: Transactions >Affects Versions: 1.0.0 >Reporter: Saket Saurabh >Assignee: Saket Saurabh >Priority: Blocker > Attachments: HIVE-14366.01.patch, HIVE-14366.02.patch > > > When a Non-ACID table is converted to an ACID table, the primary key > consisting of (original transaction id, bucket_id, row_id) is not generated > uniquely. Currently, the row_id is always set to 0 for most rows. This leads > to correctness issue for such tables. > Quickest way to reproduce is to add the following unit test to > ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java > {code:title=ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java|borderStyle=solid} > @Test > public void testOriginalReader() throws Exception { > FileSystem fs = FileSystem.get(hiveConf); > FileStatus[] status; > // 1. Insert five rows to Non-ACID table. > runStatementOnDriver("insert into " + Table.NONACIDORCTBL + "(a,b) > values(1,2),(3,4),(5,6),(7,8),(9,10)"); > // 2. Convert NONACIDORCTBL to ACID table. > runStatementOnDriver("alter table " + Table.NONACIDORCTBL + " SET > TBLPROPERTIES ('transactional'='true')"); > // 3. Perform a major compaction. > runStatementOnDriver("alter table "+ Table.NONACIDORCTBL + " compact > 'MAJOR'"); > runWorker(hiveConf); > // 4. Perform a delete. > runStatementOnDriver("delete from " + Table.NONACIDORCTBL + " where a = > 1"); > // 5. Now do a projection should have (3,4) (5,6),(7,8),(9,10) only since > (1,2) has been deleted. > List rs = runStatementOnDriver("select a,b from " + > Table.NONACIDORCTBL + " order by a,b"); > int[][] resultData = new int[][] {{3,4}, {5,6}, {7,8}, {9,10}}; > Assert.assertEquals(stringifyValues(resultData), rs); > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-14366) Conversion of a Non-ACID table to an ACID table produces non-unique primary keys
[ https://issues.apache.org/jira/browse/HIVE-14366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15400404#comment-15400404 ] Hive QA commented on HIVE-14366: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12820850/HIVE-14366.02.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-MASTER-Build/691/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-MASTER-Build/691/console Test logs: http://ec2-204-236-174-241.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-MASTER-Build-691/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n /usr/java/jdk1.8.0_25 ]] + export JAVA_HOME=/usr/java/jdk1.8.0_25 + JAVA_HOME=/usr/java/jdk1.8.0_25 + export PATH=/usr/java/jdk1.8.0_25/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/java/jdk1.8.0_25/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-MASTER-Build-691/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + cd apache-github-source-source + git fetch origin >From https://github.com/apache/hive c8648aa..6b0131b master -> origin/master + git reset --hard HEAD HEAD is now at c8648aa HIVE-14348: Add tests for alter table exchange partition (Vaibhav Gumashta reviewed by Thejas Nair) + git clean -f -d + git checkout master Already on 'master' Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded. (use "git pull" to update your local branch) + git reset --hard origin/master HEAD is now at 6b0131b Revert "HIVE-14303: CommonJoinOperator.checkAndGenObject should return directly at CLOSE state to avoid NPE if ExecReducer.close is called twice. (Zhihai Xu, reviewed by Xuefu Zhang)" + git merge --ff-only origin/master Already up-to-date. + git gc + patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch The patch does not appear to apply with p0, p1, or p2 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12820850 - PreCommit-HIVE-MASTER-Build > Conversion of a Non-ACID table to an ACID table produces non-unique primary > keys > > > Key: HIVE-14366 > URL: https://issues.apache.org/jira/browse/HIVE-14366 > Project: Hive > Issue Type: Bug > Components: Transactions >Affects Versions: 1.0.0 >Reporter: Saket Saurabh >Assignee: Saket Saurabh >Priority: Blocker > Attachments: HIVE-14366.01.patch, HIVE-14366.02.patch > > > When a Non-ACID table is converted to an ACID table, the primary key > consisting of (original transaction id, bucket_id, row_id) is not generated > uniquely. Currently, the row_id is always set to 0 for most rows. This leads > to correctness issue for such tables. > Quickest way to reproduce is to add the following unit test to > ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java > {code:title=ql/src/test/org/apache/hadoop/hive/ql/TestTxnCommands2.java|borderStyle=solid} > @Test > public void testOriginalReader() throws Exception { > FileSystem fs = FileSystem.get(hiveConf); > FileStatus[] status; > // 1. Insert five rows to Non-ACID table. > runStatementOnDriver("insert into " + Table.NONACIDORCTBL + "(a,b) > values(1,2),(3,4),(5,6),(7,8),(9,10)"); > // 2. Convert NONACIDORCTBL to ACID table. > runStatementOnDriver("alter table " + Table.NONACIDORCTBL + " SET > TBLPROPERTIES ('transactional'='true')"); > // 3. Perform a major compaction. > runStatementOnDriver("alter table "+ Table.NONACIDORCTBL + " compact > 'MAJOR'"); > runWorker(hiveConf); > // 4. Perform a delete. > runStatementOnDriver("delete from " + Table.N