[jira] [Updated] (HIVE-23072) ACID: Can't select from insert-only table with original files and deltas

2020-03-25 Thread Karen Coppage (Jira)


 [ 
https://issues.apache.org/jira/browse/HIVE-23072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Karen Coppage updated HIVE-23072:
-
Attachment: HIVE-23072.02.patch

> ACID: Can't select from insert-only table with original files and deltas
> 
>
> Key: HIVE-23072
> URL: https://issues.apache.org/jira/browse/HIVE-23072
> Project: Hive
>  Issue Type: Bug
>Reporter: Karen Coppage
>Priority: Major
> Attachments: HIVE-23072.02.patch
>
>
> NO PRECOMMIT TESTS
>  1. Create non-transactional table, not stored as orc (ORC uses another 
> FileInputFormat implementation).
>  2. Run a couple inserts. -> makes original files
>  3. Alter table, make insert-only.
>  4. Run a couple inserts. -> makes delta dirs
>  5. Select from table.
> The attached unit test recreates these steps and results in the error below 
> [1].
> Side notes:
> I tried playing around with 
> mapreduce.input.fileinputformat.input.dir.recursive, no success as of yet 
> since it messes with the ability to read or not read delta dirs.
> [1]
> {code:java}
> java.io.IOException: java.io.IOException: Not a file: 
> file:/Users/karencoppage/upstream/hive/itests/hive-unit/target/tmp/org.apache.hadoop.hive.ql.txn.compactor.TestCompactor-1585121669806_289114708/warehouse/mm_nonpart/delta_002_002_
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:638)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:545)
> at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:150)
> at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:880)
> at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:241)
> at 
> org.apache.hadoop.hive.ql.txn.compactor.TestCompactor.verifyFooBarResult(TestCompactor.java:1102)
> at 
> org.apache.hadoop.hive.ql.txn.compactor.TestCompactor.mmTableOriginalsMinor(TestCompactor.java:942)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> at 
> org.apache.hive.common.util.Retry$RetryingStatement.evaluate(Retry.java:61)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: java.io.IOException: Not a file: 
> file:/Users/karencoppage/upstream/hive/itests/hive-unit/target/tmp/org.apache.hadoop.hive.ql.txn.compactor.TestCompactor-1585121669806_289114708/warehouse/mm_nonpart/delta_002_002_
> at 
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:329)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.generateWrappedSplits(FetchOperator.java:461)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:439)
> at 
> 

[jira] [Updated] (HIVE-23072) ACID: Can't select from insert-only table with original files and deltas

2020-03-25 Thread Karen Coppage (Jira)


 [ 
https://issues.apache.org/jira/browse/HIVE-23072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Karen Coppage updated HIVE-23072:
-
Attachment: (was: HIVE-23072.patch)

> ACID: Can't select from insert-only table with original files and deltas
> 
>
> Key: HIVE-23072
> URL: https://issues.apache.org/jira/browse/HIVE-23072
> Project: Hive
>  Issue Type: Bug
>Reporter: Karen Coppage
>Priority: Major
> Attachments: HIVE-23072.02.patch
>
>
> NO PRECOMMIT TESTS
>  1. Create non-transactional table, not stored as orc (ORC uses another 
> FileInputFormat implementation).
>  2. Run a couple inserts. -> makes original files
>  3. Alter table, make insert-only.
>  4. Run a couple inserts. -> makes delta dirs
>  5. Select from table.
> The attached unit test recreates these steps and results in the error below 
> [1].
> Side notes:
> I tried playing around with 
> mapreduce.input.fileinputformat.input.dir.recursive, no success as of yet 
> since it messes with the ability to read or not read delta dirs.
> [1]
> {code:java}
> java.io.IOException: java.io.IOException: Not a file: 
> file:/Users/karencoppage/upstream/hive/itests/hive-unit/target/tmp/org.apache.hadoop.hive.ql.txn.compactor.TestCompactor-1585121669806_289114708/warehouse/mm_nonpart/delta_002_002_
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:638)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:545)
> at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:150)
> at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:880)
> at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:241)
> at 
> org.apache.hadoop.hive.ql.txn.compactor.TestCompactor.verifyFooBarResult(TestCompactor.java:1102)
> at 
> org.apache.hadoop.hive.ql.txn.compactor.TestCompactor.mmTableOriginalsMinor(TestCompactor.java:942)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> at 
> org.apache.hive.common.util.Retry$RetryingStatement.evaluate(Retry.java:61)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: java.io.IOException: Not a file: 
> file:/Users/karencoppage/upstream/hive/itests/hive-unit/target/tmp/org.apache.hadoop.hive.ql.txn.compactor.TestCompactor-1585121669806_289114708/warehouse/mm_nonpart/delta_002_002_
> at 
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:329)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.generateWrappedSplits(FetchOperator.java:461)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:439)
> at 
> 

[jira] [Updated] (HIVE-23072) ACID: Can't select from insert-only table with original files and deltas

2020-03-25 Thread Karen Coppage (Jira)


 [ 
https://issues.apache.org/jira/browse/HIVE-23072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Karen Coppage updated HIVE-23072:
-
Summary: ACID: Can't select from insert-only table with original files and 
deltas  (was: Can't select from table after minor compaction of MM table with 
original files)

> ACID: Can't select from insert-only table with original files and deltas
> 
>
> Key: HIVE-23072
> URL: https://issues.apache.org/jira/browse/HIVE-23072
> Project: Hive
>  Issue Type: Bug
>Reporter: Karen Coppage
>Priority: Major
> Attachments: HIVE-23072.patch
>
>
> NO PRECOMMIT TESTS
>  1. Create non-transactional table, not stored as orc (ORC uses another 
> FileInputFormat implementation).
>  2. Run a couple inserts. -> makes original files
>  3. Alter table, make insert-only.
>  4. Run a couple inserts. -> makes delta dirs
>  5. Select from table.
> The attached unit test recreates these steps and results in the error below 
> [1].
> Side notes:
> I tried playing around with 
> mapreduce.input.fileinputformat.input.dir.recursive, no success as of yet 
> since it messes with the ability to read or not read delta dirs.
> [1]
> {code:java}
> java.io.IOException: java.io.IOException: Not a file: 
> file:/Users/karencoppage/upstream/hive/itests/hive-unit/target/tmp/org.apache.hadoop.hive.ql.txn.compactor.TestCompactor-1585121669806_289114708/warehouse/mm_nonpart/delta_002_002_
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:638)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:545)
> at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:150)
> at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:880)
> at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.getResults(ReExecDriver.java:241)
> at 
> org.apache.hadoop.hive.ql.txn.compactor.TestCompactor.verifyFooBarResult(TestCompactor.java:1102)
> at 
> org.apache.hadoop.hive.ql.txn.compactor.TestCompactor.mmTableOriginalsMinor(TestCompactor.java:942)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at 
> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
> at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
> at 
> org.apache.hive.common.util.Retry$RetryingStatement.evaluate(Retry.java:61)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: java.io.IOException: Not a file: 
> file:/Users/karencoppage/upstream/hive/itests/hive-unit/target/tmp/org.apache.hadoop.hive.ql.txn.compactor.TestCompactor-1585121669806_289114708/warehouse/mm_nonpart/delta_002_002_
> at 
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:329)
> at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.generateWrappedSplits(FetchOperator.java:461)
> at 
>