[
https://issues.apache.org/jira/browse/DRILL-3334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14619610#comment-14619610
]
Hanifi Gunes commented on DRILL-3334:
-------------------------------------
[~mehant] I understand that hash join does not support schema change on joined
column. Does it support changing fields on non-join columns? I would think his
second query be a non-fatal, soft schema change.
{code}
org.apache.drill.common.exceptions.UserException: SYSTEM ERROR:
IllegalStateException: Failure while reading vector. Expected vector class of
org.apache.drill.exec.vector.NullableIntVector but was holding vector class
org.apache.drill.exec.vector.NullableBigIntVector.
Fragment 0:0
[Error Id: f06dfc1f-9637-4512-8e03-7ce3e95399f2 on 192.168.56.1:31010]
at
org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:523)
~[classes/:na]
at
org.apache.drill.exec.work.fragment.FragmentExecutor.sendFinalState(FragmentExecutor.java:323)
[classes/:na]
at
org.apache.drill.exec.work.fragment.FragmentExecutor.cleanup(FragmentExecutor.java:178)
[classes/:na]
at
org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:292)
[classes/:na]
at
org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
[classes/:na]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[na:1.7.0_65]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[na:1.7.0_65]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
Caused by: java.lang.IllegalStateException: Failure while reading vector.
Expected vector class of org.apache.drill.exec.vector.NullableIntVector but was
holding vector class org.apache.drill.exec.vector.NullableBigIntVector.
at
org.apache.drill.exec.record.VectorContainer.getValueAccessorById(VectorContainer.java:241)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.getValueAccessorById(AbstractRecordBatch.java:199)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.validate.IteratorValidatorBatchIterator.getValueAccessorById(IteratorValidatorBatchIterator.java:110)
~[classes/:na]
at
org.apache.drill.exec.test.generated.HashJoinProbeGen46.doSetup(HashJoinProbeTemplate.java:71)
~[na:na]
at
org.apache.drill.exec.test.generated.HashJoinProbeGen46.executeProbePhase(HashJoinProbeTemplate.java:138)
~[na:na]
at
org.apache.drill.exec.test.generated.HashJoinProbeGen46.probeAndProject(HashJoinProbeTemplate.java:223)
~[na:na]
at
org.apache.drill.exec.physical.impl.join.HashJoinBatch.innerNext(HashJoinBatch.java:234)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:147)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.validate.IteratorValidatorBatchIterator.next(IteratorValidatorBatchIterator.java:118)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:105)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:95)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext(AbstractSingleRecordBatch.java:51)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:129)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:147)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.validate.IteratorValidatorBatchIterator.next(IteratorValidatorBatchIterator.java:118)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:105)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:95)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext(AbstractSingleRecordBatch.java:51)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:129)
~[classes/:na]
at
org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:147)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.validate.IteratorValidatorBatchIterator.next(IteratorValidatorBatchIterator.java:118)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:83)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:79)
~[classes/:na]
at
org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:73)
~[classes/:na]
at
org.apache.drill.exec.work.fragment.FragmentExecutor$1.run(FragmentExecutor.java:258)
~[classes/:na]
at
org.apache.drill.exec.work.fragment.FragmentExecutor$1.run(FragmentExecutor.java:252)
~[classes/:na]
at java.security.AccessController.doPrivileged(Native Method)
~[na:1.7.0_65]
at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_65]
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
~[hadoop-common-2.4.1.jar:na]
at
org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:252)
[classes/:na]
at
org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
[classes/:na]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[na:1.7.0_65]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[na:1.7.0_65]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_65]
... 4 more
{code}
> "java.lang.IllegalStateException: Failure while reading vector.: raised when
> using dynamic schema in JSON
> ----------------------------------------------------------------------------------------------------------
>
> Key: DRILL-3334
> URL: https://issues.apache.org/jira/browse/DRILL-3334
> Project: Apache Drill
> Issue Type: Bug
> Components: Execution - Data Types
> Affects Versions: 1.0.0
> Environment: Single Node running on OSX
> and
> MapR Hadoop SandBox + Drill
> Reporter: Tugdual Grall
> Assignee: Hanifi Gunes
> Fix For: 1.2.0
>
> Attachments: test.zip
>
>
> I have a simple data set based on 3 JSON documents:
> - 1 customer
> - 2 orders
> (I have attached the document to the JIRA)
> when I do the following query that is a join between order and customers I
> can raise some unexpected exception.
> A working query:
> {code}
> SELECT customers.id, orders.total
> FROM dfs.ecommerce.`customers/*.json` customers,
> dfs.ecommerce.`orders/*.json` orders
> WHERE customers.id = orders.cust_id
> AND customers.country = 'FRANCE'
> {code}
> It works since orders.total is present in all orders
> Now when I execute the following query (tax is not present in all document)
> {code}
> SELECT customers.id, orders.tax
> FROM dfs.ecommerce.`customers/*.json` customers,
> dfs.ecommerce.`orders/*.json` orders
> WHERE customers.id = orders.cust_id
> AND customers.country = 'FRANCE'
> {code}
> Thsi query raise the following exception:
> {code}
> org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR:
> java.lang.IllegalStateException: Failure while reading vector. Expected
> vector class of org.apache.drill.exec.vector.NullableIntVector but was
> holding vector class org.apache.drill.exec.vector.NullableBigIntVector.
> Fragment 0:0 [Error Id: a7ad300a-4446-41f3-8b1c-4bb7d1dbfb52 on
> maprdemo:31010]
> {code}
> If you cannot reproduce with tax, you can try with the field:
> orders.cool
> or simply move the tax field from one document to the others.
> (the field must be present in 1 document only)
> It looks like Drill is losing the list of columns present globally.
> Note: if I use a field that does not exist in any document it is working (
> orders.this_is_crazy )
> Note: if I use * instead of a projection this raise another exception:
> {code}
> org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR:
> org.apache.drill.exec.exception.SchemaChangeException: Hash join does not
> support schema changes Fragment 0:0 [Error Id:
> 0b20d580-37a3-491a-9987-4d04fb6f2d43 on maprdemo:31010]
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)