[ 
https://issues.apache.org/jira/browse/BEAM-13959?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17494065#comment-17494065
 ] 

Brian Hulette commented on BEAM-13959:
--------------------------------------

Hi [~joelw] I'm trying to get a sense for how serious this issue is and whether 
it's worth starting another RC for 2.37.0. Is this something you've run into in 
production, or is it just when testing? It seems like it would be uncommon to 
have a field in production with the exact name 'f', but perhaps more common 
when testing.

> Unable to write to BigQuery tables with column named 'f'
> --------------------------------------------------------
>
>                 Key: BEAM-13959
>                 URL: https://issues.apache.org/jira/browse/BEAM-13959
>             Project: Beam
>          Issue Type: Bug
>          Components: io-java-gcp
>    Affects Versions: 2.36.0
>            Reporter: Joel Weierman
>            Assignee: Reuven Lax
>            Priority: P1
>          Time Spent: 1h
>  Remaining Estimate: 0h
>
> When using the BigQuery Storage Write API through the Java Beam SDK (both the 
> latest release 2.35.0 and 2.36.0-SNAPSHOT), there seems to be an issue when 
> converting field Storage API Proto to columns named 'f'. 
> Reproduction Steps: The "field" named 'f' is unable to be written to BigQuery 
> with the error referenced below. 
> [1]
> "name": "item3",
> "type": "RECORD",
> "mode": "NULLABLE",
> "fields": [
> {
> "name": "data",
> "mode": "NULLABLE",
> "type": "RECORD",
> "fields": [
> {
> "mode": "NULLABLE",
> "name": "a",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "b",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "c",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "d",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "e",
> "type": "FLOAT"
> },
> {
> "mode": "NULLABLE",
> "name": "f",
> "type": "FLOAT"
> }
> ]
> }
> ]
> [2]
> {
> ...
> "item3": {
> "data": {
> "a": 1.627424812511E12,
> "b": 3.0,
> "c": 3.0,
> "d": 530.0,
> "e": 675.0
> }
> },
> ...
> }
> The following error occurs: Exception in thread "main" 
> org.apache.beam.sdk.Pipeline$PipelineExecutionException: 
> java.lang.IllegalArgumentException: Can not set java.util.List field 
> com.google.api.services.bigquery.model.TableRow.f to java.lang.Double at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:373)
>  at 
> org.apache.beam.runners.direct.DirectRunner$DirectPipelineResult.waitUntilFinish(DirectRunner.java:341)
>  at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:218) at 
> org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:67) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:323) at 
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:309) at 
> com.google.cloud.teleport.templates.PubSubToBigQuery.run(PubSubToBigQuery.java:342)
>  at 
> com.google.cloud.teleport.templates.PubSubToBigQuery.main(PubSubToBigQuery.java:223)
>  Caused by: java.lang.IllegalArgumentException: Can not set java.util.List 
> field com.google.api.services.bigquery.model.TableRow.f to java.lang.Double 
> at 
> sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:167)
>  at 
> sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:171)
>  at 
> sun.reflect.UnsafeObjectFieldAccessorImpl.set(UnsafeObjectFieldAccessorImpl.java:81)
>  at java.lang.reflect.Field.set(Field.java:764) at 
> com.google.api.client.util.FieldInfo.setFieldValue(FieldInfo.java:275) at 
> com.google.api.client.util.FieldInfo.setValue(FieldInfo.java:231) at 
> com.google.api.client.util.GenericData.set(GenericData.java:118) at 
> com.google.api.client.json.GenericJson.set(GenericJson.java:91) at 
> com.google.api.services.bigquery.model.TableRow.set(TableRow.java:64) at 
> com.google.api.services.bigquery.model.TableRow.set(TableRow.java:29) at 
> com.google.api.client.util.GenericData.putAll(GenericData.java:131) at 
> org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.toProtoValue(TableRowToStorageApiProto.java:206)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageValueFromFieldValue(TableRowToStorageApiProto.java:175)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageFromTableRow(TableRowToStorageApiProto.java:103)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.toProtoValue(TableRowToStorageApiProto.java:207)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageValueFromFieldValue(TableRowToStorageApiProto.java:175)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.TableRowToStorageApiProto.messageFromTableRow(TableRowToStorageApiProto.java:103)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.StorageApiDynamicDestinationsTableRow$1.toMessage(StorageApiDynamicDestinationsTableRow.java:95)
>  at 
> org.apache.beam.sdk.io.gcp.bigquery.StorageApiConvertMessages$ConvertMessagesDoFn.processElement(StorageApiConvertMessages.java:106)
>  This error does not show up if I leave the write method to use Streaming 
> Inserts.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to