[ 
https://issues.apache.org/jira/browse/BEAM-13162?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergei Lilichenko updated BEAM-13162:
-------------------------------------
    Description: 
Many BigQuery Storage API errors get logged at ERROR level even though they are 
expected and handled by the Beam's transform. They should be suppressed, or 
output at DEBUG/INFO level to differentiate from abnormal behavior.

Example:

{
 "jsonPayload":

{ "message": "Got error io.grpc.StatusRuntimeException: ALREADY_EXISTS: The 
offset is within stream, expected offset 52125, received 51264 Entity: 
projects/event-processing-demo/datasets/bigquery_io/tables/events/streams/CiQ2MmZlOTFjNS0wMDAwLTIzNTItOWMxYS01ODI0MjlhOWRiOGM
 closing 
projects/event-processing-demo/datasets/bigquery_io/tables/events/streams/CiQ2MmZlOTFjNS0wMDAwLTIzNTItOWMxYS01ODI0MjlhOWRiOGM",
 "step": "Save Rows to BigQuery/StorageApiLoads/StorageApiWriteSharded/Write 
Records", "worker": "data-processing-streaming-10250846-kps9-harness-hnj3", 
"stage": "P6", "job": "2021-10-25_08_46_20-4863073108555028756", "thread": 
"258", "work": "3800026c09938fac-5cf2f68daacdd", "logger": 
"org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords" }

,
 "resource": {
 "type": "dataflow_step",
 "labels":

{ "project_id": "event-processing-demo", "step_id": "Save Rows to 
BigQuery/StorageApiLoads/StorageApiWriteSharded/Write Records", "job_id": 
"2021-10-25_08_46_20-4863073108555028756", "region": "us-central1", "job_name": 
"data-processing-streaming-storage-write-api-200-2" }

},
 "timestamp": "2021-10-25T15:57:43.617Z",
 "severity": "ERROR",
 "labels":

{ "compute.googleapis.com/resource_type": "instance", 
"dataflow.googleapis.com/job_name": 
"data-processing-streaming-storage-write-api-200-2", 
"dataflow.googleapis.com/log_type": "supportability", 
"compute.googleapis.com/resource_id": "6365565973148954518", 
"dataflow.googleapis.com/job_id": "2021-10-25_08_46_20-4863073108555028756", 
"dataflow.googleapis.com/region": "us-central1", 
"compute.googleapis.com/resource_name": 
"data-processing-streaming-10250846-kps9-harness-hnj3" }

,
 "logName": 
"projects/event-processing-demo/logs/dataflow.googleapis.com%2Fworker",
 "receiveTimestamp": "2021-10-25T15:57:47.557344849Z"
 }

Additional exceptions that fit this category:
 * Got error io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Stream is 
closed due to com.google.api.gax.rpc.UnavailableException: 
io.grpc.StatusRuntimeException: UNAVAILABLE: Connection closed after GOAWAY. 
HTTP/2 error code: NO_ERROR, debug data: server_shutting_down closing
 * Got error com.google.api.gax.rpc.AbortedException: 
io.grpc.StatusRuntimeException: ABORTED: Closing the stream because server is 
restarted. This is expected and client is advised to reconnect.
 * Got error io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Stream is 
closed due to com.google.api.gax.rpc.AbortedException: 
io.grpc.StatusRuntimeException: ABORTED: Closing the stream because server is 
restarted. This is expected and client is advised to reconnect.

 

  was:
Many BigQuery Storage API errors get logged at ERROR level even though they are 
expected and handled by the Beam's transform. They should be suppresses, or 
output at DEBUG/INFO level to differentiate from abnormal behavior.

Example:

{
 "jsonPayload": {
 "message": "Got error io.grpc.StatusRuntimeException: ALREADY_EXISTS: The 
offset is within stream, expected offset 52125, received 51264 Entity: 
projects/event-processing-demo/datasets/bigquery_io/tables/events/streams/CiQ2MmZlOTFjNS0wMDAwLTIzNTItOWMxYS01ODI0MjlhOWRiOGM
 closing 
projects/event-processing-demo/datasets/bigquery_io/tables/events/streams/CiQ2MmZlOTFjNS0wMDAwLTIzNTItOWMxYS01ODI0MjlhOWRiOGM",
 "step": "Save Rows to BigQuery/StorageApiLoads/StorageApiWriteSharded/Write 
Records",
 "worker": "data-processing-streaming-10250846-kps9-harness-hnj3",
 "stage": "P6",
 "job": "2021-10-25_08_46_20-4863073108555028756",
 "thread": "258",
 "work": "3800026c09938fac-5cf2f68daacdd",
 "logger": "org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords"
 },
 "resource": {
 "type": "dataflow_step",
 "labels": {
 "project_id": "event-processing-demo",
 "step_id": "Save Rows to BigQuery/StorageApiLoads/StorageApiWriteSharded/Write 
Records",
 "job_id": "2021-10-25_08_46_20-4863073108555028756",
 "region": "us-central1",
 "job_name": "data-processing-streaming-storage-write-api-200-2"
 }
 },
 "timestamp": "2021-10-25T15:57:43.617Z",
 "severity": "ERROR",
 "labels": {
 "compute.googleapis.com/resource_type": "instance",
 "dataflow.googleapis.com/job_name": 
"data-processing-streaming-storage-write-api-200-2",
 "dataflow.googleapis.com/log_type": "supportability",
 "compute.googleapis.com/resource_id": "6365565973148954518",
 "dataflow.googleapis.com/job_id": "2021-10-25_08_46_20-4863073108555028756",
 "dataflow.googleapis.com/region": "us-central1",
 "compute.googleapis.com/resource_name": 
"data-processing-streaming-10250846-kps9-harness-hnj3"
 },
 "logName": 
"projects/event-processing-demo/logs/dataflow.googleapis.com%2Fworker",
 "receiveTimestamp": "2021-10-25T15:57:47.557344849Z"
}

Additional exceptions that fit this category:
 * Got error io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Stream is 
closed due to com.google.api.gax.rpc.UnavailableException: 
io.grpc.StatusRuntimeException: UNAVAILABLE: Connection closed after GOAWAY. 
HTTP/2 error code: NO_ERROR, debug data: server_shutting_down closing
 * Got error com.google.api.gax.rpc.AbortedException: 
io.grpc.StatusRuntimeException: ABORTED: Closing the stream because server is 
restarted. This is expected and client is advised to reconnect.
 * Got error io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Stream is 
closed due to com.google.api.gax.rpc.AbortedException: 
io.grpc.StatusRuntimeException: ABORTED: Closing the stream because server is 
restarted. This is expected and client is advised to reconnect.

 


>  BigQueryIO Storage Write API method - suppress logging API errors for known 
> use cases.
> ---------------------------------------------------------------------------------------
>
>                 Key: BEAM-13162
>                 URL: https://issues.apache.org/jira/browse/BEAM-13162
>             Project: Beam
>          Issue Type: New Feature
>          Components: io-java-gcp
>    Affects Versions: 2.33.0
>            Reporter: Sergei Lilichenko
>            Priority: P2
>
> Many BigQuery Storage API errors get logged at ERROR level even though they 
> are expected and handled by the Beam's transform. They should be suppressed, 
> or output at DEBUG/INFO level to differentiate from abnormal behavior.
> Example:
> {
>  "jsonPayload":
> { "message": "Got error io.grpc.StatusRuntimeException: ALREADY_EXISTS: The 
> offset is within stream, expected offset 52125, received 51264 Entity: 
> projects/event-processing-demo/datasets/bigquery_io/tables/events/streams/CiQ2MmZlOTFjNS0wMDAwLTIzNTItOWMxYS01ODI0MjlhOWRiOGM
>  closing 
> projects/event-processing-demo/datasets/bigquery_io/tables/events/streams/CiQ2MmZlOTFjNS0wMDAwLTIzNTItOWMxYS01ODI0MjlhOWRiOGM",
>  "step": "Save Rows to BigQuery/StorageApiLoads/StorageApiWriteSharded/Write 
> Records", "worker": "data-processing-streaming-10250846-kps9-harness-hnj3", 
> "stage": "P6", "job": "2021-10-25_08_46_20-4863073108555028756", "thread": 
> "258", "work": "3800026c09938fac-5cf2f68daacdd", "logger": 
> "org.apache.beam.sdk.io.gcp.bigquery.StorageApiWritesShardedRecords" }
> ,
>  "resource": {
>  "type": "dataflow_step",
>  "labels":
> { "project_id": "event-processing-demo", "step_id": "Save Rows to 
> BigQuery/StorageApiLoads/StorageApiWriteSharded/Write Records", "job_id": 
> "2021-10-25_08_46_20-4863073108555028756", "region": "us-central1", 
> "job_name": "data-processing-streaming-storage-write-api-200-2" }
> },
>  "timestamp": "2021-10-25T15:57:43.617Z",
>  "severity": "ERROR",
>  "labels":
> { "compute.googleapis.com/resource_type": "instance", 
> "dataflow.googleapis.com/job_name": 
> "data-processing-streaming-storage-write-api-200-2", 
> "dataflow.googleapis.com/log_type": "supportability", 
> "compute.googleapis.com/resource_id": "6365565973148954518", 
> "dataflow.googleapis.com/job_id": "2021-10-25_08_46_20-4863073108555028756", 
> "dataflow.googleapis.com/region": "us-central1", 
> "compute.googleapis.com/resource_name": 
> "data-processing-streaming-10250846-kps9-harness-hnj3" }
> ,
>  "logName": 
> "projects/event-processing-demo/logs/dataflow.googleapis.com%2Fworker",
>  "receiveTimestamp": "2021-10-25T15:57:47.557344849Z"
>  }
> Additional exceptions that fit this category:
>  * Got error io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Stream is 
> closed due to com.google.api.gax.rpc.UnavailableException: 
> io.grpc.StatusRuntimeException: UNAVAILABLE: Connection closed after GOAWAY. 
> HTTP/2 error code: NO_ERROR, debug data: server_shutting_down closing
>  * Got error com.google.api.gax.rpc.AbortedException: 
> io.grpc.StatusRuntimeException: ABORTED: Closing the stream because server is 
> restarted. This is expected and client is advised to reconnect.
>  * Got error io.grpc.StatusRuntimeException: FAILED_PRECONDITION: Stream is 
> closed due to com.google.api.gax.rpc.AbortedException: 
> io.grpc.StatusRuntimeException: ABORTED: Closing the stream because server is 
> restarted. This is expected and client is advised to reconnect.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to