TheNeuralBit commented on PR #22415:
URL: https://github.com/apache/beam/pull/22415#issuecomment-1197509149

   > > > > Is there anywhere we can test this?
   > > > 
   > > > 
   > > > I couldn't find any tests related to it. Any suggestions?
   > > 
   > > 
   > > How did you detect the issue?
   > 
   > When I was trying to publish metrics to InfluxDB and BigQuery, the metrics 
were named like this 
`INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
RunInferencePytorch_ytorchruninference/pardo(_runinferencedofn)_max_inference_batch_latency_micro_secs
 Value: 1592367`
   > 
   > 
https://ci-beam.apache.org/job/beam_Inference_Python_Benchmarks_Dataflow_PR/9/console
   
   Ah ok, thanks. I suppose this is just testing infrastructure so we don't 
need to be too picky.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to