I haven't tried this, but in general. metrics_dataset = A BigQuery dataset (to publish metrics) metrics_table = A BigQuery table name staging_location = A GCS bucket (of the form gs://...) (also I think this should be optional since 'staging_location' defaults to 'temp_location') temp_location = A GCS bucket (of the form gs://...)
Also, note that this command assumes a DataflowRunner, hence you'll have to perform Dataflow specific setup mentioned here: https://beam.apache.org/documentation/runners/dataflow/#setup Thanks, Cham On Wed, Dec 16, 2020 at 10:51 AM Ahmet Altay <[email protected]> wrote: > +Pablo Estrada <[email protected]> +Heejong Lee <[email protected]> > +Chamikara > Jayalath <[email protected]> > > On Wed, Dec 16, 2020 at 9:15 AM Faisal Maqsood < > [email protected]> wrote: > >> Hey everyone, >> I need some help related to running LoadTest on the local machine. >> Reference to the file (*apache_beam.io.gcp.bigquery_read_perf_test* ) I >> find the command to run LoadTest but I am not sure about the parameters to >> be passed in this command can someone please help me out on this and give >> any example command. >> >> what value should be passed in these parameters >> >> *metrics_dataset = ...metrics_table* *= ...* >> >> *staging_location = ...* >> >> *temp_location = ...* >> >> Example test run on DataflowRunner: >> >> python -m apache_beam.io.gcp.bigquery_read_perf_test \ >> --test-pipeline-options=" >> --runner=TestDataflowRunner >> --project=... >> --region=... >> --staging_location=gs://... >> --temp_location=gs://... >> --sdk_location=.../dist/apache-beam-x.x.x.dev0.tar.gz >> --publish_to_big_query=true >> --metrics_dataset=gs://... >> --metrics_table=... >> --input_dataset=... >> --input_table=... >> --input_options='{ >> \"num_records\": 1024, >> \"key_size\": 1, >> \"value_size\": 1024, >> } >> >> Thanks in advance :) >> >> *Faisal Ali* >> Senior Software Engineer >> Mobile: +92 3433016854 <+92%20343%203016854> >> Skype: fessimax >> >> <http://venturedive.com/> >> >
