Hi Kamil,
I did not code nor scheduled the spark performance tests but I see in the log
(https://builds.apache.org/blue/organizati
ons/jenkins/beam_PerformanceTests_Spark/detail/beam_PerformanceTests_Spark/1628/pipeline)
that it uses PerfKit and also
that the job that is run is this one:
Hi Etienne,
I was recently playing a lot with BigQuerry when working on anomaly
detection tool and noticed that in db schema timestamp is defined as FLOAT.
Perfkit also produces it as a float
'timestamp': 1524485484.41655,
so the upload passes.
Probably it was defined as float from the
Hi guys,
I noticed a failure in the performance tests job for spark (I did not take a
look at the others): it seems to be related
to a schema update in the bigQuery output.
BigQuery error in load operation: Error processing job
'apache-beam-testing:bqjob_r2527a0e444514f2b_0162f128db2b_1':