Thanks Israel, this helped. No error anymore, but the table remains empty
with this code
<https://stackoverflow.com/questions/63035772/streaming-pipeline-in-dataflow-to-bigtable-python>
.
*Code*
class CreateRowFn(beam.DoFn):
def process(self, key):
direct_row = row.DirectRow(row_key=key)
direct_row.set_cell(
"stats_summary",
b"os_build",
b"android",
datetime.datetime.now())
return [direct_row]
_ = (p
|
beam.Create(["phone#4c410523#20190501","phone#4c410523#20190502"])
| beam.ParDo(CreateRowFn())
|
WriteToBigTable(project_id=pipeline_options.bigtable_project,
instance_id=pipeline_options.bigtable_instance,
table_id=pipeline_options.bigtable_table)
*Issue*
Empty table
(checked with happybase and check = [(key,row) for key, row in
table.scan()])
Thanks !
Le sam. 9 oct. 2021 à 21:37, Israel Herraiz <[email protected]> a écrit :
> You have to write DirectRows to Bigtable, not strings. For more info,
> please see
> https://googleapis.dev/python/bigtable/latest/row.html#google.cloud.bigtable.row.DirectRow
>
--
Pierre