Hi, The bigqueryio.Write() infers a schema type from the PCollection you are writing. In the code, you are directly writing the output of pubsubio.Read which is of []byte. You need to do insert a ParDo that would decode and convert it to schema (Schema in go SDK is just a struct with exported struct fields. You can add additional meta tags with struct tags) before writing it to bigquery. You can refer to this for an example: https://github.com/apache/beam/blob/master/sdks/go/examples/cookbook/max/max.go
On Wed, Sep 14, 2022 at 11:45 AM Abyakta Bal <[email protected]> wrote: > *correcting this:-* > *i am using this code: to read from pubsub message* > col := pubsubio.Read(s, project, *input, &pubsubio.ReadOptions{ > Subscription: sub.ID()}) > *i am using this code:- write in bigquery* > bigqueryio.Write(s, project, *output,*col)* > > On Wed, Sep 14, 2022 at 6:39 PM Abyakta Bal <[email protected]> > wrote: > >> Dear Team, >> i am trying to write bigquery table using pubsub message >> pull/suscribe .So i am unable to convert *beam.collection []uint8/bytes* >> to *beam.collection* *schema type in go apachebeamsdk.*i am facing this >> issue :-*schema type must be struct: []uint8* >> >> *i am using this code: to read from pubsub message* >> col := pubsubio.Read(s, project, *input, &pubsubio.ReadOptions{ >> Subscription: sub.ID()}) >> *i am using this code:- write in bigquery* >> bigqueryio.Write(s, project, *output, out) >> >> I request to all please help me to solve this problem.if any >> clarification required please let me know and arrange a call. >> >> >> Thankyou, >> Abyakta bal >> >> >
