Hi,
I am trying to read a CSV file and then concurrently process each line
before building it into a different schema (along with some metadata) which
I can output as a parquet file. Are builder's goroutine safe? In very loose
go code below is what I am trying to do, is this possible, does it make
sense, are there better ways of doing it?
The aim here is to try and improve performance essentially of reading and
processing these files and marshaling them into a different schema.
All feedback is appreciated thank you.
for csvReader.Next() {
record := csvReader.Record()
go func() {
process(record)
builder.Append(record.Column(1).somedataetcetc)
}()
}
Thanks,
Gus