I'm not sure whether I should be asking Parquet people or Avro people about 
this.

I'm reading a Parquet file via Avro. The Parquet file was produced by Spark. 
The Avro schema that I generated from the file (by deserializing it as a 
GenericData record & retrieving its schema) uses "record" types that have no 
"namespace" value. Therefore, when generating Java classes from the Avro schema 
in order to deserialize the Parquet file to strongly typed objects, the 
generated Java classes are created in the default package. Obviously, that's 
not very desirable.

Has anyone else run into this situation? And is there any way to work around 
it? It seems like I should be able to specify how the types in the Parquet file 
should map to a package name, in particular so that I can prevent class name 
conflicts.

Thanks!

Reply via email to