hi,all
my issue is everyday I will receive some json datafile , I want to convert
them to parquet file and save to hdfs,
the floder will like this:
/my_table_base_floder
/my_table_base_floder/day_2
/my_table_base_floder/day_3
....
where the parquet files of "day_1" was store in /my_table_base_floder
then I run :
sqlContext.createExternalTable("tpc1.customer","hdfs://master1:9000/my_table_base_floder","parquet")
but when I save parquet file to subdir ,for example :
/my_table_base_floder/day_2 and refresh the metadata.
spark doesn't recognize the data in subdir. How I can do it ?
2016-10-20
lk_spark