Hi,

I passed multiple file input paths to a spark job.  I assumed the same
behavior as Hadoop FileInputFormat.  However, the job failed.  The input
paths I pass in are:

s3://pixlogstxt/ETL/output/2015/01/28/{00,01}

I expected this could be expanded to 

"s3://pixlogstxt/ETL/output/2015/01/28/00,s3://pixlogstxt/ETL/output/2015/01/28/01"

But, instead, it seemed to me this was expanded to

"s3://pixlogstxt/ETL/output/2015/01/28/00","s3://pixlogstxt/ETL/output/2015/01/28/01"

Anybody knows what spark is doing this?  Thanks.


Ey-Chih Chow



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/multiple-input-file-paths-not-working-tp21500.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to