I think it should be possible by loading collections as RDD and then doing a union on them.
Regards, Sandeep Giri, +1 347 781 4573 (US) +91-953-899-8962 (IN) www.KnowBigData.com. <http://KnowBigData.com.> Phone: +1-253-397-1945 (Office) [image: linkedin icon] <https://linkedin.com/company/knowbigdata> [image: other site icon] <http://knowbigdata.com> [image: facebook icon] <https://facebook.com/knowbigdata> [image: twitter icon] <https://twitter.com/IKnowBigData> <https://twitter.com/IKnowBigData> On Fri, Sep 11, 2015 at 3:40 PM, Mishra, Abhishek <abhishek.mis...@xerox.com > wrote: > Anything using Spark RDD’s ??? > > > > Abhishek > > > > *From:* Sandeep Giri [mailto:sand...@knowbigdata.com] > *Sent:* Friday, September 11, 2015 3:19 PM > *To:* Mishra, Abhishek; u...@spark.apache.org; dev@spark.apache.org > *Subject:* Re: MongoDB and Spark > > > > use map-reduce. > > > > On Fri, Sep 11, 2015, 14:32 Mishra, Abhishek <abhishek.mis...@xerox.com> > wrote: > > Hello , > > > > Is there any way to query multiple collections from mongodb using spark > and java. And i want to create only one Configuration Object. Please help > if anyone has something regarding this. > > > > > > Thank You > > Abhishek > >