t from my iPhone
>
> On Jun 7, 2016, at 5:17 AM, Siddharth Dawar <siddharthdawa...@gmail.com>
> wrote:
>
> Hi,
>
> I wrote a program which creates Map-Reduce jobs in an iterative fashion as
> follows:
>
>
> while (true) {
>
> JobConf conf2 = n
ing directory. That obviates the need to even work with the
> DistributedCache class in your Mapper or Reducer, since you can just work
> with the file (or path using nio) directly.
>
>
>
> Hope that helps.
>
> -Jeff
>
> *From:* Siddharth Dawar [mailto:siddharthdawa.
Hi,
I wrote a program which creates Map-Reduce jobs in an iterative fashion as
follows:
while (true) {
JobConf conf2 = new JobConf(getConf(),graphMining.class);
Hi,
I want to use the distributed cache to allow my mappers to access data in
Hadoop 2.7.2. In main, I'm using the command
String hdfs_path="hdfs://localhost:9000/bloomfilter";InputStream in =
new BufferedInputStream(new
FileInputStream("/home/siddharth/Desktop/data/bloom_filter"));Configuration