Hi Jef, I have a C program that processes very large data files which are compressed, so this program has to have full control of the process. However the input data can be broken down into chunks, and a separate (distributed) process for each chunk can be run, which what I am doing now, but I am doing this manually at this time.
I am looking to use a distributed system like Hadoop to do this so that i controls the scheduling and all those great things I have read about Hadoop. I was wondering if I can have Hadoop run a batch file (.bat in windows or .sh in linux), also I would like to run this in Virtual Machines. Thanks Michael -- View this message in context: http://lucene.472066.n3.nabble.com/calling-C-programs-from-Hadoop-tp854833p858959.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.
