Just wondering if anybody has done/aware about encoding/compressing large image into JPEG2000 format using Hadoop ?
We have 1TB+ raw images that need to be compressed in JPEG2000 and other format. Using one beefy machine the rate of compression is about 2GB/hour, so it takes about >500hours to compress one image. There is also this http://code.google.com/p/matsu-project/ which uses map reduce to process the image ( http://www.cloudera.com/videos/hw10_video_hadoop_image_processing_for_disaster_relief)
