I am badly stuck and can't find a way out. i want to change my rowkey schema while copying data from 1 table to another. but a map reduce job to do this won't work because of large row sizes (responseTooLarge errors). so i am left with a 2 steps processing of exporting to hdfs files and importing from them to the 2nd table. so i wrote a custom exporter that changes the rowkey to newRowKey when doing context.write(newRowKey, result). but when i import these new files into new table, it doesnt work due to this exception in put - "The row in the recently added ... doesn't match the original one ....".
is there no way out for me? please help thanks
