I've been playing around with /robots.txt, and I think the RMR code might end up causing more browser traffic than we would like for gadget to container communication under some circumstances.
Imagine N gadgets on a page, all trying to pass a message to the parent. Every gadget will create an iframe to <parent>/robots.txt. If the headers on robots.txt don't explicitly allow browser caching, that'll end up creating N server round trips. Or am I missing something?