When all is said & done, there is enough compute power, heck, River is a distributed system, it all comes down to trusting code. We have the tools to sign, we have the tools to check integrity. If we can clarify proxy verification from the server side, we've got it made. How do we make sure the client side isn't compromised?

If only publicly audited, signed code is permitted for distribution and only granted a minimal set of permissions granted as described in a jar bundle format, then what issues are there in executing?

Focus on code auditing would be on the Serialization interface, required Permissions and defensive construction, that is proper encapsulation.

Anyone up for an experiment?

Cheers,

Peter.

Peter Firmstone wrote:
Gregg Wonderly wrote:
I think that there are lots of choices about service location and activation of proxies. The MarshalledObject mechanism allows one to wrap proxies at any point and make them available for remote consumption.

Downloading a proxy via http instead of Reggie works fine.
Good point.

We need to document things and provide the convienence methods and classes that will promote standard practices.
We can set up an experimental documents area on svn, scanned scratchings etc while we're hashing things out? The crawler lookup service might discard non conformant proxies to assist in promoting standard practices. The lookup service might also advertise what River platform versions it's compatible with.

Global Lookup has some interesting compromises, due to the sheer possible volume of entries, whether to provide convenience methods to perform querying at the server to reduce network bandwidth consumption, or to prohibit it due to security or memory and cpu resources. Lookup on the net would really be a search engine for proxy's, the potential is there to make it very powerful if security issues associated with remote code execution can be understood properly, a paradigm shift that overcomes the issues with current search engines could occur.

How did the net evolve in the beginning? As the number of web pages increased, internet search engines were created?

My gut feel is a River Search Engine / Service Index (Lookup Service) would need to address security issues with remote code execution.

Cheers,

Peter.


Reply via email to