On 2014-11-18 at 17:09, Robert J. Hansen wrote: >> Would this not at the same time make it simple for MUAs to discover >> that "this message is not from anyone you say you know. Delete >> without reading?" > > Sure, but that also destroys the email ecosystem. One of email's > strongest points has been that no introduction is necessary to begin a > conversation. This year I found myself re-engaging with a friend I lost > touch with a decade ago, who found me on a mailing list and figured to > drop an email and see if maybe I was the same Rob Hansen she knew from > back when. If my MUA/MTA had hidden it from me just because there was > no introduction, or urged me to delete it without reading... > > Could email as a platform survive the shift to introduction-based > systems? Sure. But it would totally transform the email experience, > and maybe in ways we wouldn't like. That's why I'm so skeptical of > proposals to fix email in this way: we might fix email, but we might > also kill it at the same time.
It’s completely true. However Mark’s right when saying it could help to do it client-side: client-side, you can access *all* private (meta)data on user without any privacy problem, and use it to better detect what’s a spam, and actually that would be really useful (isn’t it really easy for you personally, who know yourself, to detect if something is a spam or a message really adressed to you?). As he said, contacts are useful. So yes, roughly filtering spam from not-yet-introduced friends lacks flexibility and destroy several email nice features. But we can do thiner: lower the score given with bayesian autostabilizating equations. >> Again, if it's provably from no one you say that you trust, the MUA >> could refuse to execute runnable content without explicit >> permission. (Which I say should be the normal and only setting for >> all content, but I know I'm a crank.) > > It already is. Double-click on an executable attachment and a window > will pop up with a warning about how you should only run code from > people you know and trust, click "OK" to cancel running this, click "I > know the risks" to run it, etc. > > An awful lot of people click "I know the risks." A longer text explaining “you giving this program the authorization to do what it wants with your data and configuration, including destroying, corrupting, stealing, spying, reveling anything”. But the true solution is this one: use only free software, software you’re sure you can check the sources. Even more: having build information, sources and binary signed cryptographically. Even more: being sure this binary is made with reproducible builds. Even more: everything of that available trough a censorship-resistant P2P filesharing system. > He said that of all the outcomes he imagined for his Ph.D., he never > dreamed that it would be that his research could be accurately summed > up as, "the technology works fine, it's *people* who are completely > broken." Yeah, we need interdisciplinarism: a great part of work to change the world, added to technical progress, is education. It’s maybe *the* biggest and most important thing. Sometimes you don’t need to adapt to the society but adapt the society to you and people: “The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.” — George Bernard Shaw
signature.asc
Description: PGP signature
_______________________________________________ Gnupg-users mailing list [email protected] http://lists.gnupg.org/mailman/listinfo/gnupg-users
