Erm, I'm lost as to how a static document (a map, in your analogy) can do 
something active like "diverge" from something.  It must therefore be the 
evolving thing, i.e., reality (the territory, in your analogy), that is 
diverging.

Specification documents like standards track RFCs do define rules.  The 
dictionary defines "protocol" as "a set of conventions governing the treatment 
and especially the formatting of data in an electronic communications system."  
That sounds like rules to me.  "Enforcement" comes from strict adherence by 
both ends of a transaction, and rejection of things that don’t also adhere.

Each portion of a standard provides some capability, but you're only certain to 
get the benefits of that capability if you implement according to the standard. 
 Slack implementations create ambiguities, and I've yet to hear of a case where 
introducing ambiguity is of benefit other than to the developer or author who 
got to save some keystrokes.

I am very aware that some standards are hard to follow, either due to poor 
writing or complexity, or the difficulty of dealing with layers of protocol and 
legacy implementations.  I empathize with people in that position, but this 
does not change the fact that laziness benefits approximately nobody.  Poor 
implementations create pressure that eventually causes the tightness of the 
system as a whole to sag, and that's what leads us to the situation we're in 
today.

Now, if there's a problem with the standards or reality (e.g., shifting 
priorities) has evolved sufficiently that they need updating, then there exist 
public processes available to any comer for amending them.  If they're broken 
or obsolete, let's fix them.  But if that isn't happening, maybe the blame 
isn't rightly placed there after all.

-MSK

From: Roland Turner 
<[email protected]<mailto:[email protected]>>
Date: Sunday, August 4, 2013 10:54 PM
To: Murray Kucherawy <[email protected]<mailto:[email protected]>>
Cc: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: Re: [dmarc-discuss] multiple from

Hi Murray,

On 07/18/2013 12:40 AM, Murray Kucherawy wrote:

On 7/16/13 8:46 PM, "Roland Turner" 
<[email protected]><mailto:[email protected]> wrote:


Any time an RFC and reality diverge, it it the RFC that is
reality-ignorant, not reality that is RFC-ignorant.

If it happens that the DMARC specification reflects reality better than
existing RFCs - even standards track ones - then once again, it is those
RFCs that are in error, not the DMARC specification.


I don't agree with the first generalization.  RFCs specifying the format
of an Internet message have existed for a really long time.  It's reality
that decided to diverge,

Erm, if the map and the territory diverge then it's the map that's incorrect. 
RFCs are the map, not the territory.

RFCs do tend to reflect distilled wisdom, but to the extent that they embody 
designs that require practitioners to go beyond acting in their own interests 
then they can pretty reliably be expected to be incorrect, at least in the 
sense that they're presenting to readers a view that doesn't match actual 
practice.


largely out of laziness:

Sure. Laziness (/efficiency/different-priorities/...) is part of the reality 
that Internet protocols operate in. It is desirable that they be designed and 
documented to deal with this.


Email generating code
would be sloppy and cut corners, and user pressure caused mail submission
agents and other services to tolerate it rather than be strict about it.

Erm, user interests (/pressure) are not a distorting force in protocol design, 
they are the purpose of it.


We're left with a system where lots of software now supports the lazy
implementations.

This is both common sense, and desirable from a robustness perspective.


There's a draft making its way through the IETF now that describes this
situation, pleads with software to become strict once again, and then goes
into a list of common malformations and provides advice about how to
handle or interpret them.  But even that advice about safe handling
doesn't render those messages compliant; they are still broken.

If it depends for its approach upon pleading then, presumably, it depends for 
its success upon practitioners doing things that don't actually advance their 
interests. I'd suggest that the odds of success for such an approach aren't 
great. (If writing RFCs were the solution, then the problem would already have 
been solved.)

I've not read the draft (are you talking about a successor to 
draft-kucherawy-mta-malformed-03?), I'd hope that it's at least pointing out 
"you will make your own life better by doing X" in situations where that's 
possible.


DMARC's acknowledgement of reality doesn't make those RFCs wrong,

DMARC doesn't affect the correctness of the underlying protocols at all, their 
own [mis-]match with reality does.


nor does
it excuse various components' lax enforcement of the rules.

The use of terms like "enforcement" and "rules" is a large part of the problem 
because of the confusion with actual rules and actual enforcement (roughly: 
armed officers turning up to compel you) that this creates. Granted, I don't 
have better concise terms to suggest. Outside of those who are contractually or 
legally bound, no-one needs an excuse for the non-implementation of the actions 
described by a specific protocol specification. Well designed protocols 
describe the ways that practitioners tend to implement them, poorly designed 
ones describe something else.

I recognise of course that these decisions aren't actually being made in quite 
the way that I describe in that Internet protocols aren't [generally] executed 
by humans, consequently their execution is controlled by bodies of software 
which are typically reused and that tend to do things the same way repeatedly. 
An important consequence is that implementers will often pick up a component 
that appears to do what they intend to do and not look too closely at the 
details, because doing so would be wasteful (the benefit to be gained 
individually is almost certainly smaller than the resource cost to do so). I 
suspect that most of the progress that's available to be made is to identify 
these frequently reused components that are impeding most progress and work 
with their maintainers to get the issues resolved:

  *   For open-source components this typically means providing patches.
  *   For closed-source components this usually means making the economic case 
to the vendor organisation at a considerably higher level in the organisation 
than the implementer of a piece of software that generates, processes or 
consumes email and therefore at a higher level than that of anyone who's likely 
to read RFCs.

More RFCs, particularly those structured around encouraging compliance, rather 
than making the self-interest case for the maintainers of the relevant code, 
don't seem relevant to either of these cases.

- Roland


--
  Roland Turner | Director, Labs
  TrustSphere Pte Ltd | 3 Phillip Street #13-03, Singapore 048693
  Mobile: +65 96700022 | Skype: roland.turner
  [email protected]<mailto:[email protected]> | 
http://www.trustsphere.com/<https://urldefense.proofpoint.com/v1/url?u=http://www.trustsphere.com/&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=aZ0xRzXh0AB20HBCmRph%2Bg%3D%3D%0A&m=%2BlJSp7Vlo374BNNjpOntbbLk%2F%2BZLNK9jJ1vZCAiwoMs%3D%0A&s=b06d20f29239df2f60afcb881dfbc8360866ca75212e0683267c857cb5433423>
_______________________________________________
dmarc-discuss mailing list
[email protected]
http://www.dmarc.org/mailman/listinfo/dmarc-discuss

NOTE: Participating in this list means you agree to the DMARC Note Well terms 
(http://www.dmarc.org/note_well.html)

Reply via email to