Adding clarification since I forgot to specify - this would be per-sender per-source. Not a set percentage of all mail received from a source, that obviously would not work as intended.

On 12/8/2022 6:52 AM, Mark Alley wrote:

This may have been thought of before, so forgive the potentially duplicate idea, I was musing earlier about feedback reporting based on a percent of the overall mail per-source. I'm thinking of something similar in concept to the pct= tag for published policy.

This would reduce the overhead required to report from particular sources... But as I'm typing this idea out, this seems less than feasible due to the other considerations that come to mind; If a receiver designed to report only on 10% of mail received from a source, was sent 100 emails from said source, and the 80 of those emails of mail were forwards, the feedback would be overwhelmingly biased towards forwarding data, and the sender would miss out on reports from direct senders and therefore fully compliant (and arguably more useful) reports.

Evolving on this thought, if a receiver reported subset percentages of all different types of compliant/non-compliant email per-source (SPF fails/DKIM passes, SPF passes/DKIM fails... etc, etc.) this might provide the data needed while still keeping the reporting volume manageable for less internet-scale receivers.

Though, it goes without saying, this type of reporting would be woefully inadequate in terms of data availability, and only gives an idea of traffic types seen, not inclusive of all-encompassing volumetric data that could be derived normally from feedback reporters that process all emails.


On 12/8/2022 12:58 AM, Douglas Foster wrote:

1) DMARC was a successful 2-company experiment, which was turned into a widely implemented informational RFP.   We are now writing the standards-track version of that concept.  We hope that Standards Track will provide the basis for significantly increased adoption.  This seems the appropriate time to ask whether the design can be optimized for efficiency.  If you were designing from scratch, would this reporting design be the result?   What alternatives have we considered and ruled out?

2) The burden of reporting is not experienced equally by all report senders.   If I send a batch of messages from 1 source domain to: - 10 target domains at Google, I will get 1 report, because Google consolidates across target domains. - 10 target domains at Yahoo, I will get 10 reports, because Yahoo chooses to disaggregate by target domain. - 10 target domains to Ironport clients, I will get 20 or 30 reports.    These are client-specific appliances, many clients have multiple appliances configured in parallel for load balancing, and each appliance produces its own report.

Google presumably can dedicate servers to the reporting function, while the Ironport servers seem to generate reports in parallel with message processing.   Altogether, I conclude that Google can absorb an increase in workload much more easily than an appliance

3) The burden of reporting is not shared equally at present.   Substantially all of my reporting comes from the three sources just stated:  Google, Yahoo, and Ironport appliances.   Since these organizations have not been actively participating, perhaps you are right and they are happy with the present design.   On the other hand, perhaps someone with connections should ask them whether they want to see optimizations.

4) As DMARC participation grows, the growth curve is not really linear.  Currently, 40% of my mailstream is covered by DMARC reporting because more than 30% of my outbound mail goes to Google servers.   Altogether, the number of reporting domains, from all sources, is somewhere around 40.   To move reporting from 40% of messages to 40% of domains, the volume of reports will grow by orders of magnitude.

5) Which then raises the question of, "Who do we expect to do reporting?"    Several participants in this group have expressed the conviction that everyone who benefits from DMARC should also contribute to DMARC by doing reporting. This seems fair, but it is probably not necessary.  Reporting from Google alone is probably sufficient for domain owners to know whether or not their servers are properly configured.    But as long as we want everyone to participate, we cannot assume that everyone will have Google's resources to contribute to the reporting task.

All of which says to me that we should be looking to optimize the reporting function to minimize the cost of participation.

Doug Foster


On Tue, Dec 6, 2022 at 10:15 PM Seth Blank <s...@sethblank.com> wrote:

    I'm super unclear what you're talking about.

    https://dmarc.org/2022/03/dmarc-policies-up-84-for-2021/

    Aggregate reporting is used by the largest volume senders on
    earth, and the vast majority of mail received by mailbox
    providers comes with a dmarc record and reporting address attached.

    This is umpteen billions of messages a day that get aggregated
    into reports.

    What are you getting at? That seems pretty internet scale to me...

    Seth

    On Mon, Dec 5, 2022 at 2:01 PM Douglas Foster
    <dougfoster.emailstanda...@gmail.com> wrote:

        I began wondering if Aggregate Reporting works only because
        DMARC has been embraced by a small portion of domain owners.

        1) Is Aggregate Reporting a significant portion of all mail? 
        In some cases, Yes.

        My organization's data:
        Inbound volume is 11 times greater than my outbound volume.
        Inbound mail has 1 new domain for every 5 messages

        Net result:   If I were to do reporting, and reporting became
        requested for most or all domains, my outbound mail volume
        would triple, because my outbound report volume would be
        twice as large as my outbound business mail volume.

        2) Is Aggregate Reporting efficient?   Restating previous
        concerns:

        "All Signature" reporting means:
        We keep evaluating even after successful authentication has
        been established,
        so that we can capture and store data of little actual value,
        even though it causes reduced aggregation and longer reports.

        "No Problems found, No changes found" reporting means:
        We send redundant reports day after day.

        "All Requesters" reporting means:
        We send reports even to domain owners that were blocked
        because of domain reputation.

        A good place to start would be to extend the reporting
        interval for no-problem-found reports.

        Doug Foster


        _______________________________________________
        dmarc mailing list
        dmarc@ietf.org
        https://www.ietf.org/mailman/listinfo/dmarc


_______________________________________________
dmarc mailing list
dmarc@ietf.org
https://www.ietf.org/mailman/listinfo/dmarc
_______________________________________________
dmarc mailing list
dmarc@ietf.org
https://www.ietf.org/mailman/listinfo/dmarc

Reply via email to