Hi Roman, thanks for the review, feedback and comments. See responses inline:

On Wed, Jan 21, 2026 at 6:32 PM Roman Danyliw via Datatracker
<[email protected]> wrote:
>
> Roman Danyliw has entered the following ballot position for
> draft-ietf-oauth-cross-device-security-14: Discuss
>
> When responding, please keep the subject line intact and reply to all
> email addresses included in the To and CC lines. (Feel free to cut this
> introductory paragraph, however.)
>
>
> Please refer to 
> https://www.ietf.org/about/groups/iesg/statements/handling-ballot-positions/
> for more information about how to handle DISCUSS and COMMENT positions.
>
>
> The document, along with other ballot positions, can be found here:
> https://datatracker.ietf.org/doc/draft-ietf-oauth-cross-device-security/
>
>
>
> ----------------------------------------------------------------------
> DISCUSS:
> ----------------------------------------------------------------------
>
> There are diverse use cases to satisfy with the guidance provided in this
> document.  In my assessment, additional consideration is needed to balance the
> listing the considerations and prescribing them in Section 2.  More
> specifically:
>
> ** Section 2.
>    1.  Implementers MUST perform a risk assessment before implementing
>        cross-device flows, weighing the risks from Cross-Device Consent
>        Phishing and Cross-Device Session Phishing attacks against
>        benefits for users.
>
> Unless there is a defined process for performing a risk assessment, it seems 
> to
> me to have little value to normatively require this step.  Without a defined
> process, nearly everything done by an implementer would be conformant.  To
> exaggerate a bit, would “just thinking about the risk for a fleeting second”
> not qualify as an assessment?
>

We modelled this text on that included in RFC 9700, which makes
similar statements about applying mitigations subject to a risk
assessments, without specifying the process itself. Being prescriptive
about the process may cause more challenges as there are different
approaches taken by different industries, organisations and
jurisdictions. The current text allows implementers to use the risk
management processes and systems they already have.

> ** Section 2.
>    4.  Implementers MUST implement practical mitigations as listed in
>        Section 6.1 that are appropriate for the use case, architecture,
>        and selected protocols.
>
> With all of the helpful guidance in 6.1, how does an implementer know which
> they MUST implement based on the use case/architecture/selected protocol.  Is
> there a deterministic (interoperable) list to arrive at what is mandatory?
>

The mitigations themselves do not change the standardised
wire-protocols and is more focused on the mitigations that are
deployed around those protocols at a system or user experience level.
The choice of which mitigations to implement comes down to the
deployment and risk assessment, which is implied from bullet #1, but
perhaps worth re-stating here - something like "Implementers MUST
select appropriate mitigations from Section 6.1 to address risks
identified in the risk assessment."  I opened and issue to capture
that: https://github.com/oauth-wg/oauth-cross-device-security/issues/255

> ** Section 2
>    5.  Implementers SHOULD implement proximity checks as defined in
>        Section 6.1.1 if possible.
>
> Doesn’t this bullet #5 conflict with bullet #4?  Bullet #4 seems to suggest
> that it is mandatory to implement all of Section 6.1 (of which proximity 
> checks
> in Section 6.1.1 is a subset) if it applies to the relevant use
> case/architecture/protocol while bullet #5 appears to say that it is merely a
> recommendation (SHOULD).  How does one implement both of these requirements at
> the same time consistently?
>

The intent with Bullet #5 was to strongly encourage implementing
proximity checks as it is one of the more effective mitigations
(within the limitations described) if at all possible. Perhaps some
word smithing would help: "Implementers SHOULD include proximity as
one of the selected mitigations as defined in Section 6.1.1 , if
possible." see issue:
https://github.com/oauth-wg/oauth-cross-device-security/issues/256

>
> ----------------------------------------------------------------------
> COMMENT:
> ----------------------------------------------------------------------
>
> Thank you to Paul Kyzivat for the GENART review.
>
> ** Section 6.1.1
>    *  Physical connectivity: This is a good indicator of proximity, but
>       requires specific ports, cables and hardware and may be
>       challenging from a user experience perspective or may not be
>       possible in certain settings (e.g., when USB ports are blocked or
>       removed for security purposes).  Physical connectivity may be
>       better suited to dedicated hardware like FIDO devices that can be
>       used with protocols that are resistant to the exploits described
>       in this document.
>
> Doesn’t requiring one to plug into something, specifically if there is a data
> channel, present some risks with certain hardware (e.g., USB)
>

Yes, but those are different kinds of risks which goes beyond consent
phishing, so providing guidance on those risks in any detail is out of
scope for the specification. Lots of phishing resistant technologies
like FIDO 2.0 uses USB connections. I can add some text to highlight
that the risks that derive from physical connectivity may be
considered as part of a risk assessment, but is out of scope for this
specification. See issue:
https://github.com/oauth-wg/oauth-cross-device-security/issues/257

> ** Section 6.1.14
>    It SHOULD be clear to the user how to decline the request.
>
> Are there usecases or desirable user experience designs where it is optimal 
> for
> the user to be unclear on how to decline a request?

>From a security perspective, no, but from a business perspective, it
depends on the incentives of the service provider. It is not uncommon
for business decision makers to think of authentication and
authorisation as "friction", resulting in so called dark patterns
where providing less context or having fewer steps becomes desirable
to ensure authorisation is collected before a user drops out of a
decision or purchase funnel. Pre-selecting the "grant" button is a
business optimisation, making the "decline" button small and hard to
find is another one. I have even seen "decline" buttons that redirects
users to content meant to dissuade them from declining). The text
specifically encourages designs where decline options are pre-selected
or given equal prominence in the UX (even when a bad decision is made
to pre-select "grant", the "decline" button needs to just as easy to
find and use).

> I also note that what is
> “clear” seems subjective given the huge variability in the “user” population.
>

Giving guidance on UX is tricky.  The sentence is meant as an
over-arching goal for the design of the UX. The concern is that
omitting it allows someone to say that they pre-select the "decline"
button, and all buttons have equal prominence, but indulge in some
other dark patterns which we may not be aware off. The text that
follows the sentence "It SHOULD be clear to the user how to decline
the request." provides more detail as it states "To avoid accidental
authorization grants, the "decline" option SHOULD be the default
option or given similar prominence in the user experience as the
"grant" option."

Perhaps some word-smithing along the lines of "It SHOULD be a user
experience design goal to provide a clear and unambigious experience
to decline a request" or "The user interface SHOULD provide an obvious
and unambiguous way for the user to decline or cancel a request." -
see issue https://github.com/oauth-wg/oauth-cross-device-security/issues/258

> ** Reference
>    [FIDOCTAP22]
>               Bradley, J., Jones, M.B., Kumar, A., Lindemann, R.,
>               Verrept, S., and D. Waite, "Client to Authenticator
>               Protocol (CTAP)", July 2025.
>
> Please provide more detail in this reference.  This citation provides no
> indication that this is a FIDO standard or which version applies.  Looking at
> https://fidoalliance.org/specifications/download/, it looks like there are
> various versions such as:
>
> V2.1 =
> https://fidoalliance.org/specs/fido-v2.1-ps-20210615/fido-client-to-authenticator-protocol-v2.1-ps-20210615.html
>
> V2.2 =
> https://fidoalliance.org/specs/fido-v2.2-ps-20250714/fido-client-to-authenticator-protocol-v2.2-ps-20250714.html
>
>

Thanks for pointing that out - this is tracked via issue:
https://github.com/oauth-wg/oauth-cross-device-security/issues/259

>

_______________________________________________
OAuth mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to