Hi,

On Wed, Oct 1, 2025 at 5:13 AM Miguel Ojeda
<[email protected]> wrote:
[...]
>   - My biggest concern is the overflow caveat, which is a fairly big
> one if one, especially if one is dealing with runtime values.
>
>     Can we do better? Accessing the discriminant via `as` is available
> in const context, and you already have every variant easily available,
> so can we check that every variant fits in the relevant target types?
>
>     For instance, by creating some item-level const blocks
> (`static_assert!`s) -- dummy example for an unsigned-to-unsigned case:
>
>         const _: () = { assert! (E::A as u128 <= u8::MAX as u128); };
>         const _: () = { assert! (E::B as u128 <= u8::MAX as u128); };
>         ...
>
>     and so on. There may be better ways to do it -- I just quickly
> brute forced it that unsigned case with what you already had for
> handling variants:
>
>         variants.iter().map(|variant| {
>             quote! {
>                 const _: () = { assert!(#enum_ident::#variant as u128
> <= #ty::MAX as u128); };
>             }
>         });
>
>     Maybe this was already discussed elsewhere and there is a reason
> not to do something like this, but it seems to me that we should try
> to avoid that risk.

Thanks, I see your point, and I agree that compile-time checking for
potential overflows is a better and safer approach.

That said, it becomes a bit trickier when dealing with conversions
between signed and unsigned types, particularly when `u128` and `i128`
are involved. For example:

    #[derive(TryFrom, Into)]
    #[try_from(u128)]
    #[into(u128)]
    #[repr(i128)]
    enum MyEnum {
        A = 0xffffffffffffffff0, // larger than u64::MAX
        B = -1
    }

In this case, since there's no numeric type that can encompass both
`u128` and `i128`, I don't think we can express a compile-time
assertion like the one you suggested. While such edge cases involving
128-bit numeric types are unlikely in practice, the broader challenge
is that, in signed-to-unsigned conversions, I think it's difficult to
detect overflows using only the `repr` type, the target type, and the
discriminant value interpreted as the target type (please correct me if
I've misunderstood something here).

I'm considering an alternative approach: performing these checks while
parsing the macro inputs, to handle all combinations of `repr` and
target types (such as `u128` in the above example) in a unified way. I
believe this makes the behavior easier to reason about and better
covers edge cases like conversions between `i128` and `u128`. For
example:

    const U128_ALLOWED: [&str; 9] =
        ["u8", "i8", "u16", "i16", "i32", "u32", "u64", "i64", "u128"];
    const I128_ALLOWED: [&str; 9] =
        ["u8", "i8", "u16", "i16", "i32", "u32", "u64", "i64", "i128"];
    ...

    // Use this function after parsing `#[repr(...)]`, `#[into(u128)]`
    // and `#[try_from(...)]`.
    fn check_overflow(
        discriminant_repr: &str,
        helper_inputs: Vec<&str>
    ) -> Vec<&str> {
        let mut violations = Vec::new();
        if discriminant_repr == "u128" {
            for ty in helper_inputs.iter() {
                if !U128_ALLOWED.contains(&ty) {
                    violations.push(ty);
                }
            }
        } else if discriminant_repr == "i128" {
            for ty in helper_inputs.iter() {
                if !I128_ALLOWED.contains(&ty) {
                    violations.push(ty);
                }
            }
        }
        ...
        violations
    }

This is a rough sketch, but it gives a consistent way to reject
obviously invalid combinations early during parsing. I'd appreciate
your thoughts -- does this approach seem reasonable to you as well?

>   - On other news, I will post in the coming days the `syn` patches,
> and my plan is to merge them for next cycle, so when those are out,
> Benno thought you could give them a go (we can still merge this with
> your current approach and then convert, but still, more `syn` users
> and converting the existing macros would be nice :).
>
>     (By the way, the linked patches about converting the existing
> macros to `syn` are an RFC in the sense that they cannot be applied,
> but having `syn` itself is something we already agreed on a long time
> ago.)

Sounds good -- I'd be happy to give `syn` a try. It should simplify
the parsing logic quite a bit, and I believe it'll also make things
easier for reviewers.

>   - Couple nits: typo arise -> arises, and I would do `repr-rust`
> instead of `repr-rs` since that is the anchor in the reference that
> you are linking.

Thanks, I'll fix them in v3.

Best Regards,
Jesung

>
> Thanks a lot!
>
> Cheers,
> Miguel

Reply via email to