Re: [webkit-dev] Unsigned to avoid negative values
On Thu, Jan 26 2023 at 12:31:25 AM -0800, Myles Maxfield via webkit-dev wrote: Okay, sounds like we’re all pretty much in agreement. How about I add a rule to our style guide that says “use unsigned types to represent values which cannot be negative.” Good idea? This is pretty strict. ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
I agree with Alex. Underflow is just as bad as overflow. —Myles > On Jan 26, 2023, at 8:12 PM, Alex Christensen wrote: > > If you are subtracting things that have not been verified to produce a > positive value, then you hopefully aren’t dealing with values that can’t be > negative, so this wouldn’t apply then. Forgetting to verify things is a bug > in many places. I also think that a buffer offset of -1 is just about as bad > as a buffer offset of 4294967295. > >> On Jan 26, 2023, at 6:44 PM, Simon Fraser via webkit-dev >> wrote: >> >> Late to the party but…. >> >> Avoiding unsigned is usually recommended to avoid inadvertent underflow: >> >> unsigned big = 200; >> unsigned small = 100; >> auto result = small - big; // underflow >> >> This is particularly bad when doing math on buffer offsets and sizes, and >> can result in OOB bugs. I believe Apple’s media frameworks code has a “no >> unsigned usage” rule because of that. I’m surprised that no-one has raised >> it in this discussion. >> >> Simon >> >>> On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev >>> wrote: >>> >>> Hello! >>> >>> I recently learned that the C++ core guidelines recommend against using >>> unsigned to avoid negative values. Section 4.4 on page 73 of The C++ >>> Programming Language says unsigned types should be used for bitfields and >>> not in an attempt to ensure values are positive. Some talks by people on >>> the C++ standards committee (e.g., Herb Sutter) recommend against using >>> unsigned types simply because the value is expected to by positive. >>> >>> Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds >>> all over the place, and I’m assuming a fair many of them are there to >>> indicate that negative values are avoided. The C++ recommendation goes >>> against my intuition that the type is there for clarity, to indicate >>> expectations about the meaning and behavior of its value. But if it’s >>> standard practice to just use int instead, perhaps we should update the >>> style guide? >>> >>> What do you think? >>> >>> —Myles >>> ___ >>> webkit-dev mailing list >>> webkit-dev@lists.webkit.org >>> https://lists.webkit.org/mailman/listinfo/webkit-dev >> >> ___ >> webkit-dev mailing list >> webkit-dev@lists.webkit.org >> https://lists.webkit.org/mailman/listinfo/webkit-dev > ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
If you are subtracting things that have not been verified to produce a positive value, then you hopefully aren’t dealing with values that can’t be negative, so this wouldn’t apply then. Forgetting to verify things is a bug in many places. I also think that a buffer offset of -1 is just about as bad as a buffer offset of 4294967295. > On Jan 26, 2023, at 6:44 PM, Simon Fraser via webkit-dev > wrote: > > Late to the party but…. > > Avoiding unsigned is usually recommended to avoid inadvertent underflow: > > unsigned big = 200; > unsigned small = 100; > auto result = small - big; // underflow > > This is particularly bad when doing math on buffer offsets and sizes, and can > result in OOB bugs. I believe Apple’s media frameworks code has a “no > unsigned usage” rule because of that. I’m surprised that no-one has raised it > in this discussion. > > Simon > >> On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev >> wrote: >> >> Hello! >> >> I recently learned that the C++ core guidelines recommend against using >> unsigned to avoid negative values. Section 4.4 on page 73 of The C++ >> Programming Language says unsigned types should be used for bitfields and >> not in an attempt to ensure values are positive. Some talks by people on the >> C++ standards committee (e.g., Herb Sutter) recommend against using unsigned >> types simply because the value is expected to by positive. >> >> Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds >> all over the place, and I’m assuming a fair many of them are there to >> indicate that negative values are avoided. The C++ recommendation goes >> against my intuition that the type is there for clarity, to indicate >> expectations about the meaning and behavior of its value. But if it’s >> standard practice to just use int instead, perhaps we should update the >> style guide? >> >> What do you think? >> >> —Myles >> ___ >> webkit-dev mailing list >> webkit-dev@lists.webkit.org >> https://lists.webkit.org/mailman/listinfo/webkit-dev > > ___ > webkit-dev mailing list > webkit-dev@lists.webkit.org > https://lists.webkit.org/mailman/listinfo/webkit-dev ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
Late to the party but…. Avoiding unsigned is usually recommended to avoid inadvertent underflow: unsigned big = 200; unsigned small = 100; auto result = small - big; // underflow This is particularly bad when doing math on buffer offsets and sizes, and can result in OOB bugs. I believe Apple’s media frameworks code has a “no unsigned usage” rule because of that. I’m surprised that no-one has raised it in this discussion. Simon > On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev > wrote: > > Hello! > > I recently learned that the C++ core guidelines recommend against using > unsigned to avoid negative values. Section 4.4 on page 73 of The C++ > Programming Language says unsigned types should be used for bitfields and not > in an attempt to ensure values are positive. Some talks by people on the C++ > standards committee (e.g., Herb Sutter) recommend against using unsigned > types simply because the value is expected to by positive. > > Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds all > over the place, and I’m assuming a fair many of them are there to indicate > that negative values are avoided. The C++ recommendation goes against my > intuition that the type is there for clarity, to indicate expectations about > the meaning and behavior of its value. But if it’s standard practice to just > use int instead, perhaps we should update the style guide? > > What do you think? > > —Myles > ___ > webkit-dev mailing list > webkit-dev@lists.webkit.org > https://lists.webkit.org/mailman/listinfo/webkit-dev ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
https://github.com/WebKit/WebKit/pull/9199 > On Jan 26, 2023, at 12:31 AM, Myles Maxfield via webkit-dev > wrote: > > Okay, sounds like we’re all pretty much in agreement. > > How about I add a rule to our style guide that says “use unsigned types to > represent values which cannot be negative.” > > Good idea? > >> On Jan 25, 2023, at 4:11 PM, Alex Christensen wrote: >> >> If a value represents a size or a count or something that inherently cannot >> be negative, I strongly prefer using unsigned types. It reduces the number >> of places where we need to ask ourselves “what if it’s negative?” when it >> can never be negative, leading to more straightforward code that doesn’t >> have to handle impossible cases. It also eliminates the possibility of >> malicious content somehow incrementing a signed 32 bit integer past its >> maximum value and executing code with unexpected negative values used in >> signed comparison operations. >> On Jan 24, 2023, at 11:44 AM, Ryosuke Niwa via webkit-dev wrote: > On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev > wrote: I recently learned that the C++ core guidelines recommend against using unsigned to avoid negative values. Section 4.4 on page 73 of The C++ Programming Language says unsigned types should be used for bitfields and not in an attempt to ensure values are positive. Some talks by people on the C++ standards committee (e.g., Herb Sutter) recommend against using unsigned types simply because the value is expected to by positive. Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds all over the place, and I’m assuming a fair many of them are there to indicate that negative values are avoided. The C++ recommendation goes against my intuition that the type is there for clarity, to indicate expectations about the meaning and behavior of its value. But if it’s standard practice to just use int instead, perhaps we should update the style guide? What do you think? >>> >>> I don’t think we should change our coding style guidelines just because C++ >>> core guideline says something. >>> >>> - R. Niwa >>> >>> ___ >>> webkit-dev mailing list >>> webkit-dev@lists.webkit.org >>> https://lists.webkit.org/mailman/listinfo/webkit-dev >> > ___ > webkit-dev mailing list > webkit-dev@lists.webkit.org > https://lists.webkit.org/mailman/listinfo/webkit-dev ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
Okay, sounds like we’re all pretty much in agreement. How about I add a rule to our style guide that says “use unsigned types to represent values which cannot be negative.” Good idea? > On Jan 25, 2023, at 4:11 PM, Alex Christensen wrote: > > If a value represents a size or a count or something that inherently cannot > be negative, I strongly prefer using unsigned types. It reduces the number > of places where we need to ask ourselves “what if it’s negative?” when it can > never be negative, leading to more straightforward code that doesn’t have to > handle impossible cases. It also eliminates the possibility of malicious > content somehow incrementing a signed 32 bit integer past its maximum value > and executing code with unexpected negative values used in signed comparison > operations. > >>> On Jan 24, 2023, at 11:44 AM, Ryosuke Niwa via webkit-dev >>> wrote: >>> >>> On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev wrote: >>> >>> I recently learned that the C++ core guidelines recommend against using >>> unsigned to avoid negative values. Section 4.4 on page 73 of The C++ >>> Programming Language says unsigned types should be used for bitfields and >>> not in an attempt to ensure values are positive. Some talks by people on >>> the C++ standards committee (e.g., Herb Sutter) recommend against using >>> unsigned types simply because the value is expected to by positive. >>> >>> Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds >>> all over the place, and I’m assuming a fair many of them are there to >>> indicate that negative values are avoided. The C++ recommendation goes >>> against my intuition that the type is there for clarity, to indicate >>> expectations about the meaning and behavior of its value. But if it’s >>> standard practice to just use int instead, perhaps we should update the >>> style guide? >>> >>> What do you think? >> >> I don’t think we should change our coding style guidelines just because C++ >> core guideline says something. >> >> - R. Niwa >> >> ___ >> webkit-dev mailing list >> webkit-dev@lists.webkit.org >> https://lists.webkit.org/mailman/listinfo/webkit-dev > ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
If a value represents a size or a count or something that inherently cannot be negative, I strongly prefer using unsigned types. It reduces the number of places where we need to ask ourselves “what if it’s negative?” when it can never be negative, leading to more straightforward code that doesn’t have to handle impossible cases. It also eliminates the possibility of malicious content somehow incrementing a signed 32 bit integer past its maximum value and executing code with unexpected negative values used in signed comparison operations. > On Jan 24, 2023, at 11:44 AM, Ryosuke Niwa via webkit-dev > wrote: > > >> On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev >> wrote: >> >> I recently learned that the C++ core guidelines recommend against using >> unsigned to avoid negative values. Section 4.4 on page 73 of The C++ >> Programming Language says unsigned types should be used for bitfields and >> not in an attempt to ensure values are positive. Some talks by people on the >> C++ standards committee (e.g., Herb Sutter) recommend against using unsigned >> types simply because the value is expected to by positive. >> >> Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds >> all over the place, and I’m assuming a fair many of them are there to >> indicate that negative values are avoided. The C++ recommendation goes >> against my intuition that the type is there for clarity, to indicate >> expectations about the meaning and behavior of its value. But if it’s >> standard practice to just use int instead, perhaps we should update the >> style guide? >> >> What do you think? > > I don’t think we should change our coding style guidelines just because C++ > core guideline says something. > > - R. Niwa > > ___ > webkit-dev mailing list > webkit-dev@lists.webkit.org > https://lists.webkit.org/mailman/listinfo/webkit-dev ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
> On Jan 24, 2023, at 2:00 AM, Myles Maxfield via webkit-dev > wrote: > > I recently learned that the C++ core guidelines recommend against using > unsigned to avoid negative values. Section 4.4 on page 73 of The C++ > Programming Language says unsigned types should be used for bitfields and not > in an attempt to ensure values are positive. Some talks by people on the C++ > standards committee (e.g., Herb Sutter) recommend against using unsigned > types simply because the value is expected to by positive. > > Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds all > over the place, and I’m assuming a fair many of them are there to indicate > that negative values are avoided. The C++ recommendation goes against my > intuition that the type is there for clarity, to indicate expectations about > the meaning and behavior of its value. But if it’s standard practice to just > use int instead, perhaps we should update the style guide? > > What do you think? I don’t think we should change our coding style guidelines just because C++ core guideline says something. - R. Niwa ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
On Tue, Jan 24, 2023 at 11:00 AM Myles Maxfield via webkit-dev wrote: > What do you think? What this immediately made me think of is that Web IDL and the web platform at large use unsigned and signed integers of various types. And as those have different value spaces you'd notice if you do something near the limits and did not carefully account for that. ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
Hi, What’s the benefit? I don’t think we should be changing our long-time coding practices unless there are clear benefits from doing so. From your email, it is not yet clear to me what those benefits would be. Chris. > On Jan 24, 2023, at 6:58 AM, Michael Catanzaro via webkit-dev > wrote: > > On Tue, Jan 24 2023 at 02:00:11 AM -0800, Myles Maxfield via webkit-dev > wrote: >> What do you think? > > This has been a best practice for a long time now. It's a good rule to reduce > bugs if adopted consistently, but I also fear that if we were to try to adapt > existing WebKit code to follow these guidelines, we might accidentally > introduce a lot of new bugs, especially regarding container types like > Vector. So I don't know what to think! > > > ___ > webkit-dev mailing list > webkit-dev@lists.webkit.org > https://lists.webkit.org/mailman/listinfo/webkit-dev ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
On Tue, Jan 24 2023 at 02:00:11 AM -0800, Myles Maxfield via webkit-dev wrote: What do you think? This has been a best practice for a long time now. It's a good rule to reduce bugs if adopted consistently, but I also fear that if we were to try to adapt existing WebKit code to follow these guidelines, we might accidentally introduce a lot of new bugs, especially regarding container types like Vector. So I don't know what to think! ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
Re: [webkit-dev] Unsigned to avoid negative values
I could have sworn reading a few years ago a white paper discussing signed vs unsigned discussed with Blink coding style showing that using unsigned had a performance impact. Of course, now I can’t find reference to it. But I clearly recall recommendations like you mentioned. Sent from my iPhone > On 24 Jan 2023, at 9:00 pm, Myles Maxfield via webkit-dev > wrote: > > Hello! > > I recently learned that the C++ core guidelines recommend against using > unsigned to avoid negative values. Section 4.4 on page 73 of The C++ > Programming Language says unsigned types should be used for bitfields and not > in an attempt to ensure values are positive. Some talks by people on the C++ > standards committee (e.g., Herb Sutter) recommend against using unsigned > types simply because the value is expected to by positive. > > Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds all > over the place, and I’m assuming a fair many of them are there to indicate > that negative values are avoided. The C++ recommendation goes against my > intuition that the type is there for clarity, to indicate expectations about > the meaning and behavior of its value. But if it’s standard practice to just > use int instead, perhaps we should update the style guide? > > What do you think? > > —Myles > ___ > webkit-dev mailing list > webkit-dev@lists.webkit.org > https://lists.webkit.org/mailman/listinfo/webkit-dev ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev
[webkit-dev] Unsigned to avoid negative values
Hello! I recently learned that the C++ core guidelines recommend against using unsigned to avoid negative values. Section 4.4 on page 73 of The C++ Programming Language says unsigned types should be used for bitfields and not in an attempt to ensure values are positive. Some talks by people on the C++ standards committee (e.g., Herb Sutter) recommend against using unsigned types simply because the value is expected to by positive. Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds all over the place, and I’m assuming a fair many of them are there to indicate that negative values are avoided. The C++ recommendation goes against my intuition that the type is there for clarity, to indicate expectations about the meaning and behavior of its value. But if it’s standard practice to just use int instead, perhaps we should update the style guide? What do you think? —Myles ___ webkit-dev mailing list webkit-dev@lists.webkit.org https://lists.webkit.org/mailman/listinfo/webkit-dev