Since I didn't see this being mentioned anywhere, I thought I'd mention it...

On 7/4/2011 1:58 PM, Andrei Alexandrescu wrote:
On 7/4/11 2:48 PM, bearophile wrote:
In this case you wrap the code in something that allows it to
overflow without errors, like:

unsafe(overflows) { // code here }

This approach has a number of issues. First, addressing transitivity is
difficult. If the code in such a scope calls a function, either every
function has two versions, or chooses one way to go about it. Each
choice has obvious drawbacks.

C# chooses to limit the scope to the current function, and it works pretty well. The use is to modify the behavior of the *operators*, and hence, there's no transitivity issue because that's just not what it's used for.

Second, programmers are notoriously bad at choosing which code is
affecting bottom line performance, yet this feature explicitly puts the
burden on the coder. So code will be littered with amends, yet still be
overall slower. This feature has very poor scalability.

Actually, there is **NO** performance issue -- at least not in C#. In fact, if you run this program (with or without optimizations), you will see that they're literally the same almost all the time:

    using System;

    static class Program
    {
static long globalVar = 0; //Make it static so it doesn't get optimized
        static void Main()
        {
            const long COUNT = 100000000;
            for (;;)
            {
                var start = Environment.TickCount;
                for (long i = 0; i < COUNT; i++)
                    checked { globalVar = i * i; }
System.Console.WriteLine("Checked: {0}", Environment.TickCount - start);

                start = Environment.TickCount;
                for (long i = 0; i < COUNT; i++)
                    unchecked { globalVar = i * i; }
System.Console.WriteLine("Unchecked: {0}", Environment.TickCount - start);
            }
        }
    }

There is literally no performance issue. Ever.


Just my 2c

Reply via email to