http://d.puremagic.com/issues/show_bug.cgi?id=7177
--- Comment #12 from Jonathan M Davis <[email protected]> 2013-03-21 07:05:45 PDT --- If you have a type which defines length and has slicing or indexing, does it make any sense whatsoever for length _not_ to be the same as opDollar? I'd strongly argue that any type (be it a range or otherwise) which had indexing or slicing and had length and made opDollar something else other than length (or length something else than one past the last index) would be very badly designed. If we don't make it so that length automatically aliases to opDollar if opDollar isn't already defined, then almost every single finite random-access range ever will have to manually alias length to opDollar, and we'll never be able to require that ranges with slicing define opDollar, because it would break too much code if we did. I see no downside to this and huge problems if we don't do this. This is the key to being able to rely on opDollar existing for ranges types that should define it. -- Configure issuemail: http://d.puremagic.com/issues/userprefs.cgi?tab=email ------- You are receiving this mail because: -------
