On Tuesday, January 24, 2017 11:50:16 Rene Zwanenburg via Digitalmars-d- learn wrote: > On Tuesday, 24 January 2017 at 11:38:16 UTC, Jonathan M Davis > > wrote: > > Likely because it does bounds checking, so you at least know > > that it's not null. But I don't see why that would really > > improve much considering that the odds are that you're really > > going to be accessing far more than just the first element with > > the pointer. It seems _slightly_ better from a safety > > perspective but only slightly. So, I don't know what the point > > is in suggesting it as an alternative. > > > > - Jonathan M Davis > > Pointer arithmetic is forbidden in @safe code so that's not a > problem. The reason this was introduced was indeed bounds > checking. For example: > > @safe: > > int parse(ref char[] input) > { > // Pop all numeric characters from the front of the input slice > and convert to int > } > > void main() > { > auto input = "123".dup; > parse(input); > // Since all numeric chars have been popped, input is now > effectively input[$ .. $]. > // This means input.ptr is pointing past the end of the array. > writeln(input.ptr); // Out of bounds access > }
Sure, there can be problems with .ptr. It's not necessarily a problem that it's not @safe. But doing &arr[0] instead of arr.ptr is almost pointless. All it does is verify that the array isn't null or empty. If you're doing arr.ptr, you're almost certainly passing it to C code, and that code will almost certainly read well past the arr[0]. So, while it makes sense to say that .ptr can't be used in @safe code, it really doesn't make sense to suggest &arr[0] as an alternative. - Jonathan M Davis