Re: Why is `scope` planned for deprecation?

2014-11-22 Thread via Digitalmars-d

On Friday, 21 November 2014 at 16:07:20 UTC, Abdulhaq wrote:
Hear, hear. One of the problems with many introductions to 
OOP-paradigmed languages such as C++ is that by having to spend 
a lot of time explaining how to implement inheritance, the 
novice reader thinks that OOP is the 'right' approach to 
solving many problems when in fact other techniques ('prefer 
composition over inheritance' springs to mind) are far more 
appropriate. This is one of the primary problems I find in code 
of even more experienced programmers.


Yes, the problem is that you should not teach OOP, but object 
oriented analysis and object oriented modelling in a language 
agnostic fashion… but you need to touch both structured and 
object oriented programming first to create motivation in the 
student for learning analysis and modelling…


The same goes for performance and complexity. You should only 
cover structured programming/abstraction in the first programming 
course with no hindsight to performance, then touch performance 
and algorithmic complexity in the second course, then do 
complexity proofs in an advanced course.


If a single course cover too much ground students get confused, 
the learning goals become hazy and you loose half your audience.


Re: Why is `scope` planned for deprecation?

2014-11-21 Thread bearophile via Digitalmars-d

Walter Bright:

Except that isn't really quicksort. Monads are the workaround 
functional languages use to deal with things that need mutation.


Take also a look at Clean language. It doesn't use monads and 
it's a very efficient functional language.


Bye,
bearophile


Re: Why is `scope` planned for deprecation?

2014-11-21 Thread Paulo Pinto via Digitalmars-d
On Friday, 21 November 2014 at 02:56:09 UTC, Andrei Alexandrescu 
wrote:

On 11/20/14 5:09 PM, Walter Bright wrote:

On 11/20/2014 3:10 PM, Ola Fosheim Grøstad
ola.fosheim.grostad+dl...@gmail.com wrote:
On Thursday, 20 November 2014 at 22:47:27 UTC, Walter Bright 
wrote:

On 11/20/2014 1:55 PM, deadalnix wrote:
All of this is beautiful until you try to implement a 
quicksort

in, haskell.


[…]


Monads!


I think Deadalnix meant that you cannot do in-place quicksort 
easily

in Haskell.


That's correct.


Non-mutating quicksort is easy, no need for monads:

quicksort [] = []
quicksort (p:xs) = (quicksort lesser) ++ [p] ++ (quicksort 
greater)

where
lesser  = filter ( p) xs
greater = filter (= p) xs

https://www.haskell.org/haskellwiki/Introduction#Quicksort_in_Haskell


Except that isn't really quicksort. Monads are the workaround 
functional

languages use to deal with things that need mutation.


As I like to say, this troika has inflicted a lot of damage on 
both FP and those beginning to learn it:


* Linear-space factorial
* Doubly exponential Fibonacci
* (Non)Quicksort

These losers appear with depressing frequency in FP 
introductory texts.



Andrei


Just like the OOP introductory books that still insist in talking 
about Cars and Vehicles, Managers and Employees, Animals and 
Bees, always using inheritance as code reuse.


Barely talking about is-a and has-a, and all the issues about 
fragile base classes.


--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-21 Thread via Digitalmars-d
On Friday, 21 November 2014 at 02:56:09 UTC, Andrei Alexandrescu 
wrote:
As I like to say, this troika has inflicted a lot of damage on 
both FP and those beginning to learn it:


* Linear-space factorial
* Doubly exponential Fibonacci
* (Non)Quicksort

These losers appear with depressing frequency in FP 
introductory texts.


Be careful with that attitude. It is an excellent strategy to 
start with the simple implementation and then move on to other 
techniques in later chapters or more advanced texts.


https://www.haskell.org/haskellwiki/The_Fibonacci_sequence
https://www.haskell.org/haskellwiki/Memoization

Some compilers are even capable of adding memoization/caching 
behind the scenes which brings naive fibonacci down to O(n) with 
no change in the source.


Also, keep in mind that non-mutating quick sort has the same 
space/time complexity as the mutating variant. The non-mutating 
variant is no doubt faster on massively parallel hardware. You 
can do quicksort on GPUs.


The landscape of performance and complexity is not so simple 
these days.


Re: Why is `scope` planned for deprecation?

2014-11-21 Thread Andrei Alexandrescu via Digitalmars-d

On 11/21/14 12:17 AM, Paulo Pinto wrote:

On Friday, 21 November 2014 at 02:56:09 UTC, Andrei Alexandrescu wrote:

On 11/20/14 5:09 PM, Walter Bright wrote:

On 11/20/2014 3:10 PM, Ola Fosheim Grøstad
ola.fosheim.grostad+dl...@gmail.com wrote:

On Thursday, 20 November 2014 at 22:47:27 UTC, Walter Bright wrote:

On 11/20/2014 1:55 PM, deadalnix wrote:

All of this is beautiful until you try to implement a quicksort
in, haskell.


[…]


Monads!


I think Deadalnix meant that you cannot do in-place quicksort easily
in Haskell.


That's correct.


Non-mutating quicksort is easy, no need for monads:

quicksort [] = []
quicksort (p:xs) = (quicksort lesser) ++ [p] ++ (quicksort greater)
where
lesser  = filter ( p) xs
greater = filter (= p) xs

https://www.haskell.org/haskellwiki/Introduction#Quicksort_in_Haskell


Except that isn't really quicksort. Monads are the workaround functional
languages use to deal with things that need mutation.


As I like to say, this troika has inflicted a lot of damage on both FP
and those beginning to learn it:

* Linear-space factorial
* Doubly exponential Fibonacci
* (Non)Quicksort

These losers appear with depressing frequency in FP introductory texts.


Andrei


Just like the OOP introductory books that still insist in talking about
Cars and Vehicles, Managers and Employees, Animals and Bees, always
using inheritance as code reuse.


The first public example found by google (oop introduction) lists a 
class Student as the first example of a class:


http://www.codeproject.com/Articles/22769/Introduction-to-Object-Oriented-Programming-Concep#Object

and IOException inheriting Exception as the first example of inheritance:

http://www.codeproject.com/Articles/22769/Introduction-to-Object-Oriented-Programming-Concep#Inheritance

First example for overriding is Complex.ToString:

http://www.codeproject.com/Articles/22769/Introduction-to-Object-Oriented-Programming-Concep#Overloading


Barely talking about is-a and has-a, and all the issues about fragile
base classes.


Even to the extent those old texts have persisted, they are only poor 
style. In contrast, the three FP example I mentioned are computationally 
bankrupt. There is really no excuse for teaching them.



Andrei



Re: Why is `scope` planned for deprecation?

2014-11-21 Thread Abdulhaq via Digitalmars-d




Just like the OOP introductory books that still insist in 
talking about Cars and Vehicles, Managers and Employees, 
Animals and Bees, always using inheritance as code reuse.


Barely talking about is-a and has-a, and all the issues about 
fragile base classes.


--
Paulo


Hear, hear. One of the problems with many introductions to 
OOP-paradigmed languages such as C++ is that by having to spend a 
lot of time explaining how to implement inheritance, the novice 
reader thinks that OOP is the 'right' approach to solving many 
problems when in fact other techniques ('prefer composition over 
inheritance' springs to mind) are far more appropriate. This is 
one of the primary problems I find in code of even more 
experienced programmers.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread Walter Bright via Digitalmars-d
On 11/18/2014 5:35 PM, Alo Miehsof Datsørg 
ola.fosheim.gros...@sucks.goat.ass wrote:

Argumentative ?!! More like a fucking gaping fucking asshole. His
posts are the blight of this group.


Rude posts are not welcome here.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread uri via Digitalmars-d
On Wednesday, 19 November 2014 at 01:35:19 UTC, Alo Miehsof 
Datsørg wrote:
On Tuesday, 18 November 2014 at 23:48:27 UTC, Walter Bright 
wrote:
On 11/18/2014 1:23 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:
I am arguing against the position that it was a design 
mistake to keep the
semantic model simple and with few presumptions. On the 
contrary, it was the
design goal. Another goal for a language like C is ease of 
implementation so

that you can easily port it to new hardware.


The proposals I made do not change that in any way, and if KR 
designed C without those mistakes, it would have not made C 
more complex in the slightest.



VLAs have been available in gcc for a long time. They are not 
useless, I've used

them from time to time.


I know you're simply being argumentative when you defend VLAs, 
a complex and useless feature, and denigrate simple ptr/length 
pairs as complicated.


Argumentative ?!! More like a fucking gaping fucking asshole. 
His

posts are the blight of this group.


Wow that's uncalled for.

I don't always agree with Ola but his posts are rarely uninformed 
and often backed up with actual code examples or links supoprting 
his arguments. They generally lead to very interesting 
discussions on the forum.


Cheers,
uri


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread Max Samukha via Digitalmars-d

On Sunday, 16 November 2014 at 03:27:54 UTC, Walter Bright wrote:

On 11/14/2014 4:32 PM, deadalnix wrote:

To quote the guy from the PL for video games video serie, a 85%
solution often is preferable.


Spoken like a true engineer!


85% often means being at the bottom of the uncanny valey. 65% or 
95% are more preferable.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread deadalnix via Digitalmars-d

On Thursday, 20 November 2014 at 10:24:30 UTC, Max Samukha wrote:
On Sunday, 16 November 2014 at 03:27:54 UTC, Walter Bright 
wrote:

On 11/14/2014 4:32 PM, deadalnix wrote:
To quote the guy from the PL for video games video serie, a 
85%

solution often is preferable.


Spoken like a true engineer!


85% often means being at the bottom of the uncanny valey. 65% 
or 95% are more preferable.


85% is an image rather than an exact number. The point being,
every construct are good at some thing, and bad at other. Making
them capable of doing everything come at a great complexity cost,
so it is preferable to aim for a solution that cope well with
most use cases, and provide alternative solutions for the
horrible cases.

Many language make the mistake of thinking something is the holly
grail, be it OOP, functional programming or linear types. I do
think that it is a better engineering solution to provide a
decent support for all of theses, and doing so we don't need to
get them handle 100% of the case, as we have other language
construct/paradigm that suit better difficult cases anyway.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread via Digitalmars-d

On Thursday, 20 November 2014 at 20:15:03 UTC, deadalnix wrote:
Many language make the mistake of thinking something is the 
holly

grail, be it OOP, functional programming or linear types. I do
think that it is a better engineering solution to provide a
decent support for all of theses, and doing so we don't need to
get them handle 100% of the case, as we have other language
construct/paradigm that suit better difficult cases anyway.


FWIW, among language designers it is usually considered a 
desirable trait to have orthogonality between constructs and let 
them be combinable in expressive ways. This reduce the burden on 
the user who then only have to truly understand the key concepts 
to build a clear mental image of the semantic model. Then you can 
figure out ways to add syntactical sugar if needed.


Having a smaller set of basic constructs makes it easier to prove 
correctness, which turn is important for optimization (which 
depends on the ability to prove equivalence over the pre/post 
semantics). It makes it easier to prove properties such as 
@(un)safe. It also makes it easier to later extend the language.


Just think about all the areas fibers in D affect. It affects 
garbage collection and memory handling. It affects the ability to 
do deep semantic analysis. It affects implementation of fast 
multi-threaded ADTs. One innocent feature can have a great impact.


Providing a 70% solution like Go is fine as they have defined a 
narrow domain for the language, servers, thus as a programmer you 
don't hit the 30% they left out.


But D has not defined narrow use domain, so as a designer you 
cannot make up a good rationale for which 15-30% to leave out. 
Design is always related to a specific use scenario.


(I like the uncanny valley metaphor, had not thought about using 
it outside 3D. Cool association!)


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread deadalnix via Digitalmars-d

On Thursday, 20 November 2014 at 21:26:18 UTC, Ola Fosheim
Grøstad wrote:

On Thursday, 20 November 2014 at 20:15:03 UTC, deadalnix wrote:
Many language make the mistake of thinking something is the 
holly

grail, be it OOP, functional programming or linear types. I do
think that it is a better engineering solution to provide a
decent support for all of theses, and doing so we don't need to
get them handle 100% of the case, as we have other language
construct/paradigm that suit better difficult cases anyway.


FWIW, among language designers it is usually considered a 
desirable trait to have orthogonality between constructs and 
let them be combinable in expressive ways. This reduce the 
burden on the user who then only have to truly understand the 
key concepts to build a clear mental image of the semantic 
model. Then you can figure out ways to add syntactical sugar if 
needed.


Having a smaller set of basic constructs makes it easier to 
prove correctness, which turn is important for optimization 
(which depends on the ability to prove equivalence over the 
pre/post semantics). It makes it easier to prove properties 
such as @(un)safe. It also makes it easier to later extend 
the language.


Just think about all the areas fibers in D affect. It affects 
garbage collection and memory handling. It affects the ability 
to do deep semantic analysis. It affects implementation of fast 
multi-threaded ADTs. One innocent feature can have a great 
impact.


Providing a 70% solution like Go is fine as they have defined a 
narrow domain for the language, servers, thus as a programmer 
you don't hit the 30% they left out.


But D has not defined narrow use domain, so as a designer you 
cannot make up a good rationale for which 15-30% to leave out. 
Design is always related to a specific use scenario.


(I like the uncanny valley metaphor, had not thought about 
using it outside 3D. Cool association!)


All of this is beautiful until you try to implement a quicksort
in, haskell.

It is not that functional programming is bad (I actually like it
a lot) but there are problem where it is simply the wrong tool.
Once you acknowledge that, you have 2 road forward :
  - You create bizarre features to implement quicksort in a
functional way. Tge concept become more complex, but some expert
guru will secure their job.
  - Keep your functional feature as they are, but allow for other
styles, which cope better with quicksort.

The situation 2 is the practical one. There is no point in
creating an ackward hammer that can also screw things if I can
have a hammer and a screwdriver.

Obviously, this has a major drawback in the fact you cannot say
to everybody that your favorite style is the one true thing that
everybody must use. That is a real bummer for religious zealot,
but actual engineers understand that this is a feature, not a bug.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread via Digitalmars-d

On Thursday, 20 November 2014 at 21:55:16 UTC, deadalnix wrote:

All of this is beautiful until you try to implement a quicksort
in, haskell.

It is not that functional programming is bad (I actually like it
a lot) but there are problem where it is simply the wrong tool.


Sure, I am not arguing in favour of functional programming. But 
it is possible to define a tight core language (or VM) with the 
understanding that all other high level constructs have to be 
expressed within that core language in the compiler internals. 
Then you can do all the analysis on that small critical subset of 
constructs. With this approach you can create/modify all kinds of 
convenience features without affect the core semantics that keeps 
it sounds and clean.


Take for instance the concept of isolates, which I believe we 
both think can be useful. If the concept of an 
isolated-group-of-objects it is taken to the abstract level and 
married to a simple core language (or VM) in a sound way, then 
the more complicated stuff can hopefully be built on top of it. 
So you get a bottom-up approach to the language that meets the 
end user. Rather than what happens now where feature requests 
seem to be piled on top-down, they ought to be digested into 
something that can grow bottom-up. I believe this is what you try 
to do with your GC proposal.



Obviously, this has a major drawback in the fact you cannot say
to everybody that your favorite style is the one true thing that
everybody must use. That is a real bummer for religious zealot,
but actual engineers understand that this is a feature, not a 
bug.


Well, I think this holds:

1. Good language creation goes bottom-up.
2. Good language evaluation goes top-down.
3. Good language design is a circular process between 1  and 2.

In essence having a tight engine is important (the bottom), but 
you also need to understand the use context and how it will be 
used (the top).


In D the bottom-part is not so clear and could need a cleanup, 
but then the community would have to accept the effects of that 
propagate to the top.


Without defining some use contexts for the language I think the 
debates get very long, because without data you cannot do 
analysis and then you end up with feels right to me and that 
is not engineering, it is art or what you are used to. And, there 
are more good engineers in the world than good artists…


If one can define a single use scenario that is demanding enough 
to ensure that an evaluation against that scenario also will work 
for the other less demanding scenarios, then maybe some more 
rational discussions about the direction of D as language could 
be possible and you could leave out say the 10% that are less 
useful. When everybody argues out from their own line of work and 
habits… then they talk past each other.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread Walter Bright via Digitalmars-d

On 11/20/2014 1:55 PM, deadalnix wrote:

All of this is beautiful until you try to implement a quicksort
in, haskell.

It is not that functional programming is bad (I actually like it
a lot) but there are problem where it is simply the wrong tool.
Once you acknowledge that, you have 2 road forward :
   - You create bizarre features to implement quicksort in a
functional way. Tge concept become more complex, but some expert
guru will secure their job.
   - Keep your functional feature as they are, but allow for other
styles, which cope better with quicksort.

The situation 2 is the practical one. There is no point in
creating an ackward hammer that can also screw things if I can
have a hammer and a screwdriver.

Obviously, this has a major drawback in the fact you cannot say
to everybody that your favorite style is the one true thing that
everybody must use. That is a real bummer for religious zealot,
but actual engineers understand that this is a feature, not a bug.


Monads!


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread via Digitalmars-d
On Thursday, 20 November 2014 at 22:47:27 UTC, Walter Bright 
wrote:

On 11/20/2014 1:55 PM, deadalnix wrote:

All of this is beautiful until you try to implement a quicksort
in, haskell.


[…]


Monads!


I think Deadalnix meant that you cannot do in-place quicksort 
easily in Haskell. Non-mutating quicksort is easy, no need for 
monads:


quicksort [] = []
quicksort (p:xs) = (quicksort lesser) ++ [p] ++ (quicksort 
greater)

where
lesser  = filter ( p) xs
greater = filter (= p) xs

https://www.haskell.org/haskellwiki/Introduction#Quicksort_in_Haskell


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread deadalnix via Digitalmars-d

You are a goalspot shifting champion, aren't you ?


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread via Digitalmars-d

On Thursday, 20 November 2014 at 23:22:40 UTC, deadalnix wrote:

You are a goalspot shifting champion, aren't you ?


Nope, it follows up your line of argument, but the 
screwdriver/hammer metaphor is not a good one.


You can implement your hammer and your screwdriver at the top if 
you have a lower level screwdriver/hammer-components at the 
bottom. That is the good.


Piling together hammers and screwdrivers and hoping that nobody 
are going to miss the remaining 35% involving glue and tape… That 
is bad.


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread Walter Bright via Digitalmars-d
On 11/20/2014 3:10 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Thursday, 20 November 2014 at 22:47:27 UTC, Walter Bright wrote:

On 11/20/2014 1:55 PM, deadalnix wrote:

All of this is beautiful until you try to implement a quicksort
in, haskell.


[…]


Monads!


I think Deadalnix meant that you cannot do in-place quicksort easily in Haskell.


That's correct.


Non-mutating quicksort is easy, no need for monads:

quicksort [] = []
quicksort (p:xs) = (quicksort lesser) ++ [p] ++ (quicksort greater)
 where
 lesser  = filter ( p) xs
 greater = filter (= p) xs

https://www.haskell.org/haskellwiki/Introduction#Quicksort_in_Haskell


Except that isn't really quicksort. Monads are the workaround functional 
languages use to deal with things that need mutation.




Re: Why is `scope` planned for deprecation?

2014-11-20 Thread via Digitalmars-d

On Friday, 21 November 2014 at 01:09:27 UTC, Walter Bright wrote:
Except that isn't really quicksort. Monads are the workaround 
functional languages use to deal with things that need mutation.


Yes, at least in Haskell, but I find monads in Haskell harder to 
read than regular imperative code. You can apparently cheat a 
little using libraries, this doesn't look too bad (from slashdot):


import qualified Data.Vector.Generic as V
import qualified Data.Vector.Generic.Mutable as M

qsort :: (V.Vector v a, Ord a) = v a - v a
qsort = V.modify go where
go xs | M.length xs  2 = return ()
  | otherwise = do
p - M.read xs (M.length xs `div` 2)
j - M.unstablePartition ( p) xs
let (l, pr) = M.splitAt j xs
k - M.unstablePartition (== p) pr
go l; go $ M.drop k pr


http://stackoverflow.com/questions/7717691/why-is-the-minimalist-example-haskell-quicksort-not-a-true-quicksort


Re: Why is `scope` planned for deprecation?

2014-11-20 Thread Walter Bright via Digitalmars-d
On 11/20/2014 5:27 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Friday, 21 November 2014 at 01:09:27 UTC, Walter Bright wrote:

Except that isn't really quicksort. Monads are the workaround functional
languages use to deal with things that need mutation.


Yes, at least in Haskell, but I find monads in Haskell harder to read than
regular imperative code.


Exactly my point (and I presume deadalnix's, too).



Re: Why is `scope` planned for deprecation?

2014-11-20 Thread Andrei Alexandrescu via Digitalmars-d

On 11/20/14 5:09 PM, Walter Bright wrote:

On 11/20/2014 3:10 PM, Ola Fosheim Grøstad
ola.fosheim.grostad+dl...@gmail.com wrote:

On Thursday, 20 November 2014 at 22:47:27 UTC, Walter Bright wrote:

On 11/20/2014 1:55 PM, deadalnix wrote:

All of this is beautiful until you try to implement a quicksort
in, haskell.


[…]


Monads!


I think Deadalnix meant that you cannot do in-place quicksort easily
in Haskell.


That's correct.


Non-mutating quicksort is easy, no need for monads:

quicksort [] = []
quicksort (p:xs) = (quicksort lesser) ++ [p] ++ (quicksort greater)
 where
 lesser  = filter ( p) xs
 greater = filter (= p) xs

https://www.haskell.org/haskellwiki/Introduction#Quicksort_in_Haskell


Except that isn't really quicksort. Monads are the workaround functional
languages use to deal with things that need mutation.


As I like to say, this troika has inflicted a lot of damage on both FP 
and those beginning to learn it:


* Linear-space factorial
* Doubly exponential Fibonacci
* (Non)Quicksort

These losers appear with depressing frequency in FP introductory texts.


Andrei



Re: Why is `scope` planned for deprecation?

2014-11-19 Thread via Digitalmars-d
On Wednesday, 19 November 2014 at 00:04:50 UTC, Walter Bright 
wrote:
I know you're simply being argumentative when you defend VLAs, 
a complex and
useless feature, and denigrate simple ptr/length pairs as 
complicated.


Wait, we are either discussing the design goals of the original C 
or the evolved C. VLAs did not fit the original C either, but in 
the google discussion you find people who find VLAs very useful. 
It looks a loot better than alloca. The reason it is made 
optional is to make embedded-C compilers easier to write, I think.


But hey, it's simpler, faster, less code, less bug prone, 
easier to understand and uses less memory to:


1. strlen
2. allocate

…

Not faster, but if speed is no concern, sure. It seldom is when 
it comes to filenames.


I know you said just allocate a large fixed size buffer, but 
I hope you realize that such practice is the root cause of most 
buffer overflow bugs,


strcat() should never have been created, but strlcat is safe.

Now, I know that you'll never concede destruction, after all, 
this is the internet, but give it up :-)


I always concede destruction :-)


Re: Why is `scope` planned for deprecation?

2014-11-19 Thread via Digitalmars-d
On Wednesday, 19 November 2014 at 01:35:19 UTC, Alo Miehsof 
Datsørg wrote:
Argumentative ?!! More like a fucking gaping fucking asshole. 
His

posts are the blight of this group.


If you are going ad hominem, please post under your own name. I 
never go ad hominem, and therefore your response will achieve the 
exact opposite of what you are trying to achieve to ensure that 
ad hominem does not become an acceptable line of action.


Re: Why is `scope` planned for deprecation?

2014-11-19 Thread via Digitalmars-d

On Wednesday, 19 November 2014 at 01:13:04 UTC, deadalnix wrote:

There are good answer to most of this but most importantly, this
do not contain anything actionable and is completely off topic(
reminder, the topic of the thread is SCOPE ).


The topic is if scope is planned for deprecation. That has been 
answered.


The thread has been off topic for a long time, so please don't 
use moderation for silencing an opposing viewpoint from a single 
party. Unless you are a moderator, but then you should also point 
to the rules of the forum.


If you are a moderator, then you should first and foremost 
moderate people who go ad hominem without providing an identity.


Self-appointing moderation will always lead to a very bad 
situation. If the forums need moderation appoint a moderator.




Reader's time is precious, please don't waste it.


Then don't read. The topic of this thread was never actionable. 
Sounds like you want a pure developers forum.


If you want a community you need a lower threshold general forum. 
D is the general forum until it has been defined to not be so, 
but you do need a low threshold forum.


(I never give in to online bullies in unmoderated media… That 
gives them a foothold.)


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Paulo Pinto via Digitalmars-d

On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright wrote:
On 11/17/2014 3:15 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:
Ok, but I would rather say it like this: the language C 
doesn't really provide
strings, it only provides literals in a particular format. So 
the literal-format
is a trade-off between having something generic and simple and 
having something
more complex and possibly limited (having 255 char limit is 
not good enough in

the long run).


The combination of the inescapable array-to-ptr decay when 
calling a function, coupled with the Standard library which is 
part of the language that takes char* as strings, means that 
for all practical purposes C does provide strings, and pretty 
much forces it on the programmer.



I think there is a certain kind of beauty to the minimalistic 
approach taken
with C (well, at least after ANSI-C came about in the late 
80s). I like the

language better than the libraries…


C is a brilliant language. That doesn't mean it hasn't made 
serious mistakes in its design. The array decay and 0 strings 
have proven to be very costly to programmers over the decades.


Heartbleed is a nice example.

The amount of money in developer time, delivery software updates 
to customers and buying new hardware with firmware that cannot be 
replaced.


This is just one case, the CVE List gets updated every day and 
90% of the issues are the usual C suspects regarding pointer 
misuse and out of bounds.


Anyone writing C code should by following practices like 
https://wiki.debian.org/Hardening


--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d
On Tuesday, 18 November 2014 at 04:58:43 UTC, Anonymous Coward 
wrote:

Stop wasting time with the mouth breather.


Please write under your full name.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright wrote:
C is a brilliant language. That doesn't mean it hasn't made 
serious mistakes in its design. The array decay and 0 strings 
have proven to be very costly to programmers over the decades.


I'd rather say that it is the industry that has misappropriated 
C, which in my view basically was typed portable assembly with 
very little builtin presumptions by design. This is important 
when getting control over layout, and this transparency is a 
quality that only C gives me. BCPL might be considered to have 
more presumptions (such as string length), being a minimal 
bootstrapping subset of CPL.


You always had the ability in C to implement arrays as a variable 
sized struct with a length and a trailing data section, so I'd 
say that the C provided type safe variable length arrays. Many 
people don't use it. Many people don't know how to use it. Ok, 
but then they don't understand that they are programming in a low 
level language and are responsible for creating their own 
environment. I think C's standard lib mistakingly created an 
illusion of high level programming that the language only 
partially supported.


Adding the ability to transfer structs by value as a parameter 
was probably not worth the implementation cost at the time… 
Having a magic struct/tuple that transfer length or end pointer 
with the head pointer does not fit the C design. If added it 
should have been done as a struct and to make that work you would 
have to add operator overloading. There's an avalanche effect of 
features and additional language design issues there.


I think KR deserves credit for being able to say no and stay 
minimal, I think the Go team deserves the same credit. As you've 
experienced with D, saying no is hard because there are often 
good arguments for features being useful and difficult to say in 
advance with certainty what kind of avalanche effect adding 
features have (in terms of semantics, special casing and new 
needs for additional support/features, time to complete 
implementation/debugging). So saying no until practice shows that 
a feature is sorely missed is a sign of good language design 
practice.


The industry wanted portability and high speed and insisted 
moving as a flock after C and BLINDLY after C++. Seriously, the 
media frenzy around C++ was hysterical despite C++ being a bad 
design from the start. The C++ media noise was worse than with 
Java IIRC. Media are incredibly shallow when they are trying to 
sell mags/books based on the next big thing and they can 
accelerate adoption beyond merits. Which both C++ and Java are 
two good examples of.


There were alternatives such as Turbo Pascal, Modula-2/3, Simula, 
Beta, ML, Eiffel, Delphi and many more. Yet, programmers thought 
C was cool because it was portable assembly and industry 
standard and fast and safe bet. So they were happy with it, 
because C compiler emitted fast code. And fast was more important 
to them than safe. Well, they got what they deserved, right?


Not adding additional features is not a design mistake if you try 
hard to stay minimal and don't claim to support high level 
programming. The mistake is in using a tool as if it supports 
something it does not.


You might be right that KR set the bar too high for adding extra 
features. Yet others might be right that D has been too willing 
to add features. As you know, the perfect balance is difficult to 
find and it is dependent on the use context, so it materialize 
after the fact (after implementation). And C's use context has 
expanded way beyond the original use context where people were 
not afraid to write assembly.


(But the incomprehensible typing notation for function pointers 
was a design mistake since that was a feature of the language.)


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Paulo Pinto via Digitalmars-d
On Tuesday, 18 November 2014 at 11:15:28 UTC, Ola Fosheim Grøstad 
wrote:
On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright 
wrote:
C is a brilliant language. That doesn't mean it hasn't made 
serious mistakes in its design. The array decay and 0 strings 
have proven to be very costly to programmers over the decades.


I'd rather say that it is the industry that has misappropriated 
C, which in my view basically was typed portable assembly 
with very little builtin presumptions by design.


Lint was created in 1979 when it was already clear most ATT 
developers weren't writing correct C code!




I think KR deserves credit for being able to say no and stay 
minimal, I think the Go team deserves the same credit.


Of course, two of them are from the same team.

The industry wanted portability and high speed and insisted 
moving as a flock after C and BLINDLY after C++. Seriously, the 
media frenzy around C++ was hysterical despite C++ being a bad 
design from the start. The C++ media noise was worse than with 
Java IIRC. Media are incredibly shallow when they are trying to 
sell mags/books based on the next big thing and they can 
accelerate adoption beyond merits. Which both C++ and Java are 
two good examples of.


There were alternatives such as Turbo Pascal, Modula-2/3, 
Simula, Beta, ML, Eiffel, Delphi and many more. Yet, 
programmers thought C was cool because it was portable 
assembly and industry standard and fast and safe bet.


This was a consequence of UNIX spreading into the enterprise, 
like we

have to endure JavaScript to target the browser, we were forced to
code in C to target UNIX.

Other OS just followed along, as we started to want to port those 
big

iron utilities to smaller computers.

If UNIX had been written in XPTO-LALA, we would all be coding in 
XPTO-LALA today.



--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 08:28:19 UTC, Paulo  Pinto wrote:
This is just one case, the CVE List gets updated every day and 
90% of the issues are the usual C suspects regarding pointer 
misuse and out of bounds.


Sure, but these are not a strict language issues since the same 
developers would turn off bounds-checking at the first 
opportunity anyway!


Professionalism does not involve blaming the tool, it involves 
picking the right tools and process for the task. Unfortunately 
the IT industry has over time suffered from a lack of formal 
education and immature markets. Software is considered to work 
when it crash only once every 24 hours, we would not accept that 
from any other utility?


I've never heard anyone in academia claim that C is anything more 
than a small step up from assembler (i.e. low level), so why 
allow intermediate skilled programmers to write C code if you for 
the same application would not allow an excellent programmer to 
write the same program in assembly (about the same risk of having 
a crash). People get what they deserve.


Never blame the tool for bad management. You get to pick the tool 
and the process, right? Neither the tool or testing will ensure 
correct behaviour on its own. You have many factors that need to 
play together (mindset, process and the tool set).


If you want a compiler that works, you're probably better off 
writing it in ML than in C, but people implement it in C. Why? 
Because they FEEL like it… It is not rational. It is emotional.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 12:02:01 UTC, Paulo  Pinto wrote:
On Tuesday, 18 November 2014 at 11:15:28 UTC, Ola Fosheim 
Grøstad wrote:
I'd rather say that it is the industry that has 
misappropriated C, which in my view basically was typed 
portable assembly with very little builtin presumptions by 
design.


Lint was created in 1979 when it was already clear most ATT 
developers weren't writing correct C code!


Sure, but most operating system vendors considered it a strategic 
move to ensure availability of high level languages on their 
mainframes. E.g. Univac provided Algol and gave a significant 
rebate to the developers of Simula on the purchase of a Univac to 
ensure that Simula would be available for high level programming.


There were alternatives such as Turbo Pascal, Modula-2/3, 
Simula, Beta, ML, Eiffel, Delphi and many more. Yet, 
programmers thought C was cool because it was portable 
assembly and industry standard and fast and safe bet.


This was a consequence of UNIX spreading into the enterprise, 
like we
have to endure JavaScript to target the browser, we were forced 
to

code in C to target UNIX.


Nobody were forced to write code in C to target anything, it was 
a choice. And a choice that grew out of a focus on performance 
and the fact that people still dropped down to write machine 
language quit frequently. Mentality matters.


Javascript is different, since it is the exposed VM in the 
browser, but even there you don't have to write in Javascript. 
You can write in a language that compiles to javascript.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Paulo Pinto via Digitalmars-d
On Tuesday, 18 November 2014 at 13:50:59 UTC, Ola Fosheim Grøstad 
wrote:
On Tuesday, 18 November 2014 at 12:02:01 UTC, Paulo  Pinto 
wrote:



Nobody were forced to write code in C to target anything, it 
was a choice. And a choice that grew out of a focus on 
performance and the fact that people still dropped down to 
write machine language quit frequently. Mentality matters.


Javascript is different, since it is the exposed VM in the 
browser, but even there you don't have to write in Javascript. 
You can write in a language that compiles to javascript.


Since when do developers use a different systems programming 
language than the one sold by the OS vendor?


Who has the pleasure to waste work hours writing FFI wrappers 
around SDK tools?


All successful systems programming languages, even if only for a 
few years, were tied to a specific OS.


--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 14:56:42 UTC, Paulo  Pinto wrote:
Since when do developers use a different systems programming 
language than the one sold by the OS vendor?


Who has the pleasure to waste work hours writing FFI wrappers 
around SDK tools?


All successful systems programming languages, even if only for 
a few years, were tied to a specific OS.


Depends on what you mean by system programming. I posit that most 
programs that have been written in C are primarily application 
level programs. Meaning that you could factor out the C component 
as a tiny unit and write the rest in another language… Most high 
level languages provide integration with C. These things are 
entirely cultural.


In the late 80s you could do the same stuff in Turbo Pascal as in 
C, and integrate with asm with no problem. Lots of decent 
software for MSDOS was written in TP, such as BBS server software 
dealing with many connections.


On regular micros you didn't have a MMU so there was actually a 
great penalty for using an unsafe language even during 
development: the OS would reboot (or you would get the famous 
guru meditation on Amiga). That sucked.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 21:54:40 UTC, Walter Bright wrote:
Besides, C was designed for the PDP-11, which had no such 
instructions.


BTW, this is not entirely correct. It had autoincrement on 
registers. This is the example given on Wikipedia:


 MOV #MSG,R1
1$: MOVB (R1)+,R0
 BEQ DONE
 .TTYOUT
 BR 1$
 .EXIT

MSG: .ASCIZ /Hello, world!/

The full example:

http://en.wikipedia.org/wiki/MACRO-11

So the print loop is 4 instructions (I assume .TTYOUT is a I/O 
instruction), with a length you would at least have 5 
instructions and use an extra register, as you would have an 
additional compare.


(As for concat, that I almost never use. In systems programming 
you mostly append to buffers and flush when the buffer is full. 
Don't need length for that. Even in javascript and python I avoid 
regular concat due to the inefficency of concat versus a buffered 
join.)


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d
On 11/18/2014 4:18 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

Never blame the tool for bad management.


To bring up the aviation industry again, they long ago recognized that blame 
the pilot and blame the mechanics is not how safe airplanes are made. They 
are made, in part, by fixing the tools so mistakes cannot happen, as even the 
best humans keep making mistakes.


C is a mistake-prone tool, and suggesting that programmers get better educated 
about how to use it does not work.


As I showed, a great deal of C's propensity for buffer overflows can be 
eliminated by a TRIVIAL change to the language, one that is fully backwards 
compatible, and takes NOTHING away from C's power. I've brought this up in 
conference presentations more than once, and the blank silence I get from C 
programmers just baffles me.


Blaming the tools is often appropriate.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d
On 11/18/2014 3:15 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright wrote:

C is a brilliant language. That doesn't mean it hasn't made serious mistakes
in its design. The array decay and 0 strings have proven to be very costly to
programmers over the decades.


I'd rather say that it is the industry that has misappropriated C, which in my
view basically was typed portable assembly with very little builtin
presumptions by design. This is important when getting control over layout, and
this transparency is a quality that only C gives me. BCPL might be considered to
have more presumptions (such as string length), being a minimal bootstrapping
subset of CPL.

You always had the ability in C to implement arrays as a variable sized struct
with a length and a trailing data section, so I'd say that the C provided type
safe variable length arrays. Many people don't use it. Many people don't know
how to use it. Ok, but then they don't understand that they are programming in a
low level language and are responsible for creating their own environment. I
think C's standard lib mistakingly created an illusion of high level programming
that the language only partially supported.

Adding the ability to transfer structs by value as a parameter was probably not
worth the implementation cost at the time… Having a magic struct/tuple that
transfer length or end pointer with the head pointer does not fit the C design.
If added it should have been done as a struct and to make that work you would
have to add operator overloading. There's an avalanche effect of features and
additional language design issues there.

I think KR deserves credit for being able to say no and stay minimal, I think
the Go team deserves the same credit. As you've experienced with D, saying no is
hard because there are often good arguments for features being useful and
difficult to say in advance with certainty what kind of avalanche effect adding
features have (in terms of semantics, special casing and new needs for
additional support/features, time to complete implementation/debugging). So
saying no until practice shows that a feature is sorely missed is a sign of good
language design practice.

The industry wanted portability and high speed and insisted moving as a flock
after C and BLINDLY after C++. Seriously, the media frenzy around C++ was
hysterical despite C++ being a bad design from the start. The C++ media noise
was worse than with Java IIRC. Media are incredibly shallow when they are trying
to sell mags/books based on the next big thing and they can accelerate
adoption beyond merits. Which both C++ and Java are two good examples of.

There were alternatives such as Turbo Pascal, Modula-2/3, Simula, Beta, ML,
Eiffel, Delphi and many more. Yet, programmers thought C was cool because it was
portable assembly and industry standard and fast and safe bet. So they
were happy with it, because C compiler emitted fast code. And fast was more
important to them than safe. Well, they got what they deserved, right?

Not adding additional features is not a design mistake if you try hard to stay
minimal and don't claim to support high level programming. The mistake is in
using a tool as if it supports something it does not.

You might be right that KR set the bar too high for adding extra features. Yet
others might be right that D has been too willing to add features. As you know,
the perfect balance is difficult to find and it is dependent on the use context,
so it materialize after the fact (after implementation). And C's use context has
expanded way beyond the original use context where people were not afraid to
write assembly.

(But the incomprehensible typing notation for function pointers was a design
mistake since that was a feature of the language.)


I'm sorry to say this, but these rationalizations as to why C cannot add a 
trivial enhancement that takes nothing away and solves most of the buffer 
overflow problems leaves me shaking my head.


(C has added useless enhancements, like VLAs.)



Re: Why is `scope` planned for deprecation?

2014-11-18 Thread H. S. Teoh via Digitalmars-d
On Tue, Nov 18, 2014 at 11:45:13AM -0800, Walter Bright via Digitalmars-d wrote:
[...]
 I'm sorry to say this, but these rationalizations as to why C cannot
 add a trivial enhancement that takes nothing away and solves most of
 the buffer overflow problems leaves me shaking my head.
 
 (C has added useless enhancements, like VLAs.)

What's the trivial thing that will solve most buffer overflow problems?


T

-- 
Dogs have owners ... cats have staff. -- Krista Casada


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d

On 11/18/2014 12:10 PM, H. S. Teoh via Digitalmars-d wrote:

On Tue, Nov 18, 2014 at 11:45:13AM -0800, Walter Bright via Digitalmars-d wrote:

I'm sorry to say this, but these rationalizations as to why C cannot
add a trivial enhancement that takes nothing away and solves most of
the buffer overflow problems leaves me shaking my head.


What's the trivial thing that will solve most buffer overflow problems?


http://www.drdobbs.com/architecture-and-design/cs-biggest-mistake/228701625



Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Paulo Pinto via Digitalmars-d

On Tuesday, 18 November 2014 at 19:45:12 UTC, Walter Bright wrote:
On 11/18/2014 3:15 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:
On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright 
wrote:
C is a brilliant language. That doesn't mean it hasn't made 
serious mistakes
in its design. The array decay and 0 strings have proven to 
be very costly to

programmers over the decades.


I'd rather say that it is the industry that has 
misappropriated C, which in my
view basically was typed portable assembly with very little 
builtin
presumptions by design. This is important when getting control 
over layout, and
this transparency is a quality that only C gives me. BCPL 
might be considered to
have more presumptions (such as string length), being a 
minimal bootstrapping

subset of CPL.

You always had the ability in C to implement arrays as a 
variable sized struct
with a length and a trailing data section, so I'd say that the 
C provided type
safe variable length arrays. Many people don't use it. Many 
people don't know
how to use it. Ok, but then they don't understand that they 
are programming in a
low level language and are responsible for creating their own 
environment. I
think C's standard lib mistakingly created an illusion of high 
level programming

that the language only partially supported.

Adding the ability to transfer structs by value as a parameter 
was probably not
worth the implementation cost at the time… Having a magic 
struct/tuple that
transfer length or end pointer with the head pointer does not 
fit the C design.
If added it should have been done as a struct and to make that 
work you would
have to add operator overloading. There's an avalanche effect 
of features and

additional language design issues there.

I think KR deserves credit for being able to say no and stay 
minimal, I think
the Go team deserves the same credit. As you've experienced 
with D, saying no is
hard because there are often good arguments for features being 
useful and
difficult to say in advance with certainty what kind of 
avalanche effect adding
features have (in terms of semantics, special casing and new 
needs for
additional support/features, time to complete 
implementation/debugging). So
saying no until practice shows that a feature is sorely missed 
is a sign of good

language design practice.

The industry wanted portability and high speed and insisted 
moving as a flock
after C and BLINDLY after C++. Seriously, the media frenzy 
around C++ was
hysterical despite C++ being a bad design from the start. The 
C++ media noise
was worse than with Java IIRC. Media are incredibly shallow 
when they are trying
to sell mags/books based on the next big thing and they can 
accelerate
adoption beyond merits. Which both C++ and Java are two good 
examples of.


There were alternatives such as Turbo Pascal, Modula-2/3, 
Simula, Beta, ML,
Eiffel, Delphi and many more. Yet, programmers thought C was 
cool because it was
portable assembly and industry standard and fast and 
safe bet. So they
were happy with it, because C compiler emitted fast code. And 
fast was more
important to them than safe. Well, they got what they 
deserved, right?


Not adding additional features is not a design mistake if you 
try hard to stay
minimal and don't claim to support high level programming. The 
mistake is in

using a tool as if it supports something it does not.

You might be right that KR set the bar too high for adding 
extra features. Yet
others might be right that D has been too willing to add 
features. As you know,
the perfect balance is difficult to find and it is dependent 
on the use context,
so it materialize after the fact (after implementation). And 
C's use context has
expanded way beyond the original use context where people were 
not afraid to

write assembly.

(But the incomprehensible typing notation for function 
pointers was a design

mistake since that was a feature of the language.)


I'm sorry to say this, but these rationalizations as to why C 
cannot add a trivial enhancement that takes nothing away and 
solves most of the buffer overflow problems leaves me shaking 
my head.


(C has added useless enhancements, like VLAs.)


So useless that it became optional in C11.

https://groups.google.com/forum/#!topic/comp.std.c/AoB6LFHcd88


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Paulo Pinto via Digitalmars-d

On Tuesday, 18 November 2014 at 15:36:58 UTC, Ola Fosheim Grøstad
wrote:
On Tuesday, 18 November 2014 at 14:56:42 UTC, Paulo  Pinto 
wrote:
Since when do developers use a different systems programming 
language than the one sold by the OS vendor?


Who has the pleasure to waste work hours writing FFI wrappers 
around SDK tools?


All successful systems programming languages, even if only for 
a few years, were tied to a specific OS.


Depends on what you mean by system programming. I posit that 
most programs that have been written in C are primarily 
application level programs. Meaning that you could factor out 
the C component as a tiny unit and write the rest in another 
language… Most high level languages provide integration with C. 
These things are entirely cultural.



In the 80's almost everything was system programming, even
business applications.

You are forgetting the UNIX factor again.

We only had C available in UNIX systems as compiled language.
HP-UX was the only commercial UNIX I used where we had access to
compilers for other languages.

So who would pay for third party tooling, specially with the way
software used to cost?

Then of course, many wanted to do on their CP/M, Spectrum and
similar systems the type of coding possible at work or
university, which lead to Small C and other C based compilers,
thus spreading the language outside UNIX.




In the late 80s you could do the same stuff in Turbo Pascal as 
in C, and integrate with asm with no problem. Lots of decent 
software for MSDOS was written in TP, such as BBS server 
software dealing with many connections.


I was doing Turbo Pascal most of the time, by the time I learned C
with Turbo C 2.0, Turbo C++ 1.0 was just around the corner and I
only
touched pure C again on teachers and employers request.



On regular micros you didn't have a MMU so there was actually a 
great penalty for using an unsafe language even during 
development: the OS would reboot (or you would get the famous 
guru meditation on Amiga). That sucked.


Amiga was programmed in Assembly. Except for Amos, we didn't use
anything else.

--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 19:42:20 UTC, Walter Bright wrote:
To bring up the aviation industry again, they long ago 
recognized that blame the pilot and blame the mechanics is 
not how safe airplanes are made. They are made, in part, by 
fixing the tools so mistakes cannot happen, as even the best 
humans keep making mistakes.


Please note that I said it was a management issue. Clearly if 
management equip workers with unsafe tools that is bad. But there 
have always been safer tools available. It has always been 
possible to do things differently. It has always been possible to 
do risk assessment and adopt to it, in tools, education and 
process.


I am sure the aviation industry is doing a lot better than the IT 
industry!



Blaming the tools is often appropriate.


If you are forced to use one while being asked to run for a 
deadline, sure.




Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d
On 11/18/2014 9:01 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Sunday, 16 November 2014 at 21:54:40 UTC, Walter Bright wrote:

Besides, C was designed for the PDP-11, which had no such instructions.


BTW, this is not entirely correct. It had autoincrement on registers.


Those are not dedicated string instructions. Autoincrement was an addressing 
mode that could be used with any register and any instruction, including the 
stack and program counter (!).


Autoincrement/autodecrement gave rise to *p++ and *p-- in C.


This is the example given on Wikipedia:

  MOV #MSG,R1
1$: MOVB (R1)+,R0
  BEQ DONE
  .TTYOUT
  BR 1$
  .EXIT

MSG: .ASCIZ /Hello, world!/

The full example:

http://en.wikipedia.org/wiki/MACRO-11


More than destroyed by every time you have to call strlen().



So the print loop is 4 instructions (I assume .TTYOUT is a I/O instruction),
with a length you would at least have 5 instructions and use an extra register,
as you would have an additional compare.


.TTYOUT is a macro that expands to code that calls the operating system. The 11 
doesn't have I/O instructions.




(As for concat, that I almost never use. In systems programming you mostly
append to buffers and flush when the buffer is full. Don't need length for that.


Uh, you need the length to determine when the buffer is full.



Even in javascript and python  I avoid regular concat due to the inefficency of
concat versus a buffered join.)


Just try to manipulate paths, filenames, and extensions without using strlen() 
and strcat(). Your claim that C string code doesn't use strlen() is patently absurd.


Besides, you wouldn't be using javascript or python if efficiency mattered.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread H. S. Teoh via Digitalmars-d
On Tue, Nov 18, 2014 at 12:44:35PM -0800, Walter Bright via Digitalmars-d wrote:
 On 11/18/2014 12:10 PM, H. S. Teoh via Digitalmars-d wrote:
 On Tue, Nov 18, 2014 at 11:45:13AM -0800, Walter Bright via Digitalmars-d 
 wrote:
 I'm sorry to say this, but these rationalizations as to why C cannot
 add a trivial enhancement that takes nothing away and solves most of
 the buffer overflow problems leaves me shaking my head.
 
 What's the trivial thing that will solve most buffer overflow
 problems?
 
 http://www.drdobbs.com/architecture-and-design/cs-biggest-mistake/228701625

That's not a trivial change at all -- it will break pretty much every C
program there is out there. Just think of how much existing C code
relies on this conflation between arrays and pointers, and implicit
conversions between them.

Once you start going down that path, you might as well just start over
with a brand new language. Which ultimately leads to D. :-P


T

-- 
Microsoft is to operating systems  security ... what McDonalds is to gourmet 
cooking.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d

On 11/18/2014 12:53 PM, Paulo Pinto wrote:

On Tuesday, 18 November 2014 at 19:45:12 UTC, Walter Bright wrote:

(C has added useless enhancements, like VLAs.)


So useless that it became optional in C11.

https://groups.google.com/forum/#!topic/comp.std.c/AoB6LFHcd88


Note the Rationale given:

---

- Putting arbitrarily large arrays on the stack causes trouble
in multithreaded programs in implementations where stack growth
is bounded.

- There's no way to recover from an out-of-memory condition when
allocating a VLA.

- Microsoft declines to support them.

- VLAs aren't used much.  There appear to be only three in
Google Code, and no VLA parameters. The Linux kernel had one,
but it was taken out because there was no way to handle an
out of space condition.  (If anyone can find an example of
a VLA parameter in publicly visible production code, please
let me know.)

- The semantics of VLA parameters is painful.  They're
automatically reduced to pointers, with the length information
lost.  sizeof returns the size of a pointer.

- Prototypes of functions with VLA parameters do not have
to exactly match the function definition. This is
incompatible with C++ style linkage and C++ function
overloading, preventing the extension of this feature
into C++.

John Nagle



Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 19:45:12 UTC, Walter Bright wrote:
I'm sorry to say this, but these rationalizations as to why C 
cannot add a trivial enhancement that takes nothing away and


They can add whatever they want.

I am arguing against the position that it was a design mistake to 
keep the semantic model simple and with few presumptions. On the 
contrary, it was the design goal. Another goal for a language 
like C is ease of implementation so that you can easily port it 
to new hardware.


The original C was a very simple language. Most decent 
programmers can create their own C-compiler (but not a good 
optimizer).



(C has added useless enhancements, like VLAs.)


VLAs have been available in gcc for a long time. They are not 
useless, I've used them from time to time.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 21:07:22 UTC, Walter Bright wrote:
Those are not dedicated string instructions. Autoincrement was 
an addressing mode that could be used with any register and any 
instruction, including the stack and program counter (!).


Yes, Motorola 68000 also had those. Very useful combined with 
sentinels! ;^) It was one of those things that made the 68K asm 
feel a bit like a high level language.



Autoincrement/autodecrement gave rise to *p++ and *p-- in C.


Might have, but not from PDP-11. It came to C from B which 
predated PDP-11.



More than destroyed by every time you have to call strlen().


Don't!

And factor in the performance loss coming from reading from 
punched tape… ;-)


(Actually sentinels between fields are also better for recovery 
if you have data corruption in files, although there are many 
other solutions, but this is a non-problem today.)


.TTYOUT is a macro that expands to code that calls the 
operating system. The 11 doesn't have I/O instructions.


Ah, ok, so it was a system call.


Uh, you need the length to determine when the buffer is full.


For streaming: fixed size, modulo 2.

For allocating: worst case allocate, then release.

Just try to manipulate paths, filenames, and extensions without 
using strlen() and strcat(). Your claim that C string code 
doesn't use strlen() is patently absurd.


No, I don't claim that I never used strlen(), but I never used 
strcat() IIRC, and never had the need to repeatedly call strlen() 
on long strings. Long strings would usually sit in a struct where 
there is space for a length. Slightly annoying, but not the big 
deal.


Filenames are easy, just allocate a large fixed size buffer, then 
fill in. open(). reuse buffer.


Besides, you wouldn't be using javascript or python if 
efficiency mattered.


Actually, lately most of my efficiency related programming is 
done in javascript!  I spend a lot of time breaking up javascript 
code into async calls to get good responsiveness. But most of my 
efficiency related problems are in browser engine layout-code 
(not javascript) that I have to work around somehow.


Javascript in isolation is getting insanely fast in the last 
generations of browser JITs. It is almost a bit scary, because 
that means that we might be stuck with it forever…


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d

On 11/18/2014 1:12 PM, H. S. Teoh via Digitalmars-d wrote:

On Tue, Nov 18, 2014 at 12:44:35PM -0800, Walter Bright via Digitalmars-d wrote:

On 11/18/2014 12:10 PM, H. S. Teoh via Digitalmars-d wrote:

On Tue, Nov 18, 2014 at 11:45:13AM -0800, Walter Bright via Digitalmars-d wrote:

I'm sorry to say this, but these rationalizations as to why C cannot
add a trivial enhancement that takes nothing away and solves most of
the buffer overflow problems leaves me shaking my head.


What's the trivial thing that will solve most buffer overflow
problems?


http://www.drdobbs.com/architecture-and-design/cs-biggest-mistake/228701625


That's not a trivial change at all -- it will break pretty much every C
program there is out there. Just think of how much existing C code
relies on this conflation between arrays and pointers, and implicit
conversions between them.


No, I proposed a new syntax that would have different behavior:

  void foo(char a[..])


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread H. S. Teoh via Digitalmars-d
On Tue, Nov 18, 2014 at 02:14:20PM -0800, Walter Bright via Digitalmars-d wrote:
 On 11/18/2014 1:12 PM, H. S. Teoh via Digitalmars-d wrote:
 On Tue, Nov 18, 2014 at 12:44:35PM -0800, Walter Bright via Digitalmars-d 
 wrote:
 On 11/18/2014 12:10 PM, H. S. Teoh via Digitalmars-d wrote:
 On Tue, Nov 18, 2014 at 11:45:13AM -0800, Walter Bright via Digitalmars-d 
 wrote:
 I'm sorry to say this, but these rationalizations as to why C
 cannot add a trivial enhancement that takes nothing away and
 solves most of the buffer overflow problems leaves me shaking my
 head.
 
 What's the trivial thing that will solve most buffer overflow
 problems?
 
 http://www.drdobbs.com/architecture-and-design/cs-biggest-mistake/228701625
 
 That's not a trivial change at all -- it will break pretty much every
 C program there is out there. Just think of how much existing C code
 relies on this conflation between arrays and pointers, and implicit
 conversions between them.
 
 No, I proposed a new syntax that would have different behavior:
 
   void foo(char a[..])

Ah, I see. How would that be different from just declaring an array
struct and using it pervasively? Existing C code would not benefit from
such an addition without a lot of effort put into refactoring.


T

-- 
If you look at a thing nine hundred and ninety-nine times, you are perfectly 
safe; if you look at it the thousandth time, you are in frightful danger of 
seeing it for the first time. -- G. K. Chesterton


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d

On 11/18/2014 2:35 PM, H. S. Teoh via Digitalmars-d wrote:

On Tue, Nov 18, 2014 at 02:14:20PM -0800, Walter Bright via Digitalmars-d wrote:

On 11/18/2014 1:12 PM, H. S. Teoh via Digitalmars-d wrote:
No, I proposed a new syntax that would have different behavior:

   void foo(char a[..])


Ah, I see. How would that be different from just declaring an array
struct and using it pervasively?


  foo(string);

won't work using a struct parameter.



Existing C code would not benefit from
such an addition without a lot of effort put into refactoring.


Except that the syntax is not viral and can be done as convenient. And, as 
mentioned in the article, people do make the effort to do other, much more 
complicated, schemes.




Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d
On 11/18/2014 1:23 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

I am arguing against the position that it was a design mistake to keep the
semantic model simple and with few presumptions. On the contrary, it was the
design goal. Another goal for a language like C is ease of implementation so
that you can easily port it to new hardware.


The proposals I made do not change that in any way, and if KR designed C 
without those mistakes, it would have not made C more complex in the slightest.




VLAs have been available in gcc for a long time. They are not useless, I've used
them from time to time.


I know you're simply being argumentative when you defend VLAs, a complex and 
useless feature, and denigrate simple ptr/length pairs as complicated.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread Walter Bright via Digitalmars-d
On 11/18/2014 1:56 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

Filenames are easy, just allocate a large fixed size buffer, then fill in.
open(). reuse buffer.


char s[] = filename.ext;
foo(s[0..8]);

But hey, it's simpler, faster, less code, less bug prone, easier to understand 
and uses less memory to:


1. strlen
2. allocate
3. memcpy
4. append a 0
   foo
5. free

instead, right?

I know you said just allocate a large fixed size buffer, but I hope you 
realize that such practice is the root cause of most buffer overflow bugs, 
because the 640K is enough for anyone just never works.


And your just use a struct argument also promptly falls apart with:

foo(string)

Now, I know that you'll never concede destruction, after all, this is the 
internet, but give it up :-)




Re: Why is `scope` planned for deprecation?

2014-11-18 Thread deadalnix via Digitalmars-d

On Tuesday, 18 November 2014 at 12:18:16 UTC, Ola Fosheim Grøstad
wrote:
On Tuesday, 18 November 2014 at 08:28:19 UTC, Paulo  Pinto 
wrote:
This is just one case, the CVE List gets updated every day and 
90% of the issues are the usual C suspects regarding pointer 
misuse and out of bounds.


Sure, but these are not a strict language issues since the same 
developers would turn off bounds-checking at the first 
opportunity anyway!


Professionalism does not involve blaming the tool, it involves 
picking the right tools and process for the task. Unfortunately 
the IT industry has over time suffered from a lack of formal 
education and immature markets. Software is considered to work 
when it crash only once every 24 hours, we would not accept 
that from any other utility?


I've never heard anyone in academia claim that C is anything 
more than a small step up from assembler (i.e. low level), so 
why allow intermediate skilled programmers to write C code if 
you for the same application would not allow an excellent 
programmer to write the same program in assembly (about the 
same risk of having a crash). People get what they deserve.


Never blame the tool for bad management. You get to pick the 
tool and the process, right? Neither the tool or testing will 
ensure correct behaviour on its own. You have many factors that 
need to play together (mindset, process and the tool set).


If you want a compiler that works, you're probably better off 
writing it in ML than in C, but people implement it in C. Why? 
Because they FEEL like it… It is not rational. It is emotional.


There are good answer to most of this but most importantly, this
do not contain anything actionable and is completely off topic(
reminder, the topic of the thread is SCOPE ).

Reader's time is precious, please don't waste it.


Re: Why is `scope` planned for deprecation?

2014-11-18 Thread via Digitalmars-d

On Tuesday, 18 November 2014 at 23:48:27 UTC, Walter Bright wrote:
On 11/18/2014 1:23 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:
I am arguing against the position that it was a design mistake 
to keep the
semantic model simple and with few presumptions. On the 
contrary, it was the
design goal. Another goal for a language like C is ease of 
implementation so

that you can easily port it to new hardware.


The proposals I made do not change that in any way, and if KR 
designed C without those mistakes, it would have not made C 
more complex in the slightest.



VLAs have been available in gcc for a long time. They are not 
useless, I've used

them from time to time.


I know you're simply being argumentative when you defend VLAs, 
a complex and useless feature, and denigrate simple ptr/length 
pairs as complicated.


Argumentative ?!! More like a fucking gaping fucking asshole. His
posts are the blight of this group.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Walter Bright via Digitalmars-d
On 11/16/2014 5:43 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Monday, 17 November 2014 at 01:39:38 UTC, Walter Bright wrote:

Notice the total lack of strlen()'s in Warp.


Why would you need that? You know where the lexeme begins and ends? If we are
talking about old architectures you have to acknowledge that storage was premium
and that the major cost was getting the strings into memory in the first place.


The preprocessor stores lots of strings. Things like identifiers, keywords, 
string literals, expanded macro text, etc.


The C preprocessor I wrote in C years ago is filled with strlen(), as is about 
every C string processing program ever written. Heck, how do you think strcat() 
works?


(Another problem with strlen() is that the string pointed to is in a different 
piece of memory, and it'll have to be loaded into the cache to scan for the 0. 
Whereas with slices, the length data is in the hot cache.)




Nah, if you know that the file ends with zero then you can build an efficient
finite automata as a classifier.


deadalnix busted that myth a while back with benchmarks.


I haven't seen it,


It's in the n.g. archives somewhere in a thread about implementing lexers.



but it is difficult to avoid lexers being bandwidth limited
these days.

Besides, how do you actually implement a lexer without constructing a FA one way
or the other?


That's the wrong question. The question is does a trailing sentinel result in a 
faster FA? deadalnix demonstrated that the answer is 'no'.


You know, Ola, I've been in the trenches with this problem for decades. 
Sometimes I still learn something new, as I did with deadalnix's benchmark. But 
the stuff you are positing is well-trodden ground. There's a damn good reason 
why D uses slices and not 0 terminated strings.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d

On Monday, 17 November 2014 at 10:18:41 UTC, Walter Bright wrote:
(Another problem with strlen() is that the string pointed to is 
in a different piece of memory, and it'll have to be loaded 
into the cache to scan for the 0. Whereas with slices, the 
length data is in the hot cache.)


Oh, I am not saying that strlen() is a good contemporary 
solution. I am saying that when you have 32KiB RAM total it 
makes sense to save space by not storing the string length.



deadalnix busted that myth a while back with benchmarks.


I haven't seen it,


It's in the n.g. archives somewhere in a thread about 
implementing lexers.


Well, then it is just words.

That's the wrong question. The question is does a trailing 
sentinel result in a faster FA? deadalnix demonstrated that the 
answer is 'no'.


I hear that, but the fact remains, you do less work. It should 
therefore be faster. So if it is not, then you're either doing 
something wrong or you have bubbles in the pipeline on a specific 
CPU that you fail to fill. On newer CPUs you have a tiny loop 
buffer for tight inner loops that runs microops without decode, 
you want to keep the codesize down there.


Does it matter a lot in the world of SIMD? Probably not, but then 
you get a more complex lexer to maintain.


deadalnix's benchmark. But the stuff you are positing is 
well-trodden ground. There's a damn good reason why D uses 
slices and not 0 terminated strings.


I've never said that D should use 0 terminated strings. Now you 
twist the debate.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d
Remember that the alternative to zero-terminated strings at that 
time was to have 2 string types, one with a one byte length and 
one with a larger length. So I think C made the right choice for 
it's time, to have a single string type without a length.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Paulo Pinto via Digitalmars-d
On Monday, 17 November 2014 at 11:43:45 UTC, Ola Fosheim Grøstad 
wrote:
Remember that the alternative to zero-terminated strings at 
that time was to have 2 string types, one with a one byte 
length and one with a larger length. So I think C made the 
right choice for it's time, to have a single string type 
without a length.


Black hat hackers, virus and security tools vendors around the 
world rejoice of that decision...


It was anything but right.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d

On Monday, 17 November 2014 at 12:36:49 UTC, Paulo  Pinto wrote:
On Monday, 17 November 2014 at 11:43:45 UTC, Ola Fosheim 
Grøstad wrote:
Remember that the alternative to zero-terminated strings at 
that time was to have 2 string types, one with a one byte 
length and one with a larger length. So I think C made the 
right choice for it's time, to have a single string type 
without a length.


Black hat hackers, virus and security tools vendors around the 
world rejoice of that decision...


It was anything but right.


I don't think buffer overflow and string fundamentals are closely 
related, if used reasonably, but I'm not surprised you favour 
Pascal's solution of having two string types: one for strings up 
to 255 bytes and another one for longer strings.


Anyway, here is the real reason for how C implemented strings:

«None of BCPL, B, or C supports character data strongly in the 
language; each treats strings much like vectors of integers and 
supplements general rules by a few conventions. In both BCPL and 
B a string literal denotes the address of a static area 
initialized with the characters of the string, packed into cells. 
In BCPL, the first packed byte contains the number of characters 
in the string; in B, there is no count and strings are terminated 
by a special character, which B spelled `*e'. This change was 
made partially to avoid the limitation on the length of a string 
caused by holding the count in an 8- or 9-bit slot, and partly 
because maintaining the count seemed, in our experience, less 
convenient than using a terminator.


Individual characters in a BCPL string were usually manipulated 
by spreading the string out into another array, one character per 
cell, and then repacking it later; B provided corresponding 
routines, but people more often used other library functions that 
accessed or replaced individual characters in a string.»


http://cm.bell-labs.com/cm/cs/who/dmr/chist.html



Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Paulo Pinto via Digitalmars-d
On Monday, 17 November 2014 at 12:49:16 UTC, Ola Fosheim Grøstad 
wrote:

On Monday, 17 November 2014 at 12:36:49 UTC, Paulo  Pinto wrote:
On Monday, 17 November 2014 at 11:43:45 UTC, Ola Fosheim 
Grøstad wrote:
Remember that the alternative to zero-terminated strings at 
that time was to have 2 string types, one with a one byte 
length and one with a larger length. So I think C made the 
right choice for it's time, to have a single string type 
without a length.


Black hat hackers, virus and security tools vendors around the 
world rejoice of that decision...


It was anything but right.


I don't think buffer overflow and string fundamentals are 
closely related, if used reasonably, but I'm not surprised you 
favour Pascal's solution of having two string types: one for 
strings up to 255 bytes and another one for longer strings.


Anyway, here is the real reason for how C implemented strings:

«None of BCPL, B, or C supports character data strongly in the 
language; each treats strings much like vectors of integers and 
supplements general rules by a few conventions. In both BCPL 
and B a string literal denotes the address of a static area 
initialized with the characters of the string, packed into 
cells. In BCPL, the first packed byte contains the number of 
characters in the string; in B, there is no count and strings 
are terminated by a special character, which B spelled `*e'. 
This change was made partially to avoid the limitation on the 
length of a string caused by holding the count in an 8- or 
9-bit slot, and partly because maintaining the count seemed, in 
our experience, less convenient than using a terminator.


Individual characters in a BCPL string were usually manipulated 
by spreading the string out into another array, one character 
per cell, and then repacking it later; B provided corresponding 
routines, but people more often used other library functions 
that accessed or replaced individual characters in a string.»


http://cm.bell-labs.com/cm/cs/who/dmr/chist.html


I am fully aware how UNIX designers decided to ignore the systems 
programming being done in Algol variants, PL/I variants and many 
other wannabe systems programming languages that came before C.


Which they are repeating again with Go.

--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d

On Monday, 17 November 2014 at 13:39:05 UTC, Paulo  Pinto wrote:
I am fully aware how UNIX designers decided to ignore the 
systems programming being done in Algol variants, PL/I variants 
and many other wannabe systems programming languages that came 
before C.


I wouldn't say that Algol is a systems programming language, and 
Pascal originally only had fixed width strings!


(But Simula actually had decent GC backed string support with 
substrings pointing to the same buffer and a link to the full 
buffer from substrings, thus somewhat more advanced than D ;-)




Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Walter Bright via Digitalmars-d
On 11/17/2014 3:43 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

Remember that the alternative to zero-terminated strings at that time was to
have 2 string types, one with a one byte length and one with a larger length.


No, that was not the alternative.




Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Walter Bright via Digitalmars-d
On 11/17/2014 3:00 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

I am saying
that when you have 32KiB RAM total it makes sense to save space by not storing
the string length.


I know what you're saying.

You're saying without evidence that sentinels are faster. They are not.
You're saying without evidence that 0 terminated strings use less memory. They 
do not.


(It does not save space when filename and filename.ext cannot be 
overlapped.)



Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d

On Monday, 17 November 2014 at 19:24:49 UTC, Walter Bright wrote:
You're saying without evidence that sentinels are faster. They 
are not.


You are twisting and turning so much in discussions that you make 
me dizzy.


I've been saying that for SOME OPERATIONS they are too, and that 
is not without evidence. Just plot it out for a 65xx, 680xx, Z80 
etc CPU and it becomes self-evident. Any system level programmer 
should be able to do it in a few minutes.


Using sentinels is a common trick for speeding up algorithms, it 
has some downsides, and some upsides, but they are used for a 
reason (either speed, convenience or both).


Pretending that sentinels are entirely useless is not a sane line 
of argument. I use sentinels in many situations and for many 
purposes, and they can greatly speed up and/or simplify code.


You're saying without evidence that 0 terminated strings use 
less memory. They do not.


(It does not save space when filename and filename.ext 
cannot be overlapped.)


0-terminated and shortstring (first byte being used for length) 
takes the same amount of space, but permanent substring reference 
slices are very wasteful of memory for low memory situations:


1. you need a ref count on the base buffer (2-4 bytes)
2. you need pointer to base + 2 offsets (4-12 bytes)

And worst is you retain the whole buffer even if you only 
reference a tiny portion of it. Yuk! In such a use scenario you 
are generally better of reallocation or use compaction. For 
non-permanent substrings you can still use begin/end pointers.


And please no, GC is not the answer, Simula had GC and the kind 
of strings and substrings you argued for but it was not intended 
for system level programming and it was not resource efficient. 
It was convenient. Scripty style concatenation and substring 
slicing is fun, but it is not system level programming. System 
level programming is about taking control over the hardware and 
use it most efficiently. Abstractions that lie mess this up.


Is wasting space on meta information less critical today? YES, OF 
COURSE! It does matter that we have 100.000 times more RAM 
available.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Walter Bright via Digitalmars-d
On 11/17/2014 1:08 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

I've been saying that for SOME OPERATIONS they are too, and that is not without
evidence. Just plot it out for a 65xx, 680xx, Z80 etc CPU and it becomes
self-evident. Any system level programmer should be able to do it in a few 
minutes.


When designing a language data type, you don't design it for some operations. 
You design it so that it works best most of the time, or at least let the user 
decide.


You can always add a sentinel for specific cases. But C forces its use for all 
strings for practical purposes. The design is backwards, and most of the time a 
sentinel is the wrong choice.


BTW, I learned how to program on a 6800. I'm not ignorant of those machines. And 
frankly, C is too high level for the 6800 (and the other 8 bit CPUs). The idea 
that C maps well onto those processors is mistaken. Which is hardly surprising, 
as C was developed for the PDP-11, a 16 bit machine.


Yes, I know that people did use C for 8 bit machines.


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d

On Monday, 17 November 2014 at 22:03:48 UTC, Walter Bright wrote:
You can always add a sentinel for specific cases. But C forces 
its use for all strings for practical purposes. The design is 
backwards, and most of the time a sentinel is the wrong choice.


Ok, but I would rather say it like this: the language C doesn't 
really provide strings, it only provides literals in a particular 
format. So the literal-format is a trade-off between having 
something generic and simple and having something more complex 
and possibly limited (having 255 char limit is not good enough in 
the long run).


I think there is a certain kind of beauty to the minimalistic 
approach taken with C (well, at least after ANSI-C came about in 
the late 80s). I like the language better than the libraries…


BTW, I learned how to program on a 6800. I'm not ignorant of 
those machines. And frankly, C is too high level for the 6800 
(and the other 8 bit CPUs). The idea that C maps well onto 
those processors is mistaken.


Yes I agree, but those instruction sets are simple. :-) With only 
256 bytes of builtin RAM (IIRC) the 6800 was kind of skimpy on 
memory! We used it in high school for our classes in digital 
circuitry/projects.


(It is very difficult to discuss performance on x86, there is 
just too much clutter and machinery in the core that can skew 
results.)


Re: Why is `scope` planned for deprecation?

2014-11-17 Thread Walter Bright via Digitalmars-d
On 11/17/2014 3:15 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

Ok, but I would rather say it like this: the language C doesn't really provide
strings, it only provides literals in a particular format. So the literal-format
is a trade-off between having something generic and simple and having something
more complex and possibly limited (having 255 char limit is not good enough in
the long run).


The combination of the inescapable array-to-ptr decay when calling a function, 
coupled with the Standard library which is part of the language that takes char* 
as strings, means that for all practical purposes C does provide strings, and 
pretty much forces it on the programmer.




I think there is a certain kind of beauty to the minimalistic approach taken
with C (well, at least after ANSI-C came about in the late 80s). I like the
language better than the libraries…


C is a brilliant language. That doesn't mean it hasn't made serious mistakes in 
its design. The array decay and 0 strings have proven to be very costly to 
programmers over the decades.




Re: Why is `scope` planned for deprecation?

2014-11-17 Thread via Digitalmars-d

On Monday, 17 November 2014 at 19:24:49 UTC, Walter Bright wrote:
On 11/17/2014 3:00 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

I am saying
that when you have 32KiB RAM total it makes sense to save 
space by not storing

the string length.


I know what you're saying.

You're saying without evidence that sentinels are faster. They 
are not.
You're saying without evidence that 0 terminated strings use 
less memory. They do not.


(It does not save space when filename and filename.ext 
cannot be overlapped.)


Stop wasting time with the mouth breather.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Paulo Pinto via Digitalmars-d

Am 16.11.2014 um 08:44 schrieb Walter Bright:

On 11/15/2014 11:14 PM, Paulo Pinto wrote:

Am 16.11.2014 um 05:51 schrieb Walter Bright:

What I find odd about the progress of C++ (11, 14, 17, ...) is that
there has been no concerted effort to make the preprocesser obsolete.

What about templates, compile time reflection, modules and compile
time code
execution?


Competent and prominent C++ coding teams still manage to find complex
and tangled uses for the preprocessor that rely on the most obscure
details of how the preprocessor works, and then hang their whole
codebase on it.

I find it baffling, but there it is. I've made some effort to get rid of
preprocessor use in the DMD source.



No need for the pre-processor other than textual inclusion and
conditional
compilation.


Andrei, Herb, and I made a proposal to the C++ committee to introduce
'static if'. It was promptly nailed to the wall and executed by firing
squad. :-)



That was quite bad how it happened.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 03:27:54 UTC, Walter Bright wrote:

On 11/14/2014 4:32 PM, deadalnix wrote:

To quote the guy from the PL for video games video serie, a 85%
solution often is preferable.


Spoken like a true engineer!


More like a consultant for self-help:

http://www.amazon.com/85%25-Solution-Personal-Accountability-Guarantees/dp/0470500166


Real world 85% engineered solutions:

1. Titanic
2. Chernobyl
3. Challenger
4. C++
…


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Walter Bright via Digitalmars-d

On 11/16/2014 3:30 AM, Ola Fosheim Grøstad
 Real world 85% engineered solutions:


1. Titanic


Everyone likes to rag on the Titanic's design, but I've read a fair amount about 
it, and it's quite an unfair rap. It was, for its day, the safest ship afloat, 
and did represent a significant step forward in safety:


1. The watertight compartments were innovative and kept the Titanic afloat for 
hours. Without them, it would have sank very quickly. The damage the Titanic 
suffered was very unusual in its extensiveness, and would have sunk any ship of 
the day.


2. The wireless was new and state of the art, without it the Titanic would have 
sunk with all aboard without a trace, and what happened to it would have been a 
great mystery. The fault with the wireless had nothing to do with its 
engineering, but with its management (the California did not keep a 24 hr watch 
on the radio).


3. The hull steel was inferior by today's standards, but was the best available 
by the standards of its time.


4. The rudder was inadequate, but little was known at the time about how such 
large ships would handle, and they didn't exactly have computer simulation 
software available.


5. The oft-repeated thing about the lifeboats was a little unreasonable. The way 
ships usually sink it's very difficult to launch any lifeboats successfully. If 
the ship listed, the boats on the high side could not be launched at all, and if 
it tilted down at a steeper angle none of them could be launched. The way the 
Titanic sank, slowly and fairly levelly, enabling nearly all the boats to be 
launched, was very unusual. The idea was with the watertight compartments it 
would sink slowly enough that the boats could be used to ferry the passengers to 
safety. That in fact would have worked if the California had been monitoring the 
wireless.


It's unfair to apply the hubris of hindsight. Apply instead the standards and 
practices of the foresight, and the Titanic comes off very well.


It was not designed to drive full speed into an iceberg, and modern ships can't 
handle that, either. Actually, the Titantic would likely have fared better than 
modern ships if it didn't try to turn but simply rammed it head on. The 
watertight compartments would have kept it afloat.


For comparison, look what happened to that italian cruise ship a few years ago. 
It got a minor hole punched in the side by a rock, rolled over and sank.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 17:46:09 UTC, Walter Bright wrote:
Everyone likes to rag on the Titanic's design, but I've read a 
fair amount about it, and it's quite an unfair rap. It was, for 
its day, the safest ship afloat, and did represent a 
significant step forward in safety:


«The 20 lifeboats that she did carry could only take 1,178 
people, even though there were about 2,223 on board.» 
http://en.wikipedia.org/wiki/Lifeboats_of_the_RMS_Titanic


Thats not even a 85% solution, it is a 53% solution.

It's unfair to apply the hubris of hindsight. Apply instead the 
standards and practices of the foresight, and the Titanic comes 
off very well.


I don't know, my grandfather's uncle went with one of the 
expeditions around Greenland and they did not sink. That ship 
(Fram) was designed for being frozen into the ice as it was 
designed for being used in to reach the north pole. The shape of 
the hull was designed to pop out of the ice rather than being 
pushed down so that the ship could float over the arctic as part 
of the ice. It was later used for several trips, notably the 
famous trip to reach the south pole. That's a lot closer to a 
100% engineering solution!


It was not designed to drive full speed into an iceberg, and 
modern ships can't handle that, either.


It was not sane to drive at full speed I guess. There was a lot 
of arrogance in the execution around Titanic, both leaving with 
insufficient life boats and driving at full speed suggest a lack 
of understanding…



Returning to programming languages: if I cannot implement 100% of 
my design with a language then it is a non-solution. 85% is not 
enough.


In business applications people sometimes have to settle for 
ready-made 85% solutions and change their business practices to 
get the last 15%, but that is not good enough for systems 
programming IMO. That's how you think about frameworks, but not 
how you think about language design (or system level runtime).


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Walter Bright via Digitalmars-d
On 11/16/2014 10:27 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

Returning to programming languages: if I cannot implement 100% of my design with
a language then it is a non-solution. 85% is not enough.


You can do anything with a language if it is Turing complete.



In business applications people sometimes have to settle for ready-made 85%
solutions and change their business practices to get the last 15%, but that is
not good enough for systems programming IMO. That's how you think about
frameworks, but not how you think about language design (or system level 
runtime).


Be careful you don't fall into kitchen sink syndrome. Add enough features, and 
the language becomes unusable. Features are almost never orthogonal, they always 
interact and interfere with each other. This applies to all engineering, not 
just languages. There are no 100% solutions.


For example, D doesn't support multiple inheritance. This is on purpose. Yes, 
some C++ programmers believe D is badly broken because of this. I don't at all 
believe it is unreasonable that one should make adaptations in design in order 
to use a language successfully.


After all, D is explicitly designed to be a pragmatic language.



Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 18:36:01 UTC, Walter Bright wrote:

You can do anything with a language if it is Turing complete.


That's not the point. If you have to avoid features because they 
aren't general enough or have to change the design to fit the 
language and not the hardware, then the language design becomes a 
problem.


Be careful you don't fall into kitchen sink syndrome. Add 
enough features, and the language becomes unusable. Features 
are almost never orthogonal, they always interact and interfere 
with each other. This applies to all engineering, not just 
languages. There are no 100% solutions.


I think C is pretty close to a 98% solution for system level 
programming. Granted, it relies on macros to reach that.


For example, D doesn't support multiple inheritance. This is on 
purpose. Yes, some C++ programmers believe D is badly broken 
because of this. I don't at all believe it is unreasonable that 
one should make adaptations in design in order to use a 
language successfully.


After all, D is explicitly designed to be a pragmatic 
language.


I am not sure if OO-inheritance and virtual functions are all 
that important for system level programming, but generally 
features should work across the board if it can be implemented 
efficiently. E.g. creating weird typing rules due to ease of 
implementation does not sit well with me.


Or to put it in more simple terms: figuring out how a programming 
language works is a necessary investment, but having to figure 
out how a programming language does not work and when is really 
annoying.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Walter Bright via Digitalmars-d
On 11/16/2014 10:52 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

I think C is pretty close to a 98% solution for system level programming.


Not at all in my view. It has two miserable failures:

1. C's Biggest Mistake

http://www.drdobbs.com/architecture-and-design/cs-biggest-mistake/228701625

This made C far, far more difficult and buggy to work with than it should have 
been.


2. 0 terminated strings

This makes it surprisingly difficult to do performant string manipulation, and 
also results in a excessive memory consumption.




Granted, it relies on macros to reach that.


And it's a crummy macro system, even for its day.


My above remarks should be put in context of when C was designed. As with the 
Titanic, it is unfair to apply modern sensibilities to it. But if we were to, a 
vast amount of C could be dramatically improved without changing its fundamental 
nature.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 19:24:47 UTC, Walter Bright wrote:
This made C far, far more difficult and buggy to work with than 
it should have been.


Depends on your view of C, if you view C as step above assembly 
then it makes sense to treat everything as pointers. It is a bit 
confusing in the beginning since it is more or less unique to C.



2. 0 terminated strings

This makes it surprisingly difficult to do performant string 
manipulation, and also results in a excessive memory 
consumption.


Whether using sentinels is slow or fast depends on what you want 
to do, but it arguably save space for small strings (add a length 
+ alignment and you loose ~6 bytes).


Also dealing with a length means you cannot keep everything in 
registers on simple CPUs.


A lexer that takes zero terminated input is a lot easier to write 
and make fast than one that use length.


Nothing prevents you from creating a slice as a struct though.

sensibilities to it. But if we were to, a vast amount of C 
could be dramatically improved without changing its fundamental 
nature.


To me the fundamental nature of C is:

1. I can visually imagine how the code maps onto the hardware

2. I am not bound to a complicated runtime


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Walter Bright via Digitalmars-d
On 11/16/2014 11:59 AM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Sunday, 16 November 2014 at 19:24:47 UTC, Walter Bright wrote:

This made C far, far more difficult and buggy to work with than it should have
been.


Depends on your view of C, if you view C as step above assembly then it makes
sense to treat everything as pointers.


If you read my article, the fix does not take away anything.



2. 0 terminated strings

This makes it surprisingly difficult to do performant string manipulation, and
also results in a excessive memory consumption.

Whether using sentinels is slow or fast depends on what you want to do, but it
arguably save space for small strings (add a length + alignment and you loose ~6
bytes).



Also dealing with a length means you cannot keep everything in registers on
simple CPUs.

A lexer that takes zero terminated input is a lot easier to write and make fast
than one that use length.


I've worked enough with C to know that these arguments do not hold up in real 
code.



Nothing prevents you from creating a slice as a struct though.


I've tried that, too. Doesn't work - the C runtime library prevents it, as well 
as every other library.




sensibilities to it. But if we were to, a vast amount of C could be
dramatically improved without changing its fundamental nature.


To me the fundamental nature of C is:

1. I can visually imagine how the code maps onto the hardware

2. I am not bound to a complicated runtime


None of the fixes I've suggested impair that in any way.



Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 20:26:36 UTC, Walter Bright wrote:

If you read my article, the fix does not take away anything.


Yes, but that is just what all other languages had at the time, 
so leaving it out was obviously deliberate. I assume they wanted 
a very simple model where each parameter could fit in a register.


I've worked enough with C to know that these arguments do not 
hold up in real code.


But you have to admit that older CPUS/tight RAM does have an 
effect? Even 8086 have dedicated string instructions with the 
ability to terminate on zero (REPNZ)


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Walter Bright via Digitalmars-d
On 11/16/2014 12:44 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Sunday, 16 November 2014 at 20:26:36 UTC, Walter Bright wrote:

If you read my article, the fix does not take away anything.


Yes, but that is just what all other languages had at the time, so leaving it
out was obviously deliberate. I assume they wanted a very simple model where
each parameter could fit in a register.


Since structs were supported, this rationale does not work.



I've worked enough with C to know that these arguments do not hold up in real
code.

But you have to admit that older CPUS/tight RAM does have an effect? Even 8086
have dedicated string instructions with the ability to terminate on zero (REPNZ)


Remember that I wrote successful C and C++ compilers for 16 bit 8086 machines, 
and programmed on it for a decade. I know about those instructions, and I'm 
familiar with the tradeoffs. It's not worth it.


Besides, C was designed for the PDP-11, which had no such instructions.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread ketmar via Digitalmars-d
On Sun, 16 Nov 2014 19:59:52 +
via Digitalmars-d digitalmars-d@puremagic.com wrote:

 A lexer that takes zero terminated input is a lot easier to write 
 and make fast than one that use length.
that's why warp is faster than cpp? ;-)


signature.asc
Description: PGP signature


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d
On Sunday, 16 November 2014 at 22:00:10 UTC, ketmar via 
Digitalmars-d wrote:

that's why warp is faster than cpp? ;-)


Which implementation of cpp?

(Btw, take a look at lexer.c in DMD :-P)



Re: Why is `scope` planned for deprecation?

2014-11-16 Thread ketmar via Digitalmars-d
On Sun, 16 Nov 2014 22:09:00 +
via Digitalmars-d digitalmars-d@puremagic.com wrote:

 On Sunday, 16 November 2014 at 22:00:10 UTC, ketmar via 
 Digitalmars-d wrote:
  that's why warp is faster than cpp? ;-)
 Which implementation of cpp?
gcc implementation, afair. it's slowness was the reason for warping.

 (Btw, take a look at lexer.c in DMD :-P)
c++ has no good string type, so there is no much choice.

as a writer of at least four serious scripting languages (and alot
more as toy ones) i can tell you that zero-terminated strings are
PITA. the only sane way to write a good lexer is working with structure
which emulates D string and slicing (if we must parse text from
in-memory buffer, of course).


signature.asc
Description: PGP signature


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Paulo Pinto via Digitalmars-d
Am 16.11.2014 um 20:59 schrieb Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= 
ola.fosheim.grostad+dl...@gmail.com:

On Sunday, 16 November 2014 at 19:24:47 UTC, Walter Bright wrote:

This made C far, far more difficult and buggy to work with than it
should have been.


Depends on your view of C, if you view C as step above assembly then it
makes sense to treat everything as pointers. It is a bit confusing in
the beginning since it is more or less unique to C.



My view is of a kind of portable macro assembler, even MASM and TASM 
were more feature rich back in the day.


Actually I remember reading a DDJ article about a Texas Instruments 
Assembler that looked like C, just with one simple expression per line.


So you could not do

a = b + c * 4;

rather

r0 = b
r1 = c
r1 *= 4
r0 += r1

Just an idea, I don't remember any longer how it actually was.


--
Paulo





Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d
On Sunday, 16 November 2014 at 22:18:51 UTC, ketmar via 
Digitalmars-d wrote:

On Sun, 16 Nov 2014 22:09:00 +
via Digitalmars-d digitalmars-d@puremagic.com wrote:

On Sunday, 16 November 2014 at 22:00:10 UTC, ketmar via 
Digitalmars-d wrote:

 that's why warp is faster than cpp? ;-)
Which implementation of cpp?
gcc implementation, afair. it's slowness was the reason for 
warping.


Ok, I haven't seen an independent benchmark, but I believe clang 
is faster. But…


https://github.com/facebook/warp/blob/master/lexer.d#L173

PITA. the only sane way to write a good lexer is working with 
structure

which emulates D string and slicing (if we must parse text from
in-memory buffer, of course).


Nah, if you know that the file ends with zero then you can build 
an efficient finite automata as a classifier.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread ketmar via Digitalmars-d
On Sun, 16 Nov 2014 22:22:42 +
via Digitalmars-d digitalmars-d@puremagic.com wrote:

 Nah, if you know that the file ends with zero then you can build 
 an efficient finite automata as a classifier.
FSA code is a fsckn mess. either adding dependency of external tool and
alot of messy output to project, or writing that messy code manually.
and FSA is not necessary faster, as it's bigger and so it trashing CPU
cache.


signature.asc
Description: PGP signature


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread deadalnix via Digitalmars-d
On Sunday, 16 November 2014 at 11:30:01 UTC, Ola Fosheim Grøstad 
wrote:
On Sunday, 16 November 2014 at 03:27:54 UTC, Walter Bright 
wrote:

On 11/14/2014 4:32 PM, deadalnix wrote:
To quote the guy from the PL for video games video serie, a 
85%

solution often is preferable.


Spoken like a true engineer!


More like a consultant for self-help:

http://www.amazon.com/85%25-Solution-Personal-Accountability-Guarantees/dp/0470500166


Real world 85% engineered solutions:

1. Titanic
2. Chernobyl
3. Challenger
4. C++
…


Sorry but that is dumb, and the fact you are on the D newsgroup 
rather on 100% solution languages newsgroup (Java is 100% OOP, 
Haskell is 100% functional, Rust is 100% linear types, Javascript 
is 100% callbacks, erlang is 100% concurrent, LISP is 100% meta, 
BASIC is 100% imperative, python is 100% slow, PHP 100% 
inconsistent) tells me that not even you believe in your own 
bullshit.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 22:19:16 UTC, Paulo Pinto wrote:
My view is of a kind of portable macro assembler, even MASM 
and TASM were more feature rich back in the day.


Actually I remember reading a DDJ article about a Texas 
Instruments Assembler that looked like C, just with one simple 
expression per line.


So you could not do

a = b + c * 4;

rather

r0 = b
r1 = c
r1 *= 4
r0 += r1

Just an idea, I don't remember any longer how it actually was.


Not such a bad idea if you can blend it with regular assembly 
mnemonics. When I did the course in machine near programming 
university I believe I chose to do the exam in Motorola 68000 
machine language because I found it no harder than C at the 
time…(?) I surely would not have done the same with the x86 
instruction set though.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Sunday, 16 November 2014 at 22:55:54 UTC, deadalnix wrote:
Sorry but that is dumb, and the fact you are on the D newsgroup 
rather on 100% solution languages newsgroup (Java is 100% OOP, 
Haskell is 100% functional, Rust is 100% linear types, 
Javascript is 100% callbacks, erlang is 100% concurrent, LISP 
is 100% meta, BASIC is 100% imperative, python is 100% slow, 
PHP 100% inconsistent) tells me that not even you believe in 
your own bullshit.


Define what you mean by 100%? By 100% I mean that you can 
implement your system level design without bending it around 
special cases induced by the language.


The term 85% solution is used for implying that it only 
provides a solution to 85% of what you want to achieve (like a 
framework) and that you have to change your goals or go down a 
painful path to get the last 15%.


ASM is 100% (or 0%). You can do anything the hardware supports.

C is close to 98%. You can easily get the last 2% by writing asm.

Java/C# are 90%. You are locked up in abstracted frameworks.

HTML5/JS is 80%. You can do certain things efficiently, but other 
things are plain difficult.


Flash/ActionScript is 60%. …

What Jonathan Blunt apparently wants is a language that is 
tailored to the typical patterns seen in games programming, so 
that might mean that e.g. certain allocation patterns are 
supported, but others not. (Leaving out the 15% that is not used 
in games programming). This is characteristic of programming 
frameworks.


I think it is reasonable to push back when D is moving towards 
becoming a framework. There are at least two factions in the D 
community. One faction is looking for an application framework 
and the other faction is looking for a low level programming 
language.


These two perspectives are not fully compatible.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread Walter Bright via Digitalmars-d
On 11/16/2014 2:22 PM, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:

On Sunday, 16 November 2014 at 22:18:51 UTC, ketmar via Digitalmars-d wrote:

On Sun, 16 Nov 2014 22:09:00 +
via Digitalmars-d digitalmars-d@puremagic.com wrote:


On Sunday, 16 November 2014 at 22:00:10 UTC, ketmar via Digitalmars-d wrote:
 that's why warp is faster than cpp? ;-)
Which implementation of cpp?

gcc implementation, afair. it's slowness was the reason for warping.


Ok, I haven't seen an independent benchmark, but I believe clang is faster. But…

https://github.com/facebook/warp/blob/master/lexer.d#L173


Notice the total lack of strlen()'s in Warp.



Nah, if you know that the file ends with zero then you can build an efficient
finite automata as a classifier.


deadalnix busted that myth a while back with benchmarks.


Re: Why is `scope` planned for deprecation?

2014-11-16 Thread via Digitalmars-d

On Monday, 17 November 2014 at 01:39:38 UTC, Walter Bright wrote:

Notice the total lack of strlen()'s in Warp.


Why would you need that? You know where the lexeme begins and 
ends? If we are talking about old architectures you have to 
acknowledge that storage was premium and that the major cost was 
getting the strings into memory in the first place.


Nah, if you know that the file ends with zero then you can 
build an efficient

finite automata as a classifier.


deadalnix busted that myth a while back with benchmarks.


I haven't seen it, but it is difficult to avoid lexers being 
bandwidth limited these days.


Besides, how do you actually implement a lexer without 
constructing a FA one way or the other?


Re: Why is `scope` planned for deprecation?

2014-11-15 Thread Jacob Carlborg via Digitalmars-d
On 2014-11-14 15:28, Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= 
ola.fosheim.grostad+dl...@gmail.com wrote:



I don't know yet, but the 5.1 simulator will probably have to run on
OS-X 10.6.8 from what I've found on the net. Maybe it is possible to do
as you said with clang… Hm.


The simulator bundled with Xcode 5 run on Yosemite but not the one 
bundled with Xcode 4.


--
/Jacob Carlborg


Re: Why is `scope` planned for deprecation?

2014-11-15 Thread Walter Bright via Digitalmars-d

On 11/14/2014 4:32 PM, deadalnix wrote:

To quote the guy from the PL for video games video serie, a 85%
solution often is preferable.


Spoken like a true engineer!


Re: Why is `scope` planned for deprecation?

2014-11-15 Thread Walter Bright via Digitalmars-d

On 11/13/2014 5:55 AM, Manu via Digitalmars-d wrote:

I realised within minutes that it's almost impossible to live without slices.
On the plus side, I've already made lots of converts in my new office
from my constant ranting :P


You should submit a presentation proposal to the O'Reilly Software Architecture 
Conference!


  http://softwarearchitecturecon.com/sa2015


Re: Why is `scope` planned for deprecation?

2014-11-15 Thread Walter Bright via Digitalmars-d

On 11/13/2014 3:44 AM, Manu via Digitalmars-d wrote:

After having adapted to D and distancing from C++, trying to go back
is like some form of inhuman torture!
I really don't remember it being as bad as it is... the time away has
given me new perspective on how terrible C++ is, and I can say with
confidence, there is NOTHING C++ could do to make itself a 'better
option' at this point.


What I find odd about the progress of C++ (11, 14, 17, ...) is that there has 
been no concerted effort to make the preprocesser obsolete.




Re: Why is `scope` planned for deprecation?

2014-11-15 Thread Paulo Pinto via Digitalmars-d

Am 16.11.2014 um 05:51 schrieb Walter Bright:

On 11/13/2014 3:44 AM, Manu via Digitalmars-d wrote:

After having adapted to D and distancing from C++, trying to go back
is like some form of inhuman torture!
I really don't remember it being as bad as it is... the time away has
given me new perspective on how terrible C++ is, and I can say with
confidence, there is NOTHING C++ could do to make itself a 'better
option' at this point.


What I find odd about the progress of C++ (11, 14, 17, ...) is that
there has been no concerted effort to make the preprocesser obsolete.



What about templates, compile time reflection, modules and compile time 
code execution?


No need for the pre-processor other than textual inclusion and 
conditional compilation.


--
Paulo


Re: Why is `scope` planned for deprecation?

2014-11-15 Thread Walter Bright via Digitalmars-d

On 11/15/2014 11:14 PM, Paulo Pinto wrote:

Am 16.11.2014 um 05:51 schrieb Walter Bright:

What I find odd about the progress of C++ (11, 14, 17, ...) is that
there has been no concerted effort to make the preprocesser obsolete.

What about templates, compile time reflection, modules and compile time code
execution?


Competent and prominent C++ coding teams still manage to find complex and 
tangled uses for the preprocessor that rely on the most obscure details of how 
the preprocessor works, and then hang their whole codebase on it.


I find it baffling, but there it is. I've made some effort to get rid of 
preprocessor use in the DMD source.




No need for the pre-processor other than textual inclusion and conditional
compilation.


Andrei, Herb, and I made a proposal to the C++ committee to introduce 'static 
if'. It was promptly nailed to the wall and executed by firing squad. :-)




Re: Why is `scope` planned for deprecation?

2014-11-14 Thread Jacob Carlborg via Digitalmars-d
On 2014-11-13 23:00, Ola Fosheim Grøstad 
ola.fosheim.grostad+dl...@gmail.com wrote:



Unfortunately, I guess I can't use it on my next project anyway, since I
need to support iOS5.1 which probably means XCode… 4? Sigh…


Can't you use Xcode 6 and set the minimum deploy target to iOS 5.1? If 
that's not possible it should be possible to use Xcode 4 and replace the 
Clang compiler that Xcode uses with a later version.


--
/Jacob Carlborg


Re: Why is `scope` planned for deprecation?

2014-11-14 Thread via Digitalmars-d

On Friday, 14 November 2014 at 08:10:22 UTC, Jacob Carlborg wrote:
Can't you use Xcode 6 and set the minimum deploy target to iOS 
5.1? If that's not possible it should be possible to use Xcode 
4 and replace the Clang compiler that Xcode uses with a later 
version.


I don't know yet, but the 5.1 simulator will probably have to run 
on OS-X 10.6.8 from what I've found on the net. Maybe it is 
possible to do as you said with clang… Hm.


Re: Why is `scope` planned for deprecation?

2014-11-14 Thread Araq via Digitalmars-d
I think it make sense to have something for ownership. The 
error of rust wasn't going that road, but going in that road 
100%, which come at a cost at interface level which is too 
important. A simpler ownership system, that fallback on the GC 
or unsafe feature when it fall short.


I'm confident at this point that we can get most of the benefit 
of an ownership system with something way simpler than rust's 
system if you accept to not cover 100% of the scenarios.


Do you happen to have any concrete reasons for that? An example
maybe? Maybe start with explaining how in detail Rust's system is
too complex? I'm sure the Rust people will be interested in how
you can simplify a (most likely sound) type system that took
years to come up with and refine.


Re: Why is `scope` planned for deprecation?

2014-11-14 Thread via Digitalmars-d

On Thursday, 13 November 2014 at 13:29:00 UTC, Wyatt wrote:
Unfortunately for your sanity, this isn't going to happen.  
Similarly unlikely is multiple pointer types, which Walter has 
repeatedly shot down.  I'd suggest bringing it back up if and 
when discussion of D3 begins in earnest.


D needs to start to focus on providing an assumption free system 
level programming language that supports the kind of modelling 
done for system level programming.


I am not sure if adding templates to D was a good idea, but now 
that you have gone that route to such a large extent, you might 
as well do it wholesale with better support for templated SYSTEM 
programming would make sense. Make it your advantage. (including 
deforesting/common subexpression substitution, constraints 
systems etc)


As an application level programming language D stands no chance. 
More crutches and special casing will not make D a system level 
programming language. Neither does adding features designed for 
other languages geared towards functional programming (which is 
the antithesis of system level programming).


Yes, it can be done using a source to source upgrade tool.

No, attribute inference is not a silver bullet, it means changes 
to libraries would silently break applications.


Yes, function signatures matters. Function signatures are 
contracts, they need to be visually clear and the semantics have 
to be easy to grok.


No, piling up low hanging fruits that are not yet ripe is not a 
great way to do language design.


Re: Why is `scope` planned for deprecation?

2014-11-14 Thread deadalnix via Digitalmars-d

On Friday, 14 November 2014 at 14:59:39 UTC, Araq wrote:
I think it make sense to have something for ownership. The 
error of rust wasn't going that road, but going in that road 
100%, which come at a cost at interface level which is too 
important. A simpler ownership system, that fallback on the GC 
or unsafe feature when it fall short.


I'm confident at this point that we can get most of the 
benefit of an ownership system with something way simpler than 
rust's system if you accept to not cover 100% of the scenarios.


Do you happen to have any concrete reasons for that? An example
maybe? Maybe start with explaining how in detail Rust's system 
is

too complex? I'm sure the Rust people will be interested in how
you can simplify a (most likely sound) type system that took
years to come up with and refine.


I'm not sure we understand rust type system to be too complicated
the same way.

Let's be clear: There is no accidental complexity in Rust's type
system. It is sound and very powerful. There is no way I can
think of you could make it simpler.

That being said, there are cases where Rust's type system shine,
for instance tree like datastructures with same lifetime, passing
down immutable objects to pure functions and so on.

But there are also cases when it become truly infamous like a
digraph of object with disparate lifetime.

Rust made the choice to have this safe memory management that do
not rely on the GC, so they have to handle the infamous cases.
This require a rich and complex type system.

My point is that we can support the nice cases with something
much simpler, while delegating the infamous ones to the GC or
unsafe constructs. The good news is that the nice cases are more
common that the hard ones (or Rust would be absolutely unusable)
so we can reap most of the benefices of a rust like approach
while introducing much less complexity in the language.

 From a cost benefice perspective, this seems like the right way
forward to me.

To quote the guy from the PL for video games video serie, a 85%
solution often is preferable.


Re: Why is `scope` planned for deprecation?

2014-11-13 Thread Manu via Digitalmars-d
Are you guys saying you don't feel this proposal is practical?
http://wiki.dlang.org/User:Schuetzm/scope

I think it's a very interesting approach, and comes from a practical
point of view. It solves the long-standings issues, like scope return
values, in a very creative way.

On 13 November 2014 08:33, Andrei Alexandrescu via Digitalmars-d
digitalmars-d@puremagic.com wrote:
 On 11/12/14 2:10 PM, deadalnix wrote:

 On Wednesday, 12 November 2014 at 15:57:18 UTC, Nick Treleaven
 wrote:

 I think Rust's lifetimes would be a huge change if ported to D. In
 Rust user types often need annotations as well as function parameters.
 People tend to want Rust's guarantees without the limitations. I think
 D does need some kind of scope attribute verification, but we need to
 throw out some of the guarantees Rust makes to get an appropriate fit
 for existing D code.


 Rust is not the first language going that road. The problem is
 that you get great complexity if you don't want to be too
 limiting in what you can do. This complexity ultimately ends up
 costing more than what you gain.

 I think the sane road to go into is supporting
 ownership/burrowing for common cases, and fallback on the GC, or
 unsafe construct for the rest.

 One have to admit there is no silver bullet, and shoehorning
 everything in the same solution is not gonna work.


 I agree. This is one of those cases in which a good engineering solution may
 be a lot better than the perfect solution (and linear types are not even
 perfect...).

 Andrei


  1   2   >