Re: what should be the default extension?

2008-01-07 Thread Richard Hainsworth

May I suggest the following extension to the 'use ' pragma, viz.
use module name written in unicode and case sensitive in filename as 
constrained by local system


For justification, see below.

asideThere were some hot replies to what I thought was a fairly 
trivial question. A corollary perhaps of an observation in Parkinsons 
Law - people on committees argue longest over the item with the 
smallest cost. :)/aside


The broken operating system, or rather family of systems (and I 
converted away from them about three years ago), still is used by 90% of 
users. In practice, it does matter what happens in that environment.


But also consider, whatever operating system is in use, it has to know 
how by default to handle a file - interpret it as a script, run it as a 
native executable, pipe it to an editor or renderer etc. That 
information has to be associated with the file in some way. One 
operating system uses name extensions, another looks at the first line 
for a #! etc.


Personally, I find it useful to have a visible clue in the name (via an 
extension) as to the content of the file. This seems to me more 
widespread than just an OS requirement, as otherwise why have *.tar 
*.tar.gz *.pdf *.doc *.png etc or even .* for hidden files in unix?


If it doesnt matter - as far as perl6 is concerned - how the module is 
named (see Larry Wall's response regarding unicode and 
case-sensitivity), then the extensions too are irrelevant, no? So if I 
choose to call my perl6 scripts *.p6 it should not matter? Other than 
for the sake of tradition or conformity with the tribe's sense of 
propriety :)


And that brings me to another question. Should it matter what the name 
of the file is? For modules in perl5 as far as I can discern, the name 
of the module in the file name has to conform with the package name 
inside the script. I have found this default behaviour annoying at times.


By extending the 'use' pragma to include information about which 
container it can be found in, filenames become truly irrelevant. 
Moreover, the initiation file for a large project might just be a 
configuration file containing all the module names, together with use 
Main main in ProjectDebugStage.v03


[EMAIL PROTECTED] wrote:

No, some people put .pl on the end of their scripts because they are
running on broken operating systems.




  

So, I imagine, for Perl6, I'll be making the same strong recommendation
that Perl6 scripts, just like Perl5 and Perl4 scripts before them, have


*no*
  

extension.

Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777



Enthusiastically seconded! Why should we let the language of Mordor
corrupt our discourse?



--

Email and shopping with the feelgood factor!

55% of income to good causes. http://www.ippimail.com

  


Re: what should be the default extension?

2008-01-07 Thread Trey Harris

In a message dated Mon, 7 Jan 2008, Richard Hainsworth writes:


May I suggest the following extension to the 'use ' pragma, viz.
use module name written in unicode and case sensitive in filename as 
constrained by local system


Oh please, no.

The entire point of the wording currently in the synopsis is so that we 
can have platform-independent location of libraries.  The name mangling 
capability hypothesized lets the same Cuse get the requested resource, 
regardless of the file (or database location!) where the resource is 
actually located on the platform running Perl.


Your proposal would force us to have platform-dependent locations, and 
hence platform-dependent programs.  Do you really want to see dozens of 
switches like


  given $?OS {
 when m:i:/win/ { use Foo in WinFoo.pm }
 when m:i:/nix/ { use Foo in UnixLikeFoo.pm }
  }

at the top of programs?

The broken operating system, or rather family of systems (and I converted 
away from them about three years ago), still is used by 90% of users. In 
practice, it does matter what happens in that environment.


Yes--but better for implementators to make that decision than for each and 
every Perl programmer (most of whom will have no experience with most OS's 
Perl runs on, so will be making such decisions blind!) to make them 
separately (and differently).


But also consider, whatever operating system is in use, it has to know how by 
default to handle a file - interpret it as a script, run it as a native 
executable, pipe it to an editor or renderer etc. That information has to be 
associated with the file in some way. One operating system uses name 
extensions, another looks at the first line for a #! etc.


Personally, I find it useful to have a visible clue in the name (via an 
extension) as to the content of the file. This seems to me more widespread 
than just an OS requirement, as otherwise why have *.tar *.tar.gz *.pdf *.doc 
*.png etc or even .* for hidden files in unix?


The .tar, etc., are typically named for transport, where you may need to 
know the filetype without having the benefit of its contents for 
examination.  *.pdf and *.doc are filetypes that did not exist until after 
Windows did, and so the required-extension behavior ossified.  The .* 
convention merely came about because it was a convenient kludge put into 
ls for the purpose.


By way of illustration,
   % mv foo.png foo.jpg
does not convert the file's image type.  If it did (or if it refused to do 
the move) you'd have an argument there.


If it doesnt matter - as far as perl6 is concerned - how the module is named 
(see Larry Wall's response regarding unicode and case-sensitivity), then the 
extensions too are irrelevant, no? So if I choose to call my perl6 scripts 
*.p6 it should not matter? Other than for the sake of tradition or conformity 
with the tribe's sense of propriety :)


Sure, knock yourself out.  Call it .perl6 if you like, or 
.niftynewlanguagethatisn'tquitecompatiblewith.pl (except those violate 
eight.3 rules).


And that brings me to another question. Should it matter what the name of the 
file is? For modules in perl5 as far as I can discern, the name of the module 
in the file name has to conform with the package name inside the script. I 
have found this default behaviour annoying at times.


To do otherwise is to again require platform-dependence.  If you truly 
want to force it, load the code manually with Crequire in a CBEGIN 
block.


By extending the 'use' pragma to include information about which container it 
can be found in, filenames become truly irrelevant.


Quite to the contrary--they become much more relevant.

Moreover, the initiation file for a large project might just be a 
configuration file containing all the module names, together with use 
Main main in ProjectDebugStage.v03


And that configuration file would be a huge comb of $?OS and $?OSVER 
switches in a typical platform-independent development.  And even so, the 
whole thing would be far more brittle than the automatic-location system 
we have today, as problematic as it is (not very, actually...).


Trey


Re: what should be the default extension?

2008-01-07 Thread Paul Hodges

A small tangent that might be relevant -- what's the current convention
for, say, putting several related packages in the same file?

In p5, I might write a great Foo.pm that loads Foo::Loader.pm and
Foo::Parser.pm and Foo::Object.pm; I'd usually drop them into seperate
files and have one load the rest, so I could just use Foo; and get on
with my code, but that does add some maintenance that could be skipped
if loading the one file automatically grabbed all the relevant parts.
Plusses and minuses there If the Foo:Widget and Foo:Gadget are only
of use with the something in Foo proper, maybe it would be reasonable
to toss them into the same file, though.

Looking at this proposal made me wonder if there were any value in
maybe tossing them all into one file, since that's how they'd usually
be used, but that if I just really had a use for a Foo::Parser and
didn't want any of the rest, maybe I could still
   use Foo::Parser in Foo.pm;

That could load just the one section from whatever file it found in a
usual search for Foo. I really can't see that it's anything very
urgent, but maybe someone sees a use for that? It would have the
possible advantage of making the more common case be easier, and the
rarer, exceptional case be the one that takes the extra work. Even so,
I'm just not sure it's worth the work it might take to build it in.

All in all, I'm still thinking in p5, so I'd still probably do it the
old way. =o)

--- Trey Harris [EMAIL PROTECTED] wrote:

 In a message dated Mon, 7 Jan 2008, Richard Hainsworth writes:
 
  May I suggest the following extension to the 'use ' pragma, viz.
  use module name written in unicode and case sensitive in
 filename as 
  constrained by local system
 
 Oh please, no.
 
 The entire point of the wording currently in the synopsis is so that
 we 
 can have platform-independent location of libraries.  The name
 mangling 
 capability hypothesized lets the same Cuse get the requested
 resource, 
 regardless of the file (or database location!) where the resource is 
 actually located on the platform running Perl.
 
 Your proposal would force us to have platform-dependent locations,
 and 
 hence platform-dependent programs.  Do you really want to see dozens
 of 
 switches like
 
given $?OS {
   when m:i:/win/ { use Foo in WinFoo.pm }
   when m:i:/nix/ { use Foo in UnixLikeFoo.pm }
}
 
 at the top of programs?
 
  The broken operating system, or rather family of systems (and I
 converted 
  away from them about three years ago), still is used by 90% of
 users. In 
  practice, it does matter what happens in that environment.
 
 Yes--but better for implementators to make that decision than for
 each and 
 every Perl programmer (most of whom will have no experience with most
 OS's 
 Perl runs on, so will be making such decisions blind!) to make them 
 separately (and differently).
 
  But also consider, whatever operating system is in use, it has to
 know how by 
  default to handle a file - interpret it as a script, run it as a
 native 
  executable, pipe it to an editor or renderer etc. That information
 has to be 
  associated with the file in some way. One operating system uses
 name 
  extensions, another looks at the first line for a #! etc.
 
  Personally, I find it useful to have a visible clue in the name
 (via an 
  extension) as to the content of the file. This seems to me more
 widespread 
  than just an OS requirement, as otherwise why have *.tar *.tar.gz
 *.pdf *.doc 
  *.png etc or even .* for hidden files in unix?
 
 The .tar, etc., are typically named for transport, where you may need
 to 
 know the filetype without having the benefit of its contents for 
 examination.  *.pdf and *.doc are filetypes that did not exist until
 after 
 Windows did, and so the required-extension behavior ossified.  The .*
 
 convention merely came about because it was a convenient kludge put
 into 
 ls for the purpose.
 
 By way of illustration,
 % mv foo.png foo.jpg
 does not convert the file's image type.  If it did (or if it refused
 to do 
 the move) you'd have an argument there.
 
  If it doesnt matter - as far as perl6 is concerned - how the module
 is named 
  (see Larry Wall's response regarding unicode and case-sensitivity),
 then the 
  extensions too are irrelevant, no? So if I choose to call my perl6
 scripts 
  *.p6 it should not matter? Other than for the sake of tradition or
 conformity 
  with the tribe's sense of propriety :)
 
 Sure, knock yourself out.  Call it .perl6 if you like, or 
 .niftynewlanguagethatisn'tquitecompatiblewith.pl (except those
 violate 
 eight.3 rules).
 
  And that brings me to another question. Should it matter what the
 name of the 
  file is? For modules in perl5 as far as I can discern, the name of
 the module 
  in the file name has to conform with the package name inside the
 script. I 
  have found this default behaviour annoying at times.
 
 To do otherwise is to again require platform-dependence.  If you
 

$?OS semantics

2008-01-07 Thread Trey Harris

Sorry, quoting myself...

In a message dated Mon, 7 Jan 2008, Trey Harris writes:

 given $?OS {
when m:i:/win/ { use Foo in WinFoo.pm }
when m:i:/nix/ { use Foo in UnixLikeFoo.pm }
 }


It strikes me that $?OS and $?OSVER should probably not be strings (as 
they now are in Pugs) and should be objects--in fact, they can both be the 
same object, just with different stringifications.  Auto-instantiated 
singletons of classes named for the OS, to be precise.


That way, you could do Config-and OS-specific-type stuff through the $?OS 
object--which is nice, though not that important--but better yet you can 
do


  given $?OS {
 when Windows { ... }
 when MacOS   { ... }
 when Linux   { ... }
 when Unix{ ... }
 default  { ... }
  }

Rather than having to do string-matching (and one would assume that Linux 
and MacOS are subtypes of Unix, etc.)


Then we can have roles that describe cross-cutting behavior of various 
OS's (like POSIX):


  my trytolink;
  give $?OS {
 when OS::HasSymlinks { trytolink := *symlink; }
 when OS::HasLinks{ trytolink := *link; }
 default  { trytolink :=
- ($old, $new) { fail X::LinkNotAvailable }
  }
  }

  # (Aside: Is there some way to do create a two-argument sub that ignores
  #  its arguments?  Would - (undef, undef) or - (*, *) or - (Any, Any)
  #  work?)

So $?OS isn't the type of OS, it's *the OS*, and you can manipulate the 
OS through it.


Thoughts?

Trey


Re: $?OS semantics

2008-01-07 Thread Larry Wall
On Mon, Jan 07, 2008 at 11:42:06AM -0500, Trey Harris wrote:
 So $?OS isn't the type of OS, it's *the OS*, and you can manipulate the 
 OS through it.

Note that $?OS is the OS that is-or-was running at compile time,
whereas $*OS is the OS running right now (at run time).  Those don't
have to be the same, especially for modules distributed in some kind
of universal bytecodeish format.  That is, $?OS is a compile-time
constant, while $*OS is a variable.  In fact, if someone invents
a way for a process to freeze and restore the execution context in
a different place, $*OS could change more than once.

All $? variables should be considered compile-time constants, in fact.
That's why self is no longer spelled $?SELF, for instance, because
it varies.

Larry


Re: $?OS semantics

2008-01-07 Thread Trey Harris

In a message dated Mon, 7 Jan 2008, Larry Wall writes:

On Mon, Jan 07, 2008 at 11:42:06AM -0500, Trey Harris wrote:

So $?OS isn't the type of OS, it's *the OS*, and you can manipulate the
OS through it.


Note that $?OS is the OS that is-or-was running at compile time,
whereas $*OS is the OS running right now (at run time).  Those don't
have to be the same, especially for modules distributed in some kind
of universal bytecodeish format.


Understood.  My earlier suggestions apply to $*OS and $*OSVER as well 
though.


That is, $?OS is a compile-time constant, while $*OS is a variable.  In 
fact, if someone invents a way for a process to freeze and restore the 
execution context in a different place, $*OS could change more than 
once.


Good point.  So you wouldn't want to assign a static behavior to my 
trytolink example; instead you'd probably want to do


  Role BestEffortLinkAble {
  method trytolink ($old, $new) { ... }
  }

And mix the role in to $*OS.  Then call $*OS.trytolink() to get the proper 
behavior at the proper time.


Trey


will be a computed goto in perl 6

2008-01-07 Thread herbert breunung

if we take TimTowtdi strictly, the anser would be yes :)

sorry for nagging but my question about existence of ($min, $max) = 
@array.minmax also seems vaporized.


cheers
herbert




Re: $?OS semantics

2008-01-07 Thread chromatic
On Monday 07 January 2008 08:42:06 Trey Harris wrote:

 Then we can have roles that describe cross-cutting behavior of various
 OS's (like POSIX):

    my trytolink;
    give $?OS {
       when OS::HasSymlinks { trytolink := *symlink; }
       when OS::HasLinks    { trytolink := *link; }
       default              { trytolink :=
                              - ($old, $new) { fail X::LinkNotAvailable }
                            }
    }

I agree, but these examples seem more like characteristics of filesystems than 
an OS.

-- c


Re: $?OS semantics

2008-01-07 Thread Larry Wall
On Mon, Jan 07, 2008 at 02:05:18PM -0500, Trey Harris wrote:
 And mix the role in to $*OS.  Then call $*OS.trytolink() to get the proper 
 behavior at the proper time.

Imagine a Beowulf cluster of those, and now $*OS might even point to
thread-specific data.

Larry


Re: will be a computed goto in perl 6

2008-01-07 Thread Larry Wall
On Mon, Jan 07, 2008 at 08:22:34PM +0100, herbert breunung wrote:
 if we take TimTowtdi strictly, the anser would be yes :)

Just as in Perl 5, you can say goto $label, with no guarantees
on efficiency.

 sorry for nagging but my question about existence of ($min, $max) = 
 @array.minmax also seems vaporized.

We currently have a minmax infix list operator.  We haven't defined
a method yet.

($min, $max) = [minmax] @array

might do what you want, if minmax treats a scalar as both a min and a

Larry


Re: S02 interpolation of entire hashes

2008-01-07 Thread Jonathan Lang
Dave Whipp wrote:
 The tests in S02 LS02/Literal/In order to interpolate an entire hash
 appear to assume that an interpolated hash renders its keys in a sorted
 order. But this property doesn't seem to be stated in the text. Is it
 true that the keys are always sorted for interpolation? (is it possible,
 in P6, for hash keys to not be comparable?)

I don't know if it's stated anywhere; but ISTR something about %h.keys
returning a Set instead of a List, and %h.vals returning a Bag.  In
fact, I believe that this sort of thing was part of the motivation
behind adding Sets and Bags to the core language: hash keys are not
inherently sorted; and if you're using something other than strings as
your hash keys, the keys may not be sortable.

-- 
Jonathan Dataweaver Lang


Re: S02 interpolation of entire hashes

2008-01-07 Thread Larry Wall
On Mon, Jan 07, 2008 at 05:23:36PM -0800, Dave Whipp wrote:
 The tests in S02 LS02/Literal/In order to interpolate an entire hash 
 appear to assume that an interpolated hash renders its keys in a sorted 
 order. But this property doesn't seem to be stated in the text. Is it true 
 that the keys are always sorted for interpolation?

No, the tests are probably assuming a particular Haskell implementation
that likely uses trees underneath, I would guess.  Such an assumption
would be erroneous.

 (is it possible, in P6, for hash keys to not be comparable?)

Certainly, though the default key type is Str, which is always comparable.
But we can presumably hash anything that supports an identity via .WHICH.
This minimal requirement does not imply an orderable type.

Larry