Auto-install (was autoloaded...)

2001-02-08 Thread Filipe Brandenburger


Branden wrote:
When I download a module from Internet, say module Foo, then I install
it and try to use it, it promptly breaks when it tries to `use Bar'
and sees that Bar is not installed on my system. So I have to go on
to Internet again, find Bar, install it, so on, until I find Bar needs
Baz, or anything
like it.

Well, I think this could be handled by Perl itself.
This way, Perl could automatically fetch and install modules
(supposing module install is not so hard as now, involving makefiles,
and such...).



Well, I thought about it pretty much, and I finally realised what bothered 
me, and a (pretty reasonable) way to attenuate it a bit.

What bothers me is: Suppose I write a perl script to handle, say, currency 
conversion, or something like getting stock quotes from a variety of sites 
and building a spreadsheet table with them. That's typically something that 
would be pretty much used for `users', specially the ones that are *not* 
JAPH. Now suppose my script uses modules X, Y, and Z. X is in the standard 
library, but Y and Z are not. And Z uses module K, which isn't in the 
standard library either. Now suppose I want to make my script available at a 
web site for anyone who wants to download it and use it.

Now suppose a user (one that uses Windoze, didn't read the Camel, and barely 
can get to install Perl in binary form -- InstallShield is too complicated!) 
goes to the site, finds the script useful, and downloads it. The first time 
he'll want to try it, it will croak with a `module Y not found!'. Now, even 
if I put instructions on the web site about how to get required modules Y, 
Z, and K (which I really don't want to do for every little script I want to 
deploy), and even if the guy doesn't give up yet (which I'm sure he'll do), 
I'm certain he'll *NEVER* get the thing right, even if the modules are 
available in binary form, which I believe is not the case for most scripts.

The solution I propose to this problem is borrowed (copied) from what Java 
did in version 1.1 with jars (did wrong, of course), and somewhat like 
RedHat's rpms. What I suggest is having a kind of archive that would be like 
a container for all the code needed to run the thing, so that it can be 
downloaded and installed all at once. Not .tgz, but a thing that perl could 
recognize at his own's. A tool (`perlinstall', or something like that) 
distributed with perl would take this package and extract scripts, modules, 
..., and install it in the right places, with minimal (ideal zero) user 
choices.

Well, the thing with rpms is that they can be source, binaries, or platform 
independent, what is represented by having .src, .i386/.ppc/.alpha/..., or 
.noarch on the name. The same could be done with perl. The src would include 
the source for all needed modules, besides the scripts that the user is 
supposed to use. If any of the modules use XS (or its sucessor), it would be 
possible to build binary packages, that would take the burden out of the 
user to deal with C compilers and make (which is one thing I hope to see 
stripped off of module building, in Perl 6 -- since we have a MakeMaker, we 
could have a Maker that does it all! Advantages? Platform independence and 
no need for external tools but the C compiler).

Dependencies generators would be necessary as well, but I think there are 
some already.

Comments?

- Branden

_
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Nicholas Clark

On Thu, Feb 08, 2001 at 08:53:07AM -, Filipe Brandenburger wrote:
 The solution I propose to this problem is borrowed (copied) from what Java 
 did in version 1.1 with jars (did wrong, of course), and somewhat like 
 RedHat's rpms. What I suggest is having a kind of archive that would be like 
 a container for all the code needed to run the thing, so that it can be 
 downloaded and installed all at once. Not .tgz, but a thing that perl could 
 recognize at his own's. A tool (`perlinstall', or something like that) 
 distributed with perl would take this package and extract scripts, modules, 
 ..., and install it in the right places, with minimal (ideal zero) user 
 choices.

 Comments?

If you don't need XS modules (or anything else that requires the services
of Makefile.PL or some other install type script on the end machine) is
is possible to put the necessary modules in a zip file, append the zip
file to the perl script after __END__
run zip -A to "adjust the self-extracting exe" and you have a single file
which is both a valid perl script (at the front) and a valid zip file
(at the end)

If you have a perl capable of reading @INC from zip files - I've produced 2
different patches for perl5 that could do this (1 using source filters,
1 using sfio) then you 

BEGIN {unshift $0, @INC}

and your versions of the modules are used if no others are found without
even needing to unzip anything.

problems
1: you can't rely on EOF on the DATA handle in main
2: your script has to be transfered in binary mode ftp, and not mangled
   by any web browsers when saved
3: you really need the new perlio implementation in 5.7 to make it work
   best in a "vanilla" perl
   I'm working towards being able to do this (see the experimental
   PerlIO::gzip on CPAN for gunzip in a perlio layer)
4: Can't do XS
5: not sure if it's a sick hack or an elegant hack

It is also possible to place the include zipfile on the end of the perl
executable and put $^X in @INC to make a 1 file perl distribution, but
I don't think anyone ever tried this

Nicholas Clark



Re: POST blocks (like END, but in a sub or sub-like scope)

2001-02-08 Thread Nicholas Clark

On Wed, Feb 07, 2001 at 10:59:00PM -0600, David L. Nicol wrote:
 Nicholas Clark wrote:
 
  on the other hand, I'll argue the other side that
  
  {
my $flag
open FOO, "bar" ? $flag=1 : die "aargh $!";
...
  }
  post {
close FOO if $flag;
  }
  
  is clearer because the tidy up doesn't visually get in the way of the flow
  of what you're doing, and you can see what $flag is meant to be
 
 How is 
 
   $flag=1
 
 clearer than
 
   POST{close FOO;dropLock}

I didn't give $flag a good name to explain what I was thinking:

POST is clearer if the POST{} action fits on 1 line
otherwise you end up with lots of clearup stuff visually in the way of the
regular flow of control (in my opinion)

stuff which feels like it belongs near the bit that returns.
hmm. or stuff that feels like it lives near the statement that it's
undoing.
I like both sides of the argument. I'll shut up.

 in this example?  I don't think the consistency issue is strong enough.

I do not think it is very strong, definitely not strong enough alone.

 We are also being inconsistent by not suggesting a PRE which would be
 analogous to BEGIN and would run at the beginning of a sub even though
 it is defined halfway into it. For interior blocks, a label would
 be used to say which block we're firing after.  Or we always use sub
 blocks and that's that.

Good point "We provide PRE for symmetry with with POST" sounds stupid
"We don't provide a PRE to go with POST because there's little practical
use for it" sounds sensible.

Nicholas Clark



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Michael G Schwern

On Thu, Feb 08, 2001 at 08:53:07AM -, Filipe Brandenburger wrote:
 Branden wrote:
 When I download a module from Internet, say module Foo, then I install
 it and try to use it, it promptly breaks when it tries to `use Bar'
 and sees that Bar is not installed on my system. So I have to go on
 to Internet again, find Bar, install it, so on, until I find Bar needs
 Baz, or anything
 like it.
 
 Well, I think this could be handled by Perl itself.

Oddly enough, Perl does handle this... mostly.  The CPAN shell can
automatically download and install prerequisites for modules, provided
the module explicitly declares the prereqs.  Class::DBI ultimately
needs something like 9 other CPAN modules, which would be a nightmare
but for this feature.

perl -MCPAN -e 'install Class::DBI'

Its only on the more recent versions of the CPAN shell (1.48), but
since the shell whines about updating itself everytime you use it,
there's no reason not to have the latest.

I've taken advantage of this for local projects using the CPAN::Site
module and setting up a local repository for local code.

Now, the idea of perl automagically going out and downloading
untrusted code (all the code on CPAN is untrusted) from untrusted
sources (ergo, CPAN is untrusted) from untrusted scripts (if its a
core feature, any script can do it) makes my feet itch.  We can't do
this until there's a way to security audit CPAN... which is supposed
to be my job. :(

However, if you *really* want to do it, you can pretty easily.  Just
code up a function which tries to use the given module, otherwise it
fires up the CPAN shell and install it, then tries to use it again.
I've been meaning to do this for a while now.

Another problem is the fact that not everything on CPAN installs
easily.  Most does, but some require certain interaction and
configuration (LWP, Net::FTP, most any DBD driver...).  Those can't be
done automagically, but if we could do 80% then that's ok.


-- 

Michael G. Schwern   [EMAIL PROTECTED]http://www.pobox.com/~schwern/
If you have to shoot, shoot!  Don't talk.
-- Tuco, "The Good, The Bad And The Ugly"



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Branden

Michael G Schwern wrote:

 Oddly enough, Perl does handle this... mostly.  The CPAN shell can
 automatically download and install prerequisites for modules, provided
 the module explicitly declares the prereqs.  Class::DBI ultimately
 needs something like 9 other CPAN modules, which would be a nightmare
 but for this feature.


The issue is actually not auto-downloading modules and their prerequisites,
but actually packaging several scripts and modules in one file, so as Java's
jar do. I think supporting this would be neat.

As to the question of security, if you download a script on a site that says
it does XYZ and you actually trust the script does XYZ (trust in the sense
that you *believe* it), I don't see why wouldn't you trust that the script
would load modules that aren't harmful, either from CPAN or from another
place.

And having to see the code before installing is not a proof of security at
all, since Perl is the king of obfuscating languages and (I expect) Perl 6
will be able to distribute code in byte-code form. So I don't actually see
how auto-loading of modules from Internet is so much more untrustful than
manual-loading of the same modules from Internet, if they are pretty
obfuscated or are in byte-code form. Only a sandbox or something like that
can assure security in either case.

- Branden




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Michael G Schwern

On Thu, Feb 08, 2001 at 12:07:18PM -0200, Branden wrote:
 The issue is actually not auto-downloading modules and their prerequisites,
 but actually packaging several scripts and modules in one file, so as Java's
 jar do. I think supporting this would be neat.

I thought about making a "par" utility.  It would basically do this:

# for each module needed...
perl Makefile.PL PREFIX=foo LIB=foo/lib
make test
make install

Then you just stick your program into foo/bin or something and tar it
all up and ship it off.  The "pun" utility (I couldn't resist) then
untars the thing and runs "perl -Ifoo/lib foo/bin/whatever.plx".

Any obvious flaws?  Poke me enough and I'll get around to doing it.


 As to the question of security, if you download a script on a site that says
 it does XYZ and you actually trust the script does XYZ (trust in the sense
 that you *believe* it), I don't see why wouldn't you trust that the script
 would load modules that aren't harmful, either from CPAN or from another
 place.

Download Memoize from CPAN sometime and install it.  Make sure you're
sitting down.  All it takes is one joker, or one person to have a bad
day, or get a little too drunk one night near a computer.

We *can* automate security auditing of CPAN.  I know it can be done
because I've seen it done on smaller scales and it will happen.  If
you missed it, look at the CPANTS synopsis
http:[EMAIL PROTECTED]/msg00148.html

Its vapor yet, but its all within the realm of "solved problems".

-- 

Michael G. Schwern   [EMAIL PROTECTED]http://www.pobox.com/~schwern/
BOFH excuse #301:

appears to be a Slow/Narrow SCSI-0 Interface problem



Re: POST blocks (like END, but in a sub or sub-like scope)

2001-02-08 Thread John Porter

Bart Lateur wrote:
 The idea is inspired, not just by the existing BEGIN and END blocks
 syntax, but also by the fact that in 5.6.0, you can put a sub definition
 inside another sub. You can nest them. The effect is that nested sub is
 only visible from within the outer sub. That seems rather appropriate
 here.

Right.  I'm particularly concerned about lexical variables;
a post block ought to have scope to the my vars in the block
to which it pertains.  Sticking it inside lexically makes this
clear.

-- 
John Porter




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Branden

Michael G Schwern wrote:
 On Thu, Feb 08, 2001 at 12:07:18PM -0200, Branden wrote:
  The issue is actually not auto-downloading modules and their
prerequisites,
  but actually packaging several scripts and modules in one file, so as
Java's
  jar do. I think supporting this would be neat.

 I thought about making a "par" utility.  It would basically do this:

 # for each module needed...
 perl Makefile.PL PREFIX=foo LIB=foo/lib
 make test
 make install

 Then you just stick your program into foo/bin or something and tar it
 all up and ship it off.  The "pun" utility (I couldn't resist) then
 untars the thing and runs "perl -Ifoo/lib foo/bin/whatever.plx".


That's what I was talking about.

Only have `par' copy the scripts to foo/bin itself and handle tar/gzip
(maybe using Perl modules internally, so that platform dependency issues are
isolated). Then it should save the file with a `.myarch.par' extension, to
signal it's a `Perl Archive' and it's intended to be used in the `MyArch'
architecture (replace `.myarch' with `.win32', `.linux-i386', ... and it
makes sense).

`pun' could then check the file extension and issue some warnings, like:
``You are trying to install a file for architecture `Linux-Intel386' on a
machine running `Windows32'.''

`par' should support deploying .src archives, to be built on the target
machine, supplying a Makefile or similar that would build all needed modules
at once. And `pun' should recognize them and actually start the make
processing (perhaps prompting the user for it).

`pun' should be able to install the modules on a directory specified when
creating the par (like foo/lib), on a user-specified directory (like
~/myperlscripts/foo-1.2/lib), or on the Perl's main modules directories,
where the modules would be shared. `pun' could yet check if modules shipped
with the par archive are already present on the system, and skip them (or
update the system's modules, if the shipped ones are newer and the user
grants permission to do that -- i.e. he trusts the par).



And, more importantly, `par' and `pun' must be shipped in Perl 6.0.0. That
was the big mistake in Java, they only introduced `jar' in 1.1, but then
Microsoft already used `cab's for that, and `jar' never got to work for
applets, since the two big browsers supported different things.

Shipping them would make everyone's life easier. Script developers, because
they could package their code and give simple instructions like ``download
the xyz-1.2.win32.par and run `pun xyz-1.2.win32.par', then run `perl
foo\bin\myscript.plx' to run the script''. And users, because they wouldn't
have to go into the issues of what are modules, how to get them and how to
build them. And specially Perl, because having easy ways of deploying and
installing scripts/programs would make much more developers make much more
scripts and tools available for a much more wider set of users.



 Any obvious flaws?  Poke me enough and I'll get around to doing it.


If there anyone points flaws, I help doing this thing. I think we should
only wait for the definition of how building modules will be in Perl 6,
since I don't expect it to use `make', but rather a Perl-defined tool.

- Branden




Closures and default lexical-scope for subs

2001-02-08 Thread Branden


I expect Perl 6 will have some way to define its variables as being
lexical-scoped in the sub they are used as default, either by the language
default, or by a pragma, as `use scope "subs"; ', as it's proposed in RFC
64.

If that's the case, I wonder how closures will be done, since having
lexical-scope in every sub would mean that variables inside closures would
automatically be lexical, being different from the ones in the sub that made
the closure.

sub foo {
$bar = 1;
$closure = sub {
return $bar++;
};
return $closure;
}

The problem is that $bar inside the closure is different of the one in
`foo', because it's lexically scoped to the sub it's in, which is the
closure. The code above would actually compile to

sub foo {
my $bar = 1;
my $closure = sub {
my $bar;
return $bar++;
};
return $closure;
}

One naive way to solve this would be saying `only named subs will define a
lexical-scope for its variables', but that isn't right if you consider that
anonymous subs can be named, as

*foo = sub {
$bar = 1;   # not lexically scoped???
...

Of course typeglobs will probably go away, but I doubt naming an anonymous
sub will be cut of the language, since it's used by many modules that build
classes (Class::*, I'm not sure which of them do it... Someone can give some
examples?).


Well, my suggestion for solving the problem is creating a new keyword of the
my/our/your/their/his/... family that would explicitly `import' the variable
from the parent sub. Of course this would be a compile time thing (e.g. like
my), so that it would only tell the compiler that it should do the right
thing to that name access the parent sub's variable.

sub foo {
$bar = 1;
$closure = sub {
parent_sub's $bar;
return $bar++;
};
return $closure;
}

Of course `parent_sub's' sucks! But I have no better idea. Any suggestions?

I see a slightly conceptual advantage in having a keyword to indicate the
closure, because the variable is actually stored together with the sub
reference somehow, and having a keyword to indicate that would make it
explicit.


References:
* see the way Python does it, it's explicit but it's rather clumsy. In
Python, you must declare a parameter variable with some name, and set its
initial value as the outer variable's value. It kind of does the same thing
that this above, but uses initialised parameters to do this, what is rather
confusing.

In http://www.python.org/doc/current/ref/function.html :
   # Return a function that returns its argument incremented by 'n'
   def make_incrementer(n):
   def increment(x, n=n):
   return x+n
   return increment

   add1 = make_incrementer(1)
   print add1(3)  # This prints '4'

Perl 5 would be:

sub make_incrementer {
my $n = shift;
my $increment = sub {
my $x = shift;
return $x + $n;
};
return $increment;
}

Perl 6 with parent_sub's (please give me a better name!) and
lexically-scoped variables by default:

use scope 'subs';
sub make_incrementer {
$n = shift;
$increment = sub {
$x = shift;
parent_sub's $x;
return $x + $n;
};
return $increment;
}



Comments?

- Branden




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 01:44 PM 2/8/2001 -0200, Branden wrote:
Michael G Schwern wrote:
  On Thu, Feb 08, 2001 at 12:07:18PM -0200, Branden wrote:
   The issue is actually not auto-downloading modules and their
prerequisites,
   but actually packaging several scripts and modules in one file, so as
Java's
   jar do. I think supporting this would be neat.
 
  I thought about making a "par" utility.  It would basically do this:
 
  # for each module needed...
  perl Makefile.PL PREFIX=foo LIB=foo/lib
  make test
  make install
 
  Then you just stick your program into foo/bin or something and tar it
  all up and ship it off.  The "pun" utility (I couldn't resist) then
  untars the thing and runs "perl -Ifoo/lib foo/bin/whatever.plx".


That's what I was talking about.

I'm not sure this is all necessary. Wouldn't we be reasonably better off if 
we instead just shipped off bytecode compiled versions of the scripts? 
Seems easier to ship that way than as an archive of stuff. (We can, if its 
deemed useful, define the bytecode format in a way that allows us to 
package up versions of modules that can be optionally loaded from the main 
perl install instead)

Seems simpler, and it also means you can, at the time the program is 
initally compiled, crank up the optimization level a lot so you're handing 
out the fastest version of your code you can.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Branden

Dan Sugalski wrote:

 I'm not sure this is all necessary. Wouldn't we be reasonably better off
if
 we instead just shipped off bytecode compiled versions of the scripts?
 Seems easier to ship that way than as an archive of stuff. (We can, if its
 deemed useful, define the bytecode format in a way that allows us to
 package up versions of modules that can be optionally loaded from the main
 perl install instead)

 Seems simpler, and it also means you can, at the time the program is
 initally compiled, crank up the optimization level a lot so you're handing
 out the fastest version of your code you can.

 Dan



At a first glance, I really found that's much better. But I saw three small
problems:
1. updating the version of modules in the big-bloated-bytecode would
potentially have to recompile the script and all other modules.
2. it doesn't work for scripts/modules deployed in source code format.
3. it doesn't work for modules that use C extensions.

Well, the first is not a big problem, since it's probably a very rare
situation and probably the script code would change a bit if modules'
versions change. The second also is not a big issue, since source code can
always be downloaded separately, and who wants the source code probably can
handle separate module downloading and installing.

Actually, I think the archive approach is more general, because it wouldn't
have this kind of problems and would allow other resources to be deployed
together with the code, like documentation, perhaps even text files and
images used by a Perl/Tk application.



Other thing that looks like a difference at first, is that the shipped
bytecode would run directly and the archive would have to be installed to
run. I think that's actually not true, as Perl 6 will probably use one of
its own's new magic filehandles to read the sources/bytecodes, and one of
that filehandled can be tied to decompress the archive on-the-fly and pass
the code to the interpreter.

And pure bytecode applications would be actually distributed on the
`.noarch' kind of par archive, to contrast the `.src' kind.

Did I miss something here? Is it just me, or you also think this
(deploy/install) is essential for a language be used by `layman-users', and
not only JAPHs.

- Branden




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Michael G Schwern

On Thu, Feb 08, 2001 at 11:21:17AM -0500, Dan Sugalski wrote:
 I'm not sure this is all necessary. Wouldn't we be reasonably better off if 
 we instead just shipped off bytecode compiled versions of the scripts? 

Sure, except...
1) You lose your readable source code (discussions of B::Deparse as
   a viable alternative  /dev/null)
2) You have to make provisions to distribute your documentation
   seperately.
3) It makes it harder to bundle non-Perl things, like configuration
   files, images, sound files, etc...  If you want to send those along
   with the bytecode you windup needing a par-style utility anyway.
4) What Brenden said
5) Do YOU have a stable bytecode compiler??  I don't.

Perhaps it wasn't clear, I don't mean to have par as part of 6.0, I
mean to have it out, like, maybe next month if I decide to work on it.

-- 

Michael G. Schwern   [EMAIL PROTECTED]http://www.pobox.com/~schwern/
Kids - don't try this at--oh, hell, go ahead, give it a whirl...



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 11:52 AM 2/8/2001 +, Michael G Schwern wrote:
On Thu, Feb 08, 2001 at 11:21:17AM -0500, Dan Sugalski wrote:
  I'm not sure this is all necessary. Wouldn't we be reasonably better 
 off if
  we instead just shipped off bytecode compiled versions of the scripts?

Sure, except...
 1) You lose your readable source code (discussions of B::Deparse as
a viable alternative  /dev/null)

Not unless you strip the bytecode. I want to optionally package the source 
in the bytecode, since otherwise you can't do some optimizations after the 
fact on the generated bytecode stream.

 2) You have to make provisions to distribute your documentation
seperately.

Presumably you'd package it up in the tar or zip archive containing the 
fully-compiled program.

 3) It makes it harder to bundle non-Perl things, like configuration
files, images, sound files, etc...  If you want to send those along
with the bytecode you windup needing a par-style utility anyway.

Once again, you can package it up in the tar or zip archive that you're 
distributing the program in.

 4) What Brenden said

Some of what Brenden said isn't entirely applicable, though much of it is.

 5) Do YOU have a stable bytecode compiler??  I don't.

For perl 6? No. Not yet.

Perhaps it wasn't clear, I don't mean to have par as part of 6.0, I
mean to have it out, like, maybe next month if I decide to work on it.

I assumed that since you were discussing this on a perl 6 mailing list, you 
were talking about doing this with perl 6.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 02:43 PM 2/8/2001 -0200, Branden wrote:
Dan Sugalski wrote:
 
  I'm not sure this is all necessary. Wouldn't we be reasonably better off
if
  we instead just shipped off bytecode compiled versions of the scripts?
  Seems easier to ship that way than as an archive of stuff. (We can, if its
  deemed useful, define the bytecode format in a way that allows us to
  package up versions of modules that can be optionally loaded from the main
  perl install instead)
 
  Seems simpler, and it also means you can, at the time the program is
  initally compiled, crank up the optimization level a lot so you're handing
  out the fastest version of your code you can.
 
  Dan
 


At a first glance, I really found that's much better. But I saw three small
problems:
1. updating the version of modules in the big-bloated-bytecode would
potentially have to recompile the script and all other modules.

If the modules you're distributing change you need to repackage things up 
anyway. Not much difference here, really. (And we wouldn't necessarily want 
to automagically use a newer version of an installed module--that'd be 
rather unpleasant if we have another case like we had with GD)

2. it doesn't work for scripts/modules deployed in source code format.

Why are you assuming the source can't be packaged up in the generated bytecode?

3. it doesn't work for modules that use C extensions.

Definitely an issue, but packaging up a program with code that needs to be 
compiled on the target machine is problematic in a lot of ways anyway.

Actually, I think the archive approach is more general, because it wouldn't
have this kind of problems and would allow other resources to be deployed
together with the code, like documentation, perhaps even text files and
images used by a Perl/Tk application.

This is an excellent reason, and one I hadn't considered. I withdraw any 
objections. Care to put together a PDD on how it should be handled? 
(Including Archive::Tar as part of the base perl distribution's not 
inappropriate, assuming we can get permission. )

Did I miss something here? Is it just me, or you also think this
(deploy/install) is essential for a language be used by `layman-users', and
not only JAPHs.

Generally speaking, I assume installs require a certain minimum competence 
on someone's part. The less want assumed on the part of the end user, the 
more is required on the part of the person packaging up the install. (It's 
been my experience that it's an inverse cube relationship)

Doing this portably is an interesting exercise. Doing it non-portably is a 
waste of time, since if you're going to be platform-specific you're better 
off using the platform install tools.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Nicholas Clark

On Thu, Feb 08, 2001 at 12:26:59PM -0500, Dan Sugalski wrote:
 This is an excellent reason, and one I hadn't considered. I withdraw any 
 objections. Care to put together a PDD on how it should be handled? 
 (Including Archive::Tar as part of the base perl distribution's not 
 inappropriate, assuming we can get permission. )

Do we really want to use tar format (over say cpio) as tar rounds files
up to 512 block boundaries, and has some arbitrary restrictions on filename
lengths in the headers?

Nicholas Clark



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 05:39 PM 2/8/2001 +, Nicholas Clark wrote:
On Thu, Feb 08, 2001 at 12:26:59PM -0500, Dan Sugalski wrote:
  This is an excellent reason, and one I hadn't considered. I withdraw any
  objections. Care to put together a PDD on how it should be handled?
  (Including Archive::Tar as part of the base perl distribution's not
  inappropriate, assuming we can get permission. )

Do we really want to use tar format (over say cpio) as tar rounds files
up to 512 block boundaries, and has some arbitrary restrictions on filename
lengths in the headers?

Having the perl archives splittable by other available tools is a good 
thing. Using the zip format's fine too--I don't much care either way. 
(Zip's better in some ways since you can encode extra info in the file 
headers, but I don't know that we'll need it, nor if any platform besides 
VMS uses it)

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Michael G Schwern

On Thu, Feb 08, 2001 at 05:39:01PM +, Nicholas Clark wrote:
 Do we really want to use tar format (over say cpio) as tar rounds files
 up to 512 block boundaries, and has some arbitrary restrictions on filename
 lengths in the headers?

First cut will be tar.  Why?  Its simple, its common, and we have a
well-developed Perl module for it.  Later it can be changed to
anything we want.  Encapsulation++

-- 

Michael G. Schwern   [EMAIL PROTECTED]http://www.pobox.com/~schwern/
Maybe they hooked you up with one of those ass-making magazines.
-- brian d. foy as misheard by Michael G Schwern



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Nicholas Clark

On Thu, Feb 08, 2001 at 12:41:34PM -0500, Dan Sugalski wrote:
 At 05:39 PM 2/8/2001 +, Nicholas Clark wrote:

 Do we really want to use tar format (over say cpio) as tar rounds files
 up to 512 block boundaries, and has some arbitrary restrictions on filename
 lengths in the headers?
 
 Having the perl archives splittable by other available tools is a good 
 thing. Using the zip format's fine too--I don't much care either way. 

Yes, I agree. Hence cpio may not be great as tools to deal with it
are much rarer

 (Zip's better in some ways since you can encode extra info in the file 
 headers, but I don't know that we'll need it, nor if any platform besides 
 VMS uses it)

Acorn RISC OS zip tools use the extra info to store file metadata.
I think that the unix zip tools use the extra info field to store
create/access/modification times. There's a tagging format defined,
so a file can have multiple blocks of data in the extra info field
that programs that don't understand them treat as opaque.

zip's better in that it allows easy random access to a compressed file,
[without having to compress everything else first] but worse for the
same reason because you don't get as good a compression ratio by
compressing each file separately.

Nicholas Clark



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 05:58 PM 2/8/2001 +, Nicholas Clark wrote:
On Thu, Feb 08, 2001 at 12:41:34PM -0500, Dan Sugalski wrote:
  At 05:39 PM 2/8/2001 +, Nicholas Clark wrote:

  Do we really want to use tar format (over say cpio) as tar rounds files
  up to 512 block boundaries, and has some arbitrary restrictions on 
 filename
  lengths in the headers?
 
  Having the perl archives splittable by other available tools is a good
  thing. Using the zip format's fine too--I don't much care either way.

Yes, I agree. Hence cpio may not be great as tools to deal with it
are much rarer

Yup, and finding them on non-unix platforms can be rather tricky, too. Zip 
and tar are probably the two biggies.

\zip's better in that it allows easy random access to a compressed file,
[without having to compress everything else first] but worse for the
same reason because you don't get as good a compression ratio by
compressing each file separately.

I've seen it go both ways with compression, but I'm not sure that a few 
percent either way's a big deal. Packaging is more important than 
compression for this purpose anyway, I think.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 05:49 PM 2/8/2001 +, Michael G Schwern wrote:
On Thu, Feb 08, 2001 at 12:31:25PM -0500, Dan Sugalski wrote:
  Not unless you strip the bytecode. I want to optionally package the source
  in the bytecode, since otherwise you can't do some optimizations after the
  fact on the generated bytecode stream.

Clever dog!

Altogether too much so, I expect. :) Reading the optimizing compiler books 
makes my head hurt, but there's a lot of interesting stuff in them.

   2) You have to make provisions to distribute your documentation
  seperately.
 
  Presumably you'd package it up in the tar or zip archive containing the
  fully-compiled program.

Seperated documentation is no documentation.

At some point things are going to get split out, unless you wedge the docs 
into the actual program itself. (You were, after all, talking about config 
files and XS modules, and those can't usefully stay inside the archive)

   5) Do YOU have a stable bytecode compiler??  I don't.
 
  For perl 6? No. Not yet.

Is perlcc considered really stable and usable in 5.6?  Hmm, my little
test would say no. :(

I shan't be going there for lack of time. It'd be nice if someone had the 
time to make it work.

  I assumed that since you were discussing this on a perl 6 mailing list, 
 you
  were talking about doing this with perl 6.

What!  Me stay on topic?  HA!

Yeah, I know--what *was* I thinking?

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Branden

I wrote:
 I think zip is the way to go! Is there any
 platform/license or any other restricting issues we should care about zip?
 Is it ported to all platforms Perl currently runs on? Is there a Perl
module
 for handling zips?


Aren't we re-inventing the wheel here? It strikes me now that ActiveState's
ActivePerl comes with PPM, or `Perl Package Manager'. AFAIK, it's only for
downloading from ActiveState's site, and it only handles installing of
individual modules (althought it checks dependencies and fetches needed
modules as well, but it doesn't solve the problem for scripts/programs).

Anyone of ActiveState there? Can't we adapt PPM so that it handles what's
needed? Or is it too different from what we want? Does it use zip or
tar/gzip or other?

- Branden




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Branden

Peter Scott wrote:
 Eh?  I thought PPM was simply "perl -MCPAN -e install" for Windows users,
 pointed to a set of modules which have XS content that they'd had to
fiddle
 with to port to Win32.


Sorry for the mistake... I've never actually used PPM, only read about it in
the web. I guess their file format is a disguised .tar.gz, right?

- Branden




Re: Auto-install (was autoloaded...)

2001-02-08 Thread Dan Sugalski

At 05:49 PM 2/8/2001 -0200, Branden wrote:
Peter Scott wrote:
  Eh?  I thought PPM was simply "perl -MCPAN -e install" for Windows users,
  pointed to a set of modules which have XS content that they'd had to
fiddle
  with to port to Win32.
 

Sorry for the mistake... I've never actually used PPM, only read about it in
the web. I guess their file format is a disguised .tar.gz, right?

Disguised .zip.

Dan

--"it's like this"---
Dan Sugalski  even samurai
[EMAIL PROTECTED] have teddy bears and even
  teddy bears get drunk




Re:

2001-02-08 Thread Bryan C . Warnock

On Wednesday 31 December 1969 18:59, Branden wrote:

 Yes. Packaging is what's important. I actually expect to not have to 
install
 the `par' and have perl6's magic filehandles decompress a file from the
 package `on-the-fly'. I think zip is the way to go! Is there any
 platform/license or any other restricting issues we should care about zip?
 Is it ported to all platforms Perl currently runs on? Is there a Perl 
module
 for handling zips?

I've always handled this by slapping the .tgz package into the DATA section 
of a perl script that DWIMs.  Sort of a self-executable zip file.


-- 
Bryan C. Warnock
bwarnock@(gtemail.net|capita.com)



Re: Auto-install (was autoloaded...)

2001-02-08 Thread Clayton Scott

Peter Scott wrote:
 
 Eh?  I thought PPM was simply "perl -MCPAN -e install" for Windows users,
 pointed to a set of modules which have XS content that they'd had to fiddle
 with to port to Win32.

Not by far. It is a replacment for CPAN that builds and
 maintains its own local database of installed modules.


From PPM docs:
 DESCRIPTION

 PPM is a group of functions intended to simplify the tasks of 
 locating, installing, upgrading and removing software 'packages'. It 
 can determine if the most recent version of a software package is 
 installed on a system, and can install or upgrade that package from 
 a local or remote host. 

 PPM uses files containing a modified form of the Open Software 
 Distribution (OSD) specification for information about software 
 packages. These description files, which are written in Extensible
 Markup Language (XML) code, are referred to as 'PPD' files. 
 Information about OSD can be found at the W3C web site (at the 
 time of this writing, http://www.w3.org/TR/NOTE-OSD.html). The
 modifications to OSD used by PPM are documented in PPM::ppd. 

 PPD files for packages are generated from POD files using the pod2ppd
command. 



Clayton



Re: Closures and default lexical-scope for subs

2001-02-08 Thread John Porter


Branden foobar wrote:
 I expect Perl 6 will have some way to define its variables as being
 lexical-scoped in the sub they are used as default, either by the language
 default, or by a pragma, as `use scope "subs"; ', as it's proposed in RFC
 64.

 If that's the case, I wonder how closures will be done, since having
 lexical-scope in every sub would mean that variables inside closures would
 automatically be lexical, being different from the ones in the sub that made
 the closure.

Well, since the former isn't going to happen, the latter isn't going to be
a problem.


 my suggestion for solving the problem is creating a new keyword of the
 my/our/your/their/his/... family that would explicitly `import' the variable
 from the parent sub.

Ugh - upvar?  No thanks.


 I see a slightly conceptual advantage in having a keyword to indicate the
 closure, because the variable is actually stored together with the sub
 reference somehow, and having a keyword to indicate that would make it
 explicit.

Why should it be explicit?  What ambiguity needs to be cleared up?
I like the fact that perl handles the grotty details for me.


-- 
John Porter

You can't keep Perl6 Perl5.




Re: POST blocks (like END, but in a sub or sub-like scope)

2001-02-08 Thread John Porter

David L. Nicol wrote:
 
 Do you agree that they shouldn't get tacked on until execution passes their
 definition, unlike END blocks which get appended when they are parsed?

Yes, absolutely; that is an important point.

END blocks are different because there is only ever one activation record
for the file (ignoring threads); lexicals in the file are known at
compile time, and so are visible to the END block.
Lambdas deserve post blocks too. :-)

-- 
John Porter

You can't keep Perl6 Perl5.




Re: assign to magic name-of-function variable instead of return

2001-02-08 Thread Jarkko Hietaniemi

On Tue, Feb 06, 2001 at 05:01:03AM +1100, Damian Conway wrote:
 Really?  Are lexicals in the sub visible in the post handler?
 
 No. Only the original arguments and the return value.

 (Of course I realize *F does not illustrate this...)
 
 Exactly. ;-)
 
 Actually, I do agree that Perl 6 ought to provide a universal "destructor"
 mechanism on *any* block. For historical reasons, I suppose it should be
 Ccontinue, though I would much prefer a more generic name, such as
 Ccleanup.

Cmop ? :-)

-- 
$jhi++; # http://www.iki.fi/jhi/
# There is this special biologist word we use for 'stable'.
# It is 'dead'. -- Jack Cohen