Re: RFC: Getopt::Modern
Eric Wilhelm [EMAIL PROTECTED] writes: Ok. Then my previous argument stands. If the --no- means unset any hard-coded or config-file defaults, then it shouldn't be evaluated in command-line order. Good deduction, but the premise does not hold. --no- does not mean unset any [...] defaults, it means: set the option value to 'false'. And b) mixing options and arguments, where --foo arg1 --no-foo arg2 means that arg1 is processed with --foo and arg2 with --no-foo. This is not something I'm trying to address. That's okay. There are several Getopt:: modules that implement a simplified subset of Getopt::Long for various reasons. -- Johan
Re: RFC: Getopt::Modern
Eric Wilhelm [EMAIL PROTECTED] writes: Independent of percentages, why disallow --foo --no-foo provided there's a clear definition of the semantics? I never suggested that it should be disallowed. Only that it should be equivalent to '--no-foo --foo'. That's part of the clearly defined semantics. The problem that I see is legacy. Many users expect the left-to-right behaviour, and will get confused if some tools act differently. (And no, I do not have a good solution for that.) -- Johan
Re: RFC: Getopt::Modern
On Thu, 2005-06-16 at 20:12 -0700, Eric Wilhelm wrote: The purpose of a negated option is (in all of the usages that I have seen) to reset any hard-coded or config-file variable. This is not the purpose, see my other posting. -- Johan
Re: RFC: Getopt::Modern
On Sat, Jun 18, 2005 at 10:44:21AM +0200, Johan Vromans wrote: The problem that I see is legacy. Many users expect the left-to-right behaviour, and will get confused if some tools act differently. I don't think that's merely legacy; the majority of those on this list who have expressed a preference have said they prefer it that way. Mx.
Re: RFC: Getopt::Modern
Eric Wilhelm [EMAIL PROTECTED] writes: What I'm trying to do with Getopt::Modern here is to establish some conventions which allow this to happen internally. This saves the author some code and gives the user a guaranteed consistent experience with multiple programs. The debate on the usefullness of '--no-' appears to say it's useful. The debate on its behavior says that there are historical (and convenience) reasons to keep its evaluation in command-line order. What I'll probably end-up with is something like '--un-' performing the above task of initializing internal values. I'd strongly suggest that if you really break the long standing conventions, even for perfectly valid reasons!, please select a distinctive form so users know what to expect. In an earlier message I referred to the former GNU convention of starting the (then new style of) command line options with a + instead of a minus. -- Johan
Re: Getopt::Long wishes
A. Pagaltzis [EMAIL PROTECTED] writes: * Johan Vromans [EMAIL PROTECTED] [2005-06-17 17:20]: I can make this information available, if users would be interested. Access to structured data is always nicer than implementing and re-implemeting a parser for its serialized form. Would it be okay to have a generic 'deal with option' method that (upon config request; sorry Eric, yet another feature...) gets called for every option with a hash of relevant information? E.g., name of the option, type, real name (when called via an alias), the desired value, and a reference to where to store the value. What other information would you like to have access to? -- Johan
Re: Getopt::Long wishes
A. Pagaltzis [EMAIL PROTECTED] writes: Since were at this: the one thing I still fall back to Getopt::Std for is small scripts. I love Getopt::Long, but it incurs a pretty high startup cost. Is there any chance you can play some deferred compilation cards to make it go faster? This is one of the main reasons that Getopt::Long version 3 development stalled -- the startup overhead would double from version 2. But in the mean time computers have become 5 to 10 times as powerful, so would it still be a real problem? -- Johan
Re: RFC: Getopt::Modern
Eric Wilhelm [EMAIL PROTECTED] writes: Because I had originally built that as a wrapper around Getopt::Long, I had a laundry-list of what didn't work. This would have been interesting for me to know. In fact, as I mentioned I would be happy for G::L to have this functionality, but I doubt that program-order evaluation (one of the main design goals) is going to fit without some serious restructuring. It might require serious restructuring, but I'm not afraid of that. I usually rewrite/refactor most of my programs when I get bored :-). Do you have a project page for your recent work? Though not recent, http://www.squirrel.nl/people/jvromans/sw_getopt3.html reveals some of the ideas behind G::L version 3. -- Johan
Re: Failing Reports due to 3rd Party Software...
On Jun 17, 2005, at 7:49 PM, Rob Janes wrote: so basically the executive summary is that cpanplus does not report adequately system dependency failures, like a missing c compiler or a missing library. As outlined in my other email, that is because it can not (it has no way of knowing that a missing system dependency was the reason for a failure), nor should it (as an installer should be able to interpret installation errors) The idea is that perl xs modules have prereqs that are external to perl, be they compilers, utilities, file naming conventions (to backslash or not to backslash), libraries, etc. Failure in this regard is essentially a prereq failure. ExtUtils::MakeMaker and M::B would be changed to understand the concept of an external prereq. This external prereq would be tested by a piece of custom code which the module author would have to write, as I did with Compress::Bzip2 to test for the bzlib library. Likely E::M and M::B would have a list of names of external prereqs that would be understood, and reported on by the custom piece of code. The yaml file generated by E::M and M::B would list these external prereqs, by name and description. CPANPLUS understands perl prereqs. external prereqs would be more difficult, but if we had a naming convention some basic external prereqs could be preprepared since they are everywhere, like for example system_cc=a c compiler, system_libbz2=bzlib devel library. maybe something user_xxx=my custom ext. prereq for external prereqs requiring helper subroutines. What you're describing here is what's covered by the 'Alien' idea; to have a perl module specifically responsible for satisfying a non-perl prereq (and trust me, a lot of thought has gone into this). The hard part is that this would require a package manager as rigid and advanced as dpkg to pull off, and perl has none. And that leaves aside the desirability of wanting to enthrust a package manager on users, rather than interacting with their package manager of choice for the OS they run... but i digress. The Alien manifesto: http://search.cpan.org/~abergman/Alien-0.91/lib/Alien.pm An implementation for Alien for compress::zlib: http://search.cpan.org/~kane/Alien-Zlib-0.00_01/lib/Alien/Zlib.pm with perl xs, i just made CPANPLUS's check for prereqs more complicated. E::M and M::B would have to assist in this. i think currently cpanplus uses the yaml to check the prereqs and bails before it even runs the .PL file, correct me if i'm wrong. You're wrong :) CPANPLUS runs 'perl Makefille.PL' and parses the makefile for dependencies. In case of Buid.PL it asks the Module::Build API what the dependencies are. ok then, both scenarios. cpanplus runs .PL file and picks up prereq failures from the logfile. this one is easy, the .PL file just has to report on the external prereq failures. For Makefile.PL this is correct. For Module::Build, again, we ask the API. cpanplus reads the yaml and checks for prereqs, and schedules perl prereqs for loading. CPANPLUS never reads the yaml file for cpanplus to check for external prereqs, with a naming convention there would be some prereqs it could check for itself, but more generally it will have to call the .PL file with some special options to make the .PL check the prereqs. We'd need a standard for these options, along with support from EU::MM, and M::B, as well as a way of unambiguously probing the result, as well as interpreting the result. when invoked with those options, the .PL should not configure the module. CPANPLUS would only call the .PL file with those options (check external prereq) if the yaml indicated there were external prereqs (backward compatibility). What you leave out is 'what do we do with these external prereqs that failed to load' -- you've described a complicated system to figure out 'it doesnt work', in this case only to please a few test reports. But it does not address the issue on how to be able to actually *make it work*, which seems much more interesting. external prereq failure of a critical component should cause cpanplus to bail out of the build. unlike with perl prereqs, there is no recovery action available to cpanplus. There is no need for recovery if we can specify the prereq like we do with regular perl prereqs. Alien could be a solution for this. or, the .PL could supply a helper function for recovery action. for example, in Compress::Bzip2 if there is no bzlib installed, the recovery action is to activate the static build of the internal bzlib tagalong. you could go crazy with this one. like if a c compiler is missing you could fire up some p2p action and download a c compiler and install it. This does breach the 'do one thing and do it well' motto quite heavily. A more generic solution is what's to be preferred, if we're going through the trouble of altering both EU::MM, M::B, CPANPLUS and perhaps CPAN.pm --
Re: Getopt::Long wishes
* Johan Vromans [EMAIL PROTECTED] [2005-06-18 13:05]: But in the mean time computers have become 5 to 10 times as powerful, so would it still be a real problem? Yes! The most problematic scripts I have is a set thats launched by GUI events, sometimes in rapid succession. Every microsecond counts even on a 2GHz Athlon with plenty of spare memory and very fast disks. I did my own share of sticking things in eval to avoid as much cost as possible, but in the end I had to throw out Getopt::Long in favour of ::Std to get acceptable consistent startup times. On slower machines theyre still laggy. Ive pondered copypasting the switch processing code Optimizing for minimal one-shot cost isnt fun, its painful Even with less critical scripts, like my overhaul of the rename script (where theres no way Im dropping to a less capable switch parser), the fixed overhead is noticable when the scripts task is not computation intensive. Regards, -- #Aristotle *AUTOLOAD=*_=sub{s/(.*)::(.*)/print$2,(,$\/, )[defined wantarray]/e;$1}; Just-another-Perl-hacker;
Re: RFC: Getopt::Modern
To Eric, Some might-be-helpful thoughts for you: 1. Bundling Problem on Getopt::Long: Yes, I was very troubled for that problem before. I even had a same thought as you to parse the arguments by myself. But after a couple of tries I gave up. My codes get all messed up. Then I found my life could be easier with Getopt::Long. Now I have neat codes. Bundling problem? I never document the use of bundled options. It's OK if they hack and try that. But I'm not their mon. For me, controlling them for their possibility is non-sense. They take their own responsibility for its behavior. They can hack, and they can have fun. You may not agree with me. That's fine. Bundling problem is rather complicated. It's a challenge to anyone. I won't get jealous if you solved it and I couldn't. ^_^ 2. The --no-fish or --un-fish Issue: Actually, if I were you, I would use --no-default-fish, or --no-def-fish. With go_shop --fish tuna --fish halibut --no-default-fish go_shop --no-default-fish --fish tuna --fish halibut it makes perfect sense that these commands should get the same result. It reads naturally. 3. The 2 Lines Example: Actually, I can deduce it into one. The use of the $opt_nofish variable is non-sense. Getopt::Long::GetOptions( fish=s = [EMAIL PROTECTED], no-default-fish = sub { @conf_fishes = qw() }, ); @fishes = (@conf_fishes, @opt_fishes); As I said, you can work in a more flexible way on this issue. You might originally be tied to the impression of: Getopt::Long::GetOptions( verbose! = \$verbose, ); which may only be one convienent usage of Getopt::Long. -- Best regards, imacat ^_*' [EMAIL PROTECTED] PGP Key: http://www.imacat.idv.tw/me/pgpkey.txt Woman's Voice News: http://www.wov.idv.tw/ Tavern IMACAT's: http://www.imacat.idv.tw/ TLUG List Manager: http://www.linux.org.tw/mailman/listinfo/tlug pgpH7Ryuq5AcL.pgp Description: PGP signature
Re: RFC: Getopt::Modern
# The following was supposedly scribed by # Johan Vromans # on Saturday 18 June 2005 03:37 am: What I'll probably end-up with is something like '--un-' performing the above task of initializing internal values. I'd strongly suggest that if you really break the long standing conventions, even for perfectly valid reasons!, please select a distinctive form so users know what to expect. Is '--un-' not distinctive enough? I've got this much of the design written-up already: http://scratchcomputing.com/developers/Getopt-Crazy/ As I said, since '--no-' is used so much and historically is expected to behave in a certain way, I won't be changing that at all. Basically, I had just never wanted to use it that way and was therefore looking for a new kind of behavior which plays nicely with config-files. --Eric -- Politics is not a bad profession. If you succeed there are many rewards, if you disgrace yourself you can always write a book. --Ronald Reagan - http://scratchcomputing.com -
Re: Getopt::Long wishes
# The following was supposedly scribed by # Johan Vromans # on Saturday 18 June 2005 03:55 am: Would it be okay to have a generic 'deal with option' method... E.g., name of the option, type, real name (when called via an alias), the desired value, and a reference to where to store the value. Most of this comes out of Getopt::Crazy::parse_spec(). It does still need some work. But I wonder if this sort of thing (a method that the author can call for each option) might be a better approach than a hook (if I'm understanding what you propose correctly.) --Eric -- I've often gotten the feeling that the only people who have learned from computer assisted instruction are the authors. --Ben Schneiderman - http://scratchcomputing.com -
Re: RFC: Getopt::Modern
imacat [EMAIL PROTECTED] writes: [...] But, then, is this whole thread that meaningless? I don't think so. Many good ideas and suggestions have come by, and though not all ideas are equally viable, many people spend energy on communicating ideas -- which is fundamental to open source software. -- Johan
Re: RFC: Getopt::Modern
Eric Wilhelm [EMAIL PROTECTED] writes: Maybe we'll even manage to hammer it into a standard. That would really be nice! -- Johan
Re: RFC: Getopt::Modern
Orton, Yves [EMAIL PROTECTED] writes: I currently have two projects that address this issue: Getopt::Toolkit (which is based on Getopt::Long) and Getopt::Long version 3 (which is a complete redesign, a.k.a. Getopt::Long on steroids). Merging the two projects into a single new Getopt::Long version is somewhere on my TODO list. HOWEVER, since I highly appreciate my happy users, whatever comes out of the merge will be drop-in compatible with the current Getopt::Long. If this implies that you will not use it because it is too flexible, that's fine with me. One unhappy user against a zillion happy users. OOOH. Maybe I shouldn't upload Getopt::Long::INI after all... Don't hold your breath (see some other messages of mine). Besides, I'd like to reserve the Getopt::Long::* namespace for Getopt::Long internal modules... -- Johan
Re: Failing Reports due to 3rd Party Software...
On Sat, 18 Jun 2005, Ken Williams wrote: On Jun 18, 2005, at 6:58 AM, imacat wrote: But, what if we make clues there? Please correct me if I'm wrong, but from my experience working with GNU autoconf and automake, something like: test.c: int main() { } $(CC) -lsomelib test.c can be used to check the availability of libsomelib.(a|so|dll). Something like: test.c: #define someheader.h int main() { } $(CC) test.c can be used to check the availability of someheader.h. And there is already an existing method to check the availability of executables. Yeah, that might work well enough (but I don't have much experience with Configure/autoconf myself). Think we should add this to ExtUtils::CBuilder? That's the module that Module::Build uses to do all its C compiling and linking. It already has a have_compiler() method, we could add have_library($foo) and have_header($foo) methods too. There's also a have_library() method within, eg, http://search.cpan.org/src/PHISH/XML-LibXML-Common-0.13/Makefile.PL which might be considered for ExtUtils::CBuilder. Although more complicated than the above, this tests if some xs glue can be built with $foo, and so can catch some perl-specific problems. It also has some necessary tweaks to work on Win32. -- best regards, randy
Re: Failing Reports due to 3rd Party Software...
On Sat, Jun 18, 2005 at 12:06:49PM +0200, Jos I. Boumans wrote: If they can be published by Module::Build in an unambiguous way, that might be an option. Parsing the output of an error is not, as far as i'm concerned -- it would be the responsibility of the installer to tell us why it was unable to install. We're in agreement here. That means for 90% of the cases (as that's the group using EU::MM), whatever we decide here is not going to help much. Much the same way that any additional benefits added to CPANPLUS will not reach CPAN.pm users, this doesn't stop us from progressing anyway. The ratio of folks using the old system will shrink over time provided we're doing our job right. If we don't play under the assumption that Module::Build will be broadly adopted in the future then we might as well just give up right now. -- Michael G Schwern [EMAIL PROTECTED] http://www.pobox.com/~schwern ROCKS FALL! EVERYONE DIES! http://www.somethingpositive.net/sp05032002.shtml
Re: Getopt::Long wishes
A. Pagaltzis [EMAIL PROTECTED] writes: The most problematic scripts I have is a set thats launched by GUI events, sometimes in rapid succession. For purposes like this, I'd write dedicated scripts (e.g., no option parsing at all), or something simple like a first argument that can be checked quickly. Even with less critical scripts, like my overhaul of the rename script (where theres no way Im dropping to a less capable switch parser), the fixed overhead is noticable when the scripts task is not computation intensive. The only option I see is to split Getopt::Long into a couple of smaller sub-modules, each optimized for a specific type of configuration (with / without bundling, variable references / options hash, and so on). -- Johan
Re: Getopt::Long wishes
* Johan Vromans [EMAIL PROTECTED] [2005-06-18 23:35]: The only option I see is to split Getopt::Long into a couple of smaller sub-modules, each optimized for a specific type of configuration (with / without bundling, variable references / options hash, and so on). I dont think it requires sub-modules (and extra files cause extra slowdown, too), just judicious use of eval STRING to defer compilation of as much code as possible until its actually needed. If I were to look at the code to make specific suggestions (or even write a patch, I dunno), would they be obsolete again soon with the new design, or is the effort worth it? Regards, -- #Aristotle *AUTOLOAD=*_=sub{s/(.*)::(.*)/print$2,(,$\/, )[defined wantarray]/e;$1}; Just-another-Perl-hacker;
Re: RFC: Getopt::Modern
* Johan Vromans [EMAIL PROTECTED] [2005-06-18 23:20]: Eric Wilhelm [EMAIL PROTECTED] writes: Is '--un-' not distinctive enough? I believe there's more involved that adding the --un- semantics. For example, precedence order parsing yields different results from left to right parsing. I agree that --un- alone is not distinctive you want something that tells the user this *looks* *so* different that it probably *behaves* differently too, and simply using an un prefix to the option is not enough for that. OTOH, with a switch like --no-default-fish the semantics should be obvious from the description itself. In this respect the un prefix is really horribly chosen. I dont know about anyone else here, but if I saw that I could tell this hypothetical shopping application that I want --un-fish, Id have no idea what thats supposed to mean, whereas the meaning of --fish blah or --no-fish (in the traditional sense of the no prefix!) is immediately obvious. Something like ++ instead of --. I think thats ugly. Id suggest simply addding another dash to signify the altered precedence, as in ---. It is also slightly evocative in a linguistic sense as the ASCII rendering of an em-dash, so you could think of it as a break in the sentence youre writing on the command line something that can be added as an afterthought, like the subclause youre reading right now. For this hypothetical shopping application Id probably be most inclined to have a --no-default switch which takes a type or a list thereof as its argument, so the user could say --no-defaultfish,meat,fruit. With the tripple dash, Id call it something else (though I dont quite know what), but itd work the same way. Regards, -- Aristotle Like punning, programming is a play on words. Alan J. Perlis, Epigrams in Programming
Re: --un-fish
# The following was supposedly scribed by # A. Pagaltzis # on Saturday 18 June 2005 04:12 pm: OTOH, with a switch like --no-default-fish the semantics should be obvious from the description itself. In this respect the un prefix is really horribly chosen. With the 'un', being short for undef() (or undefine), I thought it was pretty appropriate. --no-default is a lot to type. Maybe if one is an alias to the other? --Eric -- Everything goes wrong all at once. --Quantized Revision of Murphy's Law - http://scratchcomputing.com -