On Tue, Apr 04, 2006 at 10:20:34PM -0400, Justin Pryzby wrote: > On Sun, Apr 02, 2006 at 11:08:27AM +0100, Julian Gilbey wrote: > > > > BTW, I have an additional way of avoiding the "killing the BTS at > > > > 02:00 local time": just change the heading to say: > > > > > > > > # Best-practice information for Debian developers > > > > # intended to be run periodically as a cronjob, for example weekly as: > > > > # 0 hour * * day /usr/bin/dev-best-practice > > > > # (replacing hour and day by your preferred times!) > > > I prefer to give something copy-and-paste-able, and which doesn't > > > assume that everyone wont use day=0 hour=0. > > > > There are two options here: > > (1) Trust that developers are competent enough to make the change > > themselves, or > > (2) Not trust them. > > > > And everyone using day=0 hour=0 will be identical to everyone using > > day=0 hour=2! > Er, I meant 2.. > > But I'm not sufficiently creative to come up with times more random > than 01:23 :/
Well, if they stick it into their crontab as is, cron will simply barf :) > > > # 0 2 * * 0 /usr/share/doc/devscripts/examples/dev-best-practice --delay > > > --mail > > Need to change this line... > Do you mean about moving to /usr/bin/? Yup. > > > while test $# -gt 0; do > > Use getopt :) > Ugh. afaics all the getopt foo requires looping anyway, so I just did > it myself. Correct. But getopt handles reduced option names, and also the choice of --opt=foo and --opt foo. Simplifies the loop slightly. > > > if [ -n "$XMAIL" ]; then > > > exec 3>&1; > > > exec 1> >(mail -s "Debian fixme list for $LOGNAME on `date +%x`" > > > $LOGNAME) > > Nice one :) > There's also > > exec >& >(tee -a "$0.log"); > > to make a self-logging script, or pdebuild >& (tee sextractor.log) > > Or, a 1-liner to implement annotate-output: > ls >& >(while read l; do echo "`date +%X`: $l"; done) Showoff :) > > > echo "Checking for your bugs tagged moreinfo:"; > > > lynx -dump "$bts/from:$DEBEMAIL&include=moreinfo" | > > > > I'd prefer not to rely on lynx/links if possible and make do with > > wget/sed/perl. > Then you need something like: > > wget -qO- 'http://bugs.debian.org/from:[EMAIL PROTECTED]&include=moreinfo' > |grep '^<li><a href="bugreport\.cgi?bug=[^"]*">' |cut -d'>' -f3 |cut -d'<' -f1 > > lynx -dump is not wget -qO- .. this leaves crud like " in the bug > titles. It both makes fragile assumptions about the BTS format, and > mangles bug titles. True; wget -qO- simply gives the HTML as is. The BTS format *is* fragile, agreed, but using lynx may or may not get around that. My issue around lynx is simply needing yet another dependency of the script. Of course, could have two options: if command -v lynx >/dev/null 2>&1; then lynx version else wget version fi > > > sed -ne '/^[[:space:]]*\* \[[[:digit:]]\+\]\(#[[:digit:]]\)/s//\1/p'; > > > # grep -E '^[[:space:]]*\* \[[[:digit:]]+\]#[[:digit:]]+' | > > > # sed -e 's/^[^#]*//'; > > > > Why the commented out lines? > It's an alternate implementation, which I haven't yet discarded. OK. > > > if [ -n "$XMAIL" ]; then > > > exec >&-; > > > exec 1>&3; > > > fi; > > > > Not sure what this bit does; wouldn't exec 1>&3- be adequate? > You're probably right. > > > Finally, I think it would be nice to have a devscripts option > > DEVCHECKER_EXCLUDE which would be a colon-separated list of tests to > > skip. Then each test would read: > > > > case ":$EXCLUDE:" in > > :thistest:) # skip test > > ;; > > *) # do test > > [...] > > ;; > > esac > I don't understand; are $EXCLUDE and $DEVCHECKER_EXCLUDE distinct? Do > you mean to split with IFS=':', and loop around it for each test? Sorry, jumped a step; DEVCHECKER_EXCLUDE would be the config file option, which becomes EXCLUDE in the script. A case statement is nicer than looping :) > Why not separate each check as a function, and list that function in > $tests, and loop: > > for test in $tests; do > grep -q ":$test:" <<<":$DEVCHECKER_EXCLUDE:" && continue; > $test; > done; Yup, this is the direction I was heading in; I like this a lot. Julian -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

