Hi!
Attached is a diff for rules_du_jour. I prefer to manually update my mail rules and push them out to my servers, vs use cron on each. Since rules_du_jour is only one component of my mail setup, I typically execute a script that updates it in addition to other things. Unfortunately that means I'm going to trigger the DEBUG flag and get lots of output I don't want (see line 387). It's not easy to disable in the present version. The attached diff changes the way DEBUG is used. Unless explicitly defined yes, true or y; it becomes defined "false" +[ "$DEBUG"="yes" -o "$DEBUG"="y" ] && DEBUG="true"; +[ "$DEBUG"="true" ] || DEBUG="false" Every test for DEBUG then needed adjustment: - [ "${DEBUG}" ] + [ "${DEBUG}" = "true" ] The end result was a little too quiet, so I added SOME_DEBUG which is triggered if it's on a terminal.... and prints the url that's being downloaded. + if [ "${SOME_DEBUG}" = "true" ] ; then + echo "Loading ${CF_URL}"; + fi Guess what, now it's easy to see what URL is slowing up the update process! ...but how to fix? Well it seems to me, in addition to to a /etc/rulesdujour/config a /etc/rulesdujour/registry is required. So, I've removed the registry from the script, and added code to source it from /etc/rulesdujour/registry and attached that file here as well. I've also allowed RDJ_URL to be specified in the config (reasons below). The two big benefits of separating the registry from the script are 1) easy local registry modifications and 2) easy maintenance of mirrors. If I want to publish my own mirror and registry information, me and my clients can continue to use the RDJ update function, per RDJ_URL specified in config and I load the rules per the local registry file. In other words, I can update RDJ, the published registry and rules to my mirror, once per 24 hours, add to that my own rules and constructed registry(ies), which my busy-body self updates every hour. Now I can enable my clients to choose their registry of choice, per their config, and update their entire setup, from my mirror, every hour. The RDJ script has a lot of code apparently designed for portability. I hope this diff is in line with its original intent and acceptable (so I can continue to use the current version!), the only downside I see is if people don't produce a registry file but update RDJ, their rules won't be upgraded when RDJ is run by cron. However, if the current "registry" is published and RDJ is further modified to update, and use that file (if undefined in config). The upgrade should be seamless for everyone. I'm happy to add that code to my patch if the author(s) wouldn't prefer doing themself. Thanks! // George -- George Georgalis, systems architect, administrator Linux BSD IXOYE http://galis.org/george/ cell:646-331-2027 mailto:[EMAIL PROTECTED]
######################################### #### Begin Rules File Registry #### ######################################### # If you add more RuleSets to your own registry, please contribute the settings to the www.exit0.us wiki # http://www.exit0.us/index.php/RulesDuJourRuleSets #### Here are settings for Tripwire. #### TRIPWIRE=0; # Index of Tripwire data into the arrays is 0 CF_URLS[0]="http://www.rulesemporium.com/rules/99_FVGT_Tripwire.cf"; CF_FILES[0]="tripwire.cf"; CF_NAMES[0]="TripWire"; PARSE_NEW_VER_SCRIPTS[0]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[0]="nothing necessary for this ruleset."; #### Here are settings for Big Evil. #### BIGEVIL=1; # Index of Big Evil is 1 CF_URLS[1]="http://www.rulesemporium.com/rules/bigevil.cf"; CF_FILES[1]="bigevil.cf"; CF_NAMES[1]="Big Evil"; PARSE_NEW_VER_SCRIPTS[1]="${HEAD}"; # CF_MUNGE_SCRIPTS[1]="nothing necessary for this ruleset."; ### INDEX NUMBERS 2-6 ARE RESERVED. DO NOT USE. # NOTE: As of 2004-02-26, backhair, weedsk, and chickenpox updates are no longer available. # BACKHAIR=3; # Index of Backhair is 3 # WEEDS1=4; # Index of Weeds Set 1 is 4 # WEEDS2=5; # Index of Weeds Set 2 is 5 # CHICKENPOX=6; # Index of ChickenPox is 6 #### Here are settings for AntiDrug. #### ANTIDRUG=7; # Index of antidrug is 7 CF_URLS[7]="http://mywebpages.comcast.net/mkettler/sa/antidrug.cf" CF_FILES[7]="antidrug.cf"; CF_NAMES[7]="Matt Kettler's AntiDrug"; PARSE_NEW_VER_SCRIPTS[7]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[7]="nothing for this ruleset."; #### Here are settings for evilnumber #### EVILNUMBERS=8; # Index of evilnumbers data into the arrays is 8 CF_URLS[8]="http://www.rulesemporium.com/rules/evilnumbers.cf"; CF_FILES[8]="evilnumbers.cf"; CF_NAMES[8]="EvilNumber"; PARSE_NEW_VER_SCRIPTS[8]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[8]="nothing for this ruleset."; #### Here are settings for sa-blacklist #### BLACKLIST=9; # Index of sa-blacklist data into the arrays is 9 CF_URLS[9]="http://www.stearns.org/sa-blacklist/sa-blacklist.current"; CF_FILES[9]="blacklist.cf"; CF_NAMES[9]="William Stearn's sa-blacklist"; PARSE_NEW_VER_SCRIPTS[9]="${GREP} -i '^#.*sa-blacklist: 200' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[9]="nothing for this ruleset."; #### Here are settings for sa-blacklist-uri #### BLACKLIST_URI=10; # Index of sa-blacklist-uri data into the arrays is 10 CF_URLS[10]="http://www.stearns.org/sa-blacklist/sa-blacklist.current.uri.cf"; CF_FILES[10]="blacklist-uri.cf"; CF_NAMES[10]="William Stearn's URI blacklist"; PARSE_NEW_VER_SCRIPTS[10]="${GREP} -i '^#.*sa-blacklist.uri: 200' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[10]="nothing for this ruleset."; #### Here are settings for sa-blacklist-random #### RANDOMVAL=11; # Index of sa-blacklist-random data into the arrays is 11 CF_URLS[11]="http://www.stearns.org/sa-blacklist/random.current.cf"; CF_FILES[11]="random.cf"; CF_NAMES[11]="William Stearn's RANDOM WORD Ruleset"; PARSE_NEW_VER_SCRIPTS[11]="${GREP} -i '^#release' | ${TAIL}"; # CF_MUNGE_SCRIPTS[11]="nothing for this ruleset."; #### Here are settings for Tim Jackson's (et al) bogus virus warnings #### BOGUSVIRUS=12; # Index of bogus-virus-warnings data into the arrays is 12 CF_URLS[12]="http://www.timj.co.uk/linux/bogus-virus-warnings.cf"; CF_FILES[12]="bogus-virus-warnings.cf"; CF_NAMES[12]="Tim Jackson's (et al) bogus virus warnings"; PARSE_NEW_VER_SCRIPTS[12]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[12]="nothing for this ruleset."; ### INDEX NUMBER 13 IS RESERVED. DO NOT USE. # NOTE: As of 2004-06-XX, MrWiggly has been deprecated. Use 70_sare_specific.cf (SARE_SPECIFIC)instead # MRWIGGLY=13; # Index of MrWiggly.cf data into the arrays is 13 #### Here are settings for sare_adult #### SARE_ADULT=14; # Index of sare_adult.cf data into the arrays is 14 CF_URLS[14]="http://www.rulesemporium.com/rules/70_sare_adult.cf" CF_FILES[14]="70_sare_adult.cf"; CF_NAMES[14]="SARE Adult Content Ruleset"; PARSE_NEW_VER_SCRIPTS[14]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[14]="nothing for this ruleset."; #### Here are settings for sare_fraud_post25x #### SARE_FRAUD=15; # Index of sare_fraud_post25x data into the arrays is 15 CF_URLS[15]="http://www.rulesemporium.com/rules/99_sare_fraud_post25x.cf" CF_FILES[15]="99_sare_fraud_post25x.cf"; CF_NAMES[15]="SARE Fraud Detection Ruleset (for SA ver. 2.5x and greater)"; PARSE_NEW_VER_SCRIPTS[15]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[15]="nothing for this ruleset."; #### Here are settings for sare_fraud_pre25x #### SARE_FRAUD_PRE25X=16; # Index of sare_fraud_pre25x data into the arrays is 16 CF_URLS[16]="http://www.rulesemporium.com/rules/99_sare_fraud_pre25x.cf" CF_FILES[16]="99_sare_fraud_pre25x.cf"; CF_NAMES[16]="SARE Fraud Detection Ruleset (for SA prior to ver. 2.5x)"; PARSE_NEW_VER_SCRIPTS[16]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[16]="nothing for this ruleset."; #### Here are settings for sare_biz_market_learn_post25x #### SARE_BML=17; # Index of sare_biz_market_learn_post25x data into the arrays is 17 CF_URLS[17]="http://www.rulesemporium.com/rules/72_sare_bml_post25x.cf" CF_FILES[17]="72_sare_bml_post25x.cf"; CF_NAMES[17]="SARE BIZ/Marketing/Learning Ruleset (for SA ver. 2.5x and greater)"; PARSE_NEW_VER_SCRIPTS[17]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[17]="nothing for this ruleset."; #### Here are settings for sare_biz_market_learn_pre25x #### SARE_BML_PRE25X=18; # Index of sare_biz_market_learn_pre25x data into the arrays is 18 CF_URLS[18]="http://www.rulesemporium.com/rules/71_sare_bml_pre25x.cf" CF_FILES[18]="71_sare_bml_pre25x.cf"; CF_NAMES[18]="SARE BIZ/Marketing/Learning Ruleset (for SA prior to ver. 2.5x)"; PARSE_NEW_VER_SCRIPTS[18]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[18]="nothing for this ruleset."; #### Here are settings for ratware #### SARE_RATWARE=19; # Index of ratware data into the arrays is 19 CF_URLS[19]="http://www.rulesemporium.com/rules/70_sare_ratware.cf" CF_FILES[19]="70_sare_ratware.cf"; OLD_CF_FILES[19]="ratware.cf"; CF_NAMES[19]="SARE Ratware Detection Ruleset"; PARSE_NEW_VER_SCRIPTS[19]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[19]="nothing for this ruleset."; #### Here are settings for sare_spoof #### SARE_SPOOF=20; # Index of sare_spoof data into the arrays is 20 CF_URLS[20]="http://www.rulesemporium.com/rules/70_sare_spoof.cf" CF_FILES[20]="70_sare_spoof.cf"; CF_NAMES[20]="SARE Spoof Ruleset for SpamAssassin"; PARSE_NEW_VER_SCRIPTS[20]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[20]="nothing for this ruleset."; #### Here are settings for sare_bayes_poison_nxm #### SARE_BAYES_POISON_NXM=21; # Index of sare_bayes_poison_nxm data into the arrays is 21 CF_URLS[21]="http://www.rulesemporium.com/rules/70_sare_bayes_poison_nxm.cf" CF_FILES[21]="70_sare_bayes_poison_nxm.cf"; CF_NAMES[21]="SARE 70_sare_bayes_poison_nxm.cf Ruleset for SpamAssassin"; PARSE_NEW_VER_SCRIPTS[21]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[21]="nothing for this ruleset."; #### Here are settings for sare_oem #### SARE_OEM=22; # Index of sare_oem data into the arrays is 22 CF_URLS[22]="http://www.rulesemporium.com/rules/70_sare_oem.cf" CF_FILES[22]="70_sare_oem.cf"; CF_NAMES[22]="SARE OEM Ruleset for SpamAssassin"; PARSE_NEW_VER_SCRIPTS[22]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[22]="nothing for this ruleset."; #### Here are settings for sare_random #### SARE_RANDOM=23; # Index of sare_random data into the arrays is 23 CF_URLS[23]="http://www.rulesemporium.com/rules/70_sare_random.cf" CF_FILES[23]="70_sare_random.cf"; CF_NAMES[23]="SARE Random Ruleset for SpamAssassin 2.5x and higher"; PARSE_NEW_VER_SCRIPTS[23]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[23]="nothing for this ruleset."; #### Here are settings for sare_header #### SARE_HEADER=24; # Index of sare_header data into the arrays is 24 SARE_HEADER_ABUSE=24; # Left here for backwards compatibility CF_URLS[24]="http://www.rulesemporium.com/rules/70_sare_header.cf" CF_FILES[24]="70_sare_header.cf"; OLD_CF_FILES[24]="header_abuse.cf 70_sare_header_abuse.cf"; CF_NAMES[24]="Ruleset for header abuse"; PARSE_NEW_VER_SCRIPTS[24]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | ${HEAD}"; # CF_MUNGE_SCRIPTS[24]="nothing for this ruleset."; #### Here are settings for coding_html #### SARE_CODING_HTML=25; # Don't use. Use SARE_CODING instead. Kept for backwards compatibility. SARE_CODING=25; # Index of coding_html data into the arrays is 25 CF_URLS[25]="http://www.rulesemporium.com/rules/70_sare_html.cf" CF_FILES[25]="70_sare_html.cf"; OLD_CF_FILES[25]="coding_html.cf"; CF_NAMES[25]="Ruleset for html coding abuse"; PARSE_NEW_VER_SCRIPTS[25]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; # CF_MUNGE_SCRIPTS[25]="nothing for this ruleset."; #### Here are settings for SARE Specific Ruleset #### SARE_SPECIFIC=26; # Index of SARE_SPECIFIC data into the arrays is 26 CF_URLS[26]="http://www.rulesemporium.com/rules/70_sare_specific.cf"; CF_FILES[26]="70_sare_specific.cf"; CF_NAMES[26]="SARE Specific Ruleset"; PARSE_NEW_VER_SCRIPTS[26]="${GREP} -i '^# Version:' | ${HEAD}"; # CF_MUNGE_SCRIPTS[26]="nothing for this ruleset."; ######################################### #### End Rules File Registry #### #########################################
--- rules_du_jour.orig Sat Aug 28 17:45:23 2004 +++ rules_du_jour Wed Dec 22 21:19:13 2004 @@ -103,7 +103,9 @@ [ "${MAILCMD}" ] || MAILCMD="mail"; # Location of the mail program # that takes and understand the -s flag -# DEBUG="true"; # Uncomment this to force debug mode on (or use -D) +[ "$DEBUG"="yes" -o "$DEBUG"="y" ] && DEBUG="true"; +[ "$DEBUG"="true" ] || DEBUG="false" +# DEBUG=true # Uncomment this to force debug mode on (or use -D) #### End Local SpamAssassin Settings #### @@ -116,7 +118,7 @@ [ "${TMPDIR}" ] || TMPDIR="${SA_DIR}/RulesDuJour"; # Where we store old rulesets. If you delete # this directory, RuleSets may be detected as # out of date the next time you run rules_du_jour. -RDJ_URL="http://sandgnat.com/rdj/rules_du_jour"; # URL to update this script +[ "${RDJ_URL}" ] || RDJ_URL="http://sandgnat.com/rdj/rules_du_jour"; # URL to update this script #### CF Files information #### # These are bash Array Variables ("man bash" for more information) @@ -127,201 +129,12 @@ declare -a PARSE_NEW_VER_SCRIPTS; # Command to run on the file to retrieve new version info [ ${CF_MUNGE_SCRIPTS} ] || declare -a CF_MUNGE_SCRIPTS; # This (optionally) modifies the file; eg: lower scores - -######################################### -#### Begin Rules File Registry #### -######################################### - -# If you add more RuleSets to your own registry, please contribute the settings to the www.exit0.us wiki -# http://www.exit0.us/index.php/RulesDuJourRuleSets - -#### Here are settings for Tripwire. #### -TRIPWIRE=0; # Index of Tripwire data into the arrays is 0 - CF_URLS[0]="http://www.rulesemporium.com/rules/99_FVGT_Tripwire.cf"; - CF_FILES[0]="tripwire.cf"; - CF_NAMES[0]="TripWire"; -PARSE_NEW_VER_SCRIPTS[0]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[0]="nothing necessary for this ruleset."; - -#### Here are settings for Big Evil. #### -BIGEVIL=1; # Index of Big Evil is 1 - CF_URLS[1]="http://www.rulesemporium.com/rules/bigevil.cf"; - CF_FILES[1]="bigevil.cf"; - CF_NAMES[1]="Big Evil"; -PARSE_NEW_VER_SCRIPTS[1]="${HEAD}"; -# CF_MUNGE_SCRIPTS[1]="nothing necessary for this ruleset."; - -### INDEX NUMBERS 2-6 ARE RESERVED. DO NOT USE. -# NOTE: As of 2004-02-26, backhair, weedsk, and chickenpox updates are no longer available. -# BACKHAIR=3; # Index of Backhair is 3 -# WEEDS1=4; # Index of Weeds Set 1 is 4 -# WEEDS2=5; # Index of Weeds Set 2 is 5 -# CHICKENPOX=6; # Index of ChickenPox is 6 - -#### Here are settings for AntiDrug. #### -ANTIDRUG=7; # Index of antidrug is 7 - CF_URLS[7]="http://mywebpages.comcast.net/mkettler/sa/antidrug.cf" - CF_FILES[7]="antidrug.cf"; - CF_NAMES[7]="Matt Kettler's AntiDrug"; -PARSE_NEW_VER_SCRIPTS[7]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[7]="nothing for this ruleset."; - -#### Here are settings for evilnumber #### -EVILNUMBERS=8; # Index of evilnumbers data into the arrays is 8 - CF_URLS[8]="http://www.rulesemporium.com/rules/evilnumbers.cf"; - CF_FILES[8]="evilnumbers.cf"; - CF_NAMES[8]="EvilNumber"; -PARSE_NEW_VER_SCRIPTS[8]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[8]="nothing for this ruleset."; - -#### Here are settings for sa-blacklist #### -BLACKLIST=9; # Index of sa-blacklist data into the arrays is 9 - CF_URLS[9]="http://www.stearns.org/sa-blacklist/sa-blacklist.current"; - CF_FILES[9]="blacklist.cf"; - CF_NAMES[9]="William Stearn's sa-blacklist"; -PARSE_NEW_VER_SCRIPTS[9]="${GREP} -i '^#.*sa-blacklist: 200' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[9]="nothing for this ruleset."; - -#### Here are settings for sa-blacklist-uri #### -BLACKLIST_URI=10; # Index of sa-blacklist-uri data into the arrays is 10 - CF_URLS[10]="http://www.stearns.org/sa-blacklist/sa-blacklist.current.uri.cf"; - CF_FILES[10]="blacklist-uri.cf"; - CF_NAMES[10]="William Stearn's URI blacklist"; -PARSE_NEW_VER_SCRIPTS[10]="${GREP} -i '^#.*sa-blacklist.uri: 200' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[10]="nothing for this ruleset."; - -#### Here are settings for sa-blacklist-random #### -RANDOMVAL=11; # Index of sa-blacklist-random data into the arrays is 11 - CF_URLS[11]="http://www.stearns.org/sa-blacklist/random.current.cf"; - CF_FILES[11]="random.cf"; - CF_NAMES[11]="William Stearn's RANDOM WORD Ruleset"; -PARSE_NEW_VER_SCRIPTS[11]="${GREP} -i '^#release' | ${TAIL}"; -# CF_MUNGE_SCRIPTS[11]="nothing for this ruleset."; - -#### Here are settings for Tim Jackson's (et al) bogus virus warnings #### -BOGUSVIRUS=12; # Index of bogus-virus-warnings data into the arrays is 12 - CF_URLS[12]="http://www.timj.co.uk/linux/bogus-virus-warnings.cf"; - CF_FILES[12]="bogus-virus-warnings.cf"; - CF_NAMES[12]="Tim Jackson's (et al) bogus virus warnings"; -PARSE_NEW_VER_SCRIPTS[12]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[12]="nothing for this ruleset."; - -### INDEX NUMBER 13 IS RESERVED. DO NOT USE. -# NOTE: As of 2004-06-XX, MrWiggly has been deprecated. Use 70_sare_specific.cf (SARE_SPECIFIC)instead -# MRWIGGLY=13; # Index of MrWiggly.cf data into the arrays is 13 - -#### Here are settings for sare_adult #### -SARE_ADULT=14; # Index of sare_adult.cf data into the arrays is 14 - CF_URLS[14]="http://www.rulesemporium.com/rules/70_sare_adult.cf" - CF_FILES[14]="70_sare_adult.cf"; - CF_NAMES[14]="SARE Adult Content Ruleset"; -PARSE_NEW_VER_SCRIPTS[14]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[14]="nothing for this ruleset."; - -#### Here are settings for sare_fraud_post25x #### -SARE_FRAUD=15; # Index of sare_fraud_post25x data into the arrays is 15 - CF_URLS[15]="http://www.rulesemporium.com/rules/99_sare_fraud_post25x.cf" - CF_FILES[15]="99_sare_fraud_post25x.cf"; - CF_NAMES[15]="SARE Fraud Detection Ruleset (for SA ver. 2.5x and greater)"; -PARSE_NEW_VER_SCRIPTS[15]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[15]="nothing for this ruleset."; - -#### Here are settings for sare_fraud_pre25x #### -SARE_FRAUD_PRE25X=16; # Index of sare_fraud_pre25x data into the arrays is 16 - CF_URLS[16]="http://www.rulesemporium.com/rules/99_sare_fraud_pre25x.cf" - CF_FILES[16]="99_sare_fraud_pre25x.cf"; - CF_NAMES[16]="SARE Fraud Detection Ruleset (for SA prior to ver. 2.5x)"; -PARSE_NEW_VER_SCRIPTS[16]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[16]="nothing for this ruleset."; - -#### Here are settings for sare_biz_market_learn_post25x #### -SARE_BML=17; # Index of sare_biz_market_learn_post25x data into the arrays is 17 - CF_URLS[17]="http://www.rulesemporium.com/rules/72_sare_bml_post25x.cf" - CF_FILES[17]="72_sare_bml_post25x.cf"; - CF_NAMES[17]="SARE BIZ/Marketing/Learning Ruleset (for SA ver. 2.5x and greater)"; -PARSE_NEW_VER_SCRIPTS[17]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[17]="nothing for this ruleset."; - -#### Here are settings for sare_biz_market_learn_pre25x #### -SARE_BML_PRE25X=18; # Index of sare_biz_market_learn_pre25x data into the arrays is 18 - CF_URLS[18]="http://www.rulesemporium.com/rules/71_sare_bml_pre25x.cf" - CF_FILES[18]="71_sare_bml_pre25x.cf"; - CF_NAMES[18]="SARE BIZ/Marketing/Learning Ruleset (for SA prior to ver. 2.5x)"; -PARSE_NEW_VER_SCRIPTS[18]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[18]="nothing for this ruleset."; - -#### Here are settings for ratware #### -SARE_RATWARE=19; # Index of ratware data into the arrays is 19 - CF_URLS[19]="http://www.rulesemporium.com/rules/70_sare_ratware.cf" - CF_FILES[19]="70_sare_ratware.cf"; - OLD_CF_FILES[19]="ratware.cf"; - CF_NAMES[19]="SARE Ratware Detection Ruleset"; -PARSE_NEW_VER_SCRIPTS[19]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[19]="nothing for this ruleset."; - -#### Here are settings for sare_spoof #### -SARE_SPOOF=20; # Index of sare_spoof data into the arrays is 20 - CF_URLS[20]="http://www.rulesemporium.com/rules/70_sare_spoof.cf" - CF_FILES[20]="70_sare_spoof.cf"; - CF_NAMES[20]="SARE Spoof Ruleset for SpamAssassin"; -PARSE_NEW_VER_SCRIPTS[20]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[20]="nothing for this ruleset."; - -#### Here are settings for sare_bayes_poison_nxm #### -SARE_BAYES_POISON_NXM=21; # Index of sare_bayes_poison_nxm data into the arrays is 21 - CF_URLS[21]="http://www.rulesemporium.com/rules/70_sare_bayes_poison_nxm.cf" - CF_FILES[21]="70_sare_bayes_poison_nxm.cf"; - CF_NAMES[21]="SARE 70_sare_bayes_poison_nxm.cf Ruleset for SpamAssassin"; -PARSE_NEW_VER_SCRIPTS[21]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[21]="nothing for this ruleset."; - -#### Here are settings for sare_oem #### -SARE_OEM=22; # Index of sare_oem data into the arrays is 22 - CF_URLS[22]="http://www.rulesemporium.com/rules/70_sare_oem.cf" - CF_FILES[22]="70_sare_oem.cf"; - CF_NAMES[22]="SARE OEM Ruleset for SpamAssassin"; -PARSE_NEW_VER_SCRIPTS[22]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[22]="nothing for this ruleset."; - -#### Here are settings for sare_random #### -SARE_RANDOM=23; # Index of sare_random data into the arrays is 23 - CF_URLS[23]="http://www.rulesemporium.com/rules/70_sare_random.cf" - CF_FILES[23]="70_sare_random.cf"; - CF_NAMES[23]="SARE Random Ruleset for SpamAssassin 2.5x and higher"; -PARSE_NEW_VER_SCRIPTS[23]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[23]="nothing for this ruleset."; - -#### Here are settings for sare_header #### -SARE_HEADER=24; # Index of sare_header data into the arrays is 24 -SARE_HEADER_ABUSE=24; # Left here for backwards compatibility - CF_URLS[24]="http://www.rulesemporium.com/rules/70_sare_header.cf" - CF_FILES[24]="70_sare_header.cf"; - OLD_CF_FILES[24]="header_abuse.cf 70_sare_header_abuse.cf"; - CF_NAMES[24]="Ruleset for header abuse"; -PARSE_NEW_VER_SCRIPTS[24]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | ${HEAD}"; -# CF_MUNGE_SCRIPTS[24]="nothing for this ruleset."; - -#### Here are settings for coding_html #### -SARE_CODING_HTML=25; # Don't use. Use SARE_CODING instead. Kept for backwards compatibility. - SARE_CODING=25; # Index of coding_html data into the arrays is 25 - CF_URLS[25]="http://www.rulesemporium.com/rules/70_sare_html.cf" - CF_FILES[25]="70_sare_html.cf"; - OLD_CF_FILES[25]="coding_html.cf"; - CF_NAMES[25]="Ruleset for html coding abuse"; -PARSE_NEW_VER_SCRIPTS[25]="${PERL} -ne 'print if /^\s*#.*(vers?|version|rev|revision)[:\.\s]*[0-9]/i;' | sort | ${TAIL}"; -# CF_MUNGE_SCRIPTS[25]="nothing for this ruleset."; - -#### Here are settings for SARE Specific Ruleset #### -SARE_SPECIFIC=26; # Index of SARE_SPECIFIC data into the arrays is 26 - CF_URLS[26]="http://www.rulesemporium.com/rules/70_sare_specific.cf"; - CF_FILES[26]="70_sare_specific.cf"; - CF_NAMES[26]="SARE Specific Ruleset"; -PARSE_NEW_VER_SCRIPTS[26]="${GREP} -i '^# Version:' | ${HEAD}"; -# CF_MUNGE_SCRIPTS[26]="nothing for this ruleset."; - -######################################### -#### End Rules File Registry #### -######################################### +# Registry file is read from one of the following locations. +# Reads persistent registry files from /etc, if one exists. +# /etc/rulesdujour/registry is the recommended registry file location. +for i in ${RDJ_REGISTRYFILE} /etc/rulesdujour/registry /etc/rulesdujour.registry /etc/mail/rulesdujour.registry /etc/sysconfig/RulesDuJour.registry /etc/sysconfig/rulesdujour.registry ; do + [ -f "$i" ] && source $i ; +done; # Do not update beyond this line unless you know what you are doing. @@ -340,8 +153,8 @@ CURL_OUTPUT=`${CURL} ${CURL_FILE} ${CURL_URL} 2>&1`; CURL_ERROR=$?; CURL_HTTP_CODE="${CURL_OUTPUT:0:3}"; - [ "${DEBUG}" ] && echo "exec: ${CURL} ${CURL_FILE} ${CURL_URL} 2>&1"; - [ "${DEBUG}" ] && echo "curl_output: ${CURL_OUTPUT}"; + [ "${DEBUG}" = "true" ] && echo "exec: ${CURL} ${CURL_FILE} ${CURL_URL} 2>&1"; + [ "${DEBUG}" = "true" ] && echo "curl_output: ${CURL_OUTPUT}"; [ "${CURL_HTTP_CODE}" = "200" ] && DOWNLOADED="true"; [ "${CURL_HTTP_CODE}" = "404" ] && WAS404="true"; [ "${CURL_HTTP_CODE:0:1}" = "4" ] && WAS4xx5xx="true"; @@ -351,10 +164,11 @@ } function WgetHttpGet() { +env URL=$1; ${WGET} ${URL} > ${TMPDIR}/wget.log 2>&1; - [ "${DEBUG}" ] && echo "exec: ${WGET} ${URL} > ${TMPDIR}/wget.log 2>&1"; - [ "${DEBUG}" ] && echo "wget_output: `cat ${TMPDIR}/wget.log`"; + [ "${DEBUG}" = "true" ] && echo "exec: ${WGET} ${URL} > ${TMPDIR}/wget.log 2>&1"; + [ "${DEBUG}" = "true" ] && echo "wget_output: `cat ${TMPDIR}/wget.log`"; ${GREP} 'saved' ${TMPDIR}/wget.log > /dev/null; DOWNLOADED=$?; ${GREP} 'ERROR [45][0-9][0-9]' ${TMPDIR}/wget.log > /dev/null; WAS4xx5xx=$?; @@ -384,7 +198,7 @@ # if invoked with -D, enable DEBUG here. [ "$1" = "-D" ] && DEBUG="true"; # if running interactively, enable DEBUG here. -[ -t 0 ] && DEBUG="true"; +[ -t 0 ] && SOME_DEBUG="true"; # Test if curl available (Bash feature) command -v ${CURL_PROG} > /dev/null 2>&1 ; CURL_AVAILABLE=$?; @@ -399,7 +213,7 @@ # Fall back to wget if version is less than 7.10 if [ ! "${CURL_COMPRESSION}" ] ; then CURL_AVAILABLE=; - [ "${DEBUG}" ] && { echo "Curl version is ${CURL_VERSION} (Not 7.10 or greater). Falling back to wget."; sleep 3; }; + [ "${DEBUG}" = "true" ] && { echo "Curl version is ${CURL_VERSION} (Not 7.10 or greater). Falling back to wget."; sleep 3; }; fi fi @@ -408,8 +222,8 @@ MAXDELAY=3600; DELAY=0; [ ! -t 0 ] && [ ${MAXDELAY} -gt 0 ] && let DELAY="${RANDOM} % ${MAXDELAY}"; -[ "${DEBUG}" ] && [ ${DELAY} -gt 0 ] && echo "Probably running from cron... sleeping for a random interval (${DELAY} seconds)"; -[ ${DELAY} -gt 0 ] && sleep ${DELAY}; +[ "${DEBUG}" = "true" ] && [ ${DELAY} -gt 0 ] && echo "Probably running from cron... sleeping for a random interval (${DELAY} seconds)"; +[ ${DELAY} -gt 0 ] && sleep ${DELAY} && echo sleeping ; # Save old working dir @@ -450,7 +264,7 @@ INDEX=${!RULESET_NAME}; if [ ! "${INDEX}" ] ; then MSG_INVALID_RULENAME="\nNo index found for ruleset named ${RULESET_NAME}. Check that this ruleset is still valid."; - [ "${DEBUG}" ] && echo -e "${MSG_INVALID_RULENAME}"; + [ "${DEBUG}" = "true" ] && echo -e "${MSG_INVALID_RULENAME}"; MESSAGES="${MESSAGES}\n${MSG_INVALID_RULENAME}"; else CF_URL=${CF_URLS[${INDEX}]}; @@ -464,8 +278,13 @@ CF_BASENAME=`basename ${CF_URL}`; DATE=`date +"%Y%m%d-%H%M"` + - if [ "${DEBUG}" ] ; then + if [ "${SOME_DEBUG}" = "true" ] ; then + echo "Loading ${CF_URL}"; + fi + + if [ "${DEBUG}" = "true" ] ; then # Dump the variables to stdout echo ""; echo "------ ${RULESET_NAME} ------"; @@ -478,11 +297,11 @@ echo "CF_MUNGE_SCRIPT=${CF_MUNGE_SCRIPT}"; fi - [ "${DEBUG}" ] && [ -f ${TMPDIR}/${CF_BASENAME} ] && echo "Old ${CF_BASENAME} already existed in ${TMPDIR}..."; - [ "${DEBUG}" ] && [ ! -f ${TMPDIR}/${CF_BASENAME} ] && \ + [ "${DEBUG}" = "true" ] && [ -f ${TMPDIR}/${CF_BASENAME} ] && echo "Old ${CF_BASENAME} already existed in ${TMPDIR}..."; + [ "${DEBUG}" = "true" ] && [ ! -f ${TMPDIR}/${CF_BASENAME} ] && \ [ ! -f ${SA_DIR}/${CF_FILE} ] && echo "This is the first time downloading ${CF_BASENAME}..."; - [ "${DEBUG}" ] && [ ! -f ${TMPDIR}/${CF_BASENAME} ] && [ -f ${SA_DIR}/${CF_FILE} ] && \ + [ "${DEBUG}" = "true" ] && [ ! -f ${TMPDIR}/${CF_BASENAME} ] && [ -f ${SA_DIR}/${CF_FILE} ] && \ echo "Copying from ${SA_DIR}/${CF_FILE} to ${TMPDIR}/${CF_BASENAME}..."; [ ! -f ${TMPDIR}/${CF_BASENAME} ] && [ -f ${SA_DIR}/${CF_FILE} ] && \ cp ${SA_DIR}/${CF_FILE} ${TMPDIR}/${CF_BASENAME} && \ @@ -492,7 +311,7 @@ - [ "${DEBUG}" ] && echo "Retrieving file from ${CF_URL}..."; + [ "${DEBUG}" = "true" ] && echo "Retrieving file from ${CF_URL}..."; # send wget output to a temp file for grepping HttpGet ${CF_URL} ${TMPDIR}/${CF_BASENAME}; @@ -501,12 +320,12 @@ [ "${FAILED}" ] && RULES_THAT_404ED="${RULES_THAT_404ED}\n${CF_NAME} had an unknown error:\n${HTTP_ERROR}"; [ "${WAS404}" ] && RULES_THAT_404ED="${RULES_THAT_404ED}\n${CF_NAME} not found (404) at ${CF_URL}"; [ "${WAS4xx5xx}" ] && RULES_THAT_404ED="${RULES_THAT_404ED}\n${CF_NAME} was not retrieved because of: ${HTTP_ERROR} from ${CF_URL}."; - [ "${WAS4xx5xx}" ] && [ "${DEBUG}" ] && RULES_THAT_404ED="${RULES_THAT_404ED}\nAdditional Info:\n${HTTP_ERROR}"; + [ "${WAS4xx5xx}" ] && [ "${DEBUG}" = "true" ] && RULES_THAT_404ED="${RULES_THAT_404ED}\nAdditional Info:\n${HTTP_ERROR}"; - [ "${DEBUG}" ] && [ "${WAS4xx5xx}" ] && echo "Got ${HTTP_ERROR} from ${CF_NAME} at ${CF_URL} ..."; - [ "${DEBUG}" ] && [ ! "${WAS4xx5xx}" ] && [ "${WAS404}" ] && echo "Got 404 from ${CF_NAME} at ${CF_URL} ..."; - [ "${DEBUG}" ] && [ ! "${WAS4xx5xx}" ] && [ "${DOWNLOADED}" ] && echo "New version downloaded..."; - [ "${DEBUG}" ] && [ ! "${WAS4xx5xx}" ] && [ ! "${DOWNLOADED}" ] && echo "${CF_BASENAME} was up to date [skipped downloading of ${CF_URL} ] ..."; + [ "${DEBUG}" = "true" ] && [ "${WAS4xx5xx}" ] && echo "Got ${HTTP_ERROR} from ${CF_NAME} at ${CF_URL} ..."; + [ "${DEBUG}" = "true" ] && [ ! "${WAS4xx5xx}" ] && [ "${WAS404}" ] && echo "Got 404 from ${CF_NAME} at ${CF_URL} ..."; + [ "${DEBUG}" = "true" ] && [ ! "${WAS4xx5xx}" ] && [ "${DOWNLOADED}" ] && echo "New version downloaded..."; + [ "${DEBUG}" = "true" ] && [ ! "${WAS4xx5xx}" ] && [ ! "${DOWNLOADED}" ] && echo "${CF_BASENAME} was up to date [skipped downloading of ${CF_URL} ] ..."; @@ -519,7 +338,7 @@ ( [ ! -f ${SA_DIR}/${CF_FILE} ] && \ [ -f ${TMPDIR}/${CF_BASENAME} ]) ) ; then if [ "${CF_MUNGE_SCRIPT}" ] ; then - [ "${DEBUG}" ] && echo "Munging output using command: ${CF_MUNGE_SCRIPT}"; + [ "${DEBUG}" = "true" ] && echo "Munging output using command: ${CF_MUNGE_SCRIPT}"; sh -c "${CF_MUNGE_SCRIPT}" < ${TMPDIR}/${CF_BASENAME} > ${TMPDIR}/${CF_BASENAME}.2; else cp ${TMPDIR}/${CF_BASENAME} ${TMPDIR}/${CF_BASENAME}.2; @@ -543,7 +362,7 @@ for OLD_CF_FILE in ${OLD_CF_FILES} ; do if [ -f ${SA_DIR}/${OLD_CF_FILE} ] ; then MSG_FILENAME_CHANGED="\n*** ${CF_NAME} has changed names from ${OLD_CF_FILE} to ${CF_FILE}.\nBecause of the name change I am removing the old file and installing the new file."; - [ "${DEBUG}" ] && echo -e "${MSG_FILENAME_CHANGED}"; + [ "${DEBUG}" = "true" ] && echo -e "${MSG_FILENAME_CHANGED}"; MESSAGES="${MESSAGES}\n${MSG_FILENAME_CHANGED}"; [ "${SINGLE_EMAIL_ONLY}" ] && QUEUE_SINGLE_EMAIL="true" || \ echo -e ${MSG_FILENAME_CHANGED} | ${MAILCMD} -s "RulesDuJour/`hostname`: ${CF_NAME} RuleSet's name has changed" ${MAIL_ADDRESS} @@ -552,8 +371,8 @@ fi done - [ "${DEBUG}" ] && [ ! -f ${SA_DIR}/${CF_FILE} ] && echo "Installing new ruleset from ${TMPDIR}/${CF_BASENAME}.2" ; - [ "${DEBUG}" ] && [ -f ${SA_DIR}/${CF_FILE} ] && echo "Old version ${SA_DIR}/${CF_FILE} differs from new version ${TMPDIR}/${CF_BASENAME}.2" && echo "Backing up old version..."; + [ "${DEBUG}" = "true" ] && [ ! -f ${SA_DIR}/${CF_FILE} ] && echo "Installing new ruleset from ${TMPDIR}/${CF_BASENAME}.2" ; + [ "${DEBUG}" = "true" ] && [ -f ${SA_DIR}/${CF_FILE} ] && echo "Old version ${SA_DIR}/${CF_FILE} differs from new version ${TMPDIR}/${CF_BASENAME}.2" && echo "Backing up old version..."; [ -f ${SA_DIR}/${CF_FILE} ] && mv -f ${SA_DIR}/${CF_FILE} ${TMPDIR}/${CF_FILE}.${DATE}; # Save the command that can be used to undo this change, if rules won't --lint @@ -562,14 +381,14 @@ UNDO_COMMAND="${UNDO_COMMAND} mv -f ${TMPDIR}/${CF_FILE}.${DATE} ${SA_DIR}/${CF_FILE};" || \ UNDO_COMMAND="${UNDO_COMMAND} rm -f ${SA_DIR}/${CF_FILE};"; - [ "${DEBUG}" ] && [ -f ${TMPDIR}/${CF_BASENAME}.2 ] && echo "Installing new version..."; + [ "${DEBUG}" = "true" ] && [ -f ${TMPDIR}/${CF_BASENAME}.2 ] && echo "Installing new version..."; [ -f ${TMPDIR}/${CF_BASENAME}.2 ] && mv -f ${TMPDIR}/${CF_BASENAME}.2 ${SA_DIR}/${CF_FILE}; NEWVER=`sh -c "cat ${SA_DIR}/${CF_FILE} | ${PARSE_NEW_VER_SCRIPT}"`; MSG_CHANGED="\n${CF_NAME} has changed on `hostname`.\nVersion line: ${NEWVER}"; MESSAGES="${MESSAGES}\n${MSG_CHANGED}"; - [ "${DEBUG}" ] && echo -e "${MSG_CHANGED}"; + [ "${DEBUG}" = "true" ] && echo -e "${MSG_CHANGED}"; [ "${SINGLE_EMAIL_ONLY}" ] && QUEUE_SINGLE_EMAIL="true" || \ echo -e ${MSG_CHANGED} | ${MAILCMD} -s "RulesDuJour/`hostname`: ${CF_NAME} RuleSet has been updated" ${MAIL_ADDRESS} @@ -593,7 +412,7 @@ } [ "${RESTART_REQUIRED}" ] && { - [ "${DEBUG}" ] && echo "Attempting to --lint the rules."; + [ "${DEBUG}" = "true" ] && echo "Attempting to --lint the rules."; ${SA_LINT} > /dev/null 2>&1; LINTFAILED=$?; [ "${LINTFAILED}" = "0" ] && LINTFAILED=; # Unset LINTFAILED if lint didn't fail. @@ -606,14 +425,14 @@ [ "${SINGLE_EMAIL_ONLY}" ] && QUEUE_SINGLE_EMAIL="true" || \ echo -e "${WARNMSG}" | ${MAILCMD} -s "RulesDuJour/`hostname`: lint failed. Updates rolled back." ${MAIL_ADDRESS}; else - [ "${DEBUG}" ] && echo "Restarting SpamAssassin using: sh -c \"${SA_RESTART}\""; + [ "${DEBUG}" = "true" ] && echo "Restarting SpamAssassin using: sh -c \"${SA_RESTART}\""; sh -c "${SA_RESTART}" > /dev/null 2>&1; fi } -[ "${DEBUG}" ] && [ ! "${RESTART_REQUIRED}" ] && echo "No files updated; No restart required."; +[ "${DEBUG}" = "true" ] && [ ! "${RESTART_REQUIRED}" ] && echo "No files updated; No restart required."; -[ "${DEBUG}" ] && echo -e "\n\n\n\n\nRules Du Jour Run Summary:${MESSAGES}"; +[ "${DEBUG}" = "true" ] && echo -e "\n\n\n\n\nRules Du Jour Run Summary:${MESSAGES}"; # Send the single consolidated notification email here. if [ "${SINGLE_EMAIL_ONLY}" ] && [ "${QUEUE_SINGLE_EMAIL}" ] ; then