Good morning *,
Am 2009-08-04 13:51:24, schrieb Jason L Tibbitts III:
> > "DS" == Dan Schaefer writes:
>
> DS> I'm glad to see this SPAM traffic has come to a halt. At least on my
> DS> mail server...
>
> Yes, I haven't seen any of those spams since the morning of the 31st.
> My servers wer
> "DS" == Dan Schaefer writes:
DS> I'm glad to see this SPAM traffic has come to a halt. At least on my
DS> mail server...
Yes, I haven't seen any of those spams since the morning of the 31st.
My servers were rejecting them like mad right up until that point in
time (10:30CDT), and then noth
Hi Dan and *,
Am 2009-08-04 14:37:46, schrieb Dan Schaefer:
> I'm glad to see this SPAM traffic has come to a halt. At least on my
> mail server...
They have seen, the out spamassassin is working verry efficient. I get
only one or two spams per day... which are catched by SA of course.
Than
I'm glad to see this SPAM traffic has come to a halt. At least on my
mail server...
--
Dan Schaefer
Web Developer/Systems Analyst
Performance Administration Corp.
(apologies for top posting, but the email software here does not really do
quoting in a way that works out well otherwise)
If your mail contains SpamAssassin headers then it was (obviously) processed
through SpamAssassin. Just because you have BL checks in your MTA does not
necessarily mean th
On Thu, 23 Jul 2009, Dan Schaefer wrote:
> > Are you quite sure that an upstream copy of SA, e.g. in your ISP
> > or at a sender site that scans for outgoing spam, hasn't already
> > added X-* headers to the message?
>
> No. Is that even possible to track down?
There would probably b
On Thu, 2009-07-23 at 12:25 -0400, Dan Schaefer wrote:
> > Are you quite sure that an upstream copy of SA, e.g. in your ISP or at a
> > sender site that scans for outgoing spam, hasn't already added X-*
> > headers to the message?
> >
> >
> > Martin
> >
> >
> No. Is that even possible to track d
On Thu, 23 Jul 2009, Dan Schaefer wrote:
Are you quite sure that an upstream copy of SA, e.g. in your ISP or at
a sender site that scans for outgoing spam, hasn't already added X-*
headers to the message?
No. Is that even possible to track down?
There would probably be an X-Spam-Checker-V
Are you quite sure that an upstream copy of SA, e.g. in your ISP or at
a sender site that scans for outgoing spam, hasn't already added X-*
headers to the message?
No. Is that even possible to track down?
There would probably be an X-Spam-Checker-Version header in your
inbound mail strea
Are you quite sure that an upstream copy of SA, e.g. in your ISP or at a
sender site that scans for outgoing spam, hasn't already added X-*
headers to the message?
Martin
No. Is that even possible to track down?
--
Dan Schaefer
Web Developer/Systems Analyst
Performance Administration Cor
Dan Schaefer wrote:
>
> If this is the case, then why does my email have the X-* headers in
> it? I have nothing in my postfix header_checks to discard the BL
> rules. Does anyone have a detailed flow chart of SA/postfix setup and
> describes blacklisting? Or even a webpage describing the proces
On Wed, 22 Jul 2009, Dan Schaefer wrote:
For those of you that manage these rules,
URI_OBFU_X9_WS, URI_OBFU_WWW, AE_MEDS38, AE_MEDS39 did not mark this
email as spam
http://pastebin.com/m40f7cff4
The URI is not obfuscated, therefore it triggered the URIBL tests
properly (and scored 3 additio
Dan Schaefer wrote:
It means that if you were using BL at MTA level your SA might never
have seen the message at all.
No your rule would not be "overlooked" 'because the site is in a
blacklist' *unless* you were using the BL in your MTA and rejected
the transaction from a blacklisted IP add
It means that if you were using BL at MTA level your SA might never have seen
the message at all.
No your rule would not be "overlooked" 'because the site is in a blacklist'
*unless* you were using the BL in your MTA and rejected the transaction from a
blacklisted IP address and, thus, never
>For those of you that manage these rules,
>URI_OBFU_X9_WS, URI_OBFU_WWW, AE_MEDS38, AE_MEDS39 did not mark this
email as spam
I'm up to AE_MED45, so I wouldn't expect AE_MEDS38 and 39 to be
hitting anything currently.
>http://pastebin.com/m40f7cff4
This is not an obfuscated domain. You
On Thu, 2009-07-23 at 07:34 +0100, rich...@buzzhost.co.uk wrote:
> It's catching on :-)
this new obfuscation is already caught by AE_MED45, but I can foresee a
variant that might not match...
How about:
body__MED_OB
/\bw{2,3}(?:[[:punct:][:space:]]{1,5}|[[:space:][:punct:]]{1,3}dot[[
It means that if you were using BL at MTA level your SA might never have seen
the message at all.
No your rule would not be "overlooked" 'because the site is in a blacklist'
*unless* you were using the BL in your MTA and rejected the transaction from a
blacklisted IP address and, thus, never su
>From: Dan Schaefer [mailto:d...@performanceadmin.com]
>For those of you that manage these rules,
>URI_OBFU_X9_WS, URI_OBFU_WWW, AE_MEDS38, AE_MEDS39 did not mark this email as
>spam
I'm up to AE_MED45, so I wouldn't expect AE_MEDS38 and 39 to be hitting
anything currently.
>http://pastebin.co
On Wed, July 22, 2009 21:56, Dan Schaefer wrote:
> Does this mean that if I have a custom rule to search for exactly the
> "via" site, my rule will be overlooked because the site is in a blacklist?
what problem ?
--
xpoint
Benny Pedersen wrote:
On Wed, July 22, 2009 21:39, Dan Schaefer wrote:
For those of you that manage these rules,
URI_OBFU_X9_WS, URI_OBFU_WWW, AE_MEDS38, AE_MEDS39 did not mark this email as
spam
http://pastebin.com/m40f7cff4
reject it with rbl testing in mta, and its found in blackli
On Wed, July 22, 2009 21:39, Dan Schaefer wrote:
> For those of you that manage these rules,
> URI_OBFU_X9_WS, URI_OBFU_WWW, AE_MEDS38, AE_MEDS39 did not mark this email as
> spam
> http://pastebin.com/m40f7cff4
reject it with rbl testing in mta, and its found in blacklist, reason it not
found
For those of you that manage these rules,
URI_OBFU_X9_WS, URI_OBFU_WWW, AE_MEDS38, AE_MEDS39 did not mark this email as
spam
http://pastebin.com/m40f7cff4
--
Dan Schaefer
Web Developer/Systems Analyst
Performance Administration Corp.
On Wed, July 22, 2009 13:16, twofers wrote:
> "Because we CAN'T."
Obama says "yes we can" :)
> My point exactly. No matter what, with the current system of internet email,
just becurse main stream spammers is so clueless that thay start using
recipient equal to sender evelope says thay newer g
Charles,
"Because we CAN'T."
My point exactly. No matter what, with the current system of internet email,
SPAM will never be stopped or filtered out completely. A completely new concept
of verifying internet email would be required for that and unfortunately, that
will never happen simply becau
Sometimes I wished everyone getting involved in heated discussions and
proposals, also would carefully read any post with a related topic...
I did leak the other day, that I actually am hacking such a beast.
Sorry. Sometimes the mailbox overload is a bit much, and I just have to
delete things w
Sometimes I wished everyone getting involved in heated discussions and
proposals, also would carefully read any post with a related topic...
On Tue, 2009-07-21 at 11:29 -0400, Charles Gregory wrote:
> Further to my original post, I haven't read all of today's mail yet, but
FWIW, neither did I, a
On Tue, 21 Jul 2009, twofers wrote:
so why not let them show us what they've got, show us where we
need to make adjustments and corrections and in turn we will continue to
refine our process, ever so more, squeezing them out...inch by inch.
Because we CAN'T. While the spammers are free
Charles,
Although I understand your reservations, I feel in this case that it's best to
lay it all out there and give it to them, let them do what they do. In my mind
it's nothing more than "Flushing" out the best they can offer and finding the
loopholes, and closing them up.
There are more
On Wed, 15 Jul 2009, MrGibbage wrote:
I wonder if the spammers are reading this forum. That seemed awful fast.
I'm sure they do. But I also suspect that they have a simple 'feedback'
mechanism that let's them know how much of their spew is getting rejected
on their botnets, and when the rejec
On Wed, 15 Jul 2009, MrGibbage wrote:
I wonder if the spammers are reading this forum. That seemed awful fast.
Of course they are.
--
John Hardin KA7OHZhttp://www.impsec.org/~jhardin/
jhar...@impsec.orgFALaholic #11174 pgpk -a jhar...@impsec.org
key: 0xB8732E79
>Which of course means we've long since passed the point where any of
>these are going to do the spammers any good. That's the frustrating
>part.
I thought that the point was that since it cost a spammer the same to send
out a million emails as to send out one, he was happy if only one of th
Chris Owen wrote:
> On Jul 13, 2009, at 2:55 PM, Charles Gregory wrote:
>
To answer your next post, I don't use '\b' because the next 'trick'
coming
will likely be something looking like Xwww herenn comX... :)
>>> At that point it can be dealt with.
>
>> Well, they're getting clos
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
On Jul 13, 2009, at 2:55 PM, Charles Gregory wrote:
To answer your next post, I don't use '\b' because the next
'trick' coming
will likely be something looking like Xwww herenn comX... :)
At that point it can be dealt with.
Well, they're getti
On Mon, 13 Jul 2009, John Hardin wrote:
> The + signs are a little risky, it might be better to use {1,3} instead.
(nod) Though without the '/m' option it would be limited to the same line.
body rules work on paragraphs, but you are right, the badness has an upper
limit.
Ugh. Forgot it was '
On Mon, 13 Jul 2009, Charles Gregory wrote:
On Mon, 13 Jul 2009, John Hardin wrote:
Why be restrictive on the domain name?
If a conservative spec is sufficient to match the spam, then we're
helping avoid false positives I'd rather tweak the rule to
catch the new tricks of the spammer tha
On Mon, 13 Jul 2009, John Hardin wrote:
Why be restrictive on the domain name?
If a conservative spec is sufficient to match the spam, then we're
helping avoid false positives I'd rather tweak the rule to
catch the new tricks of the spammer than overgeneralize. :)
The + signs are a little
On Mon, 13 Jul 2009, Charles Gregory wrote:
On Mon, 13 Jul 2009, rich...@buzzhost.co.uk wrote:
On Mon, 2009-07-13 at 10:46 -0400, Charles Gregory wrote:
> (?!www\.[a-z]{2,6}[0-9]{2,6}\.(com|net|org))
> www[^a-z0-9]+[a-z]{2,6}[0-9]{2,6}[^a-z0-9]+(com|net|org)
Does not seem to work with;
ww
On Mon, 13 Jul 2009, McDonald, Dan wrote:
On Mon, 2009-07-13 at 16:03 +0100, rich...@buzzhost.co.uk wrote:
On Mon, 2009-07-13 at 10:46 -0400, Charles Gregory wrote:
(?!www\.[a-z]{2,6}[0-9]{2,6}\.(com|net|org))
www[^a-z0-9]+[a-z]{2,6}[0-9]{2,6}[^a-z0-9]+(com|net|org)
Does not seem to work wit
On Mon, 13 Jul 2009, rich...@buzzhost.co.uk wrote:
On Mon, 2009-07-13 at 10:46 -0400, Charles Gregory wrote:
(?!www\.[a-z]{2,6}[0-9]{2,6}\.(com|net|org))
www[^a-z0-9]+[a-z]{2,6}[0-9]{2,6}[^a-z0-9]+(com|net|org)
Does not seem to work with;
www. meds .com
Correct. With spaces being one of the
On Mon, 2009-07-13 at 16:03 +0100, rich...@buzzhost.co.uk wrote:
> On Mon, 2009-07-13 at 10:46 -0400, Charles Gregory wrote:
> > (?!www\.[a-z]{2,6}[0-9]{2,6}\.(com|net|org))
> > www[^a-z0-9]+[a-z]{2,6}[0-9]{2,6}[^a-z0-9]+(com|net|org)
>
> Does not seem to work with;
>
> www. meds .com
It shouldn
On Mon, 2009-07-13 at 10:46 -0400, Charles Gregory wrote:
> (?!www\.[a-z]{2,6}[0-9]{2,6}\.(com|net|org))
> www[^a-z0-9]+[a-z]{2,6}[0-9]{2,6}[^a-z0-9]+(com|net|org)
Does not seem to work with;
www. meds .com
If I might interject. This seems to be an excellent occasion for
the PerlRE 'negative look-ahead' code (excuse the line wrap):
body =~ /(?!www\.[a-z]{2,6}[0-9]{2,6}\.(com|net|org))
www[^a-z0-9]+[a-z]{2,6}[0-9]{2,6}[^a-z0-9]+(com|net|org)/i
...unless someone can think of an FP for this r
On Fri, 10 Jul 2009, McDonald, Dan wrote:
They have. They are using underscores, which are a [:punct:], but don't form a
\b break.
New rules:
body__MED_BEG_SP/\bw{2,3}[[:space:]][[:alpha:]]{2,6}\d{2,6}/i
body__MED_BEG_PUNCT /\bw{2,3}[[:punct:]]{1,3}[[:alpha:]]{2,6}\d{2,6}/i
body
2009/7/11 Sim :
>> New rules:
>> body __MED_BEG_SP /\bw{2,3}[[:space:]][[:alpha:]]{2,6}\d{2,6}/i
>> body __MED_BEG_PUNCT /\bw{2,3}[[:punct:]]{1,3}[[:alpha:]]{2,6}\d{2,6}/i
>> body __MED_BEG_DOT /\bw{2,3}\.[[:alpha:]]{2,6}\d{2,6}/i
>> body __MED_BEG_BOTH
>> /\bw{2,3}[[:punct:][:spac
On Sat, 2009-07-11 at 07:14 -0500, McDonald, Dan wrote:
> From: rich...@buzzhost.co.uk [mailto:rich...@buzzhost.co.uk]
> >On Fri, 2009-07-10 at 22:46 -0500, McDonald, Dan wrote:
> >> >From: Jason L Tibbitts III [mailto:ti...@math.uh.edu]
> >> > "MD" == McDonald, Dan writes:
> >>
> >> MD> They
From: rich...@buzzhost.co.uk [mailto:rich...@buzzhost.co.uk]
>On Fri, 2009-07-10 at 22:46 -0500, McDonald, Dan wrote:
>> >From: Jason L Tibbitts III [mailto:ti...@math.uh.edu]
>> > "MD" == McDonald, Dan writes:
>>
>> MD> They are using underscores, which are a [:punct:], but don't form
>> MD>
Dnia 2009-07-10, pią o godzinie 16:48 -0700, fchan pisze:
> Don't tempt them, I already get enough spam not only from these guys.
> Also they will flood the network with smtp useless connections and
> unless you have good network attack mitigation system so you don't
> have a DDoS, don't tempt them
> New rules:
> body __MED_BEG_SP /\bw{2,3}[[:space:]][[:alpha:]]{2,6}\d{2,6}/i
> body __MED_BEG_PUNCT /\bw{2,3}[[:punct:]]{1,3}[[:alpha:]]{2,6}\d{2,6}/i
> body __MED_BEG_DOT /\bw{2,3}\.[[:alpha:]]{2,6}\d{2,6}/i
> body __MED_BEG_BOTH
> /\bw{2,3}[[:punct:][:space:]]{2,5}[[:alpha:]]{2
On Fri, 2009-07-10 at 22:46 -0500, McDonald, Dan wrote:
> >From: Jason L Tibbitts III [mailto:ti...@math.uh.edu]
> > "MD" == McDonald, Dan writes:
>
> MD> They are using underscores, which are a [:punct:], but don't form
> MD> a \b break.
>
> >I'm becoming confused as to what they could poss
>From: Jason L Tibbitts III [mailto:ti...@math.uh.edu]
> "MD" == McDonald, Dan writes:
MD> They are using underscores, which are a [:punct:], but don't form
MD> a \b break.
>I'm becoming confused as to what they could possibly hope to
>accomplish by that.
right now I think they are sticking
> "MD" == McDonald, Dan writes:
MD> They are using underscores, which are a [:punct:], but don't form
MD> a \b break.
I'm becoming confused as to what they could possibly hope to
accomplish by that. At least when using dots and spaces users could
cut and paste the hostname into a browser (i
>From: fchan [mailto:fc...@molsci.org]
>Don't tempt them, I already get enough spam not
>only from these guys. Also they will flood the
>network with smtp useless connections and unless
>you have good network attack mitigation system so
>you don't have a DDoS, don't tempt them.
Pretty soon th
Don't tempt them, I already get enough spam not
only from these guys. Also they will flood the
network with smtp useless connections and unless
you have good network attack mitigation system so
you don't have a DDoS, don't tempt them.
Dnia 2009-07-11, sob o godzinie 00:18 +0200, Pawe¸ T«cza pi
Dnia 2009-07-11, sob o godzinie 00:18 +0200, Paweł Tęcza pisze:
> I received very similar spam too. It also includes "www.ma29. net"
> domain. It's probably personal dedication from the spammers to me ;)
> Thank you! I know you're watching that mailing list.
Hey spammers! ;)
It's after midnight
On Fri, 10 Jul 2009, McDonald, Dan wrote:
body__MED_END_BOTH
/\b[[:alpha:]]{2,6}\d{2,6}[[:punct:][:space:]]{2,5}(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
Let's see how long it takes them to come up with a workaround for this!
A domain name with 7+ letters? www. goodmeds123. com ? :)
--
J
Am 2009-07-10 11:39:02, schrieb Daniel Schaefer:
> Since we're sharing rules for this recent Spam outbreak, here is my rule:
> body DRUG_SITE /www(\.|\
> )*(med|meds|gen|pill|shop|via|cu|co|ba|da|bu|ba)[0-9]{2}(\.|\
> )*(net|com)/
> score DRUG_SITE 0.5
> describe DRUG_SITE Test to find spam drug
John Hardin wrote:
On Fri, 10 Jul 2009, Daniel Schaefer wrote:
Doesn't the . (period) need escaped in this? [.\s]{1,3}
Nope. "[]" means "explicit set of characters", and "." = "any
character" conflicts with that context.
Thanks for the clarification. I'm still learning REs.
--
Dan Schaef
On Fri, 10 Jul 2009, Daniel Schaefer wrote:
Doesn't the . (period) need escaped in this? [.\s]{1,3}
Nope. "[]" means "explicit set of characters", and "." = "any character"
conflicts with that context.
--
John Hardin KA7OHZhttp://www.impsec.org/~jhardin/
jhar...@impsec
John Hardin wrote:
On Fri, 10 Jul 2009, Daniel Schaefer wrote:
Gerry Maddock wrote:
> > McDonald, Dan wrote:
> >
> > body DRUG_SITE /www(\.|\
> > ) *(med|meds|gen|pill|shop|via|cu|co|ba|da|bu|ba)[0-9]{2}(\.|\ >
> ) )*(net|com)/
> > You should avoid the use of *, as it allows spammers to co
On Fri, 10 Jul 2009, Daniel Schaefer wrote:
Gerry Maddock wrote:
> > McDonald, Dan wrote:
> >
> > body DRUG_SITE /www(\.|\
> > ) *(med|meds|gen|pill|shop|via|cu|co|ba|da|bu|ba)[0-9]{2}(\.|\
> > ) )*(net|com)/
>
> You should avoid the use of *, as it allows spammers to consume all
> of yo
2009/7/10 John Hardin :
> On Fri, 10 Jul 2009, Sim wrote:
>
>>>
>>> /\bwww(?:\s\W?\s?|\W\s)\w{3,6}\d{2,6}(?:\s\W?\s?|\W\s)(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
>>
>> I'm using it without good results for this format:
>>
>> bla bla www. site. net. bla bla
>>
>> Have you any idea?
>
> There are no di
> Yes, remove the outer parentheses.
>
> Here are the rules I am using:
> body AE_MEDS35 /w{2,4}\s(?:meds|shop)\d{1,4}\s(?:net|com|org)/
> describe AE_MEDS35 obfuscated domain seen in spam
> score AE_MEDS35 3.00
>
> body AE_MEDS38
> /\(\s?w{2,4}\s[[:alpha:]]{4}\d{1,4
Gerry Maddock wrote:
McDonald, Dan wrote:
Since we're sharing rules for this recent Spam outbreak, here is my
rule:
body DRUG_SITE /www(\.|\
)*(med|meds|gen|pill|shop|via|cu|co|ba|da|bu|ba)[0-9]{2}(\.|\ )*(net|
com)/
You should avoid the use of *, as it allows spamm
> > McDonald, Dan wrote:
>
> > Since we're sharing rules for this recent Spam outbreak, here is my
rule:
> > body DRUG_SITE /www(\.|\
> > )*(med|meds|gen|pill|shop|via|cu|co|ba|da|bu|ba)[0-9]{2}(\.|\ )*(net|
com)/
>
> You should avoid the use of *, as it allows spammers to consume all of
> your mem
On Fri, 2009-07-10 at 11:39 -0400, Daniel Schaefer wrote:
> McDonald, Dan wrote:
> Since we're sharing rules for this recent Spam outbreak, here is my rule:
> body DRUG_SITE /www(\.|\
> )*(med|meds|gen|pill|shop|via|cu|co|ba|da|bu|ba)[0-9]{2}(\.|\ )*(net|com)/
You should avoid the use of *, as i
On Fri, 10 Jul 2009, Sim wrote:
/\bwww(?:\s\W?\s?|\W\s)\w{3,6}\d{2,6}(?:\s\W?\s?|\W\s)(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
I'm using it without good results for this format:
bla bla www. site. net. bla bla
Have you any idea?
There are no digits in that URI.
If this becomes common, change
McDonald, Dan wrote:
Yes, remove the outer parentheses.
Here are the rules I am using:
bodyAE_MEDS35 /w{2,4}\s(?:meds|shop)\d{1,4}\s(?:net|com|org)/
describe AE_MEDS35 obfuscated domain seen in spam
score AE_MEDS35 3.00
bodyAE_MEDS38
/\(\s?w{2,4}\s[[:alpha:]]{4
On Fri, 2009-07-10 at 17:11 +0200, Sim wrote:
> >>>
> >>>
> >>> /\bwww(?:\s|\s\W|\W\s)\w{3,6}\d{2,6}(?:\s|s\W|\W\s)(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
> >>
> >> ^
> >> John,
> >>
> >> Thanks a lot for rule update! It works fine. I can say it's nearly
> >>
>>>
>>>
>>> /\bwww(?:\s|\s\W|\W\s)\w{3,6}\d{2,6}(?:\s|s\W|\W\s)(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
>>
>> ^
>> John,
>>
>> Thanks a lot for rule update! It works fine. I can say it's nearly
>> perfect, because it missing only one small back-slash :) Please
Am 2009-06-30 13:50:09, schrieb Yet Another Ninja:
> See RegistrarBoundaries.pm in SA source and
> http://www.rulesemporium.com/rules/90_2tld.cf
I know this list, but these are only domains, where you can get a
3rd Level Domain like on as
http://tamay.dogan.free.fr/
which was create
On Tue, 30 Jun 2009, John Wilcock wrote:
Le 30/06/2009 17:16, John Hardin a écrit :
> ... looking at the www peter got an impression of ...
> (-> www.peter.got?)
TLDs are limited and prevent FPs of that particular nature.
Sure, but there are lots of ccTLDs that could be confused wit
John Wilcock wrote:
... looking at the www peter got an impression of ...
(-> www.peter.got?)
TLDs are limited and prevent FPs of that particular nature.
Sure, but there are lots of ccTLDs that could be confused with English
words, never mind other languages.
Do you really want Spam
Le 30/06/2009 17:16, John Hardin a écrit :
... looking at the www peter got an impression of ...
(-> www.peter.got?)
TLDs are limited and prevent FPs of that particular nature.
Sure, but there are lots of ccTLDs that could be confused with English
words, never mind other languages.
D
On Tue, 30 Jun 2009, Jan P. Kessler wrote:
Martin Gregorie schrieb:
... digging through the WWW HE SAW this link ...
Both IMO should be caught and given a positive score. I've never seen
legitimate mail containing URLs written this way.
Maybe I was not clear: The last one is NOT an url. D
> So you want obfuscated urls to be recognised as urls but not treated as
> urls?
>
Of course. Its spam.
> If this is just for a few own pcre body rules, I'd suggest you to
> handle those de-obfuscations in your rules.
>
Guess what I'm doing.
> You can also publish your own plugin, if you think t
Martin Gregorie schrieb:
> What makes you think I'm using URI tests or that any of these would be
> recognised as a URI? My tests are simple body tests with {1,n} limits on
> repetitions to keep things under control.
>
So you want obfuscated urls to be recognised as urls but not treated as
urls
On Tue, 2009-06-30 at 13:14 +0200, Jan P. Kessler wrote:
> Martin Gregorie schrieb:
> >> ... go to WWW EVIL ORG for new meds ...
> >>
> >> and
> >>
> >> ... digging through the WWW HE SAW this link ...
> >>
> > Both IMO should be caught and given a positive score. I've never seen
> > legitimate mai
On 6/30/2009 1:18 PM, Michelle Konzack wrote:
Am 2009-06-30 12:30:14, schrieb Jan P. Kessler:
How would you distinguish between
... go to WWW EVIL ORG for new meds ...
and
... digging through the WWW HE SAW this link ...
to prevent SA trying to look up www.he.saw?
Is SAW a valid TO
Michelle Konzack wrote:
> Is SAW a valid TOPLEVEL domain?
>
> SA could use a list of valid TLD's.
>
Ok, let's change that (do not forget that there's more than .com)
the www seems to become the primary source of information these days
(->www.seems.to?)
And I think we agree, that it wo
Am 2009-06-30 11:58:20, schrieb Martin Gregorie:
> > http:// meds spammer org
> >
> That should be scored positive too, for the same reason.
And in my org this should no happen...
is a valid domain FOR SALE.
Thanks, Greetings and nice Day/Evening
Michelle Konzack
Systemadministrato
Am 2009-06-30 12:30:14, schrieb Jan P. Kessler:
> How would you distinguish between
>
> ... go to WWW EVIL ORG for new meds ...
>
> and
>
> ... digging through the WWW HE SAW this link ...
>
> to prevent SA trying to look up www.he.saw?
Is SAW a valid TOPLEVEL domain?
SA could use a l
Martin Gregorie schrieb:
>> ... go to WWW EVIL ORG for new meds ...
>>
>> and
>>
>> ... digging through the WWW HE SAW this link ...
>>
> Both IMO should be caught and given a positive score. I've never seen
> legitimate mail containing URLs written this way.
Maybe I was not clear: The last one is
> ... go to WWW EVIL ORG for new meds ...
>
> and
>
> ... digging through the WWW HE SAW this link ...
>
Both IMO should be caught and given a positive score. I've never seen
legitimate mail containing URLs written this way.
> And what about URLs that don't start with WWW, like
>
>
Jason Haar schrieb:
> All this talk about trying to catch urls that contain spaces/etc got me
> thinking: why isn't this a standard SA feature? i.e if SA sees
> "www(whitespace|comma|period)-combo(therest)", then rewrite it as the
> url and process.
How would you distinguish between
... go to
>>> "Benny Pedersen" 06/28/09 12:42 AM >>>
>On Sun, June 28, 2009 05:38, Cory Hawkless wrote:
>> I agree, wouldn't it be easier to uniformly feed all of these type of URL's
>> though the already existing SA filters. As Jason suggested maybe by
>> collapsing whitespaces?
>
>lets redefine how a url
On Sun, June 28, 2009 20:47, Raymond Dijkxhoorn wrote:
> If you have to press the 'SPAM' link you allready have gotten the spam,
> right? So thats too late if you see this black/white.
bayes also learn from sender ip and more, so its not that big problem if one
gets to enduser here, if to high
Hi!
Will users be ringing the helpdesk asking if the antispam system is
broken when all this "www space something dot" ends up in their INBOX?
Answer: you bet they do.
in my webmail there is a "SPAM" and "NOT SPAM" link, so i dont have
this problem
If you have to press the 'SPAM' link you a
On Sun, June 28, 2009 10:08, Jason Haar wrote:
> On 06/28/2009 12:18 PM, Benny Pedersen wrote:
>> spammers need to rewrite webbrowsers also :=)
>> will you click on a url that is not click bare ?
> Are you saying that this kind of spam doesn't work, as it requires the
> user to actually edit the l
Hi!
lets redefine how a url is in the first place ?
www localhost localdomain
www.localhost.localdomain
one of them does not work :)
spammers more or less just use the first one, so what ?
It doesnt matter much if it works or not. Spam is not a message with urls
that work. So its ending up
On Sun, Jun 28, 2009 at 01:08:45PM +0930, Cory Hawkless wrote:
> I agree, wouldn't it be easier to uniformly feed all of these type of URL's
> though the already existing SA filters. As Jason suggested maybe by
> collapsing whitespaces?
>
> Sounds like the obvious solution to me? Any problems with
On 06/28/2009 12:18 PM, Benny Pedersen wrote:
>
> spammers need to rewrite webbrowsers also :=)
>
> will you click on a url that is not click bare ?
>
>
Are you saying that this kind of spam doesn't work, as it requires the
user to actually edit the link to make it work?
I think that's irreleva
On Sun, June 28, 2009 05:38, Cory Hawkless wrote:
> I agree, wouldn't it be easier to uniformly feed all of these type of URL's
> though the already existing SA filters. As Jason suggested maybe by
> collapsing whitespaces?
lets redefine how a url is in the first place ?
www localhost localdomai
-
From: Jason Haar [mailto:jason.h...@trimble.co.nz]
Sent: Sunday, 28 June 2009 9:28 AM
To: users@spamassassin.apache.org
Subject: Re: [NEW SPAM FLOOD] www.shopXX.net
All this talk about trying to catch urls that contain spaces/etc got me
thinking: why isn't this a standard SA feature? i.e
On Sun, June 28, 2009 01:57, Jason Haar wrote:
> All this talk about trying to catch urls that contain spaces/etc got me
> thinking: why isn't this a standard SA feature? i.e if SA sees
> "www(whitespace|comma|period)-combo(therest)", then rewrite it as the
> url and process.
spammers need to rew
All this talk about trying to catch urls that contain spaces/etc got me
thinking: why isn't this a standard SA feature? i.e if SA sees
"www(whitespace|comma|period)-combo(therest)", then rewrite it as the
url and process.
That way you get the whole force of SURBLs/etc onto it? I'm assuming all
the
On Sat, 27 Jun 2009, Jeremy Morton wrote:
Why are you bothering with that? It seems unnecessarily complex. Here's my
amended rule:
/\bwww\s?\W?\s?\w{3,6}\d{2,6}s?\W?\s?(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
That would match hy11com, which may not be recognized by the mark as a
URI they
Why are you bothering with that? It seems unnecessarily complex.
Here's my amended rule:
/\bwww\s?\W?\s?\w{3,6}\d{2,6}s?\W?\s?(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
Best regards,
Jeremy Morton (Jez)
John Hardin wrote:
On Fri, 26 Jun 2009, Pawe�~B T�~Ycza wrote:
Dnia 2009-06-23, wto o godzin
On Fri, 26 Jun 2009, Pawe�~B T�~Ycza wrote:
Dnia 2009-06-26, pią o godzinie 14:15 -0700, John Hardin pisze:
On Fri, 26 Jun 2009, Pawe~B T~Ycza wrote:
Dnia 2009-06-23, wto o godzinie 09:39 +0200, Paweł Tęcza pisze:
body OBFU_URI_WWDD_2
/\bwww\s(?:\W\s)?\w{3,6}\d{2,6}\s(?:\W\s)?(?:c\s?o\s?m|
Dnia 2009-06-26, pią o godzinie 14:15 -0700, John Hardin pisze:
> On Fri, 26 Jun 2009, Pawe~B T~Ycza wrote:
>
> > Dnia 2009-06-23, wto o godzinie 09:39 +0200, Paweł Tęcza pisze:
>
> body OBFU_URI_WWDD_2
> /\bwww\s(?:\W\s)?\w{3,6}\d{2,6}\s(?:\W\s)?(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b
On Fri, 26 Jun 2009, Pawe�~B T�~Ycza wrote:
Dnia 2009-06-23, wto o godzinie 09:39 +0200, Paweł Tęcza pisze:
body OBFU_URI_WWDD_2
/\bwww\s(?:\W\s)?\w{3,6}\d{2,6}\s(?:\W\s)?(?:c\s?o\s?m|n\s?e\s?t|o\s?r\s?g)\b/i
The spammers strike in weekend again. Unfortunately the rule above
doesn't work fo
1 - 100 of 114 matches
Mail list logo