[SLUG] remote mail system

2008-10-22 Thread Jonathan
Hi All,

I have quite a neat system set up with my email. Fetchmail downloads it via 
POP3 from ISP (evil telstra). Dovcote Imap along with squirrelmail allow me 
to access my email (nicely filtered and sorted) over the net.

Problem:
I can't send email using this system. I have to use a work around from KMail 
(set up like I had it before all this, just simple POP3 client). This means I 
can't send emails from my computer over the net. I think port 25 is blocked, 
so It needs to take the email (which does appear in the sent folder) and then 
pump it up to the ISP with SMTP. Seems to be plenty of people saying that is 
what needs to happen, but no one who is willing to explain how that actually 
works, with any recent setup.


Any ideas?

cheers

Jon
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] remote mail system

2008-10-22 Thread Robert Thorsby
On 22/10/08 18:17:21, Jonathan wrote:
 Problem:
 I can't send email using this system. I have to use
 a work around from KMail (set up like I had it before
 all this, just simple POP3 client). This means I can't
 send emails from my computer over the net. I think
 port 25 is blocked, so It needs to take the email
 (which does appear in the sent folder) and then pump it
 up to the ISP with SMTP. Seems to be plenty of people
 saying that is what needs to happen, but no one who is
 willing to explain how that actually works, with any
 recent setup.
 Any ideas?

G'day Jon,

What you need is an application that relays to your upstream MX server. 
I have used, and still use, both msmtp and EMail-Relay for this exact 
purpose.

I use msmtp for sending a single mail and I use EMail-Relay for sending 
out batched mail.

Msmtp works in the usual way: message body from stdin, a config file, 
miscellaneous command-line switches and arguments, output to upsteam MX 
server.

EMail-Relay was set up to ease the task of adding corporate disclaimers 
etc (hiss, boo) and is best used to flush an outbound mail queue. It 
has a second executable which adds the mail to the queue.

Msmtp is one of the better examples of take a message as input and 
send it on its way whereas EM-R works well in daemon mode. While there 
are many alternatives to msmtp the only real alternatives to EM-R are 
nullmailer (which can be a bear to set up) and masqmail (which I have 
not used but once saw recommended on the lkml).

Both msmtp and EM-R will handle the usual methods of authentication 
with the upstream MX server.

HTH,
Robert Thorsby
It is impossible to make anything foolproof
because fools are ingenious.

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] remote mail system

2008-10-22 Thread Adelle Hartley

Jonathan wrote:

Hi All,

I have quite a neat system set up with my email. Fetchmail downloads it via 
POP3 from ISP (evil telstra). Dovcote Imap along with squirrelmail allow me 
to access my email (nicely filtered and sorted) over the net.


Problem:
I can't send email using this system. I have to use a work around from KMail 
(set up like I had it before all this, just simple POP3 client). This means I 
can't send emails from my computer over the net. I think port 25 is blocked, 
so It needs to take the email (which does appear in the sent folder) and then 
pump it up to the ISP with SMTP. Seems to be plenty of people saying that is 
what needs to happen, but no one who is willing to explain how that actually 
works, with any recent setup.


Solution 1
Setup your mail client to use your ISP for outgoing mail.  I hobbled 
along with that solution for ages, but from my reading of the above, 
you've tried it and it didn't work.


Solution 2
I'm using Postfix for outgoing mail, which for a long time I was using 
for intraoffice mail before setting up Postfix to send outgoing mail.


I don't know enough to know whether this is likely to work if you've 
never had the first solution working.


Here are my notes for how I got Postfix to send mail externally.

1.  Tried to send some mail; didn't work.
2.  Checked /var/log/mail.log and found an error message that mentioned 
something about not being able to find /etc/postfix/sasl_password.db
3.  Googled for the path mentioned in the error message, and found the 
answer I needed somewhere on this page:

http://www.hypexr.org/linux_mail_server.php


Adelle.

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] remote mail system

2008-10-22 Thread Kyle
The thing is; these days with so much spam floating around, using your 
own mail server to send mail is as much hit and miss as it is trouble 
setting up and maintaining. There are so many orgs that blacklist an IP 
for no good reason, a small organisation has very little chance of 
getting itself removed from one of these blacklists if you do get listed.


Consequently, I have found the simplest way to deal with this is set 
your mail server up correctly, at the very least so it's not an open 
relay (a sure way to get blacklisted), and then set your mail clients up 
to use your ISP's server as your SMTP server. At least if they get 
blacklisted, they will have the power to deal with it.


Alternatively set your clients up to use your mail server for SMTP, then 
have your mail server relay through your ISP's SMTP server.


Kyle

Adelle Hartley wrote:


Solution 1
Setup your mail client to use your ISP for outgoing mail.  I hobbled 
along with that solution for ages, but from my reading of the above, 
you've tried it and it didn't work.


Solution 2
I'm using Postfix for outgoing mail, which for a long time I was using 
for intraoffice mail before setting up Postfix to send outgoing mail.





--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] remote mail system

2008-10-22 Thread Andrew Cowie
On Wed, 2008-10-22 at 21:19 +1100, Kyle wrote:

 Alternatively set your clients up to use your mail server for SMTP, then 
 have your mail server relay through your ISP's SMTP server.

Nothing wrong with doing it this way. Hey, email once was a store and
forward protocol. :)

AfC
Sydney


signature.asc
Description: This is a digitally signed message part
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Re: [SLUG] Fortress .... err Firewall Australia

2008-10-22 Thread bill

YES!



AUSTRALIA is the pilot! Sounds like Paypal.

Are we so gullible?



--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


[SLUG] Search engine traffic dominates

2008-10-22 Thread Peter Chubb

Hi,
  I'm a little cheesed off.  In the last three months, people have
downloaded 9G per month from our website; search engines have
downloaded 21G per month.  Only Google generated significant traffic
through search engine hits (and it downloaded less than the others,
too --- around 2G per month, as opposed to 10G for Yahoo, and 4G for
MSNbot).  In other words, search engine indexing traffic was double
the actual traffic from www.gelato.unsw.edu,au.

Is there any good reason why I shouldn't block (or at least
significantly slow down) MSNbot, MJ12BOT, and Yahoo
Slurp! ???  Yahoo is particularly bad, crawling and downloading about
twice what the others do, and yet generating 1% of the hits that
Google generated for us.


--
Dr Peter Chubb  http://www.gelato.unsw.edu.au  peterc AT gelato.unsw.edu.au
http://www.ertos.nicta.com.au   ERTOS within National ICT Australia
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


[SLUG] ls lists numbers, not owner names

2008-10-22 Thread Voytek Eymont
I'm trying to fix my failed clam install, and, just noticed, when I list
certain files, I get owner/group not as names, but, as numbers;

what is that trying to tell me ?

 # ls -al /var/log/clamav
total 188
drwxr-xr-x   2  104  105  4096 Sep  3 02:31 .
drwxr-xr-x  16 root root  4096 Oct 19 04:12 ..
-rw-r-   1  104  105  3774 Jul 20 04:12 clamd.log.1
-rw-r--r--   1  104  105 0 Jul 27 04:12 freshclam.log



-- 
Voytek

-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Tony Sceats
Can't you use robots.txt (or the modern equiv, is there anything newer
actually?) to stop mass indexing, perhaps point it to pages you want indexed
and also tell it to exclude images etc etc?

On Thu, Oct 23, 2008 at 10:45 AM, Peter Chubb [EMAIL PROTECTED]wrote:


 Hi,
  I'm a little cheesed off.  In the last three months, people have
 downloaded 9G per month from our website; search engines have
 downloaded 21G per month.  Only Google generated significant traffic
 through search engine hits (and it downloaded less than the others,
 too --- around 2G per month, as opposed to 10G for Yahoo, and 4G for
 MSNbot).  In other words, search engine indexing traffic was double
 the actual traffic from www.gelato.unsw.edu,au.

 Is there any good reason why I shouldn't block (or at least
 significantly slow down) MSNbot, MJ12BOT, and Yahoo
 Slurp! ???  Yahoo is particularly bad, crawling and downloading about
 twice what the others do, and yet generating 1% of the hits that
 Google generated for us.


 --
 Dr Peter Chubb  http://www.gelato.unsw.edu.au  peterc AT
 gelato.unsw.edu.au
 http://www.ertos.nicta.com.au   ERTOS within National ICT
 Australia
 --
 SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
 Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread DaZZa
On Thu, Oct 23, 2008 at 10:48 AM, Voytek Eymont [EMAIL PROTECTED] wrote:
 I'm trying to fix my failed clam install, and, just noticed, when I list
 certain files, I get owner/group not as names, but, as numbers;

 what is that trying to tell me ?

  # ls -al /var/log/clamav
 total 188
 drwxr-xr-x   2  104  105  4096 Sep  3 02:31 .
 drwxr-xr-x  16 root root  4096 Oct 19 04:12 ..
 -rw-r-   1  104  105  3774 Jul 20 04:12 clamd.log.1
 -rw-r--r--   1  104  105 0 Jul 27 04:12 freshclam.log

The username associated with the UID which created/owned those files
is no longer listed in /etc/passwd. Nor the group in /etc/group.

grep 104 /etc/passwd

should return a line something like this

clamav:x:104:106:User for clamav:/var/run/hal:/bin/false

If it doesn't, then there's your issue.

DaZZa
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread Daniel Pittman
Voytek Eymont [EMAIL PROTECTED] writes:

 I'm trying to fix my failed clam install, and, just noticed, when I list
 certain files, I get owner/group not as names, but, as numbers;

 what is that trying to tell me ?

The entries in /etc/passwd and /etc/group[1] that map the UID 104 and
the GID 105 to names have been removed.

Regards,
Daniel

Footnotes: 
[1]  Technically, the entries returned by nsswitch, which can include
 other sources such as LDAP or NIS, in addition to or instead of the
 traditional files; for the system UID/GID space, which these are
 in, it is almost always local files.

-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Rev Simon Rumble
This one time, at band camp, Peter Chubb wrote:
   I'm a little cheesed off.  In the last three months, people have
 downloaded 9G per month from our website; search engines have
 downloaded 21G per month.  Only Google generated significant traffic
 through search engine hits (and it downloaded less than the others,
 too --- around 2G per month, as opposed to 10G for Yahoo, and 4G for
 MSNbot).  In other words, search engine indexing traffic was double
 the actual traffic from www.gelato.unsw.edu,au.

Ever since Google started being _really_ fast with index updates, all 
the others have been trying to catch up.  By really fast I mean my 
work launched a new campaign on Sunday morning and it was in Google's 
results by Monday afternoon.

 Is there any good reason why I shouldn't block (or at least
 significantly slow down) MSNbot, MJ12BOT, and Yahoo
 Slurp! ???  Yahoo is particularly bad, crawling and downloading about
 twice what the others do, and yet generating 1% of the hits that
 Google generated for us.

Depends how much you care about Yahoo's or MSN's referrals.  In the 
consumer space I work in, Google still accounts for the vast majority of 
hits (85%) and a similar proportion of sales.  MSN + Live are around 
10%, essentially through Microsoft's domination of people who don't know 
how to change their home page let alone install an alternative browser.  
Yahoo gets around 4%.

For us, these are the people we target so it's very important.  For you, 
it might be less important.

You might want to look into the Crawl-delay extension to the robots.txt 
standard, which can limit by robot:
http://en.wikipedia.org/wiki/Robots.txt#Crawl-delay_directive

-- 
Rev Simon Rumble [EMAIL PROTECTED]
www.rumble.net

The Tourist Engineer
Geeks need vacations too.
http://engineer.openguides.org/

 History teaches us that men and nations behave wisely once
  they have exhausted all other alternatives.
- Abba Eban
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Mary Gardiner
On Thu, Oct 23, 2008, Tony Sceats wrote:
 Can't you use robots.txt (or the modern equiv, is there anything newer
 actually?) to stop mass indexing, perhaps point it to pages you want indexed
 and also tell it to exclude images etc etc?

As I understand it, robots.txt is still the way to do this. Slurp also
has a specific extension by which you should be able to suggest it
crawls less often:
http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-03.html

Double-check also that your pages are cacheable, have an appropriately
long Expires header (although I don't know how much the Expires header
influences crawling rates) and have ETag and/or Last-Modified set, since
all the major robots, finally, seem to do conditional GETs.

All this is still pretty annoying, ideally you should be using an order
of magnitude LESS traffic than you refer to me should be something of a
given for search engine robots.

Of hits on my own website over the five days, the figures are:
 - Googlebot: 9% of total traffic
 - MSNBot: 4% of total traffic
 - Yahoo! Slurp: 19% of total traffic

They aren't the only crawlers either. Pretty outrageous.

-Mary
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Mary Gardiner
On Thu, Oct 23, 2008, Rev Simon Rumble wrote:
 You might want to look into the Crawl-delay extension to the robots.txt 
 standard, which can limit by robot:
 http://en.wikipedia.org/wiki/Robots.txt#Crawl-delay_directive

There's also the Sitemaps protocol, in which you can suggest how
frequently content changes: http://en.wikipedia.org/wiki/Sitemaps
It doesn't seem to come with any guarentees that the robots will respect
that as a refresh frequency though.

-Mary
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread Voytek Eymont

On Thu, October 23, 2008 10:55 am, DaZZa wrote:

 -rw-r--r--   1  104  105 0 Jul 27 04:12 freshclam.log
 The username associated with the UID which created/owned those files
 is no longer listed in /etc/passwd. Nor the group in /etc/group.
 clamav:x:104:106:User for clamav:/var/run/hal:/bin/false
 If it doesn't, then there's your issue.

DaZZa, Daniel,

thanks

how to fix, can I recreate clam entry with 'mc' editor ?
or do I need to 'adduser' ?


# grep clam /etc/passwd /etc/group

# grep amavis /etc/passwd /etc/group
/etc/passwd:amavis:x:105:106:Amavis email scan user:/var/amavis:/bin/sh
/etc/group:amavis:x:106:

# grep 104 /etc/passwd /etc/group

# grep 105 /etc/passwd /etc/group
/etc/passwd:amavis:x:105:106:Amavis email scan user:/var/amavis:/bin/sh

# grep 106 /etc/passwd /etc/group
/etc/passwd:amavis:x:105:106:Amavis email scan user:/var/amavis:/bin/sh
/etc/group:amavis:x:106:


-- 
Voytek

-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] ls lists numbers, not owner names

2008-10-22 Thread DaZZa
On Thu, Oct 23, 2008 at 1:08 PM, Voytek Eymont [EMAIL PROTECTED] wrote:
 On Thu, October 23, 2008 10:55 am, DaZZa wrote:
 DaZZa, Daniel,

 thanks

 how to fix, can I recreate clam entry with 'mc' editor ?
 or do I need to 'adduser' ?

Easiest way is to just use useradd. Editing /etc/passwd manually is
not recommended in these days of shadow passwords, because you have to
remember to edit /etc/shadow as well, and that's a bit tricky,
especially when you have to muck around with crypt to get the
encrypted password. :-)

useradd -c ClamAV scann user -d /var/clamav -u 104 -s /bin/sh clamav

and then

passwd clamav

should do it. You can add groups manually later. Of course, you should
enter your own shell and home directory paths where the -s and -d
options are above.

DaZZa
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html


Re: [SLUG] Search engine traffic dominates

2008-10-22 Thread Peter Chubb
 Mary == Mary Gardiner [EMAIL PROTECTED] writes:

Mary On Thu, Oct 23, 2008, Tony Sceats wrote:
 Can't you use robots.txt (or the modern equiv, is there anything
 newer actually?) to stop mass indexing, perhaps point it to pages
 you want indexed and also tell it to exclude images etc etc?

Mary As I understand it, robots.txt is still the way to do
Mary this. Slurp also has a specific extension by which you should be
Mary able to suggest it crawls less often:
Mary http://help.yahoo.com/l/us/yahoo/search/webcrawler/slurp-03.html

That's what I figured.  As the content is divided mostly between a
WIKI and some mailing list archives (both of which change fairly
often), and our audience is mostly Linux hackers, I'm going to use
robots.txt to stop yahoo and msn from indexing.  Yahoo gave us ~130
hits last month, compared to ~32000 from Google; MSN gave us 4 hits.

--
Dr Peter Chubb  http://www.gelato.unsw.edu.au  peterc AT gelato.unsw.edu.au
http://www.ertos.nicta.com.au   ERTOS within National ICT Australia
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html