Alejandro Fernandez wrote:
> 
> On Sun, 19 Sep 1999, you wrote:
> >Ale,
> >
> >     Thanks for your help. I have a couple follow up questions if you don't
> >mind.
> >
> >       1.) re : The monthly file does the following stuff:.....
> >
> >                         when you say the monthly file are you referring to
> >anamonth.sh reffered to in the cron?
> >
> 
> Yes. In the following line, it's saying "At 0 minutes past 07 am, 1st day of
> the month. You can get more examples in the manpage for crontab: either
> 
> man 5 crontab
> or just man crontab should do the trick.
> 
> >0 07 1 * * (cd /ale/work/stats/; ./anamonth.sh 2>&1  |mail >   [EMAIL PROTECTED])
> >
> >               what does the 2>&1 do?
> >
> 
> It redirects standard error (2) to standard output (1) (I'm not *that* sure
> about this, but it works) and then pipes that to my email. This means I get any
> error messages as well as the usual analog output in the email.

I believe that cd /ale/work/stats/; ./anamonth.sh |& mail >
[EMAIL PROTECTED]

does the same thing, assuming that's csh or tcsh you're using.

-- [EMAIL PROTECTED]

> 
>  >      2.) Could you also elaborate on this last line? What exactly is
> it doing  >to automate the process?
> >
> >                echo "<li><a href=\"$TODAY\" title=\"$THISMONTH\">Month of
> >$THISMONTH</a>"
> >               >> /software/apache/htdocs/site/stats/incl_monthlist.html
> >
> 
> I'm grouping all reports for any site in a file called /stats/index.html
> accessible from some part of that site. This is so any clients can
> see it easily. On that page I've got something like the following for every
> kind of report (You have to make sure you have server side includes working
> for this):
> 
> <h3>Weekly Stats</h3>
> <ul>
> <!--#include virtual="incl_weeklist.html"-->
> </ul>
> 
> So the line that the .sh file is appending to each time it's run, is actually
> adding each new report to that index page, automatically. I just use this as a
> time saver: You could probably write some kind of parsing cgi file to be less
> error prone (If you run the script twice by accident, you get double entries
> in the include file at the moment) ... but this way I get the job done quickly
> and I rarely have to go in and do html.
> 
> >            3.) Do you have a recomendation of how I could compress the log
> >files? Should I do it daily? what would the cron command be for that?
> >
> 
> We do it nightly (Someone else does that stuff though). It's complicated, but
> you could try using the find command, using the -mtime switch, to get just the
> most recently made log file, and running gzip just on that one. Maybe it would
> be something like:
> 
> 30 1  * * * find /www/logs/site -name * -mtime -1 -exec gzip -v {} \;
> 
> Don't blame me if this command formats your hard disk and sets fire to your
> office. But the find command is quite powerful, and you should be able to do
> some complex stuff with it once you've fiddled around a bit.
> 
> The complicated bit comes in when you start dealing with multiple
> sites...moving this gzipped log file to it's appropriate storage directory...
> No idea how to do that... Sorry!
> 
> Ale
> 
> --
> Alejandro Fernandez,
> [EMAIL PROTECTED]
> ------------------------------------------------------------------------
> This is the analog-help mailing list. To unsubscribe from this
> mailing list, send mail to [EMAIL PROTECTED]
> with "unsubscribe analog-help" in the main BODY OF THE MESSAGE.
> List archived at http://www.mail-archive.com/[email protected]/
> ------------------------------------------------------------------------
------------------------------------------------------------------------
This is the analog-help mailing list. To unsubscribe from this
mailing list, send mail to [EMAIL PROTECTED]
with "unsubscribe analog-help" in the main BODY OF THE MESSAGE.
List archived at http://www.mail-archive.com/[email protected]/
------------------------------------------------------------------------

Reply via email to