Below is a script that I can't get to work as a Korn shell script. I
would like to get something like it to run in perl. 

What I am attempting to do is search thru all of the websites on my
server and make a file that tells me what websites reside in what
directory. 

I can get the first part of the script to work but I can't get the file
cleanup to look nice. 

I bet this is really simple. 

Thanks Everyone,

Andy



Sample of current output
---------------------------------------
pwh4
/$PATH?www/ns-admin-new/https-brands-misc/config
05Jun-04:41PM
<Client urlhost="careers.is.COMPANY.com">
<Client urlhost="www.performanceparts.COMPANY.com">
<Client urlhost="www.businesslink.COMPANY.com">
<Client urlhost="www.COMPANY.com">


What I used to get current output
---------------------------------
# Setup environment variables
#
BOXNAME=`hostname`
CURDATE=`date +%d%h-%I:%M%p`
WORKINGDIR=`pwd`
NSWEB=`ls -d $PATH/www/ns-admin*/https*/config/`
FILENAME=found_hosts_`date +%d%h-%I:%M%p`
#
# Main processing
#
# Read all webserver obj.conf files and get a list of hosts
#
for WEB_SERVER in $NSWEB

do

  cd $WEB_SERVER

    CURDIR=`pwd`

# Output hostname

        echo $BOXNAME >> /tmp/$FILENAME

# Output Directory Name
        
        echo $CURDIR >> /tmp/$FILENAME

# Output Date

        echo $CURDATE >> /tmp/$FILENAME

# Output host information

        grep urlhost obj.conf >> /tmp/$FILENAME


I tried to clean it up doing below and it hasn't worked so far. 
----------------------------------------
#
# Clean up of output file
#
# Remove all but URL Name only from lines that have urls in them.

cat $FILENAME | awk -F\" '{print $2}'
cat $FILENAME | awk '{print $1}'

done

__________________________________________________
Do You Yahoo!?
Get personalized email addresses from Yahoo! Mail - only $35 
a year!  http://personal.mail.yahoo.com/

Reply via email to