I wrote some shell scripts that do something similar. Here is the sequence
of events: On host running the control process, run a script that resides on
remote server containing the logic to generate a file with the information
needed using rsh command. This remote script would also contain logic to rcp
the file generated to the host running the control process. You will also
need to pass the appropriate hostnames as variables for 2 reasons: the
script remains dynamic and the script knows which host to run the rsh on and
which host to rcp to.

Example: On controlling host: rsh remote_host_variable get_files.sh
this_host_variable
         On remote host - get_files.sh contains: 
             ls -lt blahblah > file_list.out 
             rcp -p file_list.out this_host_variable:file_list.out

Also, make sure you code error checking in your routines. 

Hope that helps. 

-----Original Message-----
Sent: Friday, March 01, 2002 10:09 AM
To: Multiple recipients of list ORACLE-L


This may be a little off topic so I apologize in advance.  But I have seen
some good Unix scripting help come across lately so I thought I would go
ahead and send this.

I am trying to setup a Unix script that will logon to a remote server, get
the name and size of all the files in a remote directory and write the
results to a local directory.  I need to do this to ensure that all the
source files for loading our data warehouse have been received and are
complete.  The list of files to be received changes nightly and I don't have
the ability to add a record to the files that would provide an "end of file
marker".

I have tested ftp using "ls" but it only allows a short list of file names,
not the long listing with file sizes.  I have also tried nlist, size and dir
from ftp but those commands don't allow printing the results to the local
server.

I have tried rsh and rlogin but these commands just sit there with no
response.  I have to use an ip address, not the hostname, so I don't know if
this is the problem or not.  I have the SA looking into this for me.

I could probably setup a cron job on the remote server to do the long
listing of the directory and then ftp the results to my local server, but
then this would be outside the control of my load process.  I would prefer
to do everything from the local server if possible.

Any ideas?

Thanks,
Nancy


-- 
Please see the official ORACLE-L FAQ: http://www.orafaq.com
-- 
Author: Nancy McCormick
  INET: [EMAIL PROTECTED]

Fat City Network Services    -- (858) 538-5051  FAX: (858) 538-5051
San Diego, California        -- Public Internet access / Mailing Lists
--------------------------------------------------------------------
To REMOVE yourself from this mailing list, send an E-Mail message
to: [EMAIL PROTECTED] (note EXACT spelling of 'ListGuru') and in
the message BODY, include a line containing: UNSUB ORACLE-L
(or the name of mailing list you want to be removed from).  You may
also send the HELP command for other information (like subscribing).
-- 
Please see the official ORACLE-L FAQ: http://www.orafaq.com
-- 
Author: 
  INET: [EMAIL PROTECTED]

Fat City Network Services    -- (858) 538-5051  FAX: (858) 538-5051
San Diego, California        -- Public Internet access / Mailing Lists
--------------------------------------------------------------------
To REMOVE yourself from this mailing list, send an E-Mail message
to: [EMAIL PROTECTED] (note EXACT spelling of 'ListGuru') and in
the message BODY, include a line containing: UNSUB ORACLE-L
(or the name of mailing list you want to be removed from).  You may
also send the HELP command for other information (like subscribing).

Reply via email to