Alan,
The IT departments of several sites, hitherto fairly independent, have all been
brought under a single roof at one of my customers and as a result a lot of databases
have fallen into the herd of databases we had to manage there.
IMHO the key point to inventory is automation; if you don't automate, it will never
stay up-to-date.
First of all, get hold of some platform for scripting.
VERY VERY SMALLI don't know perl,/VERY VERY SMALLMICROSCOPIC I don't even plan
to learn it any soon/MICROSCOPIC and as I feel comfortable with ksh, sed, awk and
the like I jumped on a Unix platform, but your choice may be different.
The first challenge in our case was to build an inventory of databases (asking
people is totally unreliable); I have used scripts from Tim Gorman which you will find
on his site (http://www.evdbt.com) - from a security paper, which I have reworked to
suit my case. The idea was to probe the network (fortunately all servers are supposed
to follow a special address pattern) and check for listeners, and send the lsnrctl
stat command. This helps you identify servers, listeners, and instances. A suitable
schema was built into a database (Oracle, but see below) to store this; note that
relationships are sometimes not very simple, since a same instance can be served by
several listeners.
Next step was to secure a foothold into each database to execute inventory queries (it
has been a good opportunity to check security too). DBSNMP/DBSNMP is a good bet.
Actually, we created a special MONITOR account on each database, with only the minimum
rights required.
Everyday a script runs, which checks V$DATABASE, V$INSTANCE, V$LICENCE, V$VERSION (the
only place BTW when you find some indication about which OS you are running on),
getting information and updating it if required. Storage is of course checked as well.
Database links are collected too. We have a PHP application displaying all the
information (with the refresh date), conveniently crossed (for instance, we list for
each database the dblinks to the database as well as the dblinks from the database).
We have some summary PDF reports (storage, databases per OS, per version, etc.) which
are printed every week. We are also linking to a (static) inventory of applications.
It's still work in progress. We have recently added a connection test every 15mn to
check database availability (trying a non-existent user. If we don't get ORA-1917 we
try to ping the server and tnsping the listener to pinpoint the reason for the problem
- of course we skip the other databases on the server if we can't ping it) and compute
some availability percentage figure. We also intend to collect some metrics at regular
intervals to have an idea about the load.
I have nothing against using Access to store the data; in fact, some of the ideas
were borrowed from another customer where the repository is a Sybase database (TCL
scripts do a full inventory of both the Sybase and Oracle databases - several hundreds
of them). But, once again, do it AUTOMATICALLY.
HTH
Stephane Faroult
- --- Original Message --- -
From: [EMAIL PROTECTED]
To: Multiple recipients of list ORACLE-L
[EMAIL PROTECTED]
Sent: Tue, 09 Dec 2003 15:34:32
Dan,
That's a good idea for documenting structures
inside the database.
However, my database manager wants more high level
info: database name /
host, oracle version, listeners, applications that
use it, cron job
descriptions and times, main schemas and what they
are used for, lists of
developers names that access the databse, etc...
Alan
Daniel Hanks
[EMAIL PROTECTED] To:
Multiple recipients of list ORACLE-L
[EMAIL PROTECTED]
c.com cc:
Sent by:
Subject: Re: Documenting databases
[EMAIL PROTECTED]
.com
12/09/2003 04:09
PM