Well, what is large?

Most of what we have as large reports are in the 10,000 to 20,000 page
range.  Then, we have our "large" reports, ranging from 200,000 to
600,000 pages.  Yep, we do print them.  But they have very little
content on them.  They compress very well.

The biggest problem with this project (other than do it with no cost
<G>) is that we are replacing a microfiche system of manual searching.
That is easy to do.  But, once the limits of manpower are eliminated, we
can only guess what will be needed.

Hence, my leaning is towards a system (using mainframe cycles and
resources) that can be used in the interm to see what people actually
want out of this system.  For that matter, it may take a few years of
saving the reports in electronic form, for any kind of search to be
meaningful.

I keep thinking that I can write this in Rexx (or Regina) in less time
then it takes to talk about it<G>.

But then, I would want a good indexing engine, which perhaps, htdig
has, and a web front end for the user interface (again, htdig has).  But
once I find the documents searched for, I need to uncompress the files,
pull the page or two and present it to the user.  If I could write a
back end interface, that should be doable.

But I'll keep it on the back burner for a few months.  Just to see what
other options may come about.  For that matter, we should have Oracle up
 and running on an IFL in a month or two.  That might be a good backend
for report storage....might not either..

Tom Duerbusch
THD Consulting

>>> [EMAIL PROTECTED] 03/21/05 1:56 PM >>>
A batch report could pretty easily be turned in to a web page with a
few
simple tags in the beginning and the end.  The report could have the
tags added after the FTP process.  Probably not something you want to
do
with reports that are real big, you need to consider page load time.
Large reports will take longer to load than users may be willing to
wait.

Tom Duerbusch wrote:
> Thanks
>
> I've looked at the web site //www.htdig.org for documenation.  Too
bad,
> no guide.
>
> But it looks like it is more suited to Web stuff.
>
> I'm looking for something to do batch reports.  For example, we put
in
> an employee's name, and a selection comes back with each report that
> employee was mentioned (Payroll register, Paycheck report, W-2, HRS
> reports, Dependancy reports, etc).
>
> But I would think it will be more for vender related stuff.  As in if
a
> vender is black listed, being able to verify that they just tried to
> change the company name and start up again.  (i.e. checks on owner
info,
> address info, phone numbers...anything to see if some bad owner, is
> trying to resurface with a new name)  This would also be used for
bad
> landlords.
>
> But it does give us some ideas of the kind of problems we may face
with
> a, more roll it your own, solution.
>
> Tom Duerbusch
> THD Consulting
>
>
>>>>[EMAIL PROTECTED] 03/21/05 1:07 PM >>>
>
> There's always htdig.  From the package description:
> The ht://Dig system is a complete world wide web indexing and
> searching
> system for a small domain or intranet. This system is not meant to
> replace
> the need for powerful internet-wide search systems like Lycos,
> Infoseek,
> Webcrawler and AltaVista. Instead it is meant to cover the search
needs
> for
> a single company, campus, or even a particular sub section of a web
> site.
>
>
> While it specifically talks about web pages, it might be worth
looking
> at to
> see if it will work with other things.
>
>
> Mark Post
>
> -----Original Message-----
> From: Linux on 390 Port [mailto:[EMAIL PROTECTED] On Behalf
Of
> Tom
> Duerbusch
> Sent: Friday, March 18, 2005 3:04 PM
> To: [email protected]
> Subject: Re: Document management software
>
>
> -snip-
> I was hoping for something (most likely from the Linux world due to
> the
> rather large software library that exists for that platform), that
> would do
> some moderate indexing, perhaps migration of reports from dasd to CD
or
> DVD
> for long term retension. After all, if you have 20-30 GB of reports,
> you
> don't want users "scanning" all the reports when looking for
something,
> just
> scan an index that may be kept up with off-hours processing.
>
>
----------------------------------------------------------------------
> For LINUX-390 subscribe / signoff / archive access instructions,
> send email to [EMAIL PROTECTED] with the message: INFO
LINUX-390
> or visit
> http://www.marist.edu/htbin/wlvindex?LINUX-390
>
>
----------------------------------------------------------------------
> For LINUX-390 subscribe / signoff / archive access instructions,
> send email to [EMAIL PROTECTED] with the message: INFO LINUX-390
or visit
> http://www.marist.edu/htbin/wlvindex?LINUX-390
>

--
Rich Smrcina
VM Assist, Inc.
Main: (262)392-2026
Cell: (414)491-6001
Ans Service:  (866)569-7378
rich.smrcina at vmassist.com

Catch the WAVV!  http://www.wavv.org
WAVV 2005 - Colorado Springs - May 20-24, 2005

----------------------------------------------------------------------
For LINUX-390 subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: INFO LINUX-390
or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390

----------------------------------------------------------------------
For LINUX-390 subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390

Reply via email to