What awesome work!  More creative than I imagined.  I hope it makes you some
money as well.

Heman

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Brian Hancock
Sent: Friday, February 16, 2007 6:42 AM
To: Dataperfect Users Discussion Group
Subject: [Dataperf] Update on my main DP web app

Hi Everyone,

This list has been looking very slow for a while, so I thought I would 
quickly update everyone on where I am up to with my DP Web application...

Here is a link to a brochure... 
http://www.brileigh.com/pic/DraftBrochure.pdf

The heart of the program is a DataPerfect application running on a 
commercially hosted Linux web server. The interaction from the website to 
and from DataPerfect is by a multitude of Perl scripts. Almost exclusively 
data goes from the Perl script into DataPerfect via a synethetic transaction

log, and is returned in an XML file, which is retrieved by the Perl script 
and is either processed in various ways by that script or delivered back to 
the calling web client.

The client web pages are generally straight HTML pages, with either AJAX or 
just simple HTTP POSTs or GETs. I have used a little Macomedia (Adobe) 
Flash, but I seem to get into strife with security settings on various 
version of Internet Explorer, so to reduce that heartache I have kept it 
more to vanilla HTML with CSS...

Initially I was very keen on Flash because I could maintain state within the

application, but the application quickly beefed up in size, and with the IE 
security issues itr became bothersome.  I found that with server side 
cookies and session files, I could maintain the necessary degree of state...

One thing which was nice with Flash, is that I could with one trip to the 
server deliver more data back into the various parts of the application. 
With straight HTML POST's and GET's  you are constantly making trips to the 
database , and the start up and take down of DataPerfect is quite slow, so 
the more data you can retrieve in one trip the better. Some background AJAX 
makes for some sleight of hand manipulation...

I have been using a kludgy pilot of the program for over and year, and 
DataPerfect stands up extremely well, and although its start up time is a 
little slow, the application is rock solid under quite reasonable loads... 
My webhoster has been keeping an eye on utilisation, and even when I have 
had as many as 20 concurrent reports running, it has not even blipped up on 
their utilisation of CPU or memory...  In fact they have said it is minimal 
in comparison to other database and midware application running on their 
servers which are far more resource hungry.

The approach needed with DataPerfect is very different that I have ever 
taken before. All interactions with the data is via reports, which is both 
liberating, and sometimes, ok perhaps often, tedious. Because DataPerfect 
can't call subroutines, you end up having massive branching of things doing 
similar things. I have an import routine, which includes 45 subreports... 
Core to running reports is a single record "dummy" panel which is used as 
the starting point for all reports, and using lots and lots of virtual 
subreports, tied together with Report Variables, and matching indexes.

Because a user will not see any activity until the report is totally 
complete, there is no luxury for looping through records needlessly, you 
need to either make sure you link directly to the required data, or use 
Skips to the record. Get the data, report it and finish. In some cases where

the report does take too long you have to go to more elaborate steps.

For instance the Import Data report allows a user to take a file of player 
information and import it intelligently. It allows the user to take a comma 
or tab delimited file, with or without a header row, assign fields to the 
particular ordinal position of data in the file, and import it, with either 
replacing, rejecting duplicates or consolidating duplicates with new but 
previously missing data, and allowing the user to select which fields can be

used for duplicate checking. It allows the user to specify the format for 
dates, dd/mm/yy, dd/mm/yyyy mm/dd/yyyy yyyymmdd   etc, and even attempts to 
parse full names into component parts, so that Mary Anne Smith would have a 
First name "Mary Anne", but Vincent van Gogh firstname would be "Vincent". 
Its not perfect but makes a reasonable attempt. It then allocates a username

based on that name and checks it for uniqueness, and if "msmith" was not 
unique would assign the username "msmith02" etc...  All this takes a lot of 
time, about 1 second per record, and so if a club were to import even just 
100 players in a file the browser would probably time out. So for these long

processes I have had to create webpages which call back to the same script 
on Javascript timers in the webpage, waiting until it can get an exclusive 
lock on the xml file being generated by DataPerfect so it can deliver the 
final results. Sometimes I deliver the final result as an email as it might 
take so long.

Because reports are complex, I have made lots of use of  <!-- xml 
comments -->,  to help me keep track on what the hell is happening..  I then

have to process out the comments using XSLT at the script.

Initially I was using client side XSLT processing to render HTML pages, 
however I found too much variation with browsers, in their ability to 
transform large and complex XML documents, so I have been using server side 
XSLT processing as much as possible... I still have some client side XSLT 
processing, as it allows the user to user their own XSLT templates to create

custom reports...

The application is not yet finished...  I still have lots of front end work 
to do...  but most of the backend is in place...  The bullet-proofing for 
user interaction takes a lot more time than I expected...  You have to 
predict so many things a user could do to mess things up...

It has been a fun project... if I am lucky it might make some money for 
me...  I like working with DataPerfect because everything is very 
predicatable...   but you do have far more complexities than in a stand 
alone application...  and I painstakingly have had to document the minutest 
details of how each report works, and how report variables are being used...

I have also not been able to use any :IN incrementing number field as this 
would require quarantining of data when making change to reports...  By 
using recursive links for creating unique serial identifiers or moment 
fields or a combination of both, you can simply upload a new .STR file when 
changing reports, which are the application's heart and soul...   I use 
flags in the perl script to put temporarily suspend operation of the 
database and alert users to the fact whenever I need to upload changes - 
unlike an office, on the web you can't shout out to all the user to jump out

of the database while you make a quick change...

I still feel like a novice, the more I learn the more I find I don't know...

exponentially it feels at times...

I am trying to be pragmatic about things, so I keep looking at whether I 
will keep DP always as its heart. I must say at times the .NET middleware, 
with a SQL backend  sometimes looks attractive, if only because I would have

the database running as a service without needing to start it up and tear it

down for each trip...  It sort of feels like I am prototyping it with 
DataPerfect....  only that is how I first got started using DataPerfect... 
I wrote a prototype of a legal billing program, and by the time I got 
through with the DP prototype the client was happy to keep using it, and I 
was scared to rewrite it in Clipper because I knew that I had gone way 
beyond prototype...  and it kept working...  By choosing XML as the (almost)

sole output of DataPerfect, I know that it can be replaced relatively easily

if necessary...

If I had a wishlist for DataPerfect it would include
1. The ability to place readable comments in the reports which do not affect

output
2. Named Report variables
3. Chaining or subroutining of one report with another report, to make 
reusable report blocks.
4. Being able to direct DP output to STDOUT
5. Some array handling... or indirect referencing...
6. Long string handling could be useful, but basically I do that all in ther

Perl scripts which has spectacular text handling ability.

This application has generated a lot of interest for the task it was 
intended to do...  I now just have to get it over the line...

Anyway, thats where things are up to...

Regards
Brian

_______________________________________________
Dataperf mailing list
[email protected]
http://lists.dataperfect.nl/mailman/listinfo/dataperf

-- 
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.5.441 / Virus Database: 268.17.39/687 - Release Date: 2/14/2007
4:17 PM
 

-- 
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.5.441 / Virus Database: 268.18.1/690 - Release Date: 2/16/2007
2:25 PM
 

_______________________________________________
Dataperf mailing list
[email protected]
http://lists.dataperfect.nl/mailman/listinfo/dataperf

Reply via email to