OK, I think I've got a working prototype, but there is a wrinkle... You recommended a 
UDF in SQL to escape out the characters illegal in XML. However, we have many 
diacritical marks and characters which would be quite difficult to include in the UDF. 
 The CFML XMLFormat() function does do this. But adding a line to run the results 
through this function causes all the the tag marks to change. Then running these 
results through some more refinements effectively lengthens the processing time 
tremendously.

You are right in that SQL does do the encoding much, much faster than CFML.  So I will 
be re-investigating the other XML functions of SQL Server. In the meantime, does 
anybody have a good SQL XMLFormat() UDF they'd be willing to share?

Thanks, again.
Karen
>>> [EMAIL PROTECTED] 7/24/2003 9:23:58 AM >>>
Even without using FOR XML you can output directly to XML from SQL Server... We had a 
process where I'm working currently which had been taking 3 hours to generate xml 
content to send to another outsourcing company.

The reason is this: In the beginning as you're building the first couple thousand 
records of xml content, CF is pretty quick about it, generating each record at under a 
second. As the xml string becomes longer and longer, it takes cf longer and longer to 
find the end of the string to append the next record. (iirc there's a similar issue 
with cffile action="append")... So once you've processed over 10k records it's solidly 
taking over 1 second per record and still slowing...

So what'd we do? Generate xml with a view... 

For example: 

CREATE VIEW dbo.v_xml_departments AS
SELECT '<department departmentid="' + dbo.xmlformat() + '">'
  + CASE LEN(TRIM(ISNULL(othercolumn,''))) WHEN 0 THEN ''
  ELSE '<otherelement>' + dbo.xmlformat(othercolumn) 
  + '</otherelement>' END + '</department>' 
  AS xmlinformation
FROM ourtable

dbo.xmlformat is a SQL2k user defined function which does this

set xml = replace(xml,'&','&amp;');
set xml = replace(xml,'''','&apos;');
set xml = replace(xml,'"','&quot;');
set xml = replace(xml,'<','&lt;');
set xml = replace(xml,'>','&gt;');
return xml;

this method gives us complete control over the xml returned by the view, then we use 
cf to turn it into a complete xml packet

<cfquery name="rs" datasource="ourdsn">
  SELECT * FROM v_xml_departments
</cfquery>

<cfsavecontent variable="xmlpacket">
  <?xml version="1.0" charset="utf-8"?>
  <departments>
    #arraytolist(rs["xmlinformation"],"")#
  </departments>
</cfsavecontent>

With 65k records in a reasonably complex schema, this method reduced processing time 
to generate the xml from 3hrs to 10 minutes. :) The query takes about 1 minute, the 
xml generation (the arraytolist function call) takes about 9 minutes.

ymmv & hth :)


s. isaac dealey                972-490-6624

new epoch                      http://www.turnkey.to 

lead architect, tapestry cms   http://products.turnkey.to 

tapestry api is opensource     http://www.turnkey.to/tapi 

team macromedia volunteer      http://www.macromedia.com/go/team 

certified advanced coldfusion 5 developer
http://www.macromedia.com/v1/handlers/index.cfm?ID=21816 


> Did you know you can make SQL return a recordset in XML
> format?  Simply
> append 'FOR XML' statement at the end of your SQL select
> statement and
> the result is an XML document.  The 'FOR XML' statement
> has 3 modes:
> AUTO, RAW, and EXPLICT.  There's also stiplulations on
> what type of SQL
> Select statement you can use 'FOR XML'.  I would check SQL
> Server Books
> and search for 'FOR XML clause'.  This is for SQL Server
> 2000.  Not sure
> if SQL Server 7 supports 'FOR XML'

> HTH - Tom

> -----Original Message-----
> From: Karen Harker
> [mailto:[EMAIL PROTECTED] 
> Sent: Thursday, July 24, 2003 8:58 AM
> To: [EMAIL PROTECTED] 
> Subject: Processing time question

> I'm working on a program that effectively exports data
> from SQL Server
> into an XML file. All I've done is simply hard-code the
> tags, and
> plugged in values based on the appropriate database
> columns.  This data
> is set as a variable and then the variable is saved as a
> file on the
> server.  Eventually, I would like to FTP that file to
> another server.

> My problem is file size and processing time.  I've
> estimated that there
> will be about 9MB of data. I've been successful in
> outputting the data
> into multiple files of up to 100 records each. This
> typically did not
> take very long to process (15-30 seconds).  In the end,
> however, we
> would like to have the data in one large file. When I try
> to process the
> entire dataset all at once, it takes much too long (I've
> timed out at
> 1000 seconds!).  I've tested it at processing 1000 records
> (there are
> approx 4800 records) and I was able to get 3 sets loaded
> before timing
> out at 500 seconds.  So it seems that the processing time
> rises almost
> exponentially with the number of records per file.

> My questions are:
> 1) Why is that?
> and
> 2) What can we do?

> We would like to get this exporting/transfer/importing
> automated as much
> as possible because the data will have to be refreshed at
> least once a
> day, if not more frequently.  Also, importing the multiple
> files could
> cause problems with the target resource.


> Thanks.


> Karen R. Harker, MLS
> UT Southwestern Medical Library
> 5323 Harry Hines Blvd.
> Dallas, TX  75390-9049
> 214-648-8946
> http://www.utsouthwestern.edu/library/ 

> -----------------------------------------------
> To post, send email to [EMAIL PROTECTED] 
> To unsubscribe:
>    Send UNSUBSCRIBE to [EMAIL PROTECTED] 
> To subscribe / unsubscribe: http://www.dfwcfug.org 

> -----------------------------------------------
> To post, send email to [EMAIL PROTECTED] 
> To unsubscribe:
>    Send UNSUBSCRIBE to [EMAIL PROTECTED] 
> To subscribe / unsubscribe: http://www.dfwcfug.org 




-----------------------------------------------
To post, send email to [EMAIL PROTECTED] 
To unsubscribe:
   Send UNSUBSCRIBE to [EMAIL PROTECTED] 
To subscribe / unsubscribe: http://www.dfwcfug.org 

-----------------------------------------------
To post, send email to [EMAIL PROTECTED]
To unsubscribe:
   Send UNSUBSCRIBE to [EMAIL PROTECTED]
To subscribe / unsubscribe: http://www.dfwcfug.org

Reply via email to