Not sure how much data is in the file, but is it possible to build the
output file in memory, then write it all at once, instead of appending
to it?
bye
Friday, November 14, 2003, 7:26:50 AM, you wrote:
CM> I am already using a exclusive CFLOCk around the CFFILE and it still does it from time to time.
CM> This is the error i get if I don't use my work-around:
CM> Error: Error processing CFFILE Error attempting to write data to target file 'f:\inetpub\wwwroot\sales\forecasting\temp\cmore_Full_Detail_Export_report.tab'.<P> Error: There was a sharing
CM> violation.<p>The error occurred while processing an element with a general identifier of (CFFILE), occupying document position (23:6) to (23:127) in the template file
CM> C:\CFUSION\CustomTags\write_tab_file_from_query2.cfm.
>>Use <cflock> around the <cffile> command.
>>
>>In MX, I'm not sure what underlying classes are being used (java.io vs.
>>java.nio), so you'll probably have to do the same thing in MX.
>>
>><cffile> sucks. What CF needs is the ability to read/write file
>>streams.
>>
>>----- Original Message -----
>>From: Chris More <[EMAIL PROTECTED]>
>>Date: Thursday, November 13, 2003 11:12 am
>>Subject: CFFILE BUG CF5.0 (DEADLOCK)
>>
>>> I have been experiencing a CF Bug for quite some time now and I
>>> wrote some code to get around the problem, but I am amazed that
>>> Macromedia never posted a fix other then saying to upgrade to MX.
>>>
>>> I am writing a text file from a very large query. It takes a few
>>> minutes to run and exports of the order of 16,000 rows of data to
>>> a text file. The first CFFILE is a write that only added one row
>>> of column header descriptions. As I loop through the query I
>>> append on one row of data at a time. Well I have experienced on
>>> large queries that CF doesn't always close the file before the
>>> loop continues. So the next time the CFFILE append is executed an
>>> error is generated because the file is currently in use. It
>>> happens at random times and it takes a random amount of time for
>>> CF to release the file and allow it to continue.
>>>
>>> I put a TRY/CATCH statement around the CFFILE append and I could
>>> catch the error. I then jumped into a loop that would continuously
>>> try to append to the file up to 500 retries. After the first
>>> successful retry it jumps out of the loop and continues to read
>>> the query and append the file. 500 retries seem to work as at 300
>>> tries I still had a few times that the file was still open when
>>> the next append occurred.
>>>
>>> Other people have complained about this problem, but Macromedia
>>> never did anything about it.
>>>
>>> No one else is using the file when the error occurs since the
>>> filename is always dynamic and unique when created.
>>>
>>> Does anyone know if someone wrote a CFX version of CFFILE that
>>> gets around this bug?
>>>
>>> Thanks,
>>> Chris More
>>>
CM>
[Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings]

