Jim Thanks for the thought viz-a-viz memory. I had considered that as an issue - is it possible that because these are active files, that they tend to stick in memory for longer and therefore have more of a chance of getting corrupted (I know that a hard reboot will almost certainly corrupt these files!).
Is there a way (with jbase 3.4.10/win2003) of forcing these files to be flushed to disk more regularly? I seem to recall the ability to configure a file in jbase so that it forces a physical disk write on update - though this would hamper performance somewhat! I will make backups of the files before fixing them and see if devsup can advise (they tend to be large 1GB+ files but will zip well). Regards Simon -----Original Message----- From: Jim Idle <[email protected]> To: [email protected] Date: Tue, 26 Apr 2011 13:47:48 -0700 Subject: RE: File Corruption... Causes? You need to examine hex dumps of the files and see what data is getting overwritten and with what other data. The other thing you should do is download the latest Memcheck CD image and run a complete memory check on the system. Most PCs do not use ECC memory and other than crashes or strange things happening you do not realize that there is a memory issue. However you may find that this is something like opening the file and editing it with notepad, or something silly like that. Anyway, you might need jBASE to help you with that but make copies of the corrupt files before ‘fixing’ them; then you can look for patterns in the corruption. The fact that they are high activity files, just means that they are the most likely to exhibit the problem. You should do the memcheck overnight as soon as possible though. Just get the customer to stick the CD in and reboot. Jim From: [email protected] [mailto:[email protected]] On Behalf Of Simon Verona Sent: Tuesday, April 26, 2011 1:13 PM To: [email protected] Subject: RE: File Corruption... Causes? Jim I mean physically corrupt... If you do a COUNT [Filename] you crash out with a "Readnext error 2007, file is corrupt message" (or similar). JCHECK with no options confirms the corruption (I double check by running it multiple times). To correct, I have to run JCHECK -MS [Filename] with all users logged out. Typically, the files that this happens to are high activity files, with lots of smallish records in. I suspected that the size maybe was the issue so I converted one of the customers into a multipart file but within a month one of the parts has corrupted. The file is normally discovered as corrupt when reading a record (either atomically, or when running a report). The problem is that it's not a completely random event - whilst I can't predict when and where it's going to happen - I notice that some systems are more prone to the error and that certain files are more likely than others to have the problem. I've kind of eliminated multi-user writing as being a cause - one of the files is only written to by a single program - this sets an execution lock to ensure that only one process can update the file at a time. It is ironically, this file that statistically corrupts the most often. I'm sorry if I'm a little vague about the issue, but I don't really have a grip as to what is going on. I don't know *when* the files are corrupting - only that they are corrupt. Regards Simon -----Original Message----- From: Jim Idle <[email protected]> To: [email protected] Date: Tue, 26 Apr 2011 12:50:26 -0700 Subject: RE: File Corruption... Causes? Do you mean logically corrupt (your records are wrong) or physically corrupt (you have to use jcheck)? You cannot physically corrupt a file by writing to it without taking a lock, you will just get trash results in your file. When are you discovering the data is corrupt? There are lots of things that you can do to actually corrupt it and some things (such as running jcheck when people are writing to the file) that might make you think it is corrupt. Jim From: [email protected] [mailto:[email protected]] On Behalf Of Simon Verona Sent: Tuesday, April 26, 2011 12:39 PM To: [email protected] Subject: File Corruption... Causes? This issue is generic, and relates to a number of similar jBASE 3.4.10 based systems running Windows Server 2003. We have an ongoing issue with file corruptions in j4 format files. The problem appears somehow to be application driven - I suspect this because across a number of systems, the files that corrupt are always the same ones... So, I'm looking for inspiration at an application level as to what could cause file corruptions. One thought I had was a WRITE without previously doing a READU. I've not managed to duplicate the issue doing this, but it's difficult to simulate a multi-user test that replicates what the application might be doing. Does anybody know if this *could* be the cause, or know of some other application (data-basic) issue that could cause a J4 file to be corrupted? thanks in advance Simon Verona -- Please read the posting guidelines at: http://groups.google.com/group/jBASE/web/Posting%20Guidelines IMPORTANT: Type T24: at the start of the subject line for questions specific to Globus/T24 To post, send email to [email protected] To unsubscribe, send email to [email protected] For more options, visit this group at http://groups.google.com/group/jBASE?hl=en -- Please read the posting guidelines at: http://groups.google.com/group/jBASE/web/Posting%20Guidelines IMPORTANT: Type T24: at the start of the subject line for questions specific to Globus/T24 To post, send email to [email protected] To unsubscribe, send email to [email protected] For more options, visit this group at http://groups.google.com/group/jBASE?hl=en -- Please read the posting guidelines at: http://groups.google.com/group/jBASE/web/Posting%20Guidelines IMPORTANT: Type T24: at the start of the subject line for questions specific to Globus/T24 To post, send email to [email protected] To unsubscribe, send email to [email protected] For more options, visit this group at http://groups.google.com/group/jBASE?hl=en
