**
David,
 
Thats a useful bunch of things to check through, have to admit that I'm a bit miffed that I didn't think of checking different date ranges first.
 
Just trying it, and have had my real time virus alert pop up saying that one of the files was compromised, and has been removed.  That in isolation isn't really an issue, but it I'm wondering how the file is stored on the server - its running SQL server, is there something like a binary/large object field type that stores attachments?
 
Regards
 
Dave

 
On 12/05/06, David Sanders <[EMAIL PROTECTED]> wrote:
**
Hi Dave
 
Malloc errors (memory allocation) on a 5.1.1 server can be caused by several things.  Have you tried changing the date range for your query to see whether it is caused by one particular record, always the 150001 record no matter what the record set, or something else?
 
Here are a couple of things I have seen in the past that can cause this error:
 
Attachment fields - an attachment field has been added to a form at some stage in its history, and some of the data pre-dates the new field.  The old records do not therefore have corresponding entries in the B tables.  In this case, export the data in 2 sets with the old data excluding the attachment field.
 
Currency fields - I've seen currency fields with null values cause memory problems (v 6)
 
Workflow firing on Get Entry - disable the workflow during the export.
 
Strange characters in particular records - exclude these records from the export.
 
Very large Audit (diary) fields - exclude the fields from the export to test if this is the problem.
 
HTH
 
David Sanders
Remedy Solution Architect
Enterprise Service Suite @ Work
==========================
ARS List Award Winner 2005
Best 3rd party Remedy Application
 
tel +44 1494 468980
mobile +44 7710 377761
 
 
-----Original Message-----
From: Action Request System discussion list(ARSList) [mailto: [email protected]]On Behalf Of Dave Barber
Sent: 12 May 2006 11:02
To: [email protected]
Subject: Malloc failed on server

**
I'm extracting data from our helpdesk form for testing migration to our 6.3 test system, I've selected a sample of data (tickets created before 2005, about 30,000 records).
 
Extracting via the user tool/reporting to an ARX file.  Doesn't matter which client I use (5/6.3/7), I get an ARERR [300] Malloc failed on server.  (I know, its a server error, so the client won't make much difference).  This is at record 15,001 that it fails.
 
Any suggestions?  The server is running ARS 5.1.1, on Win2K.  1Gig of ram (its an old server).
 
I can take on the data that I have okay, but its concerning me that with this restriction, I'm going to have to take on the forms history in 15000 record chunks, which is a little irritating.
 
Thanks all,

Regards

Dave
__20060125_______________________This posting was submitted with HTML in it___
__20060125_______________________This posting was submitted with HTML in it___

__20060125_______________________This posting was submitted with HTML in it___

Reply via email to