Hi All, In my application I usually create a DB with 10-20 records in it on the fly and delete the same after procressing them. Each record size varies but is usually ranges between 20k to 30k.
It works fine 99% of times. But at times DmNewRecord returns NULL and DmGetLastErr returns dmErrMemError. I've observed with on simulator as well on on device (Treo 650). And in both cases the free memory was more then 3MB. I dont think 25KB is a huge chuck for a DB record to get failed. Is there a way by I can instruct the OS to performs some defragmentation before I start processing my convertion, to ensure it allocates 30K chuck without issues. I'm using CodeWarrior. Regards, Sagar Mody -- For information on using the ACCESS Developer Forums, or to unsubscribe, please see http://www.access-company.com/developers/forums/
