I've been creating a tool which will generate sample data for
databases. The project is being compiled and run in 32-bit Linux. In
order to write to the file, I'm just using a TextFile, with which I do
pretty standard stuff:

AssignFile(outputFile, OutputFileNameEdit.Text);
try
  ReWrite(outputFile);
  ...
  writeln(outputFile, currentLine);
  ...
finally
  CloseFile(outputFile);
end;

Everything works fine until the file I generate hits 2 gigabytes, at
which point the command line spits out: File size limit exceeded (core
dumped)

I did some googling and found some references to ulimit, but I sort of
don't think that's my problem, given my current ulimit -a output:

/usr/lib$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
max nice                        (-e) 20
file size               (blocks, -f) unlimited
pending signals                 (-i) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) unlimited
max rt priority                 (-r) unlimited
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) unlimited
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

I think it's probably just some limitation of the file system using
standard I/O routines. My question is what do I need to use instead to
be able to create files bigger than 2 gigabytes?

Thanks,

Seth Grover

--
Seth Grover
sethdgrover[at]gmail[dot]com
http://grovers.us/seth

I'm a driver, I'm a winner. Things are going to change, I can feel it.

_________________________________________________________________
    To unsubscribe: mail [EMAIL PROTECTED] with
               "unsubscribe" as the Subject
  archives at http://www.lazarus.freepascal.org/mailarchives

Reply via email to