Derek Martin [EMAIL PROTECTED] writes:
Both are clocking in at the same time (1m 5sec for 2.6Gb), are there
any ways I can optimize either solution?
Getting 40+ MB/sec through a file system is pretty impressive.
Sounds like a RAID?
That said, due to normal I/O generally involving
On Apr 3, 2008, at 3:03 AM, Paul Rubin http://
phr.cx@NOSPAM.invalid wrote:
Derek Martin [EMAIL PROTECTED] writes:
Both are clocking in at the same time (1m 5sec for 2.6Gb), are there
any ways I can optimize either solution?
Getting 40+ MB/sec through a file system is pretty impressive.
---
Derek Tracy
[EMAIL PROTECTED]
---
On Apr 3, 2008, at 3:03 AM, Paul Rubin http://
phr.cx@NOSPAM.invalid wrote:
Derek Martin [EMAIL PROTECTED] writes:
Both are clocking in at the same time (1m 5sec for 2.6Gb), are there
any ways I can
On Thu, Apr 03, 2008 at 02:36:02PM -0400, Derek Tracy wrote:
I am running it on a RAID(stiped raid 5 using fibre channel), but I
was expecting better performance.
Don't forget that you're reading from and writing to the same
spindles. Writes are slower on RAID 5, and you have to read the
I am trying to write a script that reads in a large binary file (over 2Gb)
saves the header file (169088 bytes) into one file then take the rest of the
data and dump it into anther file. I generated code that works wonderfully
for files under 2Gb in size but the majority of the files I am dealing
On Wed, Apr 02, 2008 at 10:59:57AM -0400, Derek Tracy wrote:
I generated code that works wonderfully for files under 2Gb in size
but the majority of the files I am dealing with are over the 2Gb
limit
ary = array.array('H', INPUT.read())
You're trying to read the file all at once. You need
Derek Tracy wrote:
I am trying to write a script that reads in a large binary file (over
2Gb) saves the header file (169088 bytes) into one file then take the
rest of the data and dump it into anther file. I generated code that
works wonderfully for files under 2Gb in size but the majority
On Apr 2, 11:50 am, Derek Martin [EMAIL PROTECTED] wrote:
On Wed, Apr 02, 2008 at 10:59:57AM -0400, Derek Tracy wrote:
I generated code that works wonderfully for files under 2Gb in size
but the majority of the files I am dealing with are over the 2Gb
limit
ary = array.array('H',
Derek Tracy [EMAIL PROTECTED] wrote:
INPUT = open(infile, 'rb')
header = FH.read(169088)
ary = array.array('H', INPUT.read())
INPUT.close()
OUTF1 = open(outfile1, 'wb')
OUTF1.write(header)
OUTF2 = open(outfile2, 'wb')
ary.tofile(OUTF2)
When I try to use the above on files
On Wed, Apr 2, 2008 at 10:59 AM, Derek Tracy [EMAIL PROTECTED] wrote:
I am trying to write a script that reads in a large binary file (over 2Gb)
saves the header file (169088 bytes) into one file then take the rest of the
data and dump it into anther file. I generated code that works
On Apr 2, 2:09 pm, Derek Tracy [EMAIL PROTECTED] wrote:
On Wed, Apr 2, 2008 at 10:59 AM, Derek Tracy [EMAIL PROTECTED] wrote:
I am trying to write a script that reads in a large binary file (over 2Gb)
saves the header file (169088 bytes) into one file then take the rest of
the data and
On Wed, Apr 02, 2008 at 02:09:45PM -0400, Derek Tracy wrote:
Both are clocking in at the same time (1m 5sec for 2.6Gb), are there
any ways I can optimize either solution?
Buy faster disks? How long do you expect it to take? At 65s, you're
already reading/writing 2.6GB at a sustained
12 matches
Mail list logo