Re: [Tutor] Memory Error

2017-02-15 Thread Steven D'Aprano
On Wed, Feb 15, 2017 at 09:36:02PM +0330, elham khanchebemehr wrote: > Hi, > I'm trying to write a code in python that takes an array of integers as > sources, an array of integers as sinks, and an array of an array of > integers of capacities, returning the maximum flow. I'm new to python and I >

Re: [Tutor] memory error

2015-07-02 Thread Joshua Valdez
Hi so I figured out my problem, with this code and its working great but its still taking a very long time to process...I was wondering if there was a way to do this with just regular expressions instead of parsing the text with lxml... the idea would be to identify a tag and then move to the nex

Re: [Tutor] memory error

2015-07-02 Thread Danny Yoo
> > So I got my code working now and it looks like this > > TAG = '{http://www.mediawiki.org/xml/export-0.10/}page' > doc = etree.iterparse(wiki) > > for _, node in doc: > if node.tag == TAG: > title = > node.find("{http://www.mediawiki.org/xml/export-0.10/}title";).text > if t

Re: [Tutor] memory error

2015-07-02 Thread Danny Yoo
On Thu, Jul 2, 2015 at 9:57 AM, Joshua Valdez wrote: > > Hi so I figured out my problem, with this code and its working great but its > still taking a very long time to process...I was wondering if there was a way > to do this with just regular expressions instead of parsing the text with > lxm

Re: [Tutor] memory error

2015-07-01 Thread Joshua Valdez
Hi Danny, So I got my code workin now and it looks like this TAG = '{http://www.mediawiki.org/xml/export-0.10/}page' doc = etree.iterparse(wiki) for _, node in doc: if node.tag == TAG: title = node.find("{http://www.mediawiki.org/xml/export-0.10/}title ").text if title in pag

Re: [Tutor] memory error

2015-06-30 Thread Danny Yoo
Hi Joshua, The issue you're encountering sounds like XML namespace issues. >> So I tried that code snippet you pointed me too and I'm not getting any >> output. This is probably because the tag names of the XML are being prefixed with namespaces. This would make the original test for node.

Re: [Tutor] memory error

2015-06-30 Thread Danny Yoo
Please use reply to all: I'm currently not in front of a keyboard at the moment. Others on the mailing list should be able to help. On Jun 30, 2015 6:13 PM, "Joshua Valdez" wrote: > Hi Danny, > > So I tried that code snippet you pointed me too and I'm not getting any > output. > > I tried playin

Re: [Tutor] memory error

2015-06-30 Thread Alan Gauld
On 30/06/15 16:10, Joshua Valdez wrote: So I wrote this script to go over a large wiki XML dump and pull out the pages I want. However, every time I run it the kernel displays 'Killed' I'm assuming this is a memory issue after reading around but I'm not sure where the memory problem is in my scri

Re: [Tutor] memory error

2015-06-30 Thread Danny Yoo
On Tue, Jun 30, 2015 at 8:10 AM, Joshua Valdez wrote: > So I wrote this script to go over a large wiki XML dump and pull out the > pages I want. However, every time I run it the kernel displays 'Killed' I'm > assuming this is a memory issue after reading around but I'm not sure where > the memory

Re: [Tutor] Memory error for list creation

2010-08-24 Thread Wayne Werner
On Tue, Aug 24, 2010 at 9:54 AM, Triantafyllos Gkikopoulos < t.gkikopou...@dundee.ac.uk> wrote: > Hi, > > I am looking for an alternative to: > > > > Please consider the environment. Do you really need to print this email? > > > > >>>

Re: [Tutor] memory error files over 100MB

2009-03-17 Thread Kent Johnson
On Tue, Mar 17, 2009 at 6:34 AM, A.T.Hofkamp wrote: >> http://personalpages.tds.net/~kent37/kk/00012.html > > Nice web-page! Thanks! > You can do the above statements also iteratively of course > > for i in ... >  s = read() >  # write s > > but since the loop does nothing with either s or read

Re: [Tutor] memory error files over 100MB

2009-03-17 Thread A.T.Hofkamp
Kent Johnson wrote: On Mon, Mar 16, 2009 at 12:30 PM, A.T.Hofkamp wrote: I don't know what code is executed in an assignment exactly, but **possibly**, first the 'read()' is executed (thus loading a very big string into memory), before assigning the value to the variable (which releases the pr

Re: [Tutor] memory error files over 100MB

2009-03-16 Thread Kent Johnson
On Mon, Mar 16, 2009 at 12:30 PM, A.T.Hofkamp wrote: > I don't know what code is executed in an assignment exactly, but > **possibly**, first the 'read()' is executed (thus loading a very big string > into memory), before assigning the value to the variable (which releases the > previous value of

Re: [Tutor] memory error files over 100MB

2009-03-16 Thread A.T.Hofkamp
Cheetah1000 wrote: I can't speak for Python 2.6, but using Jython 2.1 (Python 2.1 for Java), the code only looks at the file you are trying to extract()/read(). Near the end of the zip archive is a directory of all the files in the archive, with the start position and length of each file. Jytho

Re: [Tutor] memory error files over 100MB

2009-03-16 Thread Cheetah1000
Alan Gauld wrote: > > > "Sander Sweers" wrote > >> ... I would expect zf.read(zfilename) to only read the >> requested file in the zipfile. > > That's a dangerous assumption. You might be right but I'd want to > do some tests first to see. But if even one zipped file was big the > same wou

Re: [Tutor] memory error files over 100MB

2009-03-10 Thread Lie Ryan
Sander Sweers wrote: 2009/3/10 Alan Gauld : newFile.write(zf.read(zfilename)) Remember you are reading the file into memory and then writing it out again in a single operation, that will use twice the space of the uncompressed files - plus some extra for overhead. Question, Do you m

Re: [Tutor] memory error files over 100MB

2009-03-10 Thread Alan Gauld
"Sander Sweers" wrote out again in a single operation, that will use twice the space of the uncompressed files - plus some extra for overhead. Question, Do you mean the file in the zipfile (zfilename) or the whole zipfile (zf)? I would expect zf.read(zfilename) to only read the requested

Re: [Tutor] memory error files over 100MB

2009-03-10 Thread Sander Sweers
2009/3/10 Alan Gauld : >>           newFile.write(zf.read(zfilename)) > > Remember you are reading the file into memory and then writing it > out again in a single operation, that will use twice the space of the > uncompressed files - plus some extra for overhead. Question, Do you mean the file in

Re: [Tutor] memory error files over 100MB

2009-03-10 Thread Alan Gauld
"Harris, Sarah L" wrote However I still have a memory error when I try to run it on three or more files that are over 100 MB? And this surprises you? :-) How much memory do you have free on your computer when you run this? newFile.write(zf.read(zfilename)) Remember you are re

Re: [Tutor] memory error files over 100MB

2009-03-10 Thread Moos Heintzen
On Tue, Mar 10, 2009 at 8:45 AM, Harris, Sarah L wrote: > That looks better, thank you. > However I still have a memory error when I try to run it on three or more > files that are over 100 MB? How big are files in the zip file? It seems that in this line newFile.write(zf.read(zfilename)) the

Re: [Tutor] memory error

2009-03-10 Thread Alan Gauld
"Harris, Sarah L" wrote fname=filter(isfile, glob.glob('*.zip')) for fname in fname: This will confuse things. fname starts off as a list of files and then becomes a single filename inside the loop. It's never a good idea to duplicate variable names like that. It also means that after th

Re: [Tutor] memory error

2009-03-09 Thread Moos Heintzen
On Fri, Mar 6, 2009 at 5:03 PM, Harris, Sarah L wrote: > fname=filter(isfile, glob.glob('*.zip')) > for fname in fname: > zipnames=filter(isfile, glob.glob('*.zip')) > for zipname in zipnames: > ... It looks you're using an unnecessary extra loop. Aren't the contents of fname sim

Re: [Tutor] memory error

2009-03-06 Thread Oxymoron
Hello, On Sat, Mar 7, 2009 at 11:03 AM, Harris, Sarah L wrote: > import zipfile, glob, os > os.chdir('E:\\test1') > from os.path import isfile > fname=filter(isfile, glob.glob('*.zip')) > for fname in fname: >     zipnames=filter(isfile, glob.glob('*.zip')) >     for zipname in zipnames: >   

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-31 Thread Alan Gauld
utomated testing is "A Good Thing" :-) Alan G. - Original Message - From: "Alan Gauld" <[EMAIL PROTECTED]> To: tutor@python.org Date: Thu, 31 Jul 2008 06:39:32 +0100 Subject: Re: [Tutor] Memory error - how to manage large data sets? "Kepala Pening

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-31 Thread bob gailer
Kepala Pening wrote: def sumEvenFibonacci( limit ): a, b = 1, 1 # don't waste with a = 0 sum = 0 while b < limit: if b%2 == 0: sum += b a, b = b, a + b return sum print sumEvenFibonacci( 200 ) Every 3rd element in the Fibo

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-30 Thread Kepala Pening
, 31 Jul 2008 06:39:32 +0100 Subject: Re: [Tutor] Memory error - how to manage large data sets? > > "Kepala Pening" <[EMAIL PROTECTED]> wrote > > > def sumEvenFibonacci( limit ): > > a, b = 1, 1 # don't waste with a = 0 > > sum = 0 > > w

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-30 Thread Alan Gauld
"Kepala Pening" <[EMAIL PROTECTED]> wrote def sumEvenFibonacci( limit ): a, b = 1, 1 # don't waste with a = 0 sum = 0 while b < limit: if b%2 == 0: sum += b a, b = b, a + b return sum print sumEvenFibonacci( 200 ) Does it work for limit = 2? Alan G. __

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-30 Thread Kepala Pening
s Fuller <[EMAIL PROTECTED]> To: tutor@python.org Date: Mon, 28 Jul 2008 12:27:58 -0500 Subject: Re: [Tutor] Memory error - how to manage large data sets? > On Monday 28 July 2008 10:56, Karthik wrote: > > Hi, > > > > > > > > I am new to Python programming, I was tr

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-29 Thread Chris Fuller
The original post was a little ambiguous: "I need to find the sum of all numbers at even positions in the Fibonacci series upto 2 million." But the project euler page (http://projecteuler.net/index.php?section=problems&id=2) is clear: "Find the sum of all the even-valued terms in the sequence

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Daniel Sarmiento
>(the solution, of course, is to avoid storing all those numbers in the >first place) I tried this: fib = {0:0,1:1} sum = 0 for j in xrange (2,100): i = fib[j-1] + fib[j-2] if i % 2 == 0: sum += i fib = {j-1:fib[j-1], j:i} print sum I guess it should come up with the ri

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Alan Gauld
"Alan Gauld" <[EMAIL PROTECTED]> wrote > were infinite using floats! So you need to calculate the > total as you go without saving the values > I got curious so wrote the following function: >>> def fibtot(N): ... f0,f1,tot = 0,1,1 ... for n in range(N): ... f = f0 + f1 ... f0,f1 =

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Chris Fuller
There's no need to keep any lists. The sum can be done on the fly, which is perhaps a bit slower, but takes a constant amount of ram. Even storing every other element (or every third, which is what he's trying to do: the elements that are even numbers, not every other element.. See his exampl

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread John Fouhy
On 29/07/2008, Daniel Sarmiento <[EMAIL PROTECTED]> wrote: > I tried to run your code and checked (with top) the memory ussage and > it uses more than 2 Gb of memory. > > I tried to modify the code a little bit to use less memory and came up > with this: > > fib = {0:0,1:1} > > even = [] > >

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Alan Gauld
"Karthik" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED] Forgot to include the following information, Platform - win32 Version - 2.5.1 Error message: Traceback (most recent call last): File "C:\Python25\programs\fibo.py", line 10, in if i % 2 == 0: MemoryError OK, It does look

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Daniel Sarmiento
:08 +0530 > From: "Karthik" <[EMAIL PROTECTED]> > Subject: Re: [Tutor] Memory error - how to manage large data sets? > To: > Message-ID: <[EMAIL PROTECTED]> > Content-Type: text/plain; charset="us-ascii" > > Forgot to include the following i

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Danny Yoo
> 1. I need to find the sum of all numbers at even positions in the > Fibonacci series upto 2 million. > > 2. I have used lists to achieve this. I see. You may want to produce a "sequence" or "iterator" of fibonacci numbers rather than an explicit list. As it is, your machine does not ha

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Chris Fuller
On Monday 28 July 2008 10:56, Karthik wrote: > Hi, > > > > I am new to Python programming, I was trying to work out a few problems in > order to grasp the knowledge gained after going through the basic chapters > on Python programming. I got stuck with a memory error. > > > > Following is what I di

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Karthik
Forgot to include the following information, Platform - win32 Version - 2.5.1 Error message: Traceback (most recent call last): File "C:\Python25\programs\fibo.py", line 10, in if i % 2 == 0: MemoryError Code: fib = [] even = [] def fibonacci(x,y): return x+y

Re: [Tutor] Memory error - how to manage large data sets?

2008-07-28 Thread Alan Gauld
"Karthik" <[EMAIL PROTECTED]> wrote I am new to Python programming, I was trying to work out a few problems in order to grasp the knowledge gained after going through the basic chapters on Python programming. I got stuck with a memory error. Always show us the full error text, it contains