Amaury wrote:
>Well, not C only code. There are 140 occurrences of "cast_adr_to_int" in
>RPython code...
OK, this is really helpful. It's exactly the kind of info I was hoping to hear
when starting this thread, and what I was looking for on the web site before
that.
I the spirit of getting al
OK, I reworked (and attached) the patch for the suggestions to nt_threads.[ch].
You had no comments for dtoa.c.
This is great. From inspection, there doesn't seem to be any other changes
needed to the C code in rpython/translator/c for Windows 64 support. I am
assuming the 'gcc' folder there
Maciej Fijalkowski wrote:
>It's just a bunch of work. There is nothing special or magic about it,but so
>far noone volunteered to spend enough effort there.
Got it. I'm just trying to understand what the work is because that list hasn't
been captured anywhere yet.
>Essentially, the main probl
Amaury wrote:
>Surely we could have another copy with the largeadressaware flag?
I agree. That's a smart way to proceed for now.
I'm still wondering if there is any technical reason against the flag?
Particularly the "CFFI extensions want the high bit reserved or not" issue.
Matt wrote:
____
From: Amaury Forgeot d'Arc
To: matti picus
Cc: Roger Flores ; PyPy Developer Mailing List
Sent: Saturday, April 6, 2013 7:36 AM
Subject: Re: [pypy-dev] 2GB Limit on Windows
2013/4/6 matti picus
I was under the impression that that link flag can cause problems for
Is the 2GB limit being lifted on the Windows nightlies? I ask because I keep
bumping, painfully, into it. I found this link:
http://doc.pypy.org/en/latest/windows.html#preping-windows-for-the-large-build
Googling more about "editbin /largeaddressaware"
http://www.velocityreviews.com/forums/t4
On Mon, Mar 4, 2013 at 1:13 AM, Maciej Fijalkowski wrote:
>print sys.maxint
print hex(sys.maxint)
0x7fff
That works. Not so obvious though.
On Monday, March 4, 2013 1:20 AM, Antonio Cuni wrote:
>just a wild guess: is it possible that you generated pyc files with a 32bit
>ve
On March 3, 2013 2:20 AM, Carl Friedrich Bolz wrote:
>Are you *sure* you are running on a 64 bit machine?
Sure? No. I assumed it's 64bit pypy because it was generating x86_64
instructions. How would you check for sure?
uname reports x86_64 on the machine I built pypy on.
$ pypy --version
Pyt
Thanks Armin. This explains a lot.
I get it better now.
-Roger
- Original Message -
From: Armin Rigo
To: Roger Flores
Cc: "pypy-dev@python.org"
Sent: Friday, March 1, 2013 2:20 PM
Subject: Re: [pypy-dev] Slow int code
Hi Roger,
On Fri, Mar 1, 2013 at 10:13 PM, Ro
an be happy with only 32 bits, I just
want Pypy to be happy too.
Thanks Armin,
-Roger
From: Armin Rigo
To: Roger Flores
Cc: "pypy-dev@python.org"
Sent: Friday, March 1, 2013 2:34 AM
Subject: Re: [pypy-dev] Slow int code
Hi Roger,
On Thu,
above in jitviewer has the same issues. If I comment out the two calls to
encode(), I save a huge percentage of time (up to 40% in some configurations).
-Roger
From: Alex Gaynor
To: Roger Flores
Cc: "pypy-dev@python.org"
Sent: Wednesday, Febr
Would you like a paste from jitviewer or the source code to run and examine
with jitviewer?
-Roger
From: Alex Gaynor
To: Roger Flores
Cc: "pypy-dev@python.org"
Sent: Wednesday, February 27, 2013 12:23 PM
Subject: Re: [pypy-dev] Slow int code
Hi guys. I've been looking at two simple routines using jitviewer to figure
out why they're so much slower than expected.
I've also noticed that http://pypy.org/performance.html has the line "Bad
examples include doing computations with
large longs – which is performed by unoptimizable support
ory.
OK. It sounds like it already works, just needs a large rpython_reserve.
Again, pick a large number, show it works, optimize later?
-Roger
____
From: Armin Rigo
To: Roger Flores
Cc: "pypy-dev@python.org"
Sent: Wednesday, August 8, 2012 1:
On Tue, Aug 7, 2012 at 6:33 AM, Armin Rigo wrote:
> If you can come up with a more precise scheme, you're welcome. The
> issue is to know when it's ok to reserve from that pool and when we
> should raise an RPython MemoryError instead. A possible answer would
I'm not convinced this MemoryErr
c_out_of_n
ursery_nonsmall
Fatal RPython error: MemoryError
Can someone else at least confirm this?
Thanks,
-Roger Flores
exhaust-mem.py:
"""
Code to exhaust memory and generate a MemoryError exception
"""
size = 1
# find the largest memory allocation that succe
to break it, along the lines of your thinking.
-Roger
________
From: Armin Rigo
To: Roger Flores ; PyPy Developer Mailing List
Sent: Thursday, April 19, 2012 7:51 AM
Subject: Re: [pypy-dev] pypy MemoryError crash
Hi Roger,
On Tue, Apr 17, 2012 at 18:58, Roger Flores wrote:
> Were either of yo
Hello all. A bug in a python application of mine resulted in this pypy crash:
RPython traceback:
File "translator_goal_targetpypystandalone.c", line 1033, in entry_point
File "interpreter_function.c", line 1017, in funccall__star_1
File "interpreter_function.c", line 1046, in funccall__star
Hello all. I'm getting an OverflowError error only when I run my program in
pypy. I've simplified it to a couple of lines:
binasciiproblem.py:
import binascii
value = 0
new_value = 'a'
value = binascii.crc32(new_value, value) & 0x
value = binascii.crc32(new_value, value) & 0x
19 matches
Mail list logo