Re: [Python-Dev] standard library mimetypes module pathologically broken?

2009-08-16 Thread Jacob Rus
Antoine Pitrou:
> After a fair amount of discussion on Rietveld, I think you should post another
> patch without the deprecations.
> (since the discussion was fairly long, I won't repeat here the reasons I gave
> unless someone asks me to )
> Besides, it would be nice to have the additional tests you were talking about.

I'd guess that if I make another patch, no one else will actually look at the
discussion on the first one  (though maybe no one will look at it either
way)... I'd rather get another couple of opinions about it before burying
that conversation.

> This sounds very pie-in-the-sky compared to the original intent of the patch
> (that is, fix the mimetypes module's implementation oddities).

Okay. At least for me, the goals are twofold, because not only is the
implementation odd, but I consider  the semantics broken as well. But even
just fixing the obvious implementation problems would be a big  improvement.

Cheers,
Jacob Rus

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Steven D'Aprano
On Sat, 15 Aug 2009 04:39:03 am Jason R. Coombs wrote:

> I'd like to express additional interest in python patch 1660179,
> discussed here:
>
> http://mail.python.org/pipermail/patches/2007-February/021687.html
[...]
> But to me, a compose function is much easier to read and much more
> consistent with the decorator usage syntax itself.
>
> def meta_decorator(data):
> return compose(dec_register_function_for_x, dec_alter_docstring,
> dec_inject_some_data(data))

Surely that's better written as:

meta_decorator = compose(dec_register_function_for_x,
dec_alter_docstring, dec_inject_some_data)


> I admit, I may be a bit biased; my first formal programming course
> was taught in Scheme.

Mine wasn't -- I've never even used Scheme, or Lisp, or any other 
functional language. But I've come to appreciate Python's functional 
tools, and would like to give a +0.5 to compose(). +1 if anyone can 
come up with additional use-cases.



-- 
Steven D'Aprano
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Updating tests in branches

2009-08-16 Thread Frank Wierzbicki
I plan on updating the Python unit tests with tests from Jython that
turn out to be generic Python tests.  Should I be putting these tests
into trunk and 3k or should I also put them into the 2.6 and 3.1
maintenance branches as well?

Regards,

-Frank
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updating tests in branches

2009-08-16 Thread Benjamin Peterson
2009/8/16 Frank Wierzbicki :
> I plan on updating the Python unit tests with tests from Jython that
> turn out to be generic Python tests.  Should I be putting these tests
> into trunk and 3k or should I also put them into the 2.6 and 3.1
> maintenance branches as well?

Great!

Usually, unless the test is for a bug we are backporting, new tests
only go in the trunk and py3k.



-- 
Regards,
Benjamin
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Updating tests in branches

2009-08-16 Thread Frank Wierzbicki
On Sun, Aug 16, 2009 at 11:45 AM, Benjamin Peterson wrote:
> 2009/8/16 Frank Wierzbicki :
> Usually, unless the test is for a bug we are backporting, new tests
> only go in the trunk and py3k.
Thanks! I'll do that from now on.

-Frank
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Jason R. Coombs
Steven D'Aprano wrote:
> Sent: Sunday, 16 August, 2009 08:15
>
> On Sat, 15 Aug 2009 04:39:03 am Jason R. Coombs wrote:
>
> >
> > def meta_decorator(data):
> > return compose(dec_register_function_for_x, dec_alter_docstring,
> > dec_inject_some_data(data))
>
> Surely that's better written as:
>
> meta_decorator = compose(dec_register_function_for_x,
> dec_alter_docstring, dec_inject_some_data)

I agree. The former looks unnecessarily complicated.

I purposely chose a non-trivial use case, one which involves a decorator that 
requires a parameter and thus must be called first before the actual decorator 
is returned.  I think for this reason, the former syntax must be used so that 
the meta_decorator also takes the data parameter and constructs the proper 
"inject" decorator.  Put another way, both dec_inject_some_data and 
meta_decorator are more like decorator factories.

I suspect a simpler, and more common use-case would be like the one you 
described, where either data is global or the "inject" decorator is not used:

meta_decorator = compose(dec_register_function_for_x, dec_alter_docstring)

>
> Mine wasn't -- I've never even used Scheme, or Lisp, or any other
> functional language. But I've come to appreciate Python's functional
> tools, and would like to give a +0.5 to compose(). +1 if anyone can
> come up with additional use-cases.

Thanks for the interest.  I decided to search through some of my active code 
for lambdas and see if there are areas where I would prefer to be using a 
compose function instead of an explicit lambda/reduce combination.

I only found one such application; I attribute this limited finding to the fact 
that I probably elected for a procedural implementation when the functional 
implementation might have proven difficult to read, esp. with lambda.

1) Multiple string substitutions.  You have a list of functions that operate on 
a string, but you want to collect them into a single operator that can be 
applied to a list of strings.

sub_year = lambda s: s.replace("%Y", "2009")

fix_strings_with_substituted_year = compose(str.strip, textwrap.dedent, 
sub_year)
map(fix_strings_with_substituted_year, target_strings)

Moreover, it would be great to be able to accept any number of substitutions.

substitutions = [sub_year, sub_month, ...]
fix_strings_with_substitutions = compose(str.strip, textwrap.dedent, 
*substitutions)



I did conceive of another possibly interesting use case: vector translation.

Consider an application that performs mathematical translations on 
n-dimensional vectors.  While it would be optimal to use optimized matrix 
operations to perform these translations, for the sake of this example, all we 
have are basic Python programming constructs.

At run-time, the user can compose an experiment to be conducted on his series 
of vectors. To do this, he selects from a list of provided translations and can 
provide his own.  These translations can be tagged as named translations and 
thereafter used as translations themselves.  The code might look something like:

translations = selected_translations + custom_translations
meta_translation = compose(*translations)
save_translation(meta_translation, "My New Translation")

def run_experiment(translation, vectors):
result = map(translation, vectors)
# do something with result

Then, run_experiment can take a single translation or a meta-translation such 
as the one created above. This use-case highlights that a composed functions 
must take and return exactly one value, but that the value need not be a 
primitive scalar.



I'm certain there are other, more obscure examples, but I feel these two use 
cases demonstrate some fairly common potential use cases for something like a 
composition function.

Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Antoine Pitrou
Jason R. Coombs  jaraco.com> writes:
> 
> I'm certain there are other, more obscure examples, but I feel these two use
cases demonstrate some fairly
> common potential use cases for something like a composition function.

I also think it would be a nice addition.
(but someone has to propose a patch :-))

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Raymond Hettinger


[Antoine Pitrou]

I also think it would be a nice addition.
(but someone has to propose a patch :-))


I agree with Martin's reasons for rejecting the feature request
(see the bug report for his full explanation).  IIRC, the compose() 
idea had come-up and been rejected in previous discussions as well.


At best, it will be a little syntactic sugar (though somewhat odd because
the traditional mathematical ordering of a composition operator is the
opposite of what intuition would suggest).  At worst, it will be slower
and less flexible than our normal ways of linking functions together.

IMO, its only virtue is that people coming from functional languages
are used to having compose.  Otherwise, it's a YAGNI.


Raymond



___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Jason R. Coombs


> Raymond Hettinger wrote:
> Sent: Sunday, 16 August, 2009 12:42
>
> [Antoine Pitrou]
> > I also think it would be a nice addition.
> > (but someone has to propose a patch :-))

The patch was proposed and rejected here: http://bugs.python.org/issue1660179; 
my reason for mentioning it here is because the functionality isn't YAGNI for 
me; It seems like a fundamental capability when employing a functional 
programming paradigm.


> I agree with Martin's reasons for rejecting the feature request
> (see the bug report for his full explanation).  IIRC, the compose()
> idea had come-up and been rejected in previous discussions as well.
>
> At best, it will be a little syntactic sugar (though somewhat odd
> because
> the traditional mathematical ordering of a composition operator is the
> opposite of what intuition would suggest).  At worst, it will be slower
> and less flexible than our normal ways of linking functions together.
>
> IMO, its only virtue is that people coming from functional languages
> are used to having compose.  Otherwise, it's a YAGNI.

Right.  I have great respect for your and Martin's original conclusion.

The reason I came across the old patch was because I was searching for 
something that did exactly what compose does. That is to say, I had a use case 
that was compelling enough that I thought there should be something in 
functools to do what I wanted.  I've encountered this pattern often enough that 
it might be in the stdlib.

As it turns out, it isn't.  For this reason, I wanted to voice my opinion that 
contradicts the conclusion of the previous patch discussion.  Specifically, 
YAGNI doesn't apply to my experiences, and it does seem to have broad, 
fundamental application, especially with respect to functional programming.

I'm not arguing that just because Jason needs it, it should be in the standard 
library.  Rather, I just wanted to express that, like Chris AtLee, I would find 
this function quite useful.

As Steven pointed out, this functionality is desirable even for those without a 
functional programming background.  I'd like to mention also that even though I 
learned to program in Scheme in 1994, I haven't used it since, and I've been 
using Python since 1996, so my affinity for this function is based almost 
entirely from experiences programming in Python and not in a primarily 
functional language.

If the Python community still concurs that 'compose' is YAGNI or otherwise 
undesirable, I understand.  I just wanted to share my experiences and 
motivations as they pertain to the discussion.  If it turns out that it's 
included in the stdlib later, all the better.

Respectfully,
Jason
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] random number generator state

2009-08-16 Thread Scott David Daniels

Raymond Hettinger wrote:

[Scott David Daniels]

I find I have a need in randomized testing for a shorter version
of getstate, even if it _is_ slower to restore.  [blah about big state]


Sounds like you could easily wrap the generator to get this.
It would slow you down but would give the information you want.

Well, I was thinking that this might be generally useful for randomized
testing.


I think it would be a mistake to complexify the API to accomodate
short states -- I'm not even sure than they are generally useful
(recording my initial seed and how many cycles I've run through
is only helpful for sequences short enough that I'm willing to rerun
them).

Right, that was what I was asking about.  The complexity of the change
grew on me; I hadn't realized at the outset it would be more than adding
a counter internally.  Consider me officially dissuaded.

I'm curious what your use case is.  Why not just record the the sequence 
as generated -- I don't see any analytic value to
just knowing the initial seed and cycle count. 

I'm building data structures controlled by an rng, and then performing
sequences of (again randomly controlled) operations on those data
structures, check all invariants at each step.  I then lather, rinse,
repeat recording the start of each failing experiment.  In the morning I
come in and look for commonality in the cases I see.  Having the short
state means I  means I can easily rebuild the data structure and command
list to see what is going on.  I prune commands, simplify the tree, and
thus isolate the problem I found.

I did enough experimenting to see that if I simply provide access to run
N cycles of the block, I can actually do 2**32 cycles in feasible time,
so I have a pair of counters, and the code should take long enough for
eternity to show up before the wrap.

My short state is:
seed, block_index, cycles_low, cycles_high, floating

(block_index + 625 * (cycles_low + (cycles_high << 32)) is the position,
and could be done as such; the pieces reflect the least-expensive cost
in performance to the rng. floating is simply the same final floating
piece that the state keeps now.


Ability to print out a short state implies that you are using only a
small subset of possible states (i.e. the ones you can get to with
a short seed). 

Well, as you see above, I do capture the seed.  I realize that the time-
constructed seeds are distinct from identically provided values as small
ints, and I also mark when the rng gets called by set_state to indicate
that I then know nothing about the seed.
>> mt[0] = 0x8000UL; /* MSB is 1; assuring non-zero initial array */
>> but probably should be:
>> mt[0] |= 0x8000UL; /* MSB is 1; assuring non-zero initial array*/
> Please file a bug report for this and assign to me  Also, our
> tests for MT exactly reproduce their published test sequence.
>
I've been assured it is not a bug, and I filed no report since I had 
just arrived at the point of suspicion.


To summarize, I am officially dissuaded, and will post a recipe if I
get something nice working.

--Scott David Daniels
[email protected]

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Antoine Pitrou
Raymond Hettinger  rcn.com> writes:
> 
> IMO, its only virtue is that people coming from functional languages
> are used to having compose.  Otherwise, it's a YAGNI.

Then I wonder how partial() ended up in the stdlib. It seems hardly more
useful than compose().
Either we decide it is useful to have a set of basic "functional" tools
in the stdlib, and both partial() and compose() have their place there,
or we decide functools has no place in the stdlib at all. Providing a
half-assed module is probably frustrating to its potential users.

(not being particularly attached to functional tools, I still think
compose() has its value, and Jason did a good job of presenting
potential use cases)

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Martin v. Löwis
> The reason I came across the old patch was because I was searching
> for something that did exactly what compose does. That is to say, I
> had a use case that was compelling enough that I thought there should
> be something in functools to do what I wanted.  I've encountered this
> pattern often enough that it might be in the stdlib.

Can you kindly give one or two examples of where compose would have been
useful?

Regards,
Martin

___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Martin v. Löwis
> Then I wonder how partial() ended up in the stdlib. 

PEP 309 was written, discussed, approved, and implemented - that's how
partial ended up in the stdlib. The feature itself might be debatable,
that's what we have the PEP process for.

> Either we decide it is useful to have a set of basic "functional" tools
> in the stdlib, and both partial() and compose() have their place there,
> or we decide functools has no place in the stdlib at all. Providing a
> half-assed module is probably frustrating to its potential users.

So write a PEP and propose to enhance the standard library.

> (not being particularly attached to functional tools, I still think
> compose() has its value, and Jason did a good job of presenting
> potential use cases)

I don't think he did. Comparing it to the one obvious solution (use
a lambda expression), his only reasoning was "it is much easier to
read". I truly cannot believe that a compose function would be easier
to read to the average Python programmer: if you have

  def foo(data):
return compose(a, b(data), c)

what would you expect that to mean? Please rewrite it as a regular
Python expression, preferably without looking at the patch that
has been proposed first. I bet there is a 50% chance that you get
it wrong (because there are two possible interpretations).

Regards,
Martin
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] another Py_TPFLAGS_HEAPTYPE question

2009-08-16 Thread Joshua Haberman
I wrote to this list a few weeks ago asking about Py_TPFLAGS_HEAPTYPE
(http://thread.gmane.org/gmane.comp.python.devel/105648).  It occurred
to me today that I could probably make object instances INCREF and
DECREF my type appropriately, without setting Py_TPFLAGS_HEAPTYPE, by
writing my own tp_alloc and tp_dealloc functions.  My tp_alloc function
could be:

PyObject *my_tp_alloc(PyTypeObject *type, Py_ssize_t nitems)
{
  PyObject *obj = PyType_GenericAlloc(type, nitems);
  if(obj) Py_INCREF(type);
  return obj;
}

This seems right since it is PyType_GenericAlloc that contains this
excerpt:

  if (type->tp_flags & Py_TPFLAGS_HEAPTYPE)
Py_INCREF(type);

I don't want to set Py_TPFLAGS_HEAPTYPE, but I want to get that
Py_INCREF(), so far so good.

But I couldn't find the corresponding Py_DECREF() in typeobject.c to
the above Py_INCREF().  Notably, object_dealloc() does not call
Py_DECREF(self->ob_type) if self->ob_type has the Py_TPFLAGS_HEAPTYPE
flag set.

So where does the Py_DECREF() for the above Py_INCREF() live?  I
expected to find this code snippet somewhere, but couldn't:

  if (type->tp_flags & Py_TPFLAGS_HEAPTYPE)
Py_DECREF(type);

Josh
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Martin v. Löwis
Martin v. Löwis wrote:
>> The reason I came across the old patch was because I was searching
>> for something that did exactly what compose does. That is to say, I
>> had a use case that was compelling enough that I thought there should
>> be something in functools to do what I wanted.  I've encountered this
>> pattern often enough that it might be in the stdlib.
> 
> Can you kindly give one or two examples of where compose would have been
> useful?

I went back in the archives and found your example. What I now don't
understand is why you say that a compose function would be easier to
read than a lambda expression. Can you please elaborate on that?

I deeply believe that it is *harder* to read than a lambda expression,
because the lambda expression makes the evaluation order clear, whereas
the compose function doesn't (of course, function decorators ought to be
commutative, so in this case, lack of clear evaluation order might be
less important).

Regards,
Martin
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] another Py_TPFLAGS_HEAPTYPE question

2009-08-16 Thread Martin v. Löwis
> So where does the Py_DECREF() for the above Py_INCREF() live?  I
> expected to find this code snippet somewhere, but couldn't:
> 
>   if (type->tp_flags & Py_TPFLAGS_HEAPTYPE)
> Py_DECREF(type);

For a regular heaptype, it's in subtype_dealloc:

/* Can't reference self beyond this point */
Py_DECREF(type);

HTH,
Martin
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] another Py_TPFLAGS_HEAPTYPE question

2009-08-16 Thread Joshua Haberman
On Sun, Aug 16, 2009 at 3:37 PM, "Martin v. Löwis" wrote:
>> So where does the Py_DECREF() for the above Py_INCREF() live?  I
>> expected to find this code snippet somewhere, but couldn't:
>>
>>   if (type->tp_flags & Py_TPFLAGS_HEAPTYPE)
>>     Py_DECREF(type);
>
> For a regular heaptype, it's in subtype_dealloc:
>
>                /* Can't reference self beyond this point */
>                Py_DECREF(type);

Thanks for the pointer.  I noticed that subtype_dealloc is only called for types
that are allocated using type_new().  Does this mean that it is not
safe to create
types in C using just PyType_Ready() and set Py_TPFLAGS_HEAPTYPE on
them?  The documentation is not clear on this point.

Here is what I would like to do when I create my types dynamically:

- implement tp_alloc and tp_dealloc() to INCREF and DECREF the type.
- not set Py_TPFLAGS_HEAPTYPE.
- set Py_TPFLAGS_HAVE_GC (because instances of my obj can create cycles)

Does this seem safe?  I notice that subtype_dealloc() does some funky
GC/trashcan stuff.  Is it safe for me not to call subtype_dealloc?  Can I
safely implement my tp_dealloc function like this?

void my_tp_dealloc(PyObject *obj)
{
  obj->ob_type->tp_free(obj);
  Py_DECREF(obj->ob_type);
}

Thanks,
Josh
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Antoine Pitrou

> PEP 309 was written, discussed, approved, and implemented - that's how
> partial ended up in the stdlib.

Ok, I'm surprised that a single addition to a module needed a PEP in
order to be approved.

Interestingly, here's what the summary section in PEP 309 says: 
« A standard library module functional should contain an implementation
of partial, /and any other higher-order functions the community want/. »
(emphasis mine)

> I truly cannot believe that a compose function would be easier
> to read to the average Python programmer: if you have
> 
>   def foo(data):
> return compose(a, b(data), c)
> 
> what would you expect that to mean? Please rewrite it as a regular
> Python expression, preferably without looking at the patch that
> has been proposed first.

Ok, here's my attempt without looking at the patch:

def foo(data):
def bar(*args, **kwargs):
return a(b(data)(c(*args, **kwargs)))
return bar

Whether or not it is easier to read to the "average Python programmer"
is not that important I think. We have lots of things that certainly
aren't, and yet still exist (all of the functions in the operator
module, for example; or `partial` itself for that matter). They are
there for advanced programmers.

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] functools.compose to chain functions together

2009-08-16 Thread Greg Ewing

Jason R. Coombs wrote:


I had a use case that was compelling enough that I thought there

> should be something in functools to do what I wanted.

I think this is one of those things that a small minority of
people would use frequently, but everyone else would use
very rarely or never. The decision on whether to include
something in the stdlib needs to be based on the wider
picture.

In this case, it's trivial to write your own if you want
it. As they say, "not every one-line function needs to
be in the stdlib".

--
Greg
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] another Py_TPFLAGS_HEAPTYPE question

2009-08-16 Thread Benjamin Peterson
2009/8/16 Joshua Haberman :
> On Sun, Aug 16, 2009 at 3:37 PM, "Martin v. Löwis" wrote:
>>> So where does the Py_DECREF() for the above Py_INCREF() live?  I
>>> expected to find this code snippet somewhere, but couldn't:
>>>
>>>   if (type->tp_flags & Py_TPFLAGS_HEAPTYPE)
>>>     Py_DECREF(type);
>>
>> For a regular heaptype, it's in subtype_dealloc:
>>
>>                /* Can't reference self beyond this point */
>>                Py_DECREF(type);
>
> Thanks for the pointer.  I noticed that subtype_dealloc is only called for 
> types
> that are allocated using type_new().  Does this mean that it is not
> safe to create
> types in C using just PyType_Ready() and set Py_TPFLAGS_HEAPTYPE on
> them?  The documentation is not clear on this point.
>
> Here is what I would like to do when I create my types dynamically:
>
> - implement tp_alloc and tp_dealloc() to INCREF and DECREF the type.
> - not set Py_TPFLAGS_HEAPTYPE.
> - set Py_TPFLAGS_HAVE_GC (because instances of my obj can create cycles)

[Note that this is really starting to get off topic for python-dev.]

Why do you need to set Py_TPFLAGS_HEAPTYPE on your C type? Is a normal
static type not sufficient? The easiest way to create heaptypes is to
simply call PyType_Type.


-- 
Regards,
Benjamin
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] another Py_TPFLAGS_HEAPTYPE question

2009-08-16 Thread Greg Ewing

Benjamin Peterson wrote:


Why do you need to set Py_TPFLAGS_HEAPTYPE on your C type?


I think he *doesn't* want to set Py_TPFLAGS_HEAPTYPE, but
does want to create the type dynamically.

But I suspect this is actually FUD, and that letting
Py_TPFLAGS_HEAPTYPE be set wouldn't lead to anything
disastrous happening.

Note that by not giving instances a __dict__, they
will be prevented from having arbitrary attributes
set on them, which is the most noticeable distinction
between built-in and user-defined types.

--
Greg
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] random number generator state

2009-08-16 Thread Greg Ewing

Scott David Daniels wrote:


No, I don't really need MT.  The others would be fine.
I'd love further details.


The one I've been working with is due to Pierre L'Ecuyer [1]
and is known as MRG32k3a. It's a combined multiple recursive
linear congruential generator with 6 words of state. The
formulas are

r1[i] = (a12 * r1[i-2] + a13 * r1[i-3]) % m1
r2[i] = (a21 * r2[i-1] + a23 * r2[i-3]) % m2
r[i] = (r1[i] - r2[i]) * m1

where

m1 = 2**32 - 209
m2 = 2**32 - 22835

a12 = 1403580
a13 = -810728
a21 = 527612
a23 = -1370589

If you consider the state to be made up of two 3-word
state vectors, then there are two 3x3 matrices which
map a given state onto the next state. So to jump
ahead n steps in the sequence, you raise these matrices
to the power of n.

I've attached some code implementing this generator
together with the jumping-ahead. (Sorry it's in C++,
I hadn't discovered Python when I wrote it.)

[1] Pierre L'Ecuyer, Good Parameters and Implementations for
Combined Multiple Recursive Random Number Generators,
Operations Research v47 no1 Jan-Feb 1999
http://www.iro.umontreal.ca/~lecuyer/myftp/papers/combmrg2.ps

--
Greg
/*
 *   cmr_random_generator.C
 *   ==
 *
 *   Combined Multiple Recursive random number generator.
 *
 *   This is an implementation of Pierre L'Ecuyer's
 *   MRG32k3a generator, described in:
 *
 * Pierre L'Ecuyer, Good Parameters and Implementations for
 * Combined Multiple Recursive Random Number Generators,
 * Operations Research v47 no1 Jan-Feb 1999 
 */

#include "cmr_random_generator.H"

static const double
norm = 2.328306549295728e-10,
m1   = 4294967087.0,
m2   = 429493.0,
a12  =1403580.0,
a13  =-810728.0,
a21  = 527612.0,
a23  =   -1370589.0;

static double a[2][3][3] = {
  {
{0.0, 1.0, 0.0},
{0.0, 0.0, 1.0},
{a13, a12, 0.0}
  },
  {
{0.0, 1.0, 0.0},
{0.0, 0.0, 1.0},
{a23, 0.0, a21}
  }
};

static double m[2] = {
  m1,
  m2
};

static double init_s[2][3] = {
  {1.0, 1.0, 1.0},
  {1.0, 1.0, 1.0}
};

inline static double mod(double x, double m) {
  long k = (long)(x / m);
  x -= k * m;
  if (x < 0.0)
x += m;
  return x;
}

/*
 *   Initialisation
 */

CMRRandomGenerator::CMRRandomGenerator() {
  for (int i = 0; i <= 1; i++)
for (int j = 0; j <= 2; j++)
  s[i][j] = init_s[i][j];
}

/*
 *   Advance CMRG one step and return next number
 */

double CMRRandomGenerator::Next() {
  double p1 = mod(a12 * s[0][1] + a13 * s[0][0], m1);
  s[0][0] = s[0][1];
  s[0][1] = s[0][2];
  s[0][2] = p1;
  double p2 = mod(a21 * s[1][2] + a23 * s[1][0], m2);
  s[1][0] = s[1][1];
  s[1][1] = s[1][2];
  s[1][2] = p2;
  double p = p1 - p2;
  if (p < 0.0)
p += m1;
  return (p + 1) * norm;
}

typedef unsigned long long Int64;
typedef Int64 CMRG_Vector[3];
typedef Int64 CMRG_Matrix[3][3];

static Int64 ftoi(double x, double m) {
  if (x >= 0.0)
return Int64(x);
  else
return Int64((long double)x + (long double)m);
}

static double itof(Int64 i, Int64 m) {
  return i;
}

static void v_ftoi(double u[], CMRG_Vector v, double m) {
  for (int i = 0; i <= 2; i++)
v[i] = ftoi(u[i], m);
}

static void v_itof(CMRG_Vector u, double v[], Int64 m) {
  for (int i = 0; i <= 2; i++)
v[i] = itof(u[i], m);
}

static void v_copy(CMRG_Vector u, CMRG_Vector v) {
  for (int i = 0; i <= 2; i++)
v[i] = u[i];
}

static void m_ftoi(double a[][3], CMRG_Matrix b, double m) {
  for (int i = 0; i <= 2; i++)
for (int j = 0; j <= 2; j++)
  b[i][j] = ftoi(a[i][j], m);
}

static void m_copy(CMRG_Matrix a, CMRG_Matrix b) {
  for (int i = 0; i <= 2; i++)
for (int j = 0; j <= 2; j++)
  b[i][j] = a[i][j];
}

static void mv_mul(CMRG_Matrix a, CMRG_Vector u, CMRG_Vector v, Int64 m) {
  CMRG_Vector w;
  int i, j;
  for (i = 0; i <= 2; i++) {
w[i] = 0;
for (j = 0; j <= 2; j++)
  w[i] = (a[i][j] * u[j] + w[i]) % m;
  }
  v_copy(w, v);
}

static void mm_mul(CMRG_Matrix a, CMRG_Matrix b, CMRG_Matrix c, Int64 m) {
  CMRG_Matrix d;
  int i, j, k;
  for (i = 0; i <= 2; i++) {
for (j = 0; j <= 2; j++) {
  d[i][j] = 0;
  for (k = 0; k <= 2; k++)
d[i][j] = (a[i][k] * b[k][j] + d[i][j]) % m;
}
  }
  m_copy(d, c);
}

/*
 *   Advance the CMRG by n*2^e steps
 */

void CMRRandomGenerator::Advance(unsigned long n, unsigned int e) {
  CMRG_Matrix B[2];
  CMRG_Vector S[2];
  Int64 M[2];
  int i;
  for (i = 0; i <= 1; i++) {
m_ftoi(a[i], B[i], m[i]);
v_ftoi(s[i], S[i], m[i]);
M[i] = Int64(m[i]);
  }
  while (e--) {
for (i = 0; i <= 1; i++)
  mm_mul(B[i], B[i], B[i], M[i]);
  }
  while (n) {
if (n & 1)
  for (i = 0; i <= 1; i++)
mv_mul(B[i], S[i], S[i], M[i]);
n >>= 1;
if (n)
  for (i = 0; i <= 1; i++)
mm_mul(B[i], B[i], B[i], M[i]);
  }
  for (i = 0; i <= 1; i++)
v_itof(S[i], s[i], M[i]);
}
/*
 *   cmr_random_generator.H
 *   ==
 *
 *   Combined Multiple Recursive random number generator.

[Python-Dev] [RELEASED] Python 3.1.1

2009-08-16 Thread Benjamin Peterson
On behalf of the Python development team, I'm happy to announce the first bugfix
release of the Python 3.1 series, Python 3.1.1.

This bug fix release fixes many normal bugs and several critical ones including
potential data corruption in the io library.

Python 3.1 focuses on the stabilization and optimization of the features and
changes that Python 3.0 introduced.  For example, the new I/O system has been
rewritten in C for speed.  File system APIs that use unicode strings now handle
paths with undecodable bytes in them. Other features include an ordered
dictionary implementation, a condensed syntax for nested with statements, and
support for ttk Tile in Tkinter.  For a more extensive list of changes in 3.1,
see http://doc.python.org/3.1/whatsnew/3.1.html or Misc/NEWS in the Python
distribution.

Please note the Windows and Mac binaries are not available yet but
will be in the coming days.

To download Python 3.1.1 visit:

 http://www.python.org/download/releases/3.1.1/

The 3.1 documentation can be found at:

 http://docs.python.org/3.1

Bugs can always be reported to:

 http://bugs.python.org


Enjoy!

--
Benjamin Peterson
Release Manager
benjamin at python.org
(on behalf of the entire python-dev team and 3.1.1's contributors)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] another Py_TPFLAGS_HEAPTYPE question

2009-08-16 Thread Martin v. Löwis
> Thanks for the pointer.  I noticed that subtype_dealloc is only called for 
> types
> that are allocated using type_new().  Does this mean that it is not
> safe to create
> types in C using just PyType_Ready() and set Py_TPFLAGS_HEAPTYPE on
> them?  The documentation is not clear on this point.

As Benjamin says, this is getting off-topic - python-dev is not a place
to ask for help in your project.

I believe setting flags on a type is inherently unsafe.

> Here is what I would like to do when I create my types dynamically:
> 
> - implement tp_alloc and tp_dealloc() to INCREF and DECREF the type.
> - not set Py_TPFLAGS_HEAPTYPE.
> - set Py_TPFLAGS_HAVE_GC (because instances of my obj can create cycles)
> 
> Does this seem safe?  I notice that subtype_dealloc() does some funky
> GC/trashcan stuff.  Is it safe for me not to call subtype_dealloc?  Can I
> safely implement my tp_dealloc function like this?

If you bypass documented API, you really need to study the code,
understand its motivation, judge whether certain usage is "safe" wrt.
to the current implementation, and judge the likelihood of this code
not getting changed in future versions.

Regards,
Martin
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com