Re: Decorators not worth the effort

2012-09-19 Thread Jean-Michel Pichavant


- Original Message -
 Jean-Michel Pichavant jeanmic...@sequans.com writes:
 
  - Original Message -
  Jean-Michel Pichavant wrote:
  [snip]
  One minor note, the style of decorator you are using loses the
  docstring
  (at least) of the original function. I would add the
  @functools.wraps(func)
  decorator inside your decorator.
 
  Is there a way to not loose the function signature as well ?
 
 Look at the decorator module.
 

Great, thank you.

JM
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-18 Thread Thomas Rachel

Am 15.09.2012 16:18 schrieb 8 Dihedral:


The concept of decorators is just a mapping from a function


... or class ...

 to another function

... or any other object ...

 with the same name in python.


Thomas
--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-18 Thread Neil Cerutti
On 2012-09-14, Chris Angelico ros...@gmail.com wrote:
 But then again, who actually ever needs fibonacci numbers?

If it should happen that your question is not facetious:

http://en.wikipedia.org/wiki/Fibonacci_number#Applications

-- 
Neil Cerutti
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-18 Thread Chris Angelico
On Tue, Sep 18, 2012 at 11:19 PM, Neil Cerutti ne...@norwich.edu wrote:
 On 2012-09-14, Chris Angelico ros...@gmail.com wrote:
 But then again, who actually ever needs fibonacci numbers?

 If it should happen that your question is not facetious:

 http://en.wikipedia.org/wiki/Fibonacci_number#Applications

It wasn't entirely facetious. I know there are a few cases where
they're needed, but I think they're calculated far more often to
demonstrate algorithms than because you actually have use of them. :)

Though it's as well to mention these sorts of things now and then. I
remember one time writing up something or other, and my dad was
looking over my shoulder and asked me why I'd written a little
Pascal's Triangle generator. He didn't know that it had direct
application to whatever-it-was. And unfortunately I don't remember
what I was even writing at the time :)

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-18 Thread 88888 Dihedral
Chris Angelico於 2012年9月18日星期二UTC+8下午9時25分04秒寫道:
 On Tue, Sep 18, 2012 at 11:19 PM, Neil Cerutti ne...@norwich.edu wrote:
 
  On 2012-09-14, Chris Angelico ros...@gmail.com wrote:
 
  But then again, who actually ever needs fibonacci numbers?
 
 
 
  If it should happen that your question is not facetious:
 
 
 
  http://en.wikipedia.org/wiki/Fibonacci_number#Applications
 
 
 
 It wasn't entirely facetious. I know there are a few cases where
 
 they're needed, but I think they're calculated far more often to
 
 demonstrate algorithms than because you actually have use of them. :)
 
 
 
 Though it's as well to mention these sorts of things now and then. I
 
 remember one time writing up something or other, and my dad was
 
 looking over my shoulder and asked me why I'd written a little
 
 Pascal's Triangle generator. He didn't know that it had direct
 
 application to whatever-it-was. And unfortunately I don't remember
 
 what I was even writing at the time :)
 
 
 
 ChrisA

I would suggest one should solve the Fibnaci(5) first and fast in Python.

Then one can think about computing c(n,k) in Python for large n.


Then
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-18 Thread Dieter Maurer
Jean-Michel Pichavant jeanmic...@sequans.com writes:

 - Original Message -
 Jean-Michel Pichavant wrote:
 [snip]
 One minor note, the style of decorator you are using loses the
 docstring
 (at least) of the original function. I would add the
 @functools.wraps(func)
 decorator inside your decorator.

 Is there a way to not loose the function signature as well ?

Look at the decorator module.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-18 Thread 88888 Dihedral
Terry Reedy於 2012年9月15日星期六UTC+8上午4時40分32秒寫道:
 2nd try, hit send button by mistake before
 
 
 
 On 9/14/2012 5:28 AM, Jean-Michel Pichavant wrote:
 
 
 
  Decorators are very popular so I kinda already know that the fault is
 
  mine. Now to the reason why I have troubles writing them, I don't
 
  know. Every time I did use decorators, I spent way too much time
 
  writing it (and debugging it).
 
 
 
 You are writing parameterized decorators, which require inverted 
 
 currying of the wrapper maker. Perhaps that is why you have trouble. As 
 
 I showed in response to Cameron, it may be easier to avoid that by using 
 
 a traditional post-def wrapping call instead of decorator syntax.
 
 
  
 -- 
 
 Terry Jan Reedy

I'll give another example to show the decorators in python in
versions above 2.4 .

# a general function with the variable input : def  fname( *argc,  **argn)
# a deco is a mapping from an input  funtion to another  function

def deco( fn, *list_in, **dict_in): # use list_in and dict_in to modify fn
 deco wrapper  # deco.__doc__ 

#print list_in, dict_in,  in the deco
 
def wrapper( fn, *argc, **argan): # to be returned as a function
   # do things one wants before calling fn 
   result=fn(*argc, **argn)  # call the original, save the result
   # do things after calling fn   
   return result 
# enhance the  wrapper and get info of fn
wrapper.__doc__=fn.__doc__
# enhance wrapper with result, fn, list_in, dict_in
#
 return wrapper

def f1():
doc of f1
   print inside f1

f2=deco(f1, 2,3,4,5,6, MSG=deco f1 to f2)

f2() # invoke the decorated function from f1


# For a deco maps a deco to another deco  can be done similarly


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-17 Thread Jean-Michel Pichavant


- Original Message -
 Jean-Michel Pichavant wrote:
[snip]
 One minor note, the style of decorator you are using loses the
 docstring
 (at least) of the original function. I would add the
 @functools.wraps(func)
 decorator inside your decorator.

Is there a way to not loose the function signature as well ?

help (t.api.spuAgc)
 spuAgc(self, iterations, backoffTarget, step) method of ...



But once decorated with this:

def stdApi(func):
@functools.wraps(func)
def inner(self, *args, **kwargs):
rsp = func(self, *args, **kwargs)
result = TncCmnResult()
result.returnCode = self._getReturnCode(rsp)
return result
return inner

help (t.api.spuAgc)
 t.api.spuAgc(self, *args, **kwargs) method of 

Quite annoying :-/

JM
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-16 Thread alex23
On Sep 15, 6:30 am, Terry Reedy tjre...@udel.edu wrote:
  On 13Sep2012 18:58, alex23 wuwe...@gmail.com wrote:
  | On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
  | wrote:
  |  I don't like decorators, I think they're not worth the mental effort.
  |
  | Because passing a function to a function is a huge cognitive burden?

 For parameterized decorators, there is extra cognitive burden. See below.

I do regret my initial criticism, for exactly this reason.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-15 Thread Dieter Maurer
Dwight Hutto wrote at 2012-9-14 23:42 -0400:
 ...
Reduce redundancy, is argumentative.

To me, a decorator, is no more than a logging function. Correct me if
I'm wrong.

Well, it depends on how you are using decorators and how complex
your decorators are. If what you are using as decorating function
it really trivial, as trivial as @decoratorname, then you
do not gain much.

But your decorator functions need not be trivial.
An example: in a recent project,
I have implemented a SOAP webservice where most services depend
on a valid session and must return specified fields even when
(as in the case of an error) there is no senseful value.
Instead of putting into each of those function implementations
the check do I have a valid session? and at the end
add required fields not specified, I opted for the following
decorator:

def valid_session(*fields):
! fields = (errorcode,) + fields
  @decorator
  def valid_session(f, self, sessionkey, *args, **kw):
!   s = get_session(sessionkey)
!   if not s.get(authenticated, False):
! rd = {errorcode: u1000}
!   else:
! rd = f(self, sessionkey, *args, **kw)
!   return tuple(rd.get(field, DEFAULTS.get(field, '')) for field in fields)
  return valid_session

The lines starting with ! represent the logic encapsulated by the
decorator -- the logic, I would have to copy into each function implementation
without it.

I then use it this way:

  @valid_session()
  def logout(self, sessionkey):
s = get_session(sessionkey)
s[authenticated] = False
return {}

  @valid_session(amountavail)
  def getStock(self, sessionkey, customer, item, amount):
info = self._get_article(item)
return {uamountavail:info[deliverability] and u0 or u1}

  @valid_session(item, shortdescription, pe, me, min, price, vpe, 
stock, linkpicture, linkdetail, linklist, description, tax)
  def fetchDetail(self, sessionkey, customer, item):
return self._get_article(item)
  ...

I hope you can see that at least in this example, the use of the decorator
reduces redundancy and highly improves readability -- because
boilerplate code (check valid session, add default values for unspecified
fields) is not copied over and over again but isolated in a single place.


The example uses a second decorator (@decorator) --
in the decorator definition itself. This decorator comes from the
decorator module, a module facilitating the definition of signature
preserving decorators (important in my context): such a decorator
ensures that the decoration result has the same parameters as the
decorated function. To achieve this, complex Python implementation
details and Python's introspection must be used. And I am very
happy that I do not have to reproduce this logic in my decorator
definitions but just say @decorator :-)


Example 3: In another project, I had to implement a webservice
where most of the functions should return json serialized data
structures. As I like decorators, I chose a @json decorator.
Its definition looks like this:

@decorator
def json(f, self, *args, **kw):
  r = f(self, *args, **kw)
  self.request.response.setHeader(
'content-type',
# application/json made problems with the firewall,
#  try text/json instead
#'application/json; charset=utf-8'
'text/json; charset=utf-8'
)
  return udumps(r)

It calls the decorated function, then adds the correct content-type
header and finally returns the json serialized return value.

The webservice function definitions then look like:

@json
def f1(self, ):
   

@json
def f2(self, ...):
   

The function implementions can concentrate on their primary task.
The json decorator tells that the result is (by magic specified
elsewhere) turned into a json serialized value.

This example demontrates the improved maintainability (caused by
the redundancy reduction): the json rpc specification stipulates
the use of the application/json content type. Correspondingly,
I used this content-type header initially. However, many enterprise
firewalls try to protect against viruses by banning application/*
responses -- and in those environments, my initial webservice
implementation did not work. Thus, I changed the content type
to text/json. Thanks to the decorator encapsulation of the
json result logic, I could make my change at a single place -- not littered
all over the webservice implementation.


And a final example: Sometimes you are interested to cache (expensive)
function results. Caching involves non-trivial logic (determine the cache,
determine the key, check whether the cache contains a value for the key;
if not, call the function, cache the result). The package plone.memoize
defines a set of decorators (for different caching policies) which
which caching can be as easy as:

  @memoize
  def f():
  

The complete caching logic is encapsulated in the tiny @memoize prefix.
It tells: calls to this function are cached. The function implementation
can concentrate on its 

Re: Decorators not worth the effort

2012-09-15 Thread Thomas Rachel

[Sorry, my Firefox destroyed the indent...

Am 14.09.2012 22:29 schrieb Terry Reedy:


In other words

def make_wrapper(func, param):
def wrapper(*args, **kwds):
for i in range(param):
func(*args, **kwds)
return wrapper

def f(x): print(x)
f = make_wrapper(f, 2)
f('simple')

# is simpler, at least for some people, than the following
# which does essentially the same thing.

def make_outer(param):
def make_inner(func):
def wrapper(*args, **kwds):
for i in range(param):
func(*args, **kwds)
return wrapper
return make_inner

@make_outer(2)
def f(x): print(x)
f('complex')


For this case, my mydeco.py which I use quite often contains a

def indirdeco(ind):
# Update both the outer as well as the inner wrapper.
# If we knew the inner one was to be updated with something
# from *a, **k, we could do it. But not this way...
@functools.wraps(ind)
def outer(*a, **k):
@functools.wraps(ind)
def inner(f):
return ind(f, *a, **k)
return inner
return outer

so I can do

@indirdeco
def make_wrapper(func, param):
@functools.wraps(func)
def wrapper(*args, **kwds):
for i in range(param):
func(*args, **kwds)
return wrapper

and then nevertheless

@make_wrapper(2)
def f(x): print(x)

BTW, I also have a meta-decorator for the other direction:

def wrapfunction(mighty):
Wrap a function taking (f, *a, **k) and replace it with a
function taking (f) and returning a function taking (*a, **k) which
calls our decorated function.
Other direction than indirdeco.
@functools.wraps(mighty)
def wrapped_outer(inner):
@functools.wraps(inner)
def wrapped_inner(*a, **k):
return mighty(inner, *a, **k)
wrapped_inner.func = inner # keep the wrapped function
wrapped_inner.wrapper = mighty # and the replacement
return wrapped_inner
wrapped_outer.func = mighty # keep this as well
return wrapped_outer

With this, a

@wrapfunction
def twice(func, *a, **k):
return func(*a, **k), func(*a, **k)

can be used with

@twice
def f(x): print (x); return x

very nicely.


Thomas
--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-15 Thread 88888 Dihedral
Steven D'Aprano於 2012年9月15日星期六UTC+8上午7時39分28秒寫道:
 On Fri, 14 Sep 2012 15:16:47 -0600, Ian Kelly wrote:
 
 
 
  If only there were a conceptually simpler way to do this.  Actually,
 
  there is.  I give you: metadecorators!
 
 [code snipped but shown below]
 
  Which I think is certainly easier to understand than the nested
 
  functions approach.
 
 
 
 Maybe for you, but to me it is a big ball of mud. I have no idea how this 
 
 is supposed to work! At a quick glance, I would have sworn that it 
 
 *can't* work, since simple_decorator needs to see multiple arguments but 
 
 only receives one, the function to be decorated. And yet it does work:
 
 
 
 py from functools import partial
 
 py def make_wrapper(wrapper):
 
 ... return lambda wrapped: partial(wrapper, wrapped)
 
 ...
 
 py @make_wrapper
 
 ... def simple_decorator(func, *args, **kwargs):
 
 ... print Entering decorated function
 
 ... result = func(*args, **kwargs)
 
 ... print Exiting decorated function
 
 ... return result
 
 ...
 
 py @simple_decorator
 
 ... def my_function(a, b, c):
 
 ... Doc string
 
 ... return a+b+c
 
 ...
 
 py my_function(1, 2, 3)
 
 Entering decorated function
 
 Exiting decorated function
 
 6
 
 
 
 So to me, this is far more magical than nested functions. If I saw this 
 
 in t requires me to hunt through your library for the simple function 
 
 buried in a utility module somewhere (your words), instead of seeing 
 
 everything needed in a single decorator factory function. It requires 
 
 that I understand how partial works, which in my opinion is quite tricky. 
 
 (I never remember how it works or which arguments get curried.)
 
 
 
 And the end result is that the decorated function is less debugging-
 
 friendly than I demand: it is an anonymous partial object instead of a 
 
 named function, and the doc string is lost. And it is far from clear to 
 
 me how to modify your recipe to use functools.wraps in order to keep the 
 
 name and docstring, or even whether I *can* use functools.wraps.
 
 
 
 I dare say I could answer all those questions with some experimentation 
 
 and research. But I don't think that your metadecorator using partial 
 
 is *inherently* more understandable than the standard decorator approach:
 
 
 
 def simple_decorator2(func):
 
 @functools.wraps(func)
 
 def inner(*args, **kwargs):
 
 print Entering decorated function
 
 result = func(*args, **kwargs)
 
 print Exiting decorated function
 
 return result
 
 return inner
 
 
 
 This is no more complex than yours, and it keeps the function name and 
 
 docstring.
 
 
 
 
 
  Parameterized decorators are not much more
 
  difficult this way.  This function:
 
 [snip code]
 
  And now we have a fancy parameterized decorator that again requires no
 
  thinking about nested functions at all.
 
 
 
 Again, at the cost of throwing away the function name and docstring.
 
 
 
 I realise that a lot of this boils down to personal preference, but I 
 
 just don't think that nested functions are necessarily that hard to 
 
 grasp, so I prefer to see as much of the decorator logic to be in one 
 
 place (a nested decorator function) rather than scattered across two 
 
 separate decorators plus partial.
 
 
 
 
 
 
 
 
 
 -- 
 
 Steven
 

I think the problem is not in the replaced f.__doc__.

 
def MIGHT_FAIL(f, MSG, *k, **h):
 
# use MSG to determine whether  to invoke f or not 
# and do an error catch  here
 
 def innner(f): .
 ..
 # get the right info of  f here for any trapped error   
 #return inner, result
 return inner
  

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-15 Thread Dwight Hutto
On Sat, Sep 15, 2012 at 5:45 AM, 8 Dihedral
dihedral88...@googlemail.com wrote:
 Steven D'Aprano於 2012年9月15日星期六UTC+8上午7時39分28秒寫道:
 On Fri, 14 Sep 2012 15:16:47 -0600, Ian Kelly wrote:



  If only there were a conceptually simpler way to do this.  Actually,

  there is.  I give you: muman than humanetadecorators!

 [code snipped but shown below]

  Which I think is certainly easier to understand than the nested

  functions approach.



 Maybe for you, but to me it is a big ball of mud. I have no idea how this

 is supposed to work! At a quick glance, I would have sworn that it

 *can't* work, since simple_decorator needs to see multiple arguments but

 only receives one, the function to be decorated. And yet it does work:



 py from functools import partial

 py def make_wrapper(wrapper):

 ... return lambda wrapped: partial(wrapper, wrapped)

 ...

 py @make_wrapper

 ... def simple_decorator(func, *args, **kwargs):

 ... print Entering decorated function

 ... result = func(*args, **kwargs)

 ... print Exiting decorated function

 ... return result

 ...

 py @simple_decorator

 ... def my_function(a, b, c):

 ... Doc string

 ... return a+b+c

 ...

 py my_function(1, 2, 3)

 Entering decorated function

 Exiting decorated function

 6



 So to me, this is far more magical than nested functions. If I saw this

 in t requires me to hunt through your library for the simple function

 buried in a utility module somewhere (your words), instead of seeing

 everything needed in a single decorator factory function. It requires

 that I understand how partial works, which in my opinion is quite tricky.

 (I never remember how it works or which arguments get curried.)



 And the end result is that the decorated function is less debugging-

 friendly than I demand: it is an anonymous partial object instead of a

 named function, and the doc string is lost. And it is far from clear to

 me how to modify your recipe to use functools.wraps in order to keep the

 name and docstring, or even whether I *can* use functools.wraps.



 I dare say I could answer all those questions with some experimentation

 and research. But I don't think that your metadecorator using partial

 is *inherently* more understandable than the standard decorator approach:



 def simple_decorator2(func):

 @functools.wraps(func)

 def inner(*args, **kwargs):

 print Entering decorated function

 result = func(*args, **kwargs)

 print Exiting decorated function

 return result

 return inner



 This is no more complex than yours, and it keeps the function name and

 docstring.





  Parameterized decorators are not much more

  difficult this way.  This function:

 [snip code]

  And now we have a fancy parameterized decorator that again requires no

  thinking about nested functions at all.



 Again, at the cost of throwing away the function name and docstring.



 I realise that a lot of this boils down to personal preference, but I

 just don't think that nested functions are necessarily that hard to

 grasp, so I prefer to see as much of the decorator logic to be in one

 place (a nested decorator function) rather than scattered across two

 separate decorators plus partial.

Like chi fu, allow decorators to evolve upon themselves. Like simple
moves flow through water and allow memorization of activity through
evidence of existence.


-- 
Best Regards,
David Hutto
CEO: http://www.hitwebdevelopment.com
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-15 Thread 88888 Dihedral
David Hutto於 2012年9月15日星期六UTC+8下午6時04分28秒寫道:
 On Sat, Sep 15, 2012 at 5:45 AM, 8 Dihedral
 
 dihedral88...@googlemail.com wrote:
 
  Steven D'Aprano於 2012年9月15日星期六UTC+8上午7時39分28秒寫道:
 
  On Fri, 14 Sep 2012 15:16:47 -0600, Ian Kelly wrote:
 
 
 
 
 
 
 
   If only there were a conceptually simpler way to do this.  Actually,
 
 
 
   there is.  I give you: muman than humanetadecorators!
 
 
 
  [code snipped but shown below]
 
 
 
   Which I think is certainly easier to understand than the nested
 
 
 
   functions approach.
 
 
 
 
 
 
 
  Maybe for you, but to me it is a big ball of mud. I have no idea how this
 
 
 
  is supposed to work! At a quick glance, I would have sworn that it
 
 
 
  *can't* work, since simple_decorator needs to see multiple arguments but
 
 
 
  only receives one, the function to be decorated. And yet it does work:
 
 
 
 
 
 
 
  py from functools import partial
 
 
 
  py def make_wrapper(wrapper):
 
 
 
  ... return lambda wrapped: partial(wrapper, wrapped)
 
 
 
  ...
 
 
 
  py @make_wrapper
 
 
 
  ... def simple_decorator(func, *args, **kwargs):
 
 
 
  ... print Entering decorated function
 
 
 
  ... result = func(*args, **kwargs)
 
 
 
  ... print Exiting decorated function
 
 
 
  ... return result
 
 
 
  ...
 
 
 
  py @simple_decorator
 
 
 
  ... def my_function(a, b, c):
 
 
 
  ... Doc string
 
 
 
  ... return a+b+c
 
 
 
  ...
 
 
 
  py my_function(1, 2, 3)
 
 
 
  Entering decorated function
 
 
 
  Exiting decorated function
 
 
 
  6
 
 
 
 
 
 
 
  So to me, this is far more magical than nested functions. If I saw this
 
 
 
  in t requires me to hunt through your library for the simple function
 
 
 
  buried in a utility module somewhere (your words), instead of seeing
 
 
 
  everything needed in a single decorator factory function. It requires
 
 
 
  that I understand how partial works, which in my opinion is quite tricky.
 
 
 
  (I never remember how it works or which arguments get curried.)
 
 
 
 
 
 
 
  And the end result is that the decorated function is less debugging-
 
 
 
  friendly than I demand: it is an anonymous partial object instead of a
 
 
 
  named function, and the doc string is lost. And it is far from clear to
 
 
 
  me how to modify your recipe to use functools.wraps in order to keep the
 
 
 
  name and docstring, or even whether I *can* use functools.wraps.
 
 
 
 
 
 
 
  I dare say I could answer all those questions with some experimentation
 
 
 
  and research. But I don't think that your metadecorator using partial
 
 
 
  is *inherently* more understandable than the standard decorator approach:
 
 
 
 
 
 
 
  def simple_decorator2(func):
 
 
 
  @functools.wraps(func)
 
 
 
  def inner(*args, **kwargs):
 
 
 
  print Entering decorated function
 
 
 
  result = func(*args, **kwargs)
 
 
 
  print Exiting decorated function
 
 
 
  return result
 
 
 
  return inner
 
 
 
 
 
 
 
  This is no more complex than yours, and it keeps the function name and
 
 
 
  docstring.
 
 
 
 
 
 
 
 
 
 
 
   Parameterized decorators are not much more
 
 
 
   difficult this way.  This function:
 
 
 
  [snip code]
 
 
 
   And now we have a fancy parameterized decorator that again requires no
 
 
 
   thinking about nested functions at all.
 
 
 
 
 
 
 
  Again, at the cost of throwing away the function name and docstring.
 
 
 
 
 
 
 
  I realise that a lot of this boils down to personal preference, but I
 
 
 
  just don't think that nested functions are necessarily that hard to
 
 
 
  grasp, so I prefer to see as much of the decorator logic to be in one
 
 
 
  place (a nested decorator function) rather than scattered across two
 
 
 
  separate decorators plus partial.
 
 
 
 Like chi fu, allow decorators to evolve upon themselves. Like simple
 
 moves flow through water and allow memorization of activity through
 
 evidence of existence.
 
 
 
 
 
 -- 
 
 Best Regards,


The concept of decorators is just a mapping from a function to another
function with the same name in python.


It should be easy to be grapsed for those studied real analysis and 
functional analysis.
  
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Dieter Maurer
 On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
 wrote:
 I don't like decorators, I think they're not worth the mental effort.

Fine.

I like them because they can vastly improve reusability and drastically
reduce redundancies (which I hate). Improved reusability and
reduced redundancies can make applications more readable, easier
to maintain and faster to develop.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Jean-Michel Pichavant
- Original Message -
 On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
 wrote:
  I don't like decorators, I think they're not worth the mental
  effort.
 
 Because passing a function to a function is a huge cognitive burden?
 --
 http://mail.python.org/mailman/listinfo/python-list
 

I was expecting that. Decorators are very popular so I kinda already know that 
the fault is mine. Now to the reason why I have troubles writing them, I don't 
know. Every time I did use decorators, I spent way too much time writing it 
(and debugging it).

I wrote the following one, used to decorate any function that access an 
equipment, it raises an exception when the timeout expires. The timeout is 
adapted to the platform, ASIC of FPGA so people don't need to specify everytime 
one timeout per platform.

In the end it would replace 

def boot(self, timeout=15):
if FPGA:
self.sendCmd(bootMe, timeout=timeout*3)
else:
self.sendCmd(bootMe, timeout=timeout)

with

@timeout(15)
def boot(self, timeout=None):
self.sendCmd(bootMe, timeout)

I wrote a nice documentation with sphinx to explain this, how to use it, how it 
can improve code. After spending hours on the decorator + doc, feedback from my 
colleagues : What the F... !! 

Decorators are very python specific (probably exists in any dynamic language 
though, I don't know), in some environment where people need to switch from C 
to python everyday, decorators add python magic that not everyone is familiar 
with. For example everyone in the team is able to understand and debug the 
undecorated version of the above boot method. I'm the only one capable of 
reading the decorated version. And don't flame my colleagues, they're amazing 
people (just in case they're reading this :p) who are not python developers, 
more of users.

Hence my original decorators are not worth the mental effort. Context 
specific I must admit.

Cheers,

JM

PS : Here's the decorator, just to give you an idea about how it looks. Small 
piece of code, but took me more than 2 hours to write it. I removed some 
sensible parts so I don't expect it to run.

class timeout(object):
Substitute the timeout keyword argument with the appropriate value
FACTORS = {
IcebergConfig().platform.ASIC : 1,
IcebergConfig().platform.FPGA : 3,
}

def __init__(self, asic, fpga=None, palladium=None):
self.default = asic
self.fpga = fpga

def _getTimeout(self):
platform = config().platform
factor = self.FACTORS[platform.value]
timeout = {
platform.ASIC : self.default*factor,
platform.FPGA : self.fpga or 
self.default*factor,
}[platform.value]
return timeout

def __call__(self, func):
def decorated(*args, **kwargs):
names, _, _, defaults =  inspect.getargspec(func)
defaults = defaults or []
if 'timeout' not in names:
raise ValueError('A timeout keyword argument 
is required')
if 'timeout' not in kwargs: # means the timeout keyword 
arg is not in the call
index = names.index('timeout')
argsLength = (len(names) - len(defaults))
if index  argsLength:
raise NotImplementedError('This 
decorator does not support non keyword timeout argument')
if index  len(args)-1: # means the timeout has 
not be passed using a pos argument
timeoutDef = defaults[index-argsLength]
if timeoutDef is not None:
_log.warning(Decorating a 
function with a default timeout value  None)
kwargs['timeout'] = self._getTimeout()
else:
_log.warning('Timeout value specified during 
the call, please check %s @timeout decorator.' % func.__name__)
ret = func(*args, **kwargs)
return ret
return decorated
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Duncan Booth
Jean-Michel Pichavant jeanmic...@sequans.com wrote:

 I wrote the following one, used to decorate any function that access
 an equipment, it raises an exception when the timeout expires. The
 timeout is adapted to the platform, ASIC of FPGA so people don't need
 to specify everytime one timeout per platform. 
 
 In the end it would replace 
 
 def boot(self, timeout=15):
 if FPGA:
 self.sendCmd(bootMe, timeout=timeout*3)
 else:
 self.sendCmd(bootMe, timeout=timeout)
 
 with
 
 @timeout(15)
 def boot(self, timeout=None):
 self.sendCmd(bootMe, timeout)
 
 I wrote a nice documentation with sphinx to explain this, how to use
 it, how it can improve code. After spending hours on the decorator +
 doc, feedback from my colleagues : What the F... !! 
 

I'd agree with your colleagues. How are you going to ensure that all 
relevant functions are decorated and yet no decorated function ever 
calls another decorated one?

From the code you posted it would seem appropriate that the adjustment 
of the timeout parameter happen in the `sendCmd()` method itself and 
nowhere else. Alternatively use named values for different categories of 
timeouts and adjust them on startup so instead of a default of `timeout=
15` you would have a default `timeout=MEDIUM_TIMEOUT` or whatever name 
is appropriate.

-- 
Duncan Booth http://kupuguy.blogspot.com
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Steven D'Aprano
On Fri, 14 Sep 2012 11:28:22 +0200, Jean-Michel Pichavant wrote:

 PS : Here's the decorator, just to give you an idea about how it looks.
 Small piece of code, but took me more than 2 hours to write it. I
 removed some sensible parts so I don't expect it to run.

[snip timeout class]

Holy over-engineering Batman!!!

No wonder you don't think much of decorators, if this example of overkill 
is what you consider typical of them. It does much, much more than the 
simple code you were replacing:

def boot(self, timeout=15):
if FPGA:
self.sendCmd(bootMe, timeout=timeout*3)
else:
self.sendCmd(bootMe, timeout=timeout)

# becomes:

@timeout(15)
def boot(self, timeout=None):
self.sendCmd(bootMe, timeout)


Most of my decorator functions are under a dozen lines. And that's the 
complicated ones!

Here's my solution to the example you gave:




# Untested!
def timeout(t=15):
# Decorator factory. Return a decorator to actually do the work.
if FPGA:
t *= 3
def decorator(func):
@functools.wraps(func)
def inner(self, timeout):
self.sendCmd(bootMe, timeout=t)
return inner
return decorator


I reckon that will pretty much do what your example showed. Of course, 
once you start adding more and more functionality above the simple code 
shown above (arbitrary platforms, argument checking of the decorated 
function, logging, etc.) you're going to get a much more complex 
decorator. On the other hand, YAGNI.

http://en.wikipedia.org/wiki/You_ain%27t_gonna_need_it

 
-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Ulrich Eckhardt

Am 14.09.2012 11:28, schrieb Jean-Michel Pichavant:

Decorators are very popular so I kinda already know that the
fault is mine. Now to the reason why I have troubles writing
them, I don't know. Every time I did use decorators, I spent
way too much time writing it (and debugging it).

I wrote the following one, used to decorate any function that access
an equipment, it raises an exception when the timeout expires. The
timeout is adapted to the platform, ASIC of FPGA so people don't need
to specify everytime one timeout per platform.

In the end it would replace

def boot(self, timeout=15):
 if FPGA:
 self.sendCmd(bootMe, timeout=timeout*3)
 else:
 self.sendCmd(bootMe, timeout=timeout)

with

@timeout(15)
def boot(self, timeout=None):
 self.sendCmd(bootMe, timeout)

I wrote a nice documentation with sphinx to explain this, how to use
it, how it can improve code. After spending hours on the decorator +
doc, feedback from my colleagues : What the F... !!


Quite honestly: I think like your colleagues in this case and that in 
this case the decorator doesn't improve the code. Instead, I would 
probably have added a _get_timeout() function that takes care of 
adjusting the argument passed to the function according to the 
underlying hardware target.


To be less abstract, the particular problem I have with your approach is 
that I can't even guess what your code means, let alone what parameters 
it actually takes. If you had written


  @default_timeout(15)
  def boot(self, timeout=None):

instead, I would have been able to guess. OTOH, then again I would have 
wondered why you used a decorator to create a default argument when 
there is builtin support for specifying default arguments for functions.


Maybe you could get away with a decorator like this:

  @adjust_timeout
  def boot(self, timeout=2.5):

The idea is that the decorator modifies the timeout value passed to the 
function (or maybe just modifies the default value?) according to the 
underlying hardware.




Decorators are very python specific (probably exists in any dynamic
language though, I don't know), in some environment where people need
to switch from C to python everyday, decorators add python magic that
not everyone is familiar with.


The same could be said for classes, iterators, significant whitespace, 
docstrings, lambdas. I think that this was just a bad example but it 
doesn't prove that decorators are worthless. Decorators are useful tools 
if they do something to a function, like doing something before or after 
the actual code, or modifying the context in which the code is called. 
Just setting a default parameter is possible as you have proved, but 
it's IMHO not a good use case.


A bit more specific to your case, adding a timeout decorator would 
actually make much more sense if it transparently invoked the actual 
function in a second thread and the calling thread stops waiting for 
completion and raises an error after that timeout. This has the distinct 
advantage that the code doing the actual communication doesn't have any 
timeout handling code inside.


I'm currently doing something similar here though I only monitor a TCP 
connection that is used for some telnet-style requests. Every function 
making a request over TCP is decorated with @_check_connection. That 
decorator does two things:

1. It checks for an existing fatal connection error.
2. It runs the request and filters resulting errors for fatal connection 
errors.


The decorator looks like this:

def _check_connection(fn):
@functools.wraps(fn)
def wrapper(self, *args, **kwargs):
# check for sticky connection errors
if self._connection_error:
raise self._connection_error
# run actual function
try:
return fn(self, *args, **kwargs)
catch RequestFailed:
# The other side signalled a failure, but
# further requests can still succeed.
raise
catch ConnectionError, e:
# The connection is broken beyond repair.
# Store sticky connection and forward.
self._connection_error = e
raise
return wrapper

I have had other programmers here write such requests and they blindly 
copied the decorator from existing code. This works because the code 
inside that converts/formats/parses the inputs and outputs is completely 
unaware of the connection monitoring. Otherwise, I don't think anyone 
could explain what this decorator does, but they don't have to 
understand it either. It just works.


I wish you a nice weekend!

Uli

--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Tim Chase
On 09/14/12 07:01, Steven D'Aprano wrote:
 [snip timeout class]
 
 Holy over-engineering Batman!!!
 
 No wonder you don't think much of decorators,
[snip]

 Most of my decorator functions are under a dozen lines. And that's the 
 complicated ones!


As are mine, and a sizable chunk of those under-a-dozen-lines are
somewhat boilerplate like using @functools.wraps inside, actual def
of the function, and returning that function. :-)

-tkc



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Jean-Michel Pichavant


- Original Message -
 Jean-Michel Pichavant jeanmic...@sequans.com wrote:
 
  I wrote the following one, used to decorate any function that
  access
  an equipment, it raises an exception when the timeout expires. The
  timeout is adapted to the platform, ASIC of FPGA so people don't
  need
  to specify everytime one timeout per platform.
  
  In the end it would replace
  
  def boot(self, timeout=15):
  if FPGA:
  self.sendCmd(bootMe, timeout=timeout*3)
  else:
  self.sendCmd(bootMe, timeout=timeout)
  
  with
  
  @timeout(15)
  def boot(self, timeout=None):
  self.sendCmd(bootMe, timeout)
  
  I wrote a nice documentation with sphinx to explain this, how to
  use
  it, how it can improve code. After spending hours on the decorator
  +
  doc, feedback from my colleagues : What the F... !!
  
 
 I'd agree with your colleagues. How are you going to ensure that all
 relevant functions are decorated and yet no decorated function ever
 calls another decorated one?
 
 From the code you posted it would seem appropriate that the
 adjustment
 of the timeout parameter happen in the `sendCmd()` method itself and
 nowhere else. Alternatively use named values for different categories
 of
 timeouts and adjust them on startup so instead of a default of
 `timeout=
 15` you would have a default `timeout=MEDIUM_TIMEOUT` or whatever
 name
 is appropriate.
 
 --
 Duncan Booth http://kupuguy.blogspot.com
 --
 http://mail.python.org/mailman/listinfo/python-list

All functions set different timeout values, I cannot use a DEFAULT_VALUE.
All functions are design in the same way:

def doSomeAction(self, timeout):
preprocess()
self.sendCmd('relatedAction', timeout) # send the command to the device CLI 
interface
postprocess()

Ultimately, the goal is to have something like

@timeout(2)
def doAction1

@timeout(4)
def doAction2

@timeout(12)
def doAction3

and so on... (1 second is important, because there's a platform I remove from 
my example, didn't want to advertise publicly tech used by the company, that 
runs 1000 times slower)

Additionally, the multiple check I run within the decorator is for consistency 
check and argument checking. I though it was a good idea because our python 
engine is used by a dozen of engineers to control equipment, and any misuse of 
this new decorator would lead to badly configured timeouts with heavy 
consequences on the test sessions. Sometimes a RTFM is not enough, when you 
need to make this work, you slip on your Batman costume like Steven suggested, 
and you save the day (or so I though :) ) by raising nice exceptions about 
missing keyword argument.

But let's forget about my poor example, I end up describing my life which is 
pretty pointless.

Here's Steven example:

# Untested!
def timeout(t=15):
# Decorator factory. Return a decorator to actually do the work.
if FPGA:
t *= 3
def decorator(func):
@functools.wraps(func)
def inner(self, timeout):
self.sendCmd(bootMe, timeout=t)
return inner
return decorator

I can assure you, that for some python users, it's is not easy to understand 
what it does, this function returning a function which returns another 
(wrapped) function. It requires some effort.

JM
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread andrea crotti
I think one very nice and simple example of how decorators can be used is this:

def memoize(f, cache={}, *args, **kwargs):
def _memoize(*args, **kwargs):
key = (args, str(kwargs))
if not key in cache:
cache[key] = f(*args, **kwargs)

return cache[key]

return _memoize

def fib(n):
if n = 1:
return 1
return fib(n-1) + fib(n-2)

@memoize
def fib_memoized(n):
if n = 1:
return 1
return fib_memoized(n-1) + fib_memoized(n-2)


The second fibonacci looks exactly the same but while the first is
very slow and would generate a stack overflow the second doesn't..

I might use this example for the presentation, before explaining what it is..
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Chris Angelico
On Sat, Sep 15, 2012 at 12:12 AM, andrea crotti
andrea.crott...@gmail.com wrote:
 def fib(n):
 if n = 1:
 return 1
 return fib(n-1) + fib(n-2)

 @memoize
 def fib_memoized(n):
 if n = 1:
 return 1
 return fib_memoized(n-1) + fib_memoized(n-2)


 The second fibonacci looks exactly the same but while the first is
 very slow and would generate a stack overflow the second doesn't..

Trouble is, you're starting with a pretty poor algorithm. It's easy to
improve on what's poor. Memoization can still help, but I would start
with a better algorithm, such as:

def fib(n):
if n=1: return 1
a,b=1,1
for i in range(1,n,2):
a+=b
b+=a
return b if n%2 else a

def fib(n,cache=[1,1]):
if n=1: return 1
while len(cache)=n:
cache.append(cache[-1] + cache[-2])
return cache[n]

Personally, I don't mind (ab)using default arguments for caching, but
you could do the same sort of thing with a decorator if you prefer. I
think the non-decorated non-recursive version is clear and efficient
though.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Steven D'Aprano
On Fri, 14 Sep 2012 15:22:26 +0200, Jean-Michel Pichavant wrote:

 Here's Steven example:
 
 # Untested!
 def timeout(t=15):
 # Decorator factory. Return a decorator to actually do the work. if
 FPGA:
 t *= 3
 def decorator(func):
 @functools.wraps(func)
 def inner(self, timeout):
 self.sendCmd(bootMe, timeout=t)
 return inner
 return decorator
 
 I can assure you, that for some python users, it's is not easy to
 understand what it does, this function returning a function which
 returns another (wrapped) function. It requires some effort.

Oh I agree. So does learning to tie your shoe-laces, learning to cook, 
and learning to drive.

Learning to be a world-class chess master takes a lot of effort. Learning 
about decorators does not.


-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Jean-Michel Pichavant
- Original Message -
 On Fri, 14 Sep 2012 15:22:26 +0200, Jean-Michel Pichavant wrote:
 
  Here's Steven example:
  
  # Untested!
  def timeout(t=15):
  # Decorator factory. Return a decorator to actually do the
  work. if
  FPGA:
  t *= 3
  def decorator(func):
  @functools.wraps(func)
  def inner(self, timeout):
  self.sendCmd(bootMe, timeout=t)
  return inner
  return decorator
  
  I can assure you, that for some python users, it's is not easy to
  understand what it does, this function returning a function which
  returns another (wrapped) function. It requires some effort.
 
 Oh I agree. So does learning to tie your shoe-laces, learning to
 cook,
 and learning to drive.
 
 Learning to be a world-class chess master takes a lot of effort.
 Learning
 about decorators does not.
 
 
 --
 Steven

I said some effort, not a lot of effort. Something else that requires some 
effort it to make experts realize that some things they consider trivial and 
easy, aren't actually for a lot of people. 
Returning to my cave.

JM



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread andrea crotti
2012/9/14 Chris Angelico ros...@gmail.com:

 Trouble is, you're starting with a pretty poor algorithm. It's easy to
 improve on what's poor. Memoization can still help, but I would start
 with a better algorithm, such as:

 def fib(n):
 if n=1: return 1
 a,b=1,1
 for i in range(1,n,2):
 a+=b
 b+=a
 return b if n%2 else a

 def fib(n,cache=[1,1]):
 if n=1: return 1
 while len(cache)=n:
 cache.append(cache[-1] + cache[-2])
 return cache[n]

 Personally, I don't mind (ab)using default arguments for caching, but
 you could do the same sort of thing with a decorator if you prefer. I
 think the non-decorated non-recursive version is clear and efficient
 though.

 ChrisA
 --
 http://mail.python.org/mailman/listinfo/python-list


The poor algorithm is much more close to the mathematical definition
than the smarter iterative one..  And in your second version you
include some ugly caching logic inside it, so why not using a
decorator then?

I'm not saying that with the memoization is the good solution, just
that I think it's a very nice example of how to use a decorator, and
maybe a good example to start with a talk on decorators..
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread 88888 Dihedral
Chris Angelico於 2012年9月14日星期五UTC+8下午10時41分06秒寫道:
 On Sat, Sep 15, 2012 at 12:12 AM, andrea crotti
 
 andrea.crott...@gmail.com wrote:
 
  def fib(n):
 
  if n = 1:
 
  return 1
 
  return fib(n-1) + fib(n-2)
 
 
 
  @memoize
 
  def fib_memoized(n):
 
  if n = 1:
 
  return 1
 
  return fib_memoized(n-1) + fib_memoized(n-2)
 
 
 
 
 
  The second fibonacci looks exactly the same but while the first is
 
  very slow and would generate a stack overflow the second doesn't..
 
 
 
 Trouble is, you're starting with a pretty poor algorithm. It's easy to
 
 improve on what's poor. Memoization can still help, but I would start
 
 with a better algorithm, such as:
 
 
 
 def fib(n):
 
   if n=1: return 1
 
   a,b=1,1
 
   for i in range(1,n,2):
 
   a+=b
 
   b+=a
 
   return b if n%2 else a
 
 
 
 def fib(n,cache=[1,1]):
 
   if n=1: return 1
 
   while len(cache)=n:
 
   cache.append(cache[-1] + cache[-2])
 
   return cache[n]
 
 
 
 Personally, I don't mind (ab)using default arguments for caching, but
 
 you could do the same sort of thing with a decorator if you prefer. I
 
 think the non-decorated non-recursive version is clear and efficient
 
 though.
 
 
 
 ChrisA

Uhn, the decorator part is good for wrapping functions in python.

For example a decorator can be used  to add a layor of some
message handlings of  those  plain functions
to become iterators which could be used as call back functions
in a more elegant way.



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Chris Angelico
On Sat, Sep 15, 2012 at 2:15 AM, andrea crotti
andrea.crott...@gmail.com wrote:
 The poor algorithm is much more close to the mathematical definition
 than the smarter iterative one..  And in your second version you
 include some ugly caching logic inside it, so why not using a
 decorator then?

I learned Fibonacci as a sequence, not as a recursive definition. So
the algorithm I coded (the non-caching one) is pretty much how I
learned it in mathematics. But yes, you're right that the caching is
inherent to the second version; and yes, that's where a decorator can
make it a LOT cleaner.

As a demo of recursion and decorators, your original function pair is
definitely the best. But if you want to be able to calculate fib(n)
for any n without blowing your stack, my version will scale much more
safely.

But then again, who actually ever needs fibonacci numbers?

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Terry Reedy

On 9/13/2012 10:12 PM, Cameron Simpson wrote:

On 13Sep2012 18:58, alex23 wuwe...@gmail.com wrote:
| On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
| wrote:
|  I don't like decorators, I think they're not worth the mental effort.
|
| Because passing a function to a function is a huge cognitive burden?


For parameterized decorators, there is extra cognitive burden. See below.



It is for me when I'm _writing_ the decorator:-) But if I get it right
and name it well I find it dramaticly _decreases_ the cognitive burden
of the code using the decorator...


For a simple, unparameterized wrapper, the difficulty is entirely in the 
wrapper maker. It must define the final wrapper as a nested function and 
return it*. It is irrelevant whether the wrapper maker is used with 
pre-def decorator syntax or with an explicit post-def call.


*I am here ignoring the option of a class with __call__ method.

For a parameterized wrapper, using decorator syntax requires passing the 
parameter(s) first and the function to be wrapped later. This requires 
currying the wrapper maker with double nesting. The nesting order may 
seem inside-out to some. For most people, this is extra work compared to 
writing a wrapper that accepts the function and parameters together and 
only has a single level of nesting.


In other words

def make_wrapper(func, param):
def wrapper(*args, **kwds):
for i in range(param):
func(*args, **kwds)
return wrapper

def f(x): print(x)
f = make_wrapper(f, 2)
f('simple')

# is simpler, at least for some people, than the following
# which does essentially the same thing.

def make_outer(param):
def make_inner(func):
def wrapper(*args, **kwds):
for i in range(param):
func(*args, **kwds)
return wrapper
return make_inner

@make_outer(2)
def f(x): print(x)
f('complex')

Is the gain of not repeating the wrapped function name twice in the 
post-def wrapping call, and the gain of knowing the function will be 
wrapped before reading the def, worth the pain of currying the wrapper 
maker?


--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Terry Reedy

On 9/14/2012 5:28 AM, Jean-Michel Pichavant wrote:

Decorators are very popular so I kinda already know that the fault is mine. Now 
to the reason why I have troubles writing them, I don't know. Every time I did 
use decorators, I spent way too much time writing it (and debugging it).



--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Terry Reedy

2nd try, hit send button by mistake before

On 9/14/2012 5:28 AM, Jean-Michel Pichavant wrote:


Decorators are very popular so I kinda already know that the fault is
mine. Now to the reason why I have troubles writing them, I don't
know. Every time I did use decorators, I spent way too much time
writing it (and debugging it).


You are writing parameterized decorators, which require inverted 
currying of the wrapper maker. Perhaps that is why you have trouble. As 
I showed in response to Cameron, it may be easier to avoid that by using 
a traditional post-def wrapping call instead of decorator syntax.


--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Terry Reedy

On 9/14/2012 4:29 PM, Terry Reedy wrote:

On 9/13/2012 10:12 PM, Cameron Simpson wrote:

On 13Sep2012 18:58, alex23 wuwe...@gmail.com wrote:
| On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
| wrote:
|  I don't like decorators, I think they're not worth the mental effort.
|
| Because passing a function to a function is a huge cognitive burden?


For parameterized decorators, there is extra cognitive burden. See below.



It is for me when I'm _writing_ the decorator:-) But if I get it right
and name it well I find it dramaticly _decreases_ the cognitive burden
of the code using the decorator...


For a simple, unparameterized wrapper, the difficulty is entirely in the
wrapper maker. It must define the final wrapper as a nested function and
return it*. It is irrelevant whether the wrapper maker is used with
pre-def decorator syntax or with an explicit post-def call.

*I am here ignoring the option of a class with __call__ method.

For a parameterized wrapper, using decorator syntax requires passing the
parameter(s) first and the function to be wrapped later. This requires
currying the wrapper maker with double nesting. The nesting order may
seem inside-out to some. For most people, this is extra work compared to
writing a wrapper that accepts the function and parameters together and
only has a single level of nesting.

In other words

def make_wrapper(func, param):
 def wrapper(*args, **kwds):
 for i in range(param):
 func(*args, **kwds)
 return wrapper

def f(x): print(x)
f = make_wrapper(f, 2)
f('simple')

# is simpler, at least for some people, than the following
# which does essentially the same thing.

def make_outer(param):
 def make_inner(func):
 def wrapper(*args, **kwds):
 for i in range(param):
 func(*args, **kwds)
 return wrapper
 return make_inner

@make_outer(2)
def f(x): print(x)
f('complex')

Is the gain of not repeating the wrapped function name twice in the
post-def wrapping call, and the gain of knowing the function will be
wrapped before reading the def, worth the pain of currying the wrapper
maker?




--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Ian Kelly
On Fri, Sep 14, 2012 at 2:29 PM, Terry Reedy tjre...@udel.edu wrote:
 For a simple, unparameterized wrapper, the difficulty is entirely in the
 wrapper maker. It must define the final wrapper as a nested function and
 return it*. It is irrelevant whether the wrapper maker is used with pre-def
 decorator syntax or with an explicit post-def call.

 *I am here ignoring the option of a class with __call__ method.

 For a parameterized wrapper, using decorator syntax requires passing the
 parameter(s) first and the function to be wrapped later. This requires
 currying the wrapper maker with double nesting. The nesting order may seem
 inside-out to some. For most people, this is extra work compared to writing
 a wrapper that accepts the function and parameters together and only has a
 single level of nesting.

 In other words

 def make_wrapper(func, param):
 def wrapper(*args, **kwds):
 for i in range(param):
 func(*args, **kwds)
 return wrapper

 def f(x): print(x)
 f = make_wrapper(f, 2)
 f('simple')

 # is simpler, at least for some people, than the following
 # which does essentially the same thing.

 def make_outer(param):
 def make_inner(func):
 def wrapper(*args, **kwds):
 for i in range(param):
 func(*args, **kwds)
 return wrapper
 return make_inner

 @make_outer(2)
 def f(x): print(x)
 f('complex')

 Is the gain of not repeating the wrapped function name twice in the post-def
 wrapping call, and the gain of knowing the function will be wrapped before
 reading the def, worth the pain of currying the wrapper maker?

If only there were a conceptually simpler way to do this.  Actually,
there is.  I give you: metadecorators!

First, the simple, non-parameterized case:

from functools import partial

def make_wrapper(wrapper):
return lambda wrapped: partial(wrapper, wrapped)

With that simple function buried in a utility module somewhere, we can do:

@make_wrapper
def simple_decorator(func, *args, **kwargs):
do_stuff()
result = func(*args, **kwargs)
do_more_stuff()
return result

Which I think is certainly easier to understand than the nested
functions approach.  Parameterized decorators are not much more
difficult this way.  This function:

def make_parameterized_wrapper(wrapper):
return lambda *params: lambda wrapped: partial(wrapper, wrapped, params)

enables us to write:

@make_parameterized_wrapper
def complex_decorator(func, (param1, param2, param3), *args, **kwargs):
do_stuff(param1, param2)
result = func(*args, **kwargs)
do_more_stuff(param2, param3)
return result

And now we have a fancy parameterized decorator that again requires no
thinking about nested functions at all.  Sadly, that last bit of
syntax will only work in Python 2; tuple parameter unpacking was
removed in Python 3.  It's not a complicated upgrade path, however:

@make_parameterized_wrapper
def complex_decorator(func, params, *args, **kwargs):
(param1, param2, param3) = params
do_stuff(param1, param2)
result = func(*args, **kwargs)
do_more_stuff(param2, param3)
return result

Cheers,
Ian
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Joshua Landau
On 14 September 2012 18:30, Chris Angelico ros...@gmail.com wrote:

 On Sat, Sep 15, 2012 at 2:15 AM, andrea crotti
 andrea.crott...@gmail.com wrote:
  The poor algorithm is much more close to the mathematical definition
  than the smarter iterative one..  And in your second version you
  include some ugly caching logic inside it, so why not using a
  decorator then?

 I learned Fibonacci as a sequence, not as a recursive definition. So
 the algorithm I coded (the non-caching one) is pretty much how I
 learned it in mathematics. But yes, you're right that the caching is
 inherent to the second version; and yes, that's where a decorator can
 make it a LOT cleaner.

 As a demo of recursion and decorators, your original function pair is
 definitely the best. But if you want to be able to calculate fib(n)
 for any n without blowing your stack, my version will scale much more
 safely.

 But then again, who actually ever needs fibonacci numbers?


I thought the example was good, not because  a recursive fib is useful but
because memoizing is. There are a lot of times one would like to memoize
a function: not just recursive ones. Thus, the example of the decorator was
valid.

[offtopic]
Anyhow, the best method has to be this:

 from decimal import Decimal as Dec
 def fib(n):
... rootFive = Dec(5).sqrt()
... phi = (1 + rootFive) / 2
... return round(phi**n / rootFive)
 fib(100)
354224848179261915075

It's just so obvious why it works.
[/offtopic]
-- 
http://mail.python.org/mailman/listinfo/python-list


RE: Decorators not worth the effort

2012-09-14 Thread Prasad, Ramit
Jean-Michel Pichavant wrote:

[snip]

 Ultimately, the goal is to have something like
 
 @timeout(2)
 def doAction1
 
 @timeout(4)
 def doAction2

[snip]

 Here's Steven example:
 
 # Untested!
 def timeout(t=15):
 # Decorator factory. Return a decorator to actually do the work.
 if FPGA:
 t *= 3
 def decorator(func):
 @functools.wraps(func)
 def inner(self, timeout):
 self.sendCmd(bootMe, timeout=t)
 return inner
 return decorator
 
 I can assure you, that for some python users, it's is not easy to understand
 what it does, this function returning a function which returns another
 (wrapped) function. It requires some effort.
 

I think it would help if it was renamed to set_timeout. And I would 
not expect the Python user to need to understand how it *works*, just 
to recognize what it *does* when it is used. I may not understand list's 
sort method internals (beyond the use of timsort), but I know how to 
use it to sort a list as I want. That is usually all I need.
 

For example, your colleagues just need to understand that the below
decorator is setting a timeout for the function.

@set_timeout(min=15)
def some_function():
'''blah'''
  code 


One minor note, the style of decorator you are using loses the docstring
(at least) of the original function. I would add the @functools.wraps(func) 
decorator inside your decorator.

This email is confidential and subject to important disclaimers and
conditions including on offers for the purchase or sale of
securities, accuracy and completeness of information, viruses,
confidentiality, legal privilege, and legal entity disclaimers,
available at http://www.jpmorgan.com/pages/disclosures/email.  
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Steven D'Aprano
On Fri, 14 Sep 2012 15:16:47 -0600, Ian Kelly wrote:

 If only there were a conceptually simpler way to do this.  Actually,
 there is.  I give you: metadecorators!
[code snipped but shown below]
 Which I think is certainly easier to understand than the nested
 functions approach.

Maybe for you, but to me it is a big ball of mud. I have no idea how this 
is supposed to work! At a quick glance, I would have sworn that it 
*can't* work, since simple_decorator needs to see multiple arguments but 
only receives one, the function to be decorated. And yet it does work:

py from functools import partial
py def make_wrapper(wrapper):
... return lambda wrapped: partial(wrapper, wrapped)
...
py @make_wrapper
... def simple_decorator(func, *args, **kwargs):
... print Entering decorated function
... result = func(*args, **kwargs)
... print Exiting decorated function
... return result
...
py @simple_decorator
... def my_function(a, b, c):
... Doc string
... return a+b+c
...
py my_function(1, 2, 3)
Entering decorated function
Exiting decorated function
6

So to me, this is far more magical than nested functions. If I saw this 
in t requires me to hunt through your library for the simple function 
buried in a utility module somewhere (your words), instead of seeing 
everything needed in a single decorator factory function. It requires 
that I understand how partial works, which in my opinion is quite tricky. 
(I never remember how it works or which arguments get curried.)

And the end result is that the decorated function is less debugging-
friendly than I demand: it is an anonymous partial object instead of a 
named function, and the doc string is lost. And it is far from clear to 
me how to modify your recipe to use functools.wraps in order to keep the 
name and docstring, or even whether I *can* use functools.wraps.

I dare say I could answer all those questions with some experimentation 
and research. But I don't think that your metadecorator using partial 
is *inherently* more understandable than the standard decorator approach:

def simple_decorator2(func):
@functools.wraps(func)
def inner(*args, **kwargs):
print Entering decorated function
result = func(*args, **kwargs)
print Exiting decorated function
return result
return inner

This is no more complex than yours, and it keeps the function name and 
docstring.


 Parameterized decorators are not much more
 difficult this way.  This function:
[snip code]
 And now we have a fancy parameterized decorator that again requires no
 thinking about nested functions at all.

Again, at the cost of throwing away the function name and docstring.

I realise that a lot of this boils down to personal preference, but I 
just don't think that nested functions are necessarily that hard to 
grasp, so I prefer to see as much of the decorator logic to be in one 
place (a nested decorator function) rather than scattered across two 
separate decorators plus partial.




-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Steve Howell
On Sep 14, 6:05 am, Tim Chase python.l...@tim.thechases.com wrote:
 On 09/14/12 07:01, Steven D'Aprano wrote: [snip timeout class]

  Holy over-engineering Batman!!!

  No wonder you don't think much of decorators,

 [snip]

  Most of my decorator functions are under a dozen lines. And that's the
  complicated ones!

 As are mine, and a sizable chunk of those under-a-dozen-lines are
 somewhat boilerplate like using @functools.wraps inside, actual def
 of the function, and returning that function. :-)

 -tkc

For parameterized decorators, I've usually seen the pattern below.
Basically, you have 6 lines of boilerplate, and 2 lines of signal.
The amount of boilerplate is fairly daunting, but I like the
explicitness, and the nature of decorators is that they tend to get a
lot of reuse, so you can amortize the pain of all the boilerplate.

import functools

def hello_world(name): # non-boilerplate signature
def decorator(f):
@functools.wraps(f)
def wrapped(*args, **kw):
print 'hello', name # non-boilerplate value-add
f(*args, **kw)
return wrapped
return decorator

@hello_world('earth')
def add(x, y):
print x + y

add(2, 2)
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-14 Thread Dwight Hutto
On Fri, Sep 14, 2012 at 2:40 AM, Dieter Maurer die...@handshake.de wrote:
 On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
 wrote:
 I don't like decorators, I think they're not worth the mental effort.

 Fine.

 I like them because they can vastly improve reusability and drastically
 reduce redundancies (which I hate). Improved reusability and
 reduced redundancies can make applications more readable, easier
 to maintain and faster to develop.

Reduce redundancy, is argumentative.

To me, a decorator, is no more than a logging function. Correct me if
I'm wrong. It throws things at a functyion designed to watch other
functions.

The necessity for more than one decorator with if /else statements
seems redundant, but I haven't had to use them that much recently.


 --
 http://mail.python.org/mailman/listinfo/python-list



-- 
Best Regards,
David Hutto
CEO: http://www.hitwebdevelopment.com
-- 
http://mail.python.org/mailman/listinfo/python-list


Decorators not worth the effort

2012-09-13 Thread alex23
On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
wrote:
 I don't like decorators, I think they're not worth the mental effort.

Because passing a function to a function is a huge cognitive burden?
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-13 Thread Cameron Simpson
On 13Sep2012 18:58, alex23 wuwe...@gmail.com wrote:
| On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com
| wrote:
|  I don't like decorators, I think they're not worth the mental effort.
| 
| Because passing a function to a function is a huge cognitive burden?

It is for me when I'm _writing_ the decorator:-) But if I get it right
and name it well I find it dramaticly _decreases_ the cognitive burden
of the code using the decorator...
-- 
Cameron Simpson c...@zip.com.au

Observing the first balloon ascent in Paris, [Ben] Franklin heard a scoffer
ask, What good is it?  He spoke for a generation of scientists in
his retort, What good is a newly born infant? - John F. Kasson
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Decorators not worth the effort

2012-09-13 Thread alex23
On Sep 14, 12:12 pm, Cameron Simpson c...@zip.com.au wrote:
 On 13Sep2012 18:58, alex23 wuwe...@gmail.com wrote:
 | On Sep 14, 3:54 am, Jean-Michel Pichavant jeanmic...@sequans.com| wrote:
 |  I don't like decorators, I think they're not worth the mental effort.
 |
 | Because passing a function to a function is a huge cognitive burden?

 It is for me when I'm _writing_ the decorator:-) But if I get it right
 and name it well I find it dramaticly _decreases_ the cognitive burden
 of the code using the decorator...

Okay, I will concede that point :)
-- 
http://mail.python.org/mailman/listinfo/python-list