Re: Question About When Objects Are Destroyed (continued)

2017-08-07 Thread Grant Edwards
On 2017-08-05, Tim Daneliuk  wrote:
> On 08/05/2017 03:21 PM, Chris Angelico wrote:

>> so the object's lifetime shouldn't matter to you.
>
> I disagree with this most strongly.  That's only true when the
> machine resources being consumed by your Python object are small in
> size.  But when you're dynamically cranking out millions of objects
> of relatively short lifetime, you can easily bump into the real
> world limits of practical machinery.  "Wait until the reference
> count sweep gets rid of it" only works when you have plenty of room
> to squander.

I've been writing Python applications for almost 20 years.  I've never
paid any attention _at_all_ (none, zero) to object lifetimes, and it's
never caused any problems for me.  Admittedly they didn't involve
gigabytes of data, but many of them ran for days at a time...

-- 
Grant Edwards   grant.b.edwardsYow! Jesuit priests are
  at   DATING CAREER DIPLOMATS!!
  gmail.com

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/05/2017 05:36 PM, Ned Batchelder wrote:
> On 8/5/17 5:41 PM, Tim Daneliuk wrote:
>> On 08/05/2017 11:16 AM, Ned Batchelder wrote:
>>> It uses
>>> reference counting, so most objects are reclaimed immediately when their
>>> reference count goes to zero, such as at the end of local scopes. 
>> Given this code:
>>
>> class SomeObject:
>> .
>>
>>
>> for foo in somelist:
>>
>>a = SomeObject(foo)
>>b = SomeObject(foo)
>>c = SomeObject(foo)
>>
>># Do something or other
>>...
>>
>># Bottom of 'for' scope
>>
>>
>> Are you saying that each time a,b,c are reassigned to new instances of
>> SomeObject the old instance counts go to 0 and are immediately - as in
>> synchronously, right now, on the spot - removed from memory?  
> Yes, that is what I am saying.  In CPython, that is.  Other
> implementation can behave differently. Jython and IronPython use the
> garbage collectors from the JVM and .net, I don't know specifically how
> they behave.
>> My
>> understanding was (and I may well be wrong), that the reference count
>> does get decremented - in this case to 0 - but the *detection* of that
>> fact does not happen until the gc sweep looks through the heap for such
>> stale objects.
> That is how classic garbage collectors worked.  And Python has something
> like that, but it's only used to collect circular structures, where the
> reference counts will never go to zero, but nevertheless the entire
> structure can be unreferenced as a whole.
> 
> --Ned.
> 


Interesting.  I haz a confuzed.  Thanks for clearing that up.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/05/2017 05:36 PM, Ned Batchelder wrote:
> On 8/5/17 5:41 PM, Tim Daneliuk wrote:
>> On 08/05/2017 11:16 AM, Ned Batchelder wrote:
>>> It uses
>>> reference counting, so most objects are reclaimed immediately when their
>>> reference count goes to zero, such as at the end of local scopes. 
>> Given this code:
>>
>> class SomeObject:
>> .
>>
>>
>> for foo in somelist:
>>
>>a = SomeObject(foo)
>>b = SomeObject(foo)
>>c = SomeObject(foo)
>>
>># Do something or other
>>...
>>
>># Bottom of 'for' scope
>>
>>
>> Are you saying that each time a,b,c are reassigned to new instances of
>> SomeObject the old instance counts go to 0 and are immediately - as in
>> synchronously, right now, on the spot - removed from memory?  
> Yes, that is what I am saying.  In CPython, that is.  Other
> implementation can behave differently. Jython and IronPython use the
> garbage collectors from the JVM and .net, I don't know specifically how
> they behave.
>> My
>> understanding was (and I may well be wrong), that the reference count
>> does get decremented - in this case to 0 - but the *detection* of that
>> fact does not happen until the gc sweep looks through the heap for such
>> stale objects.
> That is how classic garbage collectors worked.  And Python has something
> like that, but it's only used to collect circular structures, where the
> reference counts will never go to zero, but nevertheless the entire
> structure can be unreferenced as a whole.
> 
> --Ned.
> 


Interesting.  I haz a confuzed.  Thanks for clearing that up.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/05/2017 05:58 PM, Chris Angelico wrote:
> On Sun, Aug 6, 2017 at 7:32 AM, Tim Daneliuk  wrote:
>> On 08/05/2017 03:21 PM, Chris Angelico wrote:
>>> After a 'with' block,
>>> the object *still exists*, but it has been "exited" in some way
>>> (usually by closing/releasing an underlying resource).
>>
>> The containing object exists, but the things that the closing
>> logic explicitly released do not.  In some sense, a context
>> acts like a deconstructor, just not on the object it's associated
>> with.
>>
>>> If there's a resource you need to clean up, you clean that up
>>> explicitly,
>>
>> Such "resources" *are* objects themselves notionally.  You are exactly
>> killing those objects to free the underlying resources they consume.
> 
> Utterly irrelevant. The original post was about the string in memory.
> An "open file" is no more an object than the part of a floating point
> number after the decimal is.
> 
>>> so the object's lifetime shouldn't matter to you.
>>
>> I disagree with this most strongly.  That's only true when the machine
>> resources being consumed by your Python object are small in size.  But
>> when you're dynamically cranking out millions of objects of relatively
>> short lifetime, you can easily bump into the real world limits of
>> practical machinery.  "Wait until the reference count sweep gets rid of
>> it" only works when you have plenty of room to squander.
>>
>> Also, waiting for the reference count/gc to do its thing is
>> nondeterministic in time.  It's going to happen sooner or later, but not
>> at the same or a predictable interval.  If you want to write large,
>> performant code, you don't want this kind of variability.  While I
>> realize that we're not typically writing embedded realtime drivers in
>> Python, the principle remains - where possible make things as
>> predictable and repeatable as you can.
>>
>> For reasons I am not free discuss here, I can say with some assurance
>> that there are real world applications where managing Python object
>> lifetimes is very much indicated.
> 
> Very VERY few. How often do you actually care about the lifetime of a
> specific Python object, and not (say) about the return of a block of
> memory to the OS? Memory in CPython is allocated in pages, and those
> pages are then suballocated into objects (or other uses). Sometimes
> you care about that block going back to the OS; other times, all you
> care about is that a subsequent allocation won't require more memory
> (which can be handled with free lists). But most of the time, you
> don't need to think about either, because the language *does the right
> thing*. The nondeterminism of the GC is irrelevant to most Python
> programs; in CPython, that GC sweep applies only to reference *cycles*
> (and to weak references, I think??), so unless you frequently create
> those, you shouldn't have to care.
> 
> I've written plenty of large programs in high level languages. Some of
> them in Python, some in Pike (which has the same refcount semantics),
> and some in REXX (which has very different technical semantics but
> comes to the same thing). I've had those programs running for months
> on end; in more than one instance, I've had a program running for over
> a year (over two years, even) without restarting the process or
> anything. Aside from taking care not to create cyclic references, I
> have not needed to care about when the garbage collector runs, with
> the sole exception of an instance where I built my own system on top
> of the base GC (using weak references and an autoloader to emulate a
> lookup table larger than memory). So yes, I maintain that most of the
> time, object lifetimes *should not matter* to a Python programmer.
> Python is not C, and you shouldn't treat it as C.
> 
> ChrisA
> 


OK, noted, and thanks for the clear explanation.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/05/2017 05:58 PM, Chris Angelico wrote:
> On Sun, Aug 6, 2017 at 7:32 AM, Tim Daneliuk  wrote:
>> On 08/05/2017 03:21 PM, Chris Angelico wrote:
>>> After a 'with' block,
>>> the object *still exists*, but it has been "exited" in some way
>>> (usually by closing/releasing an underlying resource).
>>
>> The containing object exists, but the things that the closing
>> logic explicitly released do not.  In some sense, a context
>> acts like a deconstructor, just not on the object it's associated
>> with.
>>
>>> If there's a resource you need to clean up, you clean that up
>>> explicitly,
>>
>> Such "resources" *are* objects themselves notionally.  You are exactly
>> killing those objects to free the underlying resources they consume.
> 
> Utterly irrelevant. The original post was about the string in memory.
> An "open file" is no more an object than the part of a floating point
> number after the decimal is.
> 
>>> so the object's lifetime shouldn't matter to you.
>>
>> I disagree with this most strongly.  That's only true when the machine
>> resources being consumed by your Python object are small in size.  But
>> when you're dynamically cranking out millions of objects of relatively
>> short lifetime, you can easily bump into the real world limits of
>> practical machinery.  "Wait until the reference count sweep gets rid of
>> it" only works when you have plenty of room to squander.
>>
>> Also, waiting for the reference count/gc to do its thing is
>> nondeterministic in time.  It's going to happen sooner or later, but not
>> at the same or a predictable interval.  If you want to write large,
>> performant code, you don't want this kind of variability.  While I
>> realize that we're not typically writing embedded realtime drivers in
>> Python, the principle remains - where possible make things as
>> predictable and repeatable as you can.
>>
>> For reasons I am not free discuss here, I can say with some assurance
>> that there are real world applications where managing Python object
>> lifetimes is very much indicated.
> 
> Very VERY few. How often do you actually care about the lifetime of a
> specific Python object, and not (say) about the return of a block of
> memory to the OS? Memory in CPython is allocated in pages, and those
> pages are then suballocated into objects (or other uses). Sometimes
> you care about that block going back to the OS; other times, all you
> care about is that a subsequent allocation won't require more memory
> (which can be handled with free lists). But most of the time, you
> don't need to think about either, because the language *does the right
> thing*. The nondeterminism of the GC is irrelevant to most Python
> programs; in CPython, that GC sweep applies only to reference *cycles*
> (and to weak references, I think??), so unless you frequently create
> those, you shouldn't have to care.
> 
> I've written plenty of large programs in high level languages. Some of
> them in Python, some in Pike (which has the same refcount semantics),
> and some in REXX (which has very different technical semantics but
> comes to the same thing). I've had those programs running for months
> on end; in more than one instance, I've had a program running for over
> a year (over two years, even) without restarting the process or
> anything. Aside from taking care not to create cyclic references, I
> have not needed to care about when the garbage collector runs, with
> the sole exception of an instance where I built my own system on top
> of the base GC (using weak references and an autoloader to emulate a
> lookup table larger than memory). So yes, I maintain that most of the
> time, object lifetimes *should not matter* to a Python programmer.
> Python is not C, and you shouldn't treat it as C.
> 
> ChrisA
> 


OK, noted, and thanks for the clear explanation.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Chris Angelico
On Sun, Aug 6, 2017 at 7:32 AM, Tim Daneliuk  wrote:
> On 08/05/2017 03:21 PM, Chris Angelico wrote:
>> After a 'with' block,
>> the object *still exists*, but it has been "exited" in some way
>> (usually by closing/releasing an underlying resource).
>
> The containing object exists, but the things that the closing
> logic explicitly released do not.  In some sense, a context
> acts like a deconstructor, just not on the object it's associated
> with.
>
>> If there's a resource you need to clean up, you clean that up
>> explicitly,
>
> Such "resources" *are* objects themselves notionally.  You are exactly
> killing those objects to free the underlying resources they consume.

Utterly irrelevant. The original post was about the string in memory.
An "open file" is no more an object than the part of a floating point
number after the decimal is.

>> so the object's lifetime shouldn't matter to you.
>
> I disagree with this most strongly.  That's only true when the machine
> resources being consumed by your Python object are small in size.  But
> when you're dynamically cranking out millions of objects of relatively
> short lifetime, you can easily bump into the real world limits of
> practical machinery.  "Wait until the reference count sweep gets rid of
> it" only works when you have plenty of room to squander.
>
> Also, waiting for the reference count/gc to do its thing is
> nondeterministic in time.  It's going to happen sooner or later, but not
> at the same or a predictable interval.  If you want to write large,
> performant code, you don't want this kind of variability.  While I
> realize that we're not typically writing embedded realtime drivers in
> Python, the principle remains - where possible make things as
> predictable and repeatable as you can.
>
> For reasons I am not free discuss here, I can say with some assurance
> that there are real world applications where managing Python object
> lifetimes is very much indicated.

Very VERY few. How often do you actually care about the lifetime of a
specific Python object, and not (say) about the return of a block of
memory to the OS? Memory in CPython is allocated in pages, and those
pages are then suballocated into objects (or other uses). Sometimes
you care about that block going back to the OS; other times, all you
care about is that a subsequent allocation won't require more memory
(which can be handled with free lists). But most of the time, you
don't need to think about either, because the language *does the right
thing*. The nondeterminism of the GC is irrelevant to most Python
programs; in CPython, that GC sweep applies only to reference *cycles*
(and to weak references, I think??), so unless you frequently create
those, you shouldn't have to care.

I've written plenty of large programs in high level languages. Some of
them in Python, some in Pike (which has the same refcount semantics),
and some in REXX (which has very different technical semantics but
comes to the same thing). I've had those programs running for months
on end; in more than one instance, I've had a program running for over
a year (over two years, even) without restarting the process or
anything. Aside from taking care not to create cyclic references, I
have not needed to care about when the garbage collector runs, with
the sole exception of an instance where I built my own system on top
of the base GC (using weak references and an autoloader to emulate a
lookup table larger than memory). So yes, I maintain that most of the
time, object lifetimes *should not matter* to a Python programmer.
Python is not C, and you shouldn't treat it as C.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Ned Batchelder
On 8/5/17 5:41 PM, Tim Daneliuk wrote:
> On 08/05/2017 11:16 AM, Ned Batchelder wrote:
>> It uses
>> reference counting, so most objects are reclaimed immediately when their
>> reference count goes to zero, such as at the end of local scopes. 
> Given this code:
>
> class SomeObject:
> .
>
>
> for foo in somelist:
>
>a = SomeObject(foo)
>b = SomeObject(foo)
>c = SomeObject(foo)
>
># Do something or other
>...
>
># Bottom of 'for' scope
>
>
> Are you saying that each time a,b,c are reassigned to new instances of
> SomeObject the old instance counts go to 0 and are immediately - as in
> synchronously, right now, on the spot - removed from memory?  
Yes, that is what I am saying.  In CPython, that is.  Other
implementation can behave differently. Jython and IronPython use the
garbage collectors from the JVM and .net, I don't know specifically how
they behave.
> My
> understanding was (and I may well be wrong), that the reference count
> does get decremented - in this case to 0 - but the *detection* of that
> fact does not happen until the gc sweep looks through the heap for such
> stale objects.
That is how classic garbage collectors worked.  And Python has something
like that, but it's only used to collect circular structures, where the
reference counts will never go to zero, but nevertheless the entire
structure can be unreferenced as a whole.

--Ned.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread MRAB

On 2017-08-05 22:41, Tim Daneliuk wrote:

On 08/05/2017 11:16 AM, Ned Batchelder wrote:

It uses
reference counting, so most objects are reclaimed immediately when their
reference count goes to zero, such as at the end of local scopes. 


Given this code:

class SomeObject:
 .


for foo in somelist:

a = SomeObject(foo)
b = SomeObject(foo)
c = SomeObject(foo)

# Do something or other
...

# Bottom of 'for' scope


Are you saying that each time a,b,c are reassigned to new instances of
SomeObject the old instance counts go to 0 and are immediately - as in
synchronously, right now, on the spot - removed from memory?  My
understanding was (and I may well be wrong), that the reference count
does get decremented - in this case to 0 - but the *detection* of that
fact does not happen until the gc sweep looks through the heap for such
stale objects.


After this:

a = SomeObject(foo)

the name "a" is bound to an instance of SomeObject and the reference 
count of that instance is 1.


If you then bind "a" to something else:

a = None

the reference count of the instance is decremented to 0, at which point 
the instance is reclaimed.


The GC sweep stuff is for handling cycles where an object is unreachable 
but its reference count is non-zero.

--
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Marko Rauhamaa
Tim Daneliuk :

> Are you saying that each time a,b,c are reassigned to new instances of
> SomeObject the old instance counts go to 0 and are immediately - as in
> synchronously, right now, on the spot - removed from memory?

That depends on the implementation of Python. CPython employs reference
counting so the answer to your question is often yes. Objects that
participate in reference cycles, cannot be cleared on the spot. They
have to wait for a GC analysis.

> My understanding was (and I may well be wrong), that the reference
> count does get decremented - in this case to 0 - but the *detection*
> of that fact does not happen until the gc sweep looks through the heap
> for such stale objects.

You are confusing two mechanisms. The reference count mechanism kicks in
right away when 0 is reached. The sweep method doesn't make any use of
reference counting.

There are also Python implementations that don't make any use of
reference counting.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/05/2017 11:16 AM, Ned Batchelder wrote:
> It uses
> reference counting, so most objects are reclaimed immediately when their
> reference count goes to zero, such as at the end of local scopes. 

Given this code:

class SomeObject:
.


for foo in somelist:

   a = SomeObject(foo)
   b = SomeObject(foo)
   c = SomeObject(foo)

   # Do something or other
   ...

   # Bottom of 'for' scope


Are you saying that each time a,b,c are reassigned to new instances of
SomeObject the old instance counts go to 0 and are immediately - as in
synchronously, right now, on the spot - removed from memory?  My
understanding was (and I may well be wrong), that the reference count
does get decremented - in this case to 0 - but the *detection* of that
fact does not happen until the gc sweep looks through the heap for such
stale objects.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/05/2017 03:21 PM, Chris Angelico wrote:
> After a 'with' block,
> the object *still exists*, but it has been "exited" in some way
> (usually by closing/releasing an underlying resource).

The containing object exists, but the things that the closing
logic explicitly released do not.  In some sense, a context
acts like a deconstructor, just not on the object it's associated
with.


> If there's a resource you need to clean up, you clean that up
> explicitly,

Such "resources" *are* objects themselves notionally.  You are exactly
killing those objects to free the underlying resources they consume.

> so the object's lifetime shouldn't matter to you.

I disagree with this most strongly.  That's only true when the machine
resources being consumed by your Python object are small in size.  But
when you're dynamically cranking out millions of objects of relatively
short lifetime, you can easily bump into the real world limits of
practical machinery.  "Wait until the reference count sweep gets rid of
it" only works when you have plenty of room to squander.

Also, waiting for the reference count/gc to do its thing is
nondeterministic in time.  It's going to happen sooner or later, but not
at the same or a predictable interval.  If you want to write large,
performant code, you don't want this kind of variability.  While I
realize that we're not typically writing embedded realtime drivers in
Python, the principle remains - where possible make things as
predictable and repeatable as you can.

For reasons I am not free discuss here, I can say with some assurance
that there are real world applications where managing Python object
lifetimes is very much indicated.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Chris Angelico
On Sun, Aug 6, 2017 at 1:23 AM, Tim Daneliuk  wrote:
> On 08/04/2017 07:00 PM, Chris Angelico wrote:
>> Again, don't stress about exactly when objects get
>> disposed of; it doesn't matter.
>
>
> Respectfully, I disagree strongly.  Objects get build on the heap and
> persist even when they go out of scope until such time garbage
> collection takes place.  This is unlike languages that build things in
> stack frames which naturally disappear with an exit of scope.
>
> For small or trivial programs, it does not matter.  But when there is a
> lot of dynamic object construction - say, in very large programs, object
> factories, etc. - it can be important to harvest the space of expired
> objects sooner, rather than later.  This, after all, is one of the
> rationale' for Python contexts - to ensure the release of resources no
> matter how the logic ends - correctly or by exception.

By "contexts", you're presumably talking about the way you can use a
'with' block to guarantee resource release. But that is actually
orthogonal to object destruction; in fact, it's specifically because
the object might NOT be destroyed at that point. Before that feature
was implemented, people depended on CPython's reference counting and
consequent object destruction (and __del__) to close files and other
resources. That doesn't work reliably in all Python implementations,
though, so a more dependable system was needed. After a 'with' block,
the object *still exists*, but it has been "exited" in some way
(usually by closing/releasing an underlying resource).

So, again, you must not concern yourself with when the objects
themselves get destroyed. If there's a resource you need to clean up,
you clean that up explicitly, so the object's lifetime shouldn't
matter to you.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Ned Batchelder
On 8/5/17 11:23 AM, Tim Daneliuk wrote:
> On 08/04/2017 07:00 PM, Chris Angelico wrote:
>> Again, don't stress about exactly when objects get
>> disposed of; it doesn't matter.
>
> Respectfully, I disagree strongly.  Objects get build on the heap and
> persist even when they go out of scope until such time garbage
> collection takes place.  This is unlike languages that build things in
> stack frames which naturally disappear with an exit of scope.
>
> For small or trivial programs, it does not matter.  But when there is a
> lot of dynamic object construction - say, in very large programs, object
> factories, etc. - it can be important to harvest the space of expired
> objects sooner, rather than later.  This, after all, is one of the
> rationale' for Python contexts - to ensure the release of resources no
> matter how the logic ends - correctly or by exception.

You might want to look into how CPython works more closely.  It uses
reference counting, so most objects are reclaimed immediately when their
reference count goes to zero, such as at the end of local scopes.  The
exception is objects that are in circular structures (A references B
which references C which references A, for example).  Those have to wait
until an asynchronous garbage collection takes place.

You can run into problems if you accidentally are still referring to
large circular structures. Then it is important to understand where the
references are, and add some code to delete the references.  But this is
unusual, and is not an issue with delayed garbage collection, but with
references keeping unwanted structures.

People in worlds with manually managed memory (such as C) are rightly
anxious about the details of their memory management. But they typically
don't need to bring that anxiety with them to Python.

--Ned.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed

2017-08-05 Thread Ned Batchelder
On 8/4/17 7:42 PM, Jon Forrest wrote:
> On 8/4/2017 4:34 PM, gst wrote:
>> 'two' is a so called constant or literal value .. (of that
>> function).
>>
>> Why not attach it, as a const value/object, to the function itself ?
>> So that a new string object has not to be created each time the
>> function is called. Because anyway strings are immutable. So what
>> would be the point to recreate such object every time the function is
>> called ?
>
> This was just an example program, not meant to do anything
> meaningful. I would think that the same object behavior would
> occur if I dynamically created an object in that function.
>
"The same object behavior" does occur: the object is freed when there
are no more references to it. But if the object is a literal in the
function, then the function keeps a reference to it. On the other hand,
if it's dynamically computed, the function has no reference to it.  That
reference can change the behavior you see.

Others have already mentioned some reasons why you are seeing the
behavior you see. Another I don't think I saw mentioned: perhaps with
ctypes you are examining memory that has been freed, but not cleared,
and in fact the object doesn't still exist.

This is the best way I know to explain a lot of these issues:
https://nedbatchelder.com/text/names1.html

You seem comfortable with C ideas and techniques. You might have to let
go of some habits and let Python do its work. :)

--Ned.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Marko Rauhamaa
Tim Daneliuk :

> On 08/04/2017 07:00 PM, Chris Angelico wrote:
>> Again, don't stress about exactly when objects get disposed of; it
>> doesn't matter.
>
> Respectfully, I disagree strongly. Objects get build on the heap and
> persist even when they go out of scope until such time garbage
> collection takes place. This is unlike languages that build things in
> stack frames which naturally disappear with an exit of scope.

Python never has to dispose of a single object. It is allowed to do so
if it doesn't affect the correct behavior of the program.

> For small or trivial programs, it does not matter. But when there is a
> lot of dynamic object construction - say, in very large programs,
> object factories, etc. - it can be important to harvest the space of
> expired objects sooner, rather than later. This, after all, is one of
> the rationale' for Python contexts - to ensure the release of
> resources no matter how the logic ends - correctly or by exception.

You are correct that maintaining references to stale objects prevents
Python's garbage collection from reclaiming memory space.

Releasing non-memory resources is a different matter. I suppose Chris
was only referring to RAM usage.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed (continued)

2017-08-05 Thread Tim Daneliuk
On 08/04/2017 07:00 PM, Chris Angelico wrote:
> Again, don't stress about exactly when objects get
> disposed of; it doesn't matter.


Respectfully, I disagree strongly.  Objects get build on the heap and
persist even when they go out of scope until such time garbage
collection takes place.  This is unlike languages that build things in
stack frames which naturally disappear with an exit of scope.

For small or trivial programs, it does not matter.  But when there is a
lot of dynamic object construction - say, in very large programs, object
factories, etc. - it can be important to harvest the space of expired
objects sooner, rather than later.  This, after all, is one of the
rationale' for Python contexts - to ensure the release of resources no
matter how the logic ends - correctly or by exception.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed

2017-08-04 Thread Terry Reedy

On 8/4/2017 7:11 PM, Jon Forrest wrote:

Consider the following Python shell session (Python 3.6.2, Win64):

 >>> def givemetwo():
... x = 'two'
... print(id(x))
...
 >>> givemetwo()
1578505988392

So far fine. My understanding of object existence made me
think that the object referred to by x would be deleted when
the givemetwo() function returned, like a local variable in C.


Python does not specifically delete objects.  It deletes references to 
objects, such as an association between a name and an object, or between 
a key and an object, or between a sequence index and an object.  When a 
function returns, associations between local names and object are 
deleted. Python on a machine does not necessarily spend time 'cleaning' 
memory by overwriting it with 0s.



However, this isn't true, as shown by the following in the same
session:

 >>> import ctypes


Most everything you see in the manuals (and on this list) has a little * 
after it.


* unless one imports and uses ctypes (or 3rd party modules).

For instance, 'Python code, even if buggy, should not crash the 
interpreter'* and 'If you find such code, it is a bug'*, 'Please open an 
issue on the tracker'*.  If you use ctypes, please don't.



 >>> print (ctypes.cast(1578505988392, ctypes.py_object).value)
two

This shows that the object still exists, which was a surprise.
Will this object ever be deleted? 


What do you mean deleted? At this point, you are not looking at a normal 
Python object, as such, but rather the contents of a segment of machine 
memory at a particular address, extracted into a ctypes.py_object 
object.  I don't think you want that chunk of memory destroyed.


The only public attribute of the ctypes.py_object object is .value.  It 
has a few undocumented private attributes, such as ._b_needsfree_ which, 
when I tried it, is 1.


--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed

2017-08-04 Thread Steve D'Aprano
On Sat, 5 Aug 2017 09:11 am, Jon Forrest wrote:

> Consider the following Python shell session (Python 3.6.2, Win64):
> 
>  >>> def givemetwo():
> ... x = 'two'
> ... print(id(x))
> ...
>  >>> givemetwo()
> 1578505988392
> 
> So far fine. My understanding of object existence made me
> think that the object referred to by x would be deleted when
> the givemetwo() function returned, like a local variable in C.

Not necessarily.

Objects are destroyed when they are no longer referenced by any other object.
That may happen when the function exits, but it may not. For example, if you
return x, and the caller assigns it to a name, then the object will still be
referenced.

However, when you exit the function, what is guaranteed is that all local
variables will go out of scope and clear *their* references to whatever objects
they are bound to. Not necessarily *all* references, but just the ones from
local variables.

Consider the object "spam and eggs", a string. If I say:

s = "spam and eggs"  # first reference

def func():
t = s  # t is a local variable, so now two refs
u = t  # third ref
return None

func()

While func() is executing, there are three references to the object: s, a
global, and t and u, locals. When the function exits, the *names* (variables) t
and u go out of scope and those references cease to exist, but the s reference
still exists and so the object (string "spam and eggs") still exists.

If you then re-bind the name s to something else:

s = "foo bar"

or delete the name:

del s

that will remove the last reference to the object and it can be garbage
collected.


> However, this isn't true, as shown by the following in the same
> session:
> 
>  >>> import ctypes
>  >>> print (ctypes.cast(1578505988392, ctypes.py_object).value)
> two
> 
> This shows that the object still exists, which was a surprise.

You may be right about the object still existing, but for the wrong reasons.

The Python interpreter is free to cache objects that it thinks have a good
chance of being re-created. This is obviously implementation dependent: it will
depend on the specific interpreter (CPython, Jython, IronPython, PyPy,
Stackless, Nuika), the specific version, and potentially any other factor the
interpreter wishes to take into account, up to and including the phase of the
moon.

In this case, CPython caches short strings that look like identifiers. It does
this because variables are implemented as string keys in dicts, so when you
create a variable

two = 2

the interpreter creates a string object 'two' to use as a key in the globals()
dict. Since object creation is relatively costly, the interpreter caches that
string and keeps it around.

So your test has accidentally hit an implementation-dependent feature of
CPython. If you had used a string that didn't look like an identifier, say

"two not three, okay?"

you may have seen different results.

Or maybe not. This is all implementation dependent.

By using ctypes, you are poking around in the internals of the Python
interpreter, you aren't looking at what *Python the language* guarantees, but
merely whatever this specific version of CPython happens to do, today.

For example, you are already on shaky ground by using the ID of an object, which
is an opaque integer, as if it were a memory address. Object IDs aren't memory
addresses, they are opaque integers.

It just happens that for speed, the CPython interpreter uses the address of the
object as its ID. But it can only do that because the CPython garbage collector
is very simple, and it never relocates objects from place to place. But Jython
and IronPython have more sophisticated garbage collectors which do, so they use
a different scheme for generating IDs which are consecutive integers.

So... you've effectively grabbed an arbitrary address in memory, which may have
previously contained a certain string object. You use ctypes to interpret that
chunk of memory as an object. Since CPython doesn't move memory around, you
might be okay:

- either that address actually does point to a live object, and you're safe;

- or it points to what *was* a live object, but the memory hasn't been used,
  and so all the fields are still correctly allocated, and you're safe;

but you might not be. What if some other object has re-used that piece of
memory? You might now be jumping halfway into some other object, and trying to
interpret that as the start of an object.

You can segfault CPython with ctypes.


> Will this object ever be deleted?

The specific object 'two'? Maybe, maybe not. It might be cached for the lifetime
of this interpreter session. Or there may be circumstances where cached objects
age-out and are deleted. It depends on the implementation of the cache. That
isn't a question about Python the language.


> I'm learning about function 
> decorators which, as my early studies tell me, depend on calling
> a function defined inside another function. This suggests that
> objects 

Re: Question About When Objects Are Destroyed (continued)

2017-08-04 Thread Chris Angelico
On Sat, Aug 5, 2017 at 9:47 AM, Jon Forrest  wrote:
> Perhaps the reason the variable isn't destroyed is
> shown by the following (again, in the same session):
>
 import sys
 sys.getrefcount(1578505988392)
> 3
>
> So, maybe it's not destroyed because there are still
> references to it. But, what are these references?
> Will the reference count ever go to zero?

That's the reference count for the integer object. Nothing to do with
the original. Again, don't stress about exactly when objects get
disposed of; it doesn't matter.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed

2017-08-04 Thread Chris Angelico
On Sat, Aug 5, 2017 at 9:42 AM, Jon Forrest  wrote:
> On 8/4/2017 4:34 PM, gst wrote:
>>
>> 'two' is a so called constant or literal value .. (of that
>> function).
>>
>> Why not attach it, as a const value/object, to the function itself ?
>> So that a new string object has not to be created each time the
>> function is called. Because anyway strings are immutable. So what
>> would be the point to recreate such object every time the function is
>> called ?
>
>
> This was just an example program, not meant to do anything
> meaningful. I would think that the same object behavior would
> occur if I dynamically created an object in that function.

Python doesn't have pointers, so what you have is an arbitrary number.
Even if the object had been freed from memory, you could quite
probably do the same shenanigans to get a value out of it; the data
would still be there.

Basically, don't think about object lifetimes. Objects will be flushed
from memory once they're not needed any more, and no sooner; so you
can safely return anything from any function.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Question About When Objects Are Destroyed

2017-08-04 Thread Jon Forrest

On 8/4/2017 4:34 PM, gst wrote:

'two' is a so called constant or literal value .. (of that
function).

Why not attach it, as a const value/object, to the function itself ?
So that a new string object has not to be created each time the
function is called. Because anyway strings are immutable. So what
would be the point to recreate such object every time the function is
called ?


This was just an example program, not meant to do anything
meaningful. I would think that the same object behavior would
occur if I dynamically created an object in that function.

Jon




--
https://mail.python.org/mailman/listinfo/python-list