[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread 李超然

李超然  added the comment:

Thank you Tim Peters for replying to me. I tried your demo and it is a proof 
exactly I want. It did prove that there is no memory issue. And I tried to 
modify my own code and showed the same result. I will close this issue.

--
resolution:  -> not a bug
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread Tim Peters


Tim Peters  added the comment:

There's no evidence of a Python issue here, so I recommend closing this. It's 
not the Python bug tracker's job to try to make sense of platform-specific 
reporting tools, which, as already explained, can display exceedingly confusing 
numbers. We (the Python project) didn't write them, have no control over what 
they display, and have no special insight into them.

Here's a similar simple program that creates a 4 GiB block of shared memory, 
and passes it to 16 processes. I'm running this on Windows 10, on a machine 
with 4 physical cores and 16 GiB of RAM. If the memory weren't actually being 
shared, Windows would have killed the job (there's no "over allocation" allowed 
at all on Windows), because there's nowhere near 4 * 17 = 68 GiB of RAM 
available.

Instead, the program brings the machine to its knees, but because it created 16 
processes each of which is 100% CPU- and memory-bound. The Windows task manager 
detailed view shows less than a MiB of non-shared memory in use by each of the 
worker processes, with the 4 GiB in the "memory in use that can be shared with 
other processes" column for each worker process.  I'm running lots of other 
stuff too, and Windows still says there's over 5 GiB of my 16 GiB of RAM free 
for use.

Windows reporting has its own quirks. For example, the shared memory block 
doesn't show up at all in the workers at first. Instead the amount reported 
steadily increases, as the write loop in each worker process forces the OS to 
materialize shared pages in the worker's virtual address space (that is, 
Windows reports pages actually in use by a process, not virtual address space 
reservations).

from multiprocessing import Process, shared_memory

SHM_SIZE = 2 ** 32 # 4 GiB

def f(name):
shm = shared_memory.SharedMemory(name=name)
print(f"shm of size {shm.size:,}")
buf = shm.buf
for i in range(shm.size):
buf[i] = 123
shm.close()

def main():
shm = shared_memory.SharedMemory(create=True, size=SHM_SIZE, name='shm')
ps = []
for i in range(16):
p = Process(target=f, args=('shm',))
p.start()
ps.append(p)
for p in ps:
p.join()
shm.close()
shm.unlink()

if __name__ == '__main__':
main()

--
nosy: +tim.peters

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread Eric V. Smith


Eric V. Smith  added the comment:

Okay. We'll see if someone else can provide more info.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread 李超然

李超然  added the comment:

Sorry, I don't know what I can do about it. In my perspective, I think there is 
a memory leak because monitor software have provided proof for me to believe 
that. 
I have provided a script to reproduce this issue. I think that is enough for 
developers to conduct other tests to verify whether this issue does exist. I 
don't think I can provide any more information since you can reproduce this 
problem. 
As a user, I just need some advice about how to fix this issue or confirmation 
about this issue, or some explanations that make me believe that there is no 
issue.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread Eric V. Smith


Eric V. Smith  added the comment:

Sorry, I don't have any particular suggestion other than accounting for all 
virtual, shared, and physical memory of all types, and seeing how they're being 
used and allocated per-process by the various tools.

There are probably guides for this on the internet, but I haven't been able to 
find any with a quick search.

I'm not saying there isn't a problem, but I'm saying it's going to require more 
analysis before we can verify that a problem does exist.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread 李超然

李超然  added the comment:

Okay. I know this is complicated. So how can I make sure this is not an issue? 
Can you provide some steps or a bash script to prove that the memory increment 
issue does not exist? I'm now not being persuaded because I don't know how to 
prove there is no issue about the memory. Will the experiment in the last 
message prove it? Or maybe you can provide an simple example showing there is 
no memory issue. Thank you.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread Eric V. Smith


Eric V. Smith  added the comment:

You'll have to play with it. I'm just saying that it's a very complicated 
subject, and not as simple as asking how much memory an individual process is 
using. For example, see 
https://www.howtogeek.com/659529/how-to-check-memory-usage-from-the-linux-terminal/
 for the many statistics that are involved.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread 李超然

李超然  added the comment:

You mean if I have a machine that has 16GB RAM, and the maximum shared memory 
size is 8GB. I then create two processes to write to this shared memory, and 
the system won't run out of memory. Is it right?
I can try this experiment later on. But I can not understand this. I can see 
this memory increment in `htop`. I think the memory stat from `htop` is 
correct, otherwise, any system monitor software could deliver a fake warning 
message about the memory.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread Eric V. Smith


Eric V. Smith  added the comment:

It's likely that the same memory is being counted by both processes, to the 
output is misleading. Shared memory is notoriously difficult to allocate 
per-process. For example, it's definitely true that the shared memory is 
consuming virtual address space in both processes. That's often the value 
that's reported. But since the actual memory is shared, it's not consuming more 
physical memory than expected.

--
nosy: +eric.smith

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41587] Potential memory leak while using shared memory

2020-08-19 Thread 李超然

New submission from 李超然 :

We find an issue while using shared memory. When opening another process to 
overwrite the shared memory, the memory of this process will increase to about 
the size of this shared memory. So when several processes try to read or write 
the shared memory, there will be N times memory usage.

```
import os, psutil
import numpy as np
from multiprocessing import Process, shared_memory

SHM_SIZE = 10 * 30 * 20


def f(name):
print('[Sub] (Before) Used Memory of {}: {} MiB'.format(
os.getpid(),
psutil.Process(os.getpid()).memory_info().rss / 1024 / 1024,
))
shm = shared_memory.SharedMemory(name=name)
b = np.ndarray(shape=(SHM_SIZE, 1), dtype=np.float64, buffer=shm.buf)
for i in range(SHM_SIZE):
b[i, 0] = 1
print('[Sub] (After) Used Memory of {}: {} MiB'.format(
os.getpid(),
psutil.Process(os.getpid()).memory_info().rss / 1024 / 1024,
))


def main():
print('[Main] Used Memory of {}: {} MiB'.format(
os.getpid(),
psutil.Process(os.getpid()).memory_info().rss / 1024 / 1024,
))
shm = shared_memory.SharedMemory(create=True, size=8*SHM_SIZE, name='shm')
p = Process(target=f, args=('shm',))
p.start()
p.join()
print('[Main] Used Memory of {}: {} MiB'.format(
os.getpid(),
psutil.Process(os.getpid()).memory_info().rss / 1024 / 1024,
))
input()
shm.close()
shm.unlink()


if __name__ == '__main__':
main()
```

--
components: Library (Lib)
messages: 375642
nosy: seraphlivery
priority: normal
severity: normal
status: open
title: Potential memory leak while using shared memory
type: resource usage
versions: Python 3.8

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com