Memory usage of processes from python?
Is there a way to find the memory usage of python processes?
Trying to find some portable way of doing this. However, so far I think a new module might be needed...
I've got linux mostly covered, but maybe you know how with freebsd, OSX, windows(9x-7)?
So is there something built into python already? Is there a X-platform third party module already? Or a module just for one platform available?
update: here's the linux code I found and cleaned up a bit memory_usage.py if anyone is interested. bytes_resident = memory_usage.resident(). It reads /proc/PID/status... eg, like "$ cat /proc/PID/status | grep VmRSS" would.
pympler: 'Pympler is a development tool to measure, monitor and analyze the memory behavior of Python objects in a running Python application.'
psutil: 'psutil is a module providing an interface for retrieving information on running processes and system utilization (CPU, memory) in a portable way by using Python, implementing many functionalities offered by tools like ps, top and Windows task manager.'
dowser: 'Dowser is a CherryPy application that displays sparklines of Python object counts, and allows you to trace their referents. This helps you track memory usage and leaks in any Python program, but especially CherryPy sites.'
syrupy: 'Syrupy is a Python script that regularly takes snapshots of the memory and CPU load of one or more running processes, so as to dynamically build up a profile of their usage of system resources.'
Some non-pythony memory tools: valgrind memcheck, massif and cachegrind (linux), MallocDebug (osx)
Trying to find some portable way of doing this. However, so far I think a new module might be needed...
I've got linux mostly covered, but maybe you know how with freebsd, OSX, windows(9x-7)?
So is there something built into python already? Is there a X-platform third party module already? Or a module just for one platform available?
update: here's the linux code I found and cleaned up a bit memory_usage.py if anyone is interested. bytes_resident = memory_usage.resident(). It reads /proc/PID/status... eg, like "$ cat /proc/PID/status | grep VmRSS" would.
pympler: 'Pympler is a development tool to measure, monitor and analyze the memory behavior of Python objects in a running Python application.'
psutil: 'psutil is a module providing an interface for retrieving information on running processes and system utilization (CPU, memory) in a portable way by using Python, implementing many functionalities offered by tools like ps, top and Windows task manager.'
dowser: 'Dowser is a CherryPy application that displays sparklines of Python object counts, and allows you to trace their referents. This helps you track memory usage and leaks in any Python program, but especially CherryPy sites.'
syrupy: 'Syrupy is a Python script that regularly takes snapshots of the memory and CPU load of one or more running processes, so as to dynamically build up a profile of their usage of system resources.'
Some non-pythony memory tools: valgrind memcheck, massif and cachegrind (linux), MallocDebug (osx)
Comments
http://www.aminus.net/wiki/Dowser
$ cat /proc//status | grep VmRSS
@peter: very easy! That's the file being parsed in the memory_usage module.
@grib: ah, that's quite nice for inspecting a running program.
I found out about /proc/PID/smaps which has a lot more memory information compared to /proc/PID/status. There's a few tools which visualise smaps data too.
exmap: 'Exmap is a memory analysis tool which allows you to accurately determine how much physical memory and swap is used by individual processes and shared libraries on a running system. In particular, it accounts for the sharing of memory and swap between different processes.'
So exmap seems very useful for measuring system level memory usage with python.
# Note per _program_, not per process. So for example this script
# will report RAM used by all httpd process together.'
exmap uses a loadable kernel module... but ps_mem.py uses /proc/PID/smaps data from 2.6.23 onwards kernels. So I think it will be more useful overall.
David Malcolm wrote a blog post at memory usage: is it worth sharing constant data? related to the OLPC project which wants to reduce memory usage of CPython processes by trying to get all the different processes to share memory.