Memory usage of processes from python?

Is there a way to find the memory usage of python processes?

Trying to find some portable way of doing this. However, so far I think a new module might be needed...

I've got linux mostly covered, but maybe you know how with freebsd, OSX, windows(9x-7)?

So is there something built into python already? Is there a X-platform third party module already? Or a module just for one platform available?

update: here's the linux code I found and cleaned up a bit if anyone is interested. bytes_resident = memory_usage.resident(). It reads /proc/PID/status... eg, like "$ cat /proc/PID/status | grep VmRSS" would.

pympler: 'Pympler is a development tool to measure, monitor and analyze the memory behavior of Python objects in a running Python application.'

psutil: 'psutil is a module providing an interface for retrieving information on running processes and system utilization (CPU, memory) in a portable way by using Python, implementing many functionalities offered by tools like ps, top and Windows task manager.'

dowser: 'Dowser is a CherryPy application that displays sparklines of Python object counts, and allows you to trace their referents. This helps you track memory usage and leaks in any Python program, but especially CherryPy sites.'

syrupy: 'Syrupy is a Python script that regularly takes snapshots of the memory and CPU load of one or more running processes, so as to dynamically build up a profile of their usage of system resources.'

Some non-pythony memory tools: valgrind memcheck, massif and cachegrind (linux), MallocDebug (osx)


Alex said…
You might be able to make something like what you want out of pympler.
Lawouach said…
Have you tried psutil:
illume said…
Hi Alex. pympler is very awesome. It seems to look at sizes of python objects. Whereas I am interested in the memory usage at the process level. cu.
illume said…
Lawouach: brilliant! psutil looks the ticket. Combined with pympler, that's a lot of information :)
grib said…
dowser is also really cool for finding memory leaks and other surprising memory usage stats.
Jeet Sukumaran said…
To fulfill this need, a while back I wrote Syrupy, which works on any POSIX-compliant platform that provides "ps".
peterbe said…
We use:

$ cat /proc//status | grep VmRSS
illume said…
@Jeet: cool, syrupy looks very useful. Looks like it would be easy to convert the output to a pretty graph.

@peter: very easy! That's the file being parsed in the memory_usage module.

@grib: ah, that's quite nice for inspecting a running program.
flow said…
you might be interested in RunSnakeRun; there is also a nice video
illume said…
Found a couple more things...

I found out about /proc/PID/smaps which has a lot more memory information compared to /proc/PID/status. There's a few tools which visualise smaps data too.

exmap: 'Exmap is a memory analysis tool which allows you to accurately determine how much physical memory and swap is used by individual processes and shared libraries on a running system. In particular, it accounts for the sharing of memory and swap between different processes.'

So exmap seems very useful for measuring system level memory usage with python.
illume said… ' Try to determine how much RAM is currently being used per program.
# Note per _program_, not per process. So for example this script
# will report RAM used by all httpd process together.'

exmap uses a loadable kernel module... but uses /proc/PID/smaps data from 2.6.23 onwards kernels. So I think it will be more useful overall.

David Malcolm wrote a blog post at memory usage: is it worth sharing constant data? related to the OLPC project which wants to reduce memory usage of CPython processes by trying to get all the different processes to share memory.

Popular posts from this blog

Is PostgreSQL good enough?

Experiments with new low latency PyPy garbage collector in a thread.

🐱‍🏍 — the first pygame 2 community game. Starting now! Are you in?