Is there a way to find the memory usage of python processes?
Trying to find some portable way of doing this. However, so far I think a new module might be needed...
I've got linux mostly covered, but maybe you know how with freebsd, OSX, windows(9x-7)?
So is there something built into python already? Is there a X-platform third party module already? Or a module just for one platform available?
update: here's the linux code I found and cleaned up a bit memory_usage.py if anyone is interested. bytes_resident = memory_usage.resident(). It reads /proc/PID/status... eg, like "$ cat /proc/PID/status | grep VmRSS" would.
pympler: 'Pympler is a development tool to measure, monitor and analyze the memory behavior of Python objects in a running Python application.'
psutil: 'psutil is a module providing an interface for retrieving information on running processes and system utilization (CPU, memory) in a portable way by using Python, implementing many functionalities offered by tools like ps, top and Windows task manager.'
dowser: 'Dowser is a CherryPy application that displays sparklines of Python object counts, and allows you to trace their referents. This helps you track memory usage and leaks in any Python program, but especially CherryPy sites.'
syrupy: 'Syrupy is a Python script that regularly takes snapshots of the memory and CPU load of one or more running processes, so as to dynamically build up a profile of their usage of system resources.'
Some non-pythony memory tools: valgrind memcheck, massif and cachegrind (linux), MallocDebug (osx)