You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
sys.getsizeof(object, default) will always return default on PyPy, and raise a TypeError if default is not provided.
First note that the CPython documentation says that this function may raise a TypeError, so if you are seeing it, it means that the program you are using is not correctly handling this case.
On PyPy, though, it always raises TypeError. Before looking for alternatives, please take a moment to read the following explanation as to why it is the case. What you are looking for may not be possible.
A memory profiler using this function is most likely to give results inconsistent with reality on PyPy. It would be possible to have sys.getsizeof() return a number (with enough work), but that may or may not represent how much memory the object uses. It doesn't even make really sense to ask how much one object uses, in isolation with the rest of the system. For example, instances have maps, which are often shared across many instances; in this case the maps would probably be ignored by an implementation of sys.getsizeof(), but their overhead is important in some cases if they are many instances with unique maps. Conversely, equal strings may share their internal string data even if they are different objects---or empty containers may share parts of their internals as long as they are empty. Even stranger, some lists create objects as you read them; if you try to estimate the size in memory of range(10**6) as the sum of all items' size, that operation will by itself create one million integer objects that never existed in the first place.
In #13, @oldcai writes:
Let's make sure
pg_dump_splitsort.py
runs with PyPy, and document the performance gain.The text was updated successfully, but these errors were encountered: