-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
per-interpreter: memory allocators #30
Comments
Can we have per interpreter heap and allocators and a flag telling whether we are in the main interpreter? The overhead doesn't sounds terrible. |
"per interpreter allocators" is exactly what this issue is for. What do you mean by "per interpreter heap"? The C heap wouldn't make sense so I'm guessing you're talking about the Python "heap". If so, that's pretty much unrelated to what we're working on here. Consider starting a thread on the python-ideas or python-dev mailing lists. Where are you looking for a flag about the main interpreter? In the C code all the info is there already. From Python the information is exposed in PEP 554. |
A workaround is to disable pymalloc. It can even be done by setting PYTHONMALLOC env var to "malloc" for example. libc malloc() is thread safe. |
@vstinner, a workaround for what? per-interpreter allocators or the bug for which we rolled back moving the allocators into PyRuntimeState? |
@ericsnowcurrently: "@vstinner, a workaround for what?" pymalloc is not thread safe. If you want to experiment subinterpreters running in parallel in different threads, you can force the usage of the libc malloc()/free() by setting PYTHONMALLOC env var to "malloc". |
I proposed the following issue and PR to implement this workaround as a configure build option: |
[maybe split into more granular tasks]
PyRuntimeState.obj
toPyInterpreterState.obj
PyRuntimeState.mem
toPyInterpreterState.mem
Notable concerns:
The global state for this was moved back out of
Include/internal
to fix a bug. That PR encapsulates the relevant global state and C-API pretty well.The text was updated successfully, but these errors were encountered: