-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
read() and read_multiple() block when consuming from python-managed pipes or sockets #354
Comments
My first guess would be that your second snippet is somehow blocking in the C++ code, while not relinquishing the GIL. That would prevent the other thread from continuing, creating a deadlock. However, the following snippet shows that the GIL is being released since #308: Lines 3872 to 3873 in 1fb1687
Are you sure you are using a pycapnp version that is new enough? |
I finally figured out how to make this approach work using processes, not threads! Most imporantanly, for the pipe version, the buffer on the write side has to be set to 0 using
The socket version using There are also fallpits when using a process, in particular using However, when I run the exact same code using |
Hi, I know that there is ongoing work to remove or reduce the dependency on raw file descriptors in #283 and #311. However, this issue I have is with the existing implementation (reference 2.0.0b2) and how it works with pipe (
os.pipe
) and socket (socket.socket
) objects in Python, which do expose usable raw file descriptors viafileno()
.We can construct a pipe using
pipe_read, pipe_write = os.pipe()
Data can be generated or read in one thread (or process) like
and be consumed in another thread like
What I observe is, that both
MyStruct.read_multiple(read_file)
andwrite_file.write(chunk)
block, ifchunk_size
is shorter than the serialized struct item. I hypothesize, that this has to do how the reader peaks into the data, which is in fact a stream, without actually consuming it, but I don't know.Strangely, if a process outside Python generates the stream via
process = subprocess.Popen()
and writes it into a pipe via standard output usingstdout=subprocess.PIPE
, read_multiple() can read it without issues usingprocess.stdout
.Maybe someone has an idea why this happens and how it could be circumvented or fixed? Happy to hear your thoughts.
The text was updated successfully, but these errors were encountered: