Skip to content Skip to sidebar Skip to footer

Python Sharing A Network Socket With Multiprocessing.manager

I am currently writing a nginx proxy server module with a Request queue in front, so the requests are not dropped when the servers behind the nginx can't handle the requests (nginx

Solution 1:

U can use multiprocessing.reduction to transfer the connection and socket objects between processes

Example Code

# Main process
from multiprocessing.reduction import reduce_handle
h = reduce_handle(client_socket.fileno())
pipe_to_worker.send(h)

# Worker process
from multiprocessing.reduction import rebuild_handle
h = pipe.recv()
fd = rebuild_handle(h)
client_socket = socket.fromfd(fd, socket.AF_INET, socket.SOCK_STREAM)
client_socket.send("hello from the worker process\r\n") 

Solution 2:

Looks like you need to pass file descriptors between processes (assuming Unix here, no clue about Windows). I've never done this in Python, but here is link to python-passfd project that you might want to check.

Solution 3:

You can look at this code - https://gist.github.com/sunilmallya/4662837 which is multiprocessing.reduction socket server with parent processing passing connections to client after accepting connections

Post a Comment for "Python Sharing A Network Socket With Multiprocessing.manager"