将 multiprocessing.Queue 转储到列表中

Dumping a multiprocessing.Queue into a list(将 multiprocessing.Queue 转储到列表中)
本文介绍了将 multiprocessing.Queue 转储到列表中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

问题描述

我希望将 multiprocessing.Queue 转储到列表中.对于该任务,我编写了以下函数:

I wish to dump a multiprocessing.Queue into a list. For that task I've written the following function:

import Queue

def dump_queue(queue):
    """
    Empties all pending items in a queue and returns them in a list.
    """
    result = []

    # START DEBUG CODE
    initial_size = queue.qsize()
    print("Queue has %s items initially." % initial_size)
    # END DEBUG CODE

    while True:
        try:
            thing = queue.get(block=False)
            result.append(thing)
        except Queue.Empty:

            # START DEBUG CODE
            current_size = queue.qsize()
            total_size = current_size + len(result)
            print("Dumping complete:")
            if current_size == initial_size:
                print("No items were added to the queue.")
            else:
                print("%s items were added to the queue." % 
                      (total_size - initial_size))
            print("Extracted %s items from the queue, queue has %s items 
            left" % (len(result), current_size))
            # END DEBUG CODE

            return result

但由于某种原因它不起作用.

But for some reason it doesn't work.

观察以下 shell 会话:

Observe the following shell session:

>>> import multiprocessing
>>> q = multiprocessing.Queue()
>>> for i in range(100):
...     q.put([range(200) for j in range(100)])
... 
>>> q.qsize()
100
>>> l=dump_queue(q)
Queue has 100 items initially.
Dumping complete:
0 items were added to the queue.
Extracted 1 items from the queue, queue has 99 items left
>>> l=dump_queue(q)
Queue has 99 items initially.
Dumping complete:
0 items were added to the queue.
Extracted 3 items from the queue, queue has 96 items left
>>> l=dump_queue(q)
Queue has 96 items initially.
Dumping complete:
0 items were added to the queue.
Extracted 1 items from the queue, queue has 95 items left
>>> 

这里发生了什么?为什么不是所有的物品都被倾倒了?

What's happening here? Why aren't all the items being dumped?

推荐答案

试试这个:

import Queue
import time

def dump_queue(queue):
    """
    Empties all pending items in a queue and returns them in a list.
    """
    result = []

    for i in iter(queue.get, 'STOP'):
        result.append(i)
    time.sleep(.1)
    return result

import multiprocessing
q = multiprocessing.Queue()
for i in range(100):
    q.put([range(200) for j in range(100)])
q.put('STOP')
l=dump_queue(q)
print len(l)

多处理队列有一个内部缓冲区,该缓冲区有一个馈线线程,该线程从缓冲区中提取工作并将其刷新到管道中.如果不是所有的对象都被刷新,我可以看到 Empty 过早引发的情况.使用哨兵来指示队列的结束是安全的(可靠的).此外,使用 iter(get, sentinel) 习惯用法比依赖 Empty 更好.

Multiprocessing queues have an internal buffer which has a feeder thread which pulls work off a buffer and flushes it to the pipe. If not all of the objects have been flushed, I could see a case where Empty is raised prematurely. Using a sentinel to indicate the end of the queue is safe (and reliable). Also, using the iter(get, sentinel) idiom is just better than relying on Empty.

我不喜欢它可能由于刷新时间而升空(我添加了 time.sleep(.1) 以允许上下文切换到馈线线程,您可能不需要它,没有它它也可以工作 - 它是释放 GIL 的习惯).

I don't like that it could raise empty due to flushing timing (I added the time.sleep(.1) to allow a context switch to the feeder thread, you may not need it, it works without it - it's a habit to release the GIL).

这篇关于将 multiprocessing.Queue 转储到列表中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

相关文档推荐

build conda package from local python package(从本地 python 包构建 conda 包)
How can I see all packages that depend on a certain package with PIP?(如何使用 PIP 查看依赖于某个包的所有包?)
How to organize multiple python files into a single module without it behaving like a package?(如何将多个 python 文件组织到一个模块中而不像一个包一样?)
Check if requirements are up to date(检查要求是否是最新的)
How to upload new versions of project to PyPI with twine?(如何使用 twine 将新版本的项目上传到 PyPI?)
Why #egg=foo when pip-installing from git repo(为什么从 git repo 进行 pip 安装时 #egg=foo)