• <tfoot id='ECK5S'></tfoot>
  • <legend id='ECK5S'><style id='ECK5S'><dir id='ECK5S'><q id='ECK5S'></q></dir></style></legend>

      <small id='ECK5S'></small><noframes id='ECK5S'>

      <i id='ECK5S'><tr id='ECK5S'><dt id='ECK5S'><q id='ECK5S'><span id='ECK5S'><b id='ECK5S'><form id='ECK5S'><ins id='ECK5S'></ins><ul id='ECK5S'></ul><sub id='ECK5S'></sub></form><legend id='ECK5S'></legend><bdo id='ECK5S'><pre id='ECK5S'><center id='ECK5S'></center></pre></bdo></b><th id='ECK5S'></th></span></q></dt></tr></i><div id='ECK5S'><tfoot id='ECK5S'></tfoot><dl id='ECK5S'><fieldset id='ECK5S'></fieldset></dl></div>
        <bdo id='ECK5S'></bdo><ul id='ECK5S'></ul>

        如何从多个进程递增共享计数器?

        How to increment a shared counter from multiple processes?(如何从多个进程递增共享计数器?)
        <i id='NqDW1'><tr id='NqDW1'><dt id='NqDW1'><q id='NqDW1'><span id='NqDW1'><b id='NqDW1'><form id='NqDW1'><ins id='NqDW1'></ins><ul id='NqDW1'></ul><sub id='NqDW1'></sub></form><legend id='NqDW1'></legend><bdo id='NqDW1'><pre id='NqDW1'><center id='NqDW1'></center></pre></bdo></b><th id='NqDW1'></th></span></q></dt></tr></i><div id='NqDW1'><tfoot id='NqDW1'></tfoot><dl id='NqDW1'><fieldset id='NqDW1'></fieldset></dl></div>
        • <tfoot id='NqDW1'></tfoot>

            <small id='NqDW1'></small><noframes id='NqDW1'>

              <tbody id='NqDW1'></tbody>
                <bdo id='NqDW1'></bdo><ul id='NqDW1'></ul>
              • <legend id='NqDW1'><style id='NqDW1'><dir id='NqDW1'><q id='NqDW1'></q></dir></style></legend>
                1. 本文介绍了如何从多个进程递增共享计数器?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  我在使用multiprocessing模块时遇到问题。我使用Pool个工作者及其map方法并发分析大量文件。每次处理文件时,我都希望更新计数器,这样我就可以跟踪还有多少文件需要处理。以下是示例代码:

                  import os
                  import multiprocessing
                  
                  counter = 0
                  
                  
                  def analyze(file):
                      # Analyze the file.
                      global counter
                      counter += 1
                      print counter
                  
                  
                  if __name__ == '__main__':
                      files = os.listdir('/some/directory')
                      pool = multiprocessing.Pool(4)
                      pool.map(analyze, files)
                  

                  我找不到此问题的解决方案。

                  推荐答案

                  问题在于counter变量不在您的进程之间共享:每个单独的进程都在创建自己的本地实例并递增该实例。

                  有关可用于在进程之间共享状态的一些技术,请参阅文档的this section。在您的情况下,您可能希望在您的员工之间共享Value实例

                  这里是您的示例的工作版本(带有一些虚拟输入数据)。请注意,它使用的是我在实践中确实会尽量避免的全局值:

                  from multiprocessing import Pool, Value
                  from time import sleep
                  
                  counter = None
                  
                  def init(args):
                      ''' store the counter for later use '''
                      global counter
                      counter = args
                  
                  def analyze_data(args):
                      ''' increment the global counter, do something with the input '''
                      global counter
                      # += operation is not atomic, so we need to get a lock:
                      with counter.get_lock():
                          counter.value += 1
                      print counter.value
                      return args * 10
                  
                  if __name__ == '__main__':
                      #inputs = os.listdir(some_directory)
                  
                      #
                      # initialize a cross-process counter and the input lists
                      #
                      counter = Value('i', 0)
                      inputs = [1, 2, 3, 4]
                  
                      #
                      # create the pool of workers, ensuring each one receives the counter 
                      # as it starts. 
                      #
                      p = Pool(initializer = init, initargs = (counter, ))
                      i = p.map_async(analyze_data, inputs, chunksize = 1)
                      i.wait()
                      print i.get()
                  

                  这篇关于如何从多个进程递增共享计数器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

                  相关文档推荐

                  GUI Freezes while downloading PyQt5 and Pytube(GUI在下载PyQt5和Pytube时冻结)
                  How to solve memory issues while multiprocessing using Pool.map()?(如何解决使用Pool.map()进行多处理时的内存问题?)
                  Python - How to use FastAPI and uvicorn.run without blocking the thread?(Python-如何使用FastAPI和uvicorn.run而不阻塞线程?)
                  Using pika, how to connect to rabbitmq running in docker, started with docker-compose with external network?(使用pika,如何连接运行在docker中的rabbitmq,从docker开始-与外部网络连接?)
                  How to use .rolling() on each row of a Pandas dataframe?(如何对 pandas 数据帧的每一行使用.roll()?)
                  pandas- changing the start and end date of resampled timeseries( pandas -更改重新采样的时间序列的开始和结束日期)
                  <legend id='1EWR1'><style id='1EWR1'><dir id='1EWR1'><q id='1EWR1'></q></dir></style></legend>

                2. <i id='1EWR1'><tr id='1EWR1'><dt id='1EWR1'><q id='1EWR1'><span id='1EWR1'><b id='1EWR1'><form id='1EWR1'><ins id='1EWR1'></ins><ul id='1EWR1'></ul><sub id='1EWR1'></sub></form><legend id='1EWR1'></legend><bdo id='1EWR1'><pre id='1EWR1'><center id='1EWR1'></center></pre></bdo></b><th id='1EWR1'></th></span></q></dt></tr></i><div id='1EWR1'><tfoot id='1EWR1'></tfoot><dl id='1EWR1'><fieldset id='1EWR1'></fieldset></dl></div>
                  <tfoot id='1EWR1'></tfoot>

                  <small id='1EWR1'></small><noframes id='1EWR1'>

                            <tbody id='1EWR1'></tbody>
                            <bdo id='1EWR1'></bdo><ul id='1EWR1'></ul>