<tfoot id='HV4fd'></tfoot>

  1. <legend id='HV4fd'><style id='HV4fd'><dir id='HV4fd'><q id='HV4fd'></q></dir></style></legend>
      <bdo id='HV4fd'></bdo><ul id='HV4fd'></ul>
  2. <small id='HV4fd'></small><noframes id='HV4fd'>

    1. <i id='HV4fd'><tr id='HV4fd'><dt id='HV4fd'><q id='HV4fd'><span id='HV4fd'><b id='HV4fd'><form id='HV4fd'><ins id='HV4fd'></ins><ul id='HV4fd'></ul><sub id='HV4fd'></sub></form><legend id='HV4fd'></legend><bdo id='HV4fd'><pre id='HV4fd'><center id='HV4fd'></center></pre></bdo></b><th id='HV4fd'></th></span></q></dt></tr></i><div id='HV4fd'><tfoot id='HV4fd'></tfoot><dl id='HV4fd'><fieldset id='HV4fd'></fieldset></dl></div>

      Python多处理使用队列写入同一个文件

      Python Multiprocessing using Queue to write to same file(Python多处理使用队列写入同一个文件)

        • <bdo id='LEWKX'></bdo><ul id='LEWKX'></ul>
          <i id='LEWKX'><tr id='LEWKX'><dt id='LEWKX'><q id='LEWKX'><span id='LEWKX'><b id='LEWKX'><form id='LEWKX'><ins id='LEWKX'></ins><ul id='LEWKX'></ul><sub id='LEWKX'></sub></form><legend id='LEWKX'></legend><bdo id='LEWKX'><pre id='LEWKX'><center id='LEWKX'></center></pre></bdo></b><th id='LEWKX'></th></span></q></dt></tr></i><div id='LEWKX'><tfoot id='LEWKX'></tfoot><dl id='LEWKX'><fieldset id='LEWKX'></fieldset></dl></div>
            <tbody id='LEWKX'></tbody>
          • <small id='LEWKX'></small><noframes id='LEWKX'>

            1. <tfoot id='LEWKX'></tfoot>
              <legend id='LEWKX'><style id='LEWKX'><dir id='LEWKX'><q id='LEWKX'></q></dir></style></legend>

              1. 本文介绍了Python多处理使用队列写入同一个文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                问题描述

                我知道 Stack Exchange 上有很多关于将结果从多处理写入单个文件的帖子,我只阅读了这些帖子就开发了我的代码.我想要实现的是并行运行RevMapCoord"函数,并使用 multiprocess.queue 将其结果写入一个文件中.但是我在排队工作时遇到了问题.我的代码:

                I know there are many post on Stack Exchange related to writing results from multiprocessing to single file and I have developed my code after reading only those posts. What I am trying to achieve is that run 'RevMapCoord' function in parallel and write its result in one single file using multiprocess.queue. But I am having problem while queuing my job. My Code:

                def RevMapCoord(list):
                    "Read a file, Find String and Do something"
                
                def feed(queue, parlist):
                    for par in parlist:
                        print ('Echo from Feeder: %s' % (par))
                        queue.put(par)
                    print ('**Feeder finished queing**')
                
                def calc(queueIn, queueOut):
                     print ('Worker function started')
                     while True:
                         try:
                             par = queueIn.get(block = False)
                             res = RevMapCoord(final_res)
                             queueOut.put((par,res))
                         except:
                             break
                
                def write(queue, fname):
                    fhandle = open(fname, "w")
                    while True:
                         try:
                            par, res = queue.get(block = False)
                            print >>fhandle, par, res
                         except:
                            break
                    fhandle.close()
                
                
                feedProc = Process(target = feed , args = (workerQueue, final_res))
                calcProc = [Process(target = calc , args = (workerQueue, writerQueue)) for i in range(nproc)]
                writProc = Process(target = write, args = (writerQueue, sco_inp_extend_geno))
                
                feedProc.start()
                print ('Feeder is joining')
                feedProc.join ()
                for p in calcProc:
                    p.start()
                for p in calcProc:
                    p.join()
                writProc.start()
                writProc.join ()
                

                当我运行此代码时,脚本卡在feedProc.start()"步骤.屏幕的最后几行输出显示了feedProc.start()"末尾的打印语句:

                When I run this code script stucks at "feedProc.start()" step. The last few output lines from screen shows print statement from the end of "feedProc.start()":

                Echo from Feeder: >AK779,AT61680,50948-50968,50959,6,0.406808,Ashley,Dayne
                Echo from Feeder: >AK832,AT30210,1091-1111,1102,7,0.178616,John,Caine
                **Feeder finished queing**
                

                但在执行下一行feedProc.join()"之前挂起.代码没有错误并继续运行但什么都不做(挂起).请告诉我我犯了什么错误.

                But hangs before executing next line "feedProc.join ()". Code gives no error and keep on running but doing nothing(hangs). Please tell me what mistake I am making.

                推荐答案

                我在Python3中使用'map_async'函数实现了多处理到单个文件的写入结果.这是我写的函数:

                I achieved writing results from multiprocessing to a single file by uing 'map_async' function in Python3. Here is the function I wrote:

                def PPResults(module,alist):##Parallel processing
                    npool = Pool(int(nproc))    
                    res = npool.map_async(module, alist)
                    results = (res.get())###results returned in form of a list 
                    return results
                

                因此,我在a_list"中为该函数提供了一个参数列表,module"是一个执行处理并返回结果的函数.上述函数继续以列表的形式收集结果,并在处理完'a_list'中的所有参数后返回.结果可能不是正确的顺序,但由于顺序对我来说并不重要,所以效果很好.结果"列表可以迭代,并将单个结果写入文件中,例如:

                So, I provide this function with a list of parameters in 'a_list' and 'module' is a function that does the processing and returns result. The above function keeps on collecting the results in form of list and returns back when all the parameters from 'a_list' have been processed. The results might not be correct order but as order was not important for me this worked well. The 'result' list can be iterated and individual results written in file like:

                fh_out = open('./TestResults', 'w')
                for i in results:##Write Results from list to file
                    fh_out.write(i)
                

                为了保持结果的顺序,我们可能需要使用类似于我在问题(上文)中提到的队列".虽然我能够修复代码,但我相信这里不需要提及.

                To keep the order of the results we might need to use 'queues' similar to I mentioned in my question (above). Though I am being able to fix the code but I believe it is not required to be mentioned here.

                谢谢

                AK

                这篇关于Python多处理使用队列写入同一个文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

                相关文档推荐

                Adding config modes to Plotly.Py offline - modebar(将配置模式添加到 Plotly.Py 离线 - 模式栏)
                Plotly: How to style a plotly figure so that it doesn#39;t display gaps for missing dates?(Plotly:如何设置绘图图形的样式,使其不显示缺失日期的间隙?)
                python save plotly plot to local file and insert into html(python将绘图保存到本地文件并插入到html中)
                Plotly: What color cycle does plotly express follow?(情节:情节表达遵循什么颜色循环?)
                How to save plotly express plot into a html or static image file?(如何将情节表达图保存到 html 或静态图像文件中?)
                Plotly: How to make a line plot from a pandas dataframe with a long or wide format?(Plotly:如何使用长格式或宽格式的 pandas 数据框制作线图?)

                  <small id='jeCH7'></small><noframes id='jeCH7'>

                    1. <tfoot id='jeCH7'></tfoot>
                      • <bdo id='jeCH7'></bdo><ul id='jeCH7'></ul>
                          <tbody id='jeCH7'></tbody>

                      • <legend id='jeCH7'><style id='jeCH7'><dir id='jeCH7'><q id='jeCH7'></q></dir></style></legend>

                        <i id='jeCH7'><tr id='jeCH7'><dt id='jeCH7'><q id='jeCH7'><span id='jeCH7'><b id='jeCH7'><form id='jeCH7'><ins id='jeCH7'></ins><ul id='jeCH7'></ul><sub id='jeCH7'></sub></form><legend id='jeCH7'></legend><bdo id='jeCH7'><pre id='jeCH7'><center id='jeCH7'></center></pre></bdo></b><th id='jeCH7'></th></span></q></dt></tr></i><div id='jeCH7'><tfoot id='jeCH7'></tfoot><dl id='jeCH7'><fieldset id='jeCH7'></fieldset></dl></div>