python - Using multiprocessing pool of workers -


I have the following code written for my lazy CPU core work. Originally code gets the desired "marine" files in the directory hierarchy, and in order to process these binary "sea" files, 50 to 100 text and binary files are produced in order to execute external scripts. is. As the title of the question gives an equally suggestive way to increase the speed of processing.

This question arises from the discussion for a long time that we keep the title as "" in the list of IPython users. Starting with its use on parallel processing functions of IPython

The issue is that I am unable to run this code correctly. If there are only "Sea" files in folders containing "sea" files, then the script completely eliminates its execution without the external script runs. (I have 30-50 external scripts to run, but only after executing the first script in these multi-script series my script was extracted from the multiprocessing script.) Interestingly, if I already had this script on the processed folder I run (which is the "Sea" file already processed and the output file is already in that folder) then it runs, but this time I'm about to speed-up from 2.4 Regarding the linear processing time at 2.7X. This is not very expected because only 2 core Duo 2.5 GHz CPU is in my laptop. Although I have a CUDA powered GPU, I have nothing to do with the current parallel computing conflict :)

Do you think that could be the source of this issue?

Thanks and suggestions for all comments.

#! / UIR / BIN / NV Python importing pool subprances imported from multi-dimensional importing OS DF search_ervice_file (): file_list, path_list = [], [] init = os from the pool. Files in the file for Root, Ders, OS Walk ('.') In the file: for dirs.sort () getcwd () if file.endswith ('. Sea'): file_list.append (file) os. Chdir (root) path_list.append (os.getcwd ()) os.chdir (init) returns file_list, path_list def process_all (pf): os.chdir (pf [0]) call (['postprocessing_saudi', pf [1] ]) If __name__ == '__main__': pool = pool (processes = 2) # start2 worker process files, path = find_sea_files () pathfile = [[path [i], files [i]] Lenan (files))] Pool.map (process_all, pa thfile)

worker The process Or multiprocessing module is required, it comes with logging for its subprocesses. Since you simplified the code to minimize the problem, I will debug with some print statements like (or you can pf prettyPrint theres aheri):

 

version of Python which I have completed with 2.6.4.


Comments

Popular posts from this blog

asp.net - Javascript/DOM Why is does my form not support submit()? -

sockets - Delphi: TTcpServer, connection reset when reading -

javascript - Classic ASP "ExecuteGlobal" statement acting differently on two servers -