python - Multiprocessing program - Not Executing in parallel -
i scrapping info various sites. code scrappers stored in modules (x,y,z,a,b)
where x.dump function uses files storing scraped data. dump function takes single argument 'input'. note : dump functions not same.
i trying run each of these dump function in parallel. next code runs fine. have noticed still follows serial order x y ... execution.
is right way of going problem?
are multithreading , multiprocessing native ways parallel programming?
from multiprocessing import process import x.x x import y.y y import z.z z import a.a import b.b b input = "" f_list = [x.dump, y.dump, z.dump, a.dump, b.dump] processes = [] function in f_list: processes.append(process(target=function, args=(input,))) process in processes: process.run() process in processes: process.join()
that's because run()
method implement task itself, you're not meant phone call outside that. supposed phone call start()
spawns new process calls run()
in other process , returns command can more work (and later join()
).
python multiprocessing
No comments:
Post a Comment