FAQ Database Discussion Community


Issue while sharing semaphore objects between process in parallel python

python,python-2.7,multiprocessing,parallel-python
I am facing issue while passsing any semaphore objects or synchronizing objects like Events, pipes , queues etc to a child process when used with parallel python I am getting following error when i passed a Queue to child process. ' through inheritance' % type(self).__name__ RuntimeError: Queue objects should only...

Python - multiprocessing pool with functions with 2 inputs arrays

python,arrays,numpy,multiprocessing,pool
I m trying to use multiprocess in order to decrease the time of calculation of functions whose depend of 2D arrays whose shape is 2000x2000. I have 2 inputs arrays for the function but with p.map it doesnt work...(with one it s ok). How can i do to enable that?...

Multiprocessing separation of tasks

python,multiprocessing,python-multiprocessing
I have two different tasks that I want to split amongst processes. I have one task that consists of retrieving responses from URLs and writing the responses to a Queue (multiprocessing Queue, not threading) which I would like to have a few processes working on in parallel. I have another...

How to do two requests in parallel

python,multiprocessing,python-requests
I have the following code which requests something from Amazon's API: params = {'Operation': 'GetRequesterStatistic', 'Statistic': 'NumberHITsAssignable', 'TimePeriod': 'LifeToDate'} response = self.conn.make_request(action=None, params=params, path='/', verb='GET') data['ActiveHITs'] = self.conn._process_response(response).LongValue params = {'Operation': 'GetRequesterStatistic', 'Statistic': 'NumberAssignmentsPending', 'TimePeriod': 'LifeToDate'} response = self.conn.make_request(action=None, params=params, path='/', verb='GET')...

multiprocessing Event makes my code slow

python,multiprocessing,finance,python-multiprocessing
I have a main process that is eating data from many different markets. It does some preliminary processing on the message and then passes it into a multiprocessing Queue (each unique market has its own dedicated process, call it Parse, on the other end of the Queue). Then the...

Behavior of multiprocessing module on cluster

python,multiprocessing,cluster-computing,qsub
There are modules which are suited for multiprocessing on clusters, listed here. But I have a script which is already using the multiprocessing module. This answer states that using this module on a cluster will only let it make processes within a node. But what is this behavior like? Lets...

Multithreading is an extension of multiprocessing?

multithreading,multiprocessing
Multithreading is an extension of multiprocessing but within a process scope? true / false

python multi-processing zombie processes

python,multiprocessing,zombie-process
I have a simple implementation of python's multi-processing module if __name__ == '__main__': jobs = [] while True: for i in range(40): # fetch one by one from redis queue #item = item from redis queue p = Process(name='worker '+str(i), target=worker, args=(item,)) # if p is not running, start p...

Python Multiprocessing example. never terminates

python,multiprocessing,terminate
I'm fairly new to Python Multiprocessing and i came across a tutorial and hence tried to check its multiprocessing. Here the processes are not getting terminated. they are running forever. what's wrong.? I read that when optional arguments are true the process doesn't terminate. hence I've put empty statement over...

Killing a process spawned by another thread

python,multithreading,multiprocessing
I have a Python process which is spawning another process from a separate thread, e.g. class MyClass(unittest.TestCase): def setup(self): def spawn_proc(): subprocess.call("test_process") thread = threading.Thread(target=spawn_proc, args=(), daemon=True) thread.start() def cleanup(self): # @@@ kill test_process So calling MyClass.setup() means the test_process will be spawned in second thread. What I want is...

Python - Using Streamhandler in multiprocessing environment

python,windows,logging,multiprocessing
I have a CLI script which logs all its processes into a log file. One of the functions of the CLI is to upload a large file by splitting it up into pieces and uploading them in parallel. In linux, the whole things works like a charm but in windows...

Concurrent requests with MRI Ruby

ruby-on-rails,ruby,multithreading,ruby-on-rails-4,multiprocessing
I put together a simple example trying to prove concurrent requests in Rails using a basic example. Note that I am using MRI Ruby2 and Rails 4.2. def api_call sleep(10) render :json => "done" end I then go to 4 different tabs in Chrome on my mac (I7 / 4...

Spawn parallel processes for function and pass several different arguments to function

python,multiprocessing,python-multiprocessing
Hi everybody I took Jiaaro's solution as a template to convert it from threading to multiprocessing: import multiprocessing from function_repo import run from time import time vitems = ['02','63','25','0'] num_processes = (multiprocessing.cpu_count()/1) threads = [] if __name__ == '__main__': begin = time() print begin # run until all the threads...

multiple files upload using ftplib and multiprocessing

python,multithreading,python-2.7,multiprocessing
I am trying to multiple files using ftp. However, Instead of uploading multiple different files, it uploads one file multiple times. What's wrong with it? import fnmatch import os from multiprocessing import Pool import ftplib file_extensions = [ '*.mp4', '*wmv' ] matches = [] #match = [] exclude = "/ext_hdd/download/incomplete"...

Keep unified count during multiprocessing?

python,multithreading,multiprocessing
I have a python program that runs a Monte Carlo simulation to find answers to probability questions. I am using multiprocessing and here it is in pseudo code import multiprocessing def runmycode(result_queue): print "Requested..." while 1==1: iterations +=1 if "result found (for example)": result_queue.put("result!") print "Done" processs = [] result_queue...

Python piped subprocess hanging after join

python,python-2.7,multiprocessing,python-multiprocessing
Hopefully this is a fairly easy question to answer. I am trying to constantly send an array of data to a worker process while my program is executing. The issue is that, when I attempt to join the thread, the program just hangs. I'd thought that perhaps the worker terminated...

Multiprocessing and a global manager

python,dictionary,process,multiprocessing,future
I have import concurrent.futures from multiprocessing import Process,Manager,freeze_support from functools import partial def setup(): freeze_support() global manager manager=Manager() global dict1 dict1=manager.dict() global dict2 dict2=manager.dict() load_stuff() and later: def foo(file): #do some stuff foobar() def foobar(): #look up some stuff in dict1,dict2 def load_stuff(): f=partial(foo,dict1,dict2) with concurrent.futures.ProcessPoolExecutor() as executor: for users,...

C programming bi directional communication

c,unix,multiprocessing,ipc,inter-process-communicat
I am trying to make something work here, i have c program where my parent process creates a pipe so he can listen request from children process. These children are created dynamically, it is never the same number. So far, i managed to send the requests to the parent through...

How do I run multiple subprocesses in parallel and wait for them to finish in Python

python,multiprocessing
I am trying to migrate a bash script to Python. The bash script runs multiple OS commands in parallel then waits for them to finish before resuming, ie: command1 & command2 & . commandn & wait command I want to achieve the same using Python subprocess. Is this possible? How...

passing arguments and manager.dict to pool in multiprocessing in python 2.7

python,python-2.7,multiprocessing,pool
I want to parallelise a function that will update a shared dictionary using Pool instead of Process so that i don't over-allocate too many cpus. i.e. can i take this def my_function(bar,results): results[bar] = bar*10 def paralell_XL(): from multiprocessing import Pool, Manager, Process manager = Manager() results=manager.dict() jobs = []...

Writing Errors in Multiprocessing Python

python,multithreading,python-2.7,multiprocessing
I am trying to write certain files after editing using multiprocessing code python (2.7). It works like a charm for small number(<20). but when I try for more files (20+), it goes berserk. I am using Python 2.7.5 on CentOS 6.5 with 4 Core Processor. import sys, os import multiprocessing...

python multiprocessing/threading cleanup

python,multithreading,multiprocessing
I have a python tool, that has basically this kind of setup: main process (P1) -> spawns a process (P2) that starts a tcp connection -> spawns a thread (T1) that starts a loop to receive messages that are sent from P2 to P1 via a Queue (Q1) server process...

Multi core ZeroMQ?

python,multithreading,multiprocessing,zeromq
ZeroMQ is used for receiving input parameters.. def server(): rep = context.socket(zmq.REP) rep.bind('tcp://*:{}'.format(PORT)) while True: data = rep.recv_json() result = calculate(data) rep.send_json(result) The calculation method is called calculate, after finished, result would be sent to client through ZMQ. Base on my test, it currently uses only 1 core of the...

Can I use a StreamHandler for Logging in a Multiprocessing Environment in Python?

python,logging,multiprocessing,stdout
Is it safe to use a single StreamHandler in a multiprocessing environment? More precisely, can it be problematic to have a just one StreamHandler that simply prints the logging statements of all processes to stdout? Like this, for example: import multiprocessing as mp import logging def do_log(no): # 2nd EDIT,...

Run function with positional and optional arguments in parallel in python (follow up)

python,python-2.7,parallel-processing,multiprocessing
This is a follow up question to: Python: How can I run python functions in parallel? Minimal Working Example: ''' Created on 06.05.2015 http://stackoverflow.com/questions/7207309/python-how-can-i-run-python-functions-in-parallel ''' from multiprocessing import Process import time def runInParallel(*fns): proc = [] for fn in fns: p = Process(target=fn) p.start() proc.append(p) for p in proc: p.join()...

Serialize DictProxy object to JSON

python,dictionary,multiprocessing
I have a DictProxy object created using multiprocessing.Manager().dict() to support concurrency. At the end of the run, I need to serialize the dict to JSON. But it's unclear how to convert the DictProxy to serializable dict object. When I tried it, I got: TypeError: <DictProxy object, typeid 'dict' at 0x10a240ed0>...

Properly Designing a Multiprocessing.Manager Custom Object

python,python-2.7,amazon-s3,multiprocessing
I want to use the multiprocessing.Manager() object so I can asynchronously send information from a worker to the manager to send information to a server. What I have is roughly 10 instances writing PDFs to disk. I then wanted to use the manager object in the multiprocessing package to send...

How to get around the pickling error of python multiprocessing without being in the top-level?

python,multiprocessing,pickle,python-multiprocessing
I've researched this question on here multiple times, but I haven't exactly found a workaround that either works in my case, or one that I understand, so please bear with me. Basically I have a hierarchical organization of functions, and that is preventing me from multiprocessing in the top-level. Unfortunately,...

Python multiprocessing.Process object with multiprocessing.Manager creates multiple multiprocessing forks in Windows Task Manager

python,multiprocessing
I am running python 3.4.3 on Windows Standard Embedded 7. I have a class that inherits multiprocessing.Process. In the class's run method I create a thread for the process object to start. While watching Task Manager, specifically the Command Line column, when the process class is instantiated I see a...

python: start a Multiprocessing process on wxbutton click

python,python-2.7,wxpython,multiprocessing,wx
I have a wxpython app running, and I was wondering if I could start another instance of the app as a result of a button click when the previous ones continue to run? something like : Start a wxpython app -> click on a button -> and the event spawns...

When can a Python object be pickled

python,multiprocessing,pickle
I'm doing a fair amount of parallel processing in Python using the multiprocessing module. I know certain objects CAN be pickle (thus passed as arguments in multi-p) and others can't. E.g. class abc(): pass a=abc() pickle.dumps(a) 'ccopy_reg\n_reconstructor\np1\n(c__main__\nabc\np2\nc__builtin__\nobject\np3\nNtRp4\n.' But I have some larger classes in my code (a dozen methods, or...

Multiprocessing in Python 3 parallel processing and waiting on jobs

python,multithreading,python-3.x,multiprocessing
I have a piece of code that queries a DB and returns a set of IDs. For each ID, I need to run a related query to get a dataset. I would like to run the queries in parallel to speed up the processing. Once all the processes are run,...

PyQt SimpleHTTPServer: GUI freezes on starting server

python,multithreading,pyqt,multiprocessing,simplehttpserver
I am trying to create a simple desktop app using PyQt that runs a SimpleHTTPServer on clicking a start server button. I have tried using threads(both python threads and Qthread) and understand that this is not possible as it runs into issues with the GIL. Here's the code def btn_startserver_clicked(self):...

Local variable not updated in a loop in the same way as shared memory objects in Python

python,multiprocessing,python-multiprocessing
In the following Python code, the multiprocessing module starts three processes that print out the values of one local variable and two multiprocessing shared memory objects. import multiprocessing as mp import os,time # local variable count = 0 # shared memory objects (int and array) scalar = mp.Value('i', 0) vector...

Python multiprocess share memory vs using arguments

python,data,pandas,multiprocessing,shared-memory
I'm trying to get my head around what is the most efficient and less memory consuming way to share the same data source between different process. Imagine the following code, that simplify my problem. import pandas as pd import numpy as np from multiprocessing import Pool # method #1 def...

python, multiprocessing and dmtcp: checkpointing one process in Pool?

python,multiprocessing,pool,checkpoint
Is it possible to use python's integration of dmtcp to checkpoint a child process in parallel execution? My situation is as follows: I have a multiprocessing.Pool with several workers receiving async jobs (using apply_async). Certain big jobs require all the resources (cpu cores & memory). When one of these jobs...

Starting celery worker from multiprocessing

python,flask,multiprocessing,celery,elastic-beanstalk
I'm new to celery. All of the examples I've seen start a celery worker from the command line. e.g: $ celery -A proj worker -l info I'm starting a project on elastic beanstalk and thought it would be nice to have the worker be a subprocess of my web app....

Multiprocessing: How to write separate log files for each instance while using pool.map?

python,class,logging,multiprocessing,instance
I want to create a class where each instance writes its own log file. This works fine when I use a function instead of a class (or when I don´t use multiprocessing): import multiprocessing, logging def setup_logger(name_logfile, path_logfile): logger = logging.getLogger(name_logfile) formatter = logging.Formatter('%(asctime)s: %(message)s', datefmt='%Y/%m/%d %H:%M:%S') fileHandler = logging.FileHandler(path_logfile,...

Python subclassing process with constructor

python,class,constructor,multiprocessing
I'm trying to create a object as a new process. If I give a constructor to the class, program is showing an error. Code import multiprocessing as mp import time class My_class(mp.Process): def __init__(self): self.name = "Hello "+self.name self.num = 20 def run(self): print self.name, "created and waiting for", str(self.num),...

Sharing many queues among processes in Python

python,queue,multiprocessing,python-multiprocessing
I am aware of multiprocessing.Manager() and how it can be used to create shared objects, in particular queues which can be shared between workers. There is this question, this question, this question and even one of my own questions. However, I need to define a great many queues, each of...

Running multiple programs from the for loop

python,multithreading,performance,multiprocessing
This program prints out the number of lines of each text file one by one: files = glob.glob('*.txt') # 5 files for f in files: with open(f,'r') as fi: lines = fi.read().splitlines() print len(lines) How can I write my code so that it runs 5 simultaneous programs and prints number...

Synchronizing processes with semaphores and signals in C

c,linux,multiprocessing,signals,semaphore
I have to write program in C on Linux. It has to have 3 processes - first reads from STDIN, sends message through FIFO to second process, which counts lenght of recevied message and sends result to third process (also through FIFO), which displays it on STDOUT. I have to...

Differences between python Queue.Queue and multiprocessing.Queue

python,opencv,pyqt,multiprocessing,pyside
My program does not close cleanly when I use the queue from the multiprocessing module (Python 2.7 on Windows) in place of Queue.Queue. Eventually I want to process the frames in imgbuffer using a multiprocessing.Process and then pull back display data using a second queue. This does not work yet...

multiprocessing.Queue hanging when Process dies

python,multiprocessing
I have a subprocess via multiprocessing.Process and a queue via multiprocessing.Queue. The main process is using multiprocessing.Queue.get() to get some new data. I don't want to have a timeout there and I want it to be blocking. However, when the child process dies for whatever reason (manually killed by user...

Launching nested processes in multiprocessing

python,multiprocessing
I have a main file that launches multiple processes and one of the processes again launches multiple processes. I am having problems launching the nested set of processes. I have the following code in one file: # parallel_test.py import Queue import multiprocessing import time import threading def worker(q): while not...

Multiprocessing not working with cmd2 module, import issue?

python,import,cmd,multiprocessing,importerror
Extending the solution from this question Multiprocessing code works upon import, breaks upon being called, I have injected some multiprocessing code into my project, which has promptly broken. I think there are import issues. I have two modules. test.py looks like: print 'omnia praeclara' import multi4 if __name__ == "__main__":...

Multiprocessing vs running several Python interpreters

python,multiprocessing
is there any benefit in using multiprocessing versus running several python interpreters in parallel for long-running, embarrassingly parallel tasks? At the moment, I'm just firing up several python interpreters that run the analysis over slices of input data, each of them dumping the results into a separate pickle file. It...

fork() in C; which should be parent process which should be child process

c,multiprocessing,fork
This may seem to be a dumb question but I don't really have a good understanding of fork() other than knowing that this is about multi-threading. Child process is like a thread. If a task needs to be processed via fork(), how to correctly assign tasks to parent process and...

Python / OpenCV application lockup issue

python,multithreading,opencv,multiprocessing
My Python application running on a 64-core Linux box normally runs without a problem. Then after some random length of time (around 0.5 to 1.5 days usually) I suddenly start getting frequent pauses/lockups of over 10 seconds! During these lockups the system CPU time (i.e. time in the kernel) can...

Multiprocessing slower than sequential with python

python,multiprocessing
I've invested quite a lot of time in rewriting my code to exploit more cores, but when I benchmarked it I found that all I had achieved was making it 7 times slower than the original code, despite running on 16 cores rather than one! This leads me to believe...

Multiprocessing works in Ubuntu, doesn't in Windows

python,windows,multiprocessing,cherrypy,python-multiprocessing
I am trying to use this example as a template for a queuing system on my cherrypy app. I was able to convert it from python 2 to python 3 (change from Queue import Empty into from queue import Empty) and to execute it in Ubuntu. But when I execute...

Why is this python boto S3 multipart upload code not working?

python,amazon-web-services,amazon-s3,multiprocessing,boto
I am trying to upload a 10 GB file to AWS S3, and someone said to use S3 Multipart Upload, so I stumbled upon someone's github gist: import os import sys import glob import subprocess import contextlib import functools import multiprocessing from multiprocessing.pool import IMapIterator from optparse import OptionParser from...

Exit from multiprocessing pool upon exception or KeyboardInterrupt? [duplicate]

python,multiprocessing,python-multiprocessing
This question already has an answer here: Keyboard Interrupts with python's multiprocessing Pool 7 answers I would like my program to exit as soon as I press Ctrl+C: import multiprocessing import os import time def sqr(a): time.sleep(0.2) print 'local {}'.format(os.getpid()) #raise Exception() return a * a pool = multiprocessing.Pool(processes=4)...

Parallel downloads using pySmartDL - python multiprocess

python,multithreading,multiprocessing
I'm hoping to use pySmartDL for my project to handle downloads. But pySmartDL doesn't support parallel downloads and the execution thread stops until a download is finished. I tried using WorQ to handle the issue. But with that download doesn't happen after I start it.Is there a better way to...

With the MESI protocol, a write hit also stalls the processor, right?

caching,architecture,multiprocessing,vhdl,mesi
I'm doing a project that is to implement a dual-processor system with some kind of cache coherency (for which I chose MESI) in VHDL. I just want to confirm this one thing: a write-hit on a shared cache line should cause the cache controller to send invalidation messages on the...

Executing a cassandra insert query through Python multiprocessing queue

python,cassandra,queue,multiprocessing
I have a cassandra keyspace sujata.I am connecting to cassandra using python driver cassandra.cluster.The column family of sujata is hello. Following is my code:- from multiprocessing import Process,Queue from cassandra.cluster import Cluster import os queue=Queue() cluster = Cluster(['127.0.0.1']) metadata = cluster.metadata session = cluster.connect("sujata") def hi(): global session global queue...

Java 8 automatically using multicore?

java,multithreading,multiprocessing,java-8,multicore
I did some tests a year ago concerning multicore with java 7. First I implemented some calculations only in the main thread (CPU usage showed that only one core did all the work) and then I implemented Callable with an ExecutorService instance. While running it all cores where doing the...

Can I implement a counter for multiprocessing using pool callback?

python,multiprocessing,python-multiprocessing
I googled a bit for how to correctly building a counter to keep track of the progress of work done. So far it seems all answers involved the use of lock and Value. I am wondering if I can achieve it using the callback. It seems that the callback is...

Using a shared queue that workers can add tasks to

python,concurrency,multiprocessing,python-multiprocessing
I'm pretty new to python (I mainly write code in Java). I have a python script that's essentially a crawler. It calls phantomjs, which loads up the page, returns its source, and a list of urls that it found in the page. I've been trying to use Python 3's multiprocessing...

Python multiprocessing - Is it possible to introduce a fixed time delay between individual processes?

python,multithreading,batch-file,multiprocessing
I have searched and cannot find an answer to this question elsewhere. Hopefully I haven't missed something. I am trying to use Python multiprocessing to essentially batch run some proprietary models in parallel. I have, say, 200 simulations, and I want to batch run them ~10-20 at a time. My...

multiprocessing.Pool with maxtasksperchild produces equal PIDs

python,python-3.x,multiprocessing,pid
I need to run a function in a process, which is completely isolated from all other memory, several times. I would like to use multiprocessing for that (since I need to serialize a complex output coming from the functions). I set the start_method to 'spawn' and use a pool with...

Using process instead of thread with zeromq

python,multiprocessing,zeromq
I'm reading this code http://zguide.zeromq.org/py:mtserver But when I've tried to replace threading.Thread by multiprocessing.Process I got the error Assertion failed: ok (mailbox.cpp:84) Code is import time import threading import zmq def worker_routine(worker_url, context=None): """Worker routine""" context = context or zmq.Context.instance() # Socket to talk to dispatcher socket = context.socket(zmq.REP) socket.connect(worker_url)...

How to specify multiple processors in PBS?

python,multiprocessing,pbs
There are two options in the PBS queuing system (that I know of) that are related to multiple processes. This is the relevant line in the script : #PBS -l nodes=1:ppn=1 I always used just one process, and this went fine. However, to speed up things, I rewrote my script...

How to make Pool sensitive of KeyboardInterrupt [duplicate]

python,multiprocessing
This question already has an answer here: Keyboard Interrupts with python's multiprocessing Pool 7 answers Is there a way to manually get out of a multithreaded operation? Here is what I'm trying to do: pool = Pool(num_parallel_workers) pool.map(update_completions_in_parallel, list_of_hits) # press ctrl+c in middle of operation to 'exit' and...

Do I need to call pool.terminate manually upon excepton in multiprocessing?

python,multiprocessing,python-multiprocessing
It seems the following 2 snippets have the same behavior: def sqr(a): time.sleep(1.2) print 'local {}'.format(os.getpid()) if a == 20: raise Exception('fff') return a * a pool = Pool(processes=4) A: try: r = [pool.apply_async(sqr, (x,)) for x in range(100)] pool.close() for item in r: item.get(timeout=999999) except: pool.terminate() raise finally: pool.join()...

Breaking down 1M row jsonlines file into individual json files - python

python,memory,queue,multiprocessing
I'm trying to process hundreds of different files that are of .jl.gz format stored in s3. I need to take different parts of each of the 1M json objects in these files and move them over to a sql database, a mongodb, and elasticsearch. This is taking too long. So,...

C - kill system call doesn't wake up a process

c,multithreading,gcc,multiprocessing,signals
I'm writing a program for synchronization between two child processes. The request is: Implement a concurrent program in C language that creates two children: a sender and a receiver. The sender loops reading a string from the standard input and preparing a “message” of a single line in a file,...

Combination of chdir=1 and num_jobs>1 in SCons

python,build,multiprocessing,scons
I have a rather lengthy test task which is automated by SCons and could be parallelized. However, it currently relies on using chdir=1, which at the moment is not trivial to remove. Now, as soon as I use -j2 respectively SetOption('num_jobs', 2) the job fails and the following minimal (non-)...

Shared State in Python multiprocessing Processes

python,multiprocessing
Please consider this code: import time from multiprocessing import Process class Host(object): def __init__(self): self.id = None def callback(self): print "self.id = %s" % self.id def bind(self, event_source): event_source.callback = self.callback class Event(object): def __init__(self): self.callback = None def trigger(self): self.callback() h = Host() h.id = "A" e = Event()...

Scope of Global variables in Python in Windows

python,logging,multiprocessing
I have a CLI script which I am using to push files into and s3 bucket. For larger files i am splitting the files into parts and uploading them in parallel. (Pasting code structure here. I tried to make a minimalist example but even that is 60 lines long) def...

How could I write data hourly to files named by current hour with multiple processes?

c++,linux,logging,multiprocessing
I have to write some data to a file based on current hour in my server. For example, write data to a file named like 2015061117.txt. And there is multiple processes write data to file simultaneously. How should I design my server to implement this? Do I need to use...

Python subclassing process with parameter

python,object,arguments,multiprocessing
I'm trying to create an object but as a new process. I'm following this guide and came up with this code. import multiprocessing as mp import time class My_class(mp.Process): def run(self): print self.name, "created" time.sleep(10) print self.name, "exiting" self.x() def x(self): print self.name, "X" if __name__ == '__main__': print 'main...

Python 2.7 multiprocessing get process result whithout using pool

python,process,multiprocessing
How can I get the result from my process without using a pool ? (I'm willing to conserve an eye on the progression: (print "\r",float(done)/total,"%",) which can't be done using a pool as far I know) def multiprocess(function, argslist, ncpu): total = len(argslist) done = 0 jobs = [] while...

Parallel processing a large number of tasks

python,multiprocessing
I have 10,000 csv files for which I have to open in Pandas and manipulate/transform using some of Pandas's function and save the new output to csv. Could I use a parallel process (for Windows) to make the work faster? I tried the following but no luck: import pandas pd...

Can multiple programs write to STDOUT at the same time?

python,multithreading,parallel-processing,multiprocessing,stdout
I'm currently using GNU Parallel to run a Python script on multiple large files simultaneously. I have a master Python script that sets up the files I need to process, and then dispatches Parallel to run the same worker script on these files. I need to get the data back...

Does Java automatically optimize for loops for multi-core processors

java,for-loop,optimization,multiprocessing
A fellow developer told me today that Java (or the JIT) can automatically optimize the execution of a for loop so that it uses all of the available CPUs on the computer, so long as the code in each iteration of the for loop can execute without relying on variables...

append to instance list with multiprocess

python,multiprocessing,python-multiprocessing
I have a class that has a list for one of its attributes. After instantiating the class, I would like to do some calculations then append the results (which are lists themselves) to the list. I'm trying to use multiprocess to speed up the calculations/appending since since the order doesn't...

python multiprocessing pool: how can I know when all the workers in the pool have finished?

python,multiprocessing,pool
I am running a multiprocessing pool in python, where I have ~2000 tasks, being mapped to 24 workers with the pool. each task creates a file based on some data analysis and webservices. I want to run a new task, when all the tasks in the pool were finished. how...

How to use php pcntl_fork to run function in background

php,mysql,multiprocessing,pcntl
I have two functions. One function I want to run in the background with the mysql connection and without returning any errors or anything to the browser. And another function I want to run which returns data to the browser. I've used the php pcntl_fork as follows: $pid = pcntl_fork();...

How can I measure the memory occupancy of Python MPI or multiprocessing program?

python,memory,multiprocessing,mpi,mpi4py
I am doing this on Cray XE6 machine where I can't log in on compute nodes and there is no possibility for interactive session, therefore I would need to somehow use top command: run top in the background and have it take snapshot at regular times and send it to...

How to control what gets imported when you unpickle python object?

python,multiprocessing,pickle
I have the following setup: a.py: class A(object): def __init__(self, name): self.name = name def a(self): print('yow {}!'.format(self.name)) b.py: class B(object): def __init__(self, obj): self.obj = obj sender.py: from a import A from b import B message = pickle.dumps(B(A('Martin'))) receiver.py: my_b = pickle.loads(message) my_a = my_b.obj my_a.a() Output: yow Martin!...

threading.Thread against multiprocessing.Process

python,multithreading,multiprocessing
So recently I had the following problem : I have to make a server that handle request in order to update some values while the main process is using this values. So here, the server handling function is in the subprocess, and I can't stop it when i want. In...

How do I detect if a system supports forking of processes in python?

python,multiprocessing,fork
How can I figure out if an operating system supports forking like os.fork() without invoking the command itself? I.e. does import os hasattr(os, 'fork') return False under Windows?...

Python3.x how to share a database connection between processes?

python,mysql,python-3.x,multiprocessing
I'm running a number of processes using multiprocessing.Pool Each process has to query my mysql database. I currently connect to the database once and then share the connection between the processes It works but occasionally I get strange errors. I've confirmed that the errors are caused when querying the database....

Python: Yield in multiprocessing Pool

python,multiprocessing,yield
I've to Parallelize a function which involves a certain "yield". This is only a simple replica of the whole program that I've to work on, but sums up the problems i'm facing. Here I'm try to understand multiprocessing, apply_async and yield for my project In this example I've used a...

How to apply a function to a 2D numpy array with multiprocessing

python,arrays,numpy,multiprocessing
Suppose I have the following function: def f(x,y): return x*y How do I apply the funtion to each element in an NxM 2D numpy array using the multiprocessing module? Using serial iteration, the code might look as follows: import numpy as np N = 10 M = 12 results =...

Why aren't all my processes starting at once?

python,multiprocessing,python-multiprocessing
I have a process that adds up a bunch of numbers: def slow(x): num = 0 for i in xrange(int(1E9)): num += 1 And I start 500 of these. for x in range(500): out.write("Starting slow process - " + str(datetime.now()) + "\n") p = multiprocessing.Process(target = slow, args = (x,...

Progress in a itertools.combinations pool() map , result passed in list

python,multiprocessing
I run a multipocessing script like this: def intersterer(i): somestuff return x if __name__ == '__main__': pool = Pool() list = pool.map(intersterer, itertools.combinations(xrange(great_number), 2)) I want know how I can see the progress of the work. I look around a shared counter, but its ugly and seems not telling the...

High Kernel CPU when running multiple python progams

python,linux,performance,multiprocessing
I developed a python program that does heavy numerical calculations. I run it on a linux machine with 32 Xeon CPUs, 64GB RAM, and Ubuntu 14.04 64-bit. I launch multiple python instances with different model parameters in parallel to use multiple processes without having to worry about the global interpreter...

Killing a process launched from a process that has ended - Python

python,multiprocessing
I am trying to kill a process in Python, that is being launched from another process and I am unable to find the correct place to place my ".terminate()". To explain myself better I will post some example code: from multiprocessing import Process import time def function(): print "Here is...

python for-loop parallelization using multiprocessing.pool

python-2.7,multiprocessing
I have a piece of code that looks like this: def calc_stuff(x,a,b,c): ... return y x = range(N) y = zeros(x.shape) if __name__ == '__main__': p = Pool(nprocs) y = p.map(calc_stuff,x,a,b,c) This does not work, and as I searched online, it is because the map function deals with iterables rather...

Supress output in multiprocessing process

python,multiprocessing
I have some code which runs in parallel using the mulitprocessing Pool class. Unforntunately some of the functions I use from another library have some verbose output. To abstract the problem have a look at the following example: from multiprocessing import Pool def f(x): print 'hello' return x*x p =...

How can I prevent the inheritance of python loggers and handlers during multiprocessing based on fork?

python,logging,multiprocessing,fork
Suppose I configured logging handlers in the main process. The main process spawns some children and due to os.fork() (in Linux) all loggers and handlers are inherited from the main process. In the example below 'Hello World' would be printed 100 times to the console: import multiprocessing as mp import...

Wrong exit code received from wexitstatus

php,multiprocessing,signals,pcntl,wexitstatus
I'm using PCNTL to multiprocess a big script in PHP on an ubuntu server. Here is the code ( simplified and commented ) function signalHandler($signo = null) { $pid = posix_getpid(); switch ($signo) { case SIGTERM: case SIGINT: case SIGKILL: // a process is asked to stop (from user or...

Multiprocessing a python script

python,parallel-processing,multiprocessing,python-multiprocessing
I learnt about the multiprocessing tool in python: https://docs.python.org/2/library/multiprocessing.html. Say I have a python program which is complicated and fleshed out, but it does not use up all my cores when running. So it uses 100% of one core and takes forever to complete. It is hard for me to...

Is Object in dynamic link library(.dll) shared across process

c++,dll,multiprocessing,dynamic-linking
suppose i have two class and a .c file in my .dll file. such as class MyClass { private : int id; Context* appContext; static Context* statContext; public: a(){ appContext = NULL; id = -1; } void setId(int a){ id = a; } void setContext(){ statContext = appContext = new...

Prevent code from running at import time

python,multiprocessing,cherrypy,python-multiprocessing
How do I get some code to run when a module is imported when the CherryPy web app starts and not when a new Process is created? My CherryPy app follows this pattern: Main.py from Api1.Api1 import Api1 from Api2.Api2 import Api2 config = {'global': {'server.socket_host': '0.0.0.0'}} class Root(): global...

Does python os.fork uses the same python interpreter?

python,multiprocessing
I understand that threads in Python use the same instance of Python interpreter. My question is it the same with process created by os.fork? Or does each process created by os.fork has its own interpreter?

Python end parent process from thread

python,multithreading,timer,multiprocessing
My aim is to create a new process which will take a number as input from user in a loop and display the square of it. If the user doesn't enter a number for 10 seconds, the process should end (not the main process). I'm using threading.Timer with multiprocessing. Tried...

Python Apply_async not waiting for other Processes to Finish

python,parallel-processing,multiprocessing
I have the following sample code that I am trying to use the multiprocessing module on. The following statement had been working previously under other applications, but one process (which receives a very small amount of data just due to the breakup) finishes first and causes the program to finish....