AboutThis page hosts some miscellaneous Python modules I've written, mostly for fun, but which may be useful.
git clone git://blitiri.com.ar/pymisc
You can also get them individually using the links below.
They're licensed under the BOLA (Public Domain), so you can do whatever you want with them.
If you have any questions, suggestions or comments, please let me know.
adatasks.pyADA is a well known (but not very popular) programming language. I don't like it very much, but it has a nice concurrency model that might be useful for some purposes. The idea is to define tasks that will execute asynchronously, which have an associated channel to send messages to it, which consist of a tuple (entry, parameters).
You can launch several identical tasks, which share the message channel. When a message arrives, only one task will pick it up, in a FIFO fashion.
The Python implementation is very simple, it consists of a decorator to convert a function into a task, which adds some attributes to read messages from the communication channel, to launch tasks and to wait for a given task completion.
from adatasks import * # create a task by defining an annotated function @task def f(): print f.accept("msg1"), f.select("msg2", "msg3") # launch all tasks launch() # send two messages to f f.msg1("Hello", "world!") f.msg3("How's it going?") # here f prints: ('Hello', 'world!') ('msg3', ("How's it going?",)) # wait for all tasks to complete wait_for_all()adatasks.py
contract.pyThis module provides a set of three decorators to implement design by contract, using pre and post functions embedded inside the function body.
Putting them inside the function have some drawbacks, most notably it's harder to reuse them, and it's not really that different from open-coding them directly in the function. However, it can be more comfortable if you prefer looking at everything inside the same function body.
from contract import * @contract def f(x): @precond(f) def pre(x): ... @postcond(f) def post(ret, x): ... ...contract.py
lazy.pyThis module implements a decorator for lazy evaluation.
When you invoke a lazy-decorated function, it will return a special object inmediately, which you can assume has the returned value, but it doesn't: the function doesn't actually run until you try to access the object.
from lazy import lazy @lazy def f(x): ... return x r = f(x) # r has a "lazy object" ... print r # f is run herelazy.py
dataflow.pyThis module implements objects with dataflow semantics, and a dataflow decorator to make functions return dataflow objects.
The term dataflow (which is often used for a number of different things) here is used to denote an object which can be used before its value has been computed, with the computing happening asynchronously. When you try to use the object, if its computation has not concluded, you're put to wait until it is.
The decorator is easier to understand, because it's similar to the lazy decorator described above, but instead of waiting until the object is needed to run the function, it runs it asynchronously right away.
import time from dataflow import dataflow @dataflow def f(x): time.sleep(2) return x r1 = f(1) # these return immediately, but the functions r2 = f(2) # begin to run in paralell print r1, r2 # and here we wait until f has been completeddataflow.py
pybackground is another project that implements this.
pcmp.pyThis module implements a message passing channel between two processes, a parent and its child.
It has only one class that implements the channel, which allows you to send and receive python objects (as long as they're pickable).
You should instance it before you fork, and then call .child() or .parent() according to your status. Then you can begin sending and receiving objects with .send() and .recv().
import os import pcmp c = pcmp.Channel() pid = os.fork() if pid == 0: c.child() c.send('Hi dad!') print c.recv() else: c.parent() c.send("Hi son!") print c.recv()pcmp.py
pfunctional.pyThis module implements parallel versions of the map() and filter() functions. They work by partitioning the sequence passed to them, and running the function over the items of each part in parallel. The number of threads to use is determined by the 'workers' parameter, which defaults to 2.
Be aware that due to Python's internal threading implementation, purely CPU bound parallel runs are probably not going to run faster than sequential runs. Google for "GIL" (as in Global Interpreter Lock) for more information. On the other hand, it might be quite useful if you want to map or filter over some function that interacts with the network or the I/O subsystem.
pickle_rpc.pyThis module implements a TCP based simple RPC server and client (similar to xmlrpclib), but using pickle for serialization. This results in much improved performance compared to XML-RPC or SOAP, at the expense of not being able to work between different languages.
Alberto Bertogli (firstname.lastname@example.org) - Last updated 10/Nov/2012