The Occasional Occurence
Python's Enhanced Generators
June 13, 2008 at 01:38 PM | categories: Python, computing, GeneralSo while I was mowing my grass last night*, I got to thinking about Python 2.5's enhanced generators and how I hadn't tried them out yet. Here is a simple example that uses the consumer/pipeline model described in PEP 342.
__author__ = "Christian Wyglendowski" __license__ = "MIT" def consumer(func): """Automatically advance generator to 1st yield point - (from PEP 342)""" def wrapper(*args, **kw): gen = func(*args, **kw) gen.next() return gen wrapper.__name__ = func.__name__ wrapper.__dict__ = func.__dict__ wrapper.__doc__ = func.__doc__ return wrapper @consumer def reverser(next=None): """A string-reversing consumer""" t = yield if next: yield next.send("".join(reversed(t))) else: yield "".join(reversed(t)) def transformer(methodname, *args, **params): """Creates a consumer for the given string methodname. Passes *args and **params to the string method.""" @consumer def _transform(next=None): text = yield m = getattr(text, methodname) transformed = m(*args, **params) if next: yield next.send(transformed) else: yield transformed _transform.__name__ = methodname return _transform def transform(text, transformers): """Passes text through a list of transformers and returns the result.""" next = None for t in reversed(transformers): next = t(next) return next.send(text)
(Is this too much code for a blog post?)
So what can you do with all of that? Easily perform various combinations of string transformations!
In [1]: title = transformer('title') In [2]: transform('python', [title, reverser]) Out[2]: 'nohtyP' In [3]: transform('python', [reverser, title]) Out[3]: 'Nohtyp'
It's a fairly contrived example, but the concept of this pipeline of transformations (described in much more detail in PEP 342) is really pretty cool. An application applying a more advanced version of something like this could process it's output using a variety of plugins that a user enabled.
Here is one final example:
In [4]: decode = transformer('decode', 'utf-8') In [5]: replace = transformer('replace', unichr(347), 's') In [6]: encode = transformer('encode', 'utf-8') In [7]: transform('Chri\xc5\x9btian', [decode, replace, encode]) Out[7]: 'Christian'
Has anyone else seen any neat examples of PEP 342 in the wild?
cw
* I usually gravitate to thinking about some obscure programming paradigm or feature while I am mowing grass. I think it has to do with how exceptionally boring mowing is.
EDIT: I guess I could have made the generators, um, more generative, so they aren't simply one-shot function-like simpletons. Oh well.
Converting docx Files
April 16, 2008 at 01:34 PM | categories: Python, Software, work, computing, GeneralI'm working on an OOXML implementation in Python and found this handy utility for converting docx files to rtf.
It seems to open docx files that Word complains about, but at least it let's me know that I am on the right track. Also, it runs under Wine on Linux, so there is no need for a virtual or non-virtual machine running Windows.
cw
Command History
April 12, 2008 at 10:52 AM | categories: Python, work, computing, GeneralSince all the cool kids are doing it ...
Work laptop-----------
christian@yga-dowski:~$ history|awk '{a[$2]++ } END{for(i in a){print a[i] " " i}}' |sort -rn|head 82 sudo 68 vim 51 ls 49 cd 48 exit 20 hg 16 rm 16 ipython 14 py.test 10 ping
Apparently I do a lot of exiting. I just started using Mercurial for local revision control, hence the presence of hg.
Dev server---------- (where I actually do most of my work)
cmw@watson:~/svn/g2wc$ history|awk '{a[$2]++ } END{for(i in a){print a[i] " " i}}' |sort -rn|head 158 vim 81 make 70 svn 55 rm 34 ls 33 cd 15 sudo 11 exit 7 kinit 7 htop
The make commands encapsulate many calls to python and py.test.
Reading Chunked HTTP/1.1 Responses
April 02, 2008 at 12:35 AM | categories: Python, work, computing, cherrypy, GeneralFor work today I wanted a way to iterate over an HTTP response with chunked transfer-coding on a chunk-for-chunk basis. I didn't see a builtin way to do that with httplib. It supports chunked reads but you have to specify the amount that you want to read if you don't want it to buffer. I just wanted it to read and yield each chunk that it received from the server.
For my first crack at it I really just tried to use the httplib basics:
import httplib conn = httplib.HTTPConnection('localhost:8080') conn.request('GET', '/') r = conn.getresponse() data = r.read(10) while data: print data data = r.read(10)
That worked but since I won't know the chunk size in real-life, I would probably get output similar to this:
Chunk 0 Ch unk 1 Chun k 2 Chunk 3 Chunk 4 ...
I really wanted that chunk-for-chunk iteration. After taking a look at the very readable httplib source this evening, it wasn't very hard to accomplish. I basically just took the httplib.HTTPResponse._read_chunked method and modified it to be a generator. I subclassed HTTPResponse and stuck my generator in an __iter__ method. Behold; now you can do this sort of thing:
if __name__ == "__main__": import httplib import iresponse conn = httplib.HTTPConnection('localhost:8080') conn.response_class = iresponse.IterableResponse conn.request('GET', '/') r = conn.getresponse() for chunk in r: print chunk
With nice results like this:
Chunk 0 Chunk 1 Chunk 2 Chunk 3 Chunk 4 ...
You can download the iresponse module from my projects site. There is also a small CherryPy application that serves some data with chunked transfer-coding in case any of you want to fiddle with it.
cw
My PyCon 2008 Post
March 27, 2008 at 10:15 PM | categories: Python, computing, GeneralWARNING: this is YAPAP (Yet Another Post About PyCon), and a late one at that. Click to read more, else move along fair netizen.
I enjoyed every part of PyCon 2008. I guess I'm easy to please or something. Maybe I just enjoyed getting out of my home/office for a few days and talking to my coworkers and other online friends face-to-face for a change. Yeah, that was probably it.
Three of us from ClePy (Kevin, Gary and me) and the lone member of "PittPy" (Chad) piled into Chad's dad's CRV and drove for 6+ hours to Chicago on Thursday. Before I left, my wife asked me what in the world we'd all be talking about for 6+ hours to and from Chicago. A valid question since we're a bunch of computer nerds.
Well, funny thing; put an at-home Disney Imagineer, a philosopher-hippie, an opinionated vegetarian and a guy whose typical daytime social outlet is IRC into the same vehicle for a road-trip to PyCon and you have a recipe for some interesting conversation. Of course we talked plenty about Python but we also covered pumping CO2 into underground oil pockets, burping lakes, sagging high-tension lines, farming, quantum mechanics and religion amongst many other topics.
I don't have too much to say about the talks at the conference. I went to a few really good ones and a bunch of O.K. ones. Highlights included the one on "rolling your own persistence system with Python" and the one on writing Trac plugins. I think what made those two talks good was a) a speaker who was excited about the topic and b) the right balance of code and concepts. I walked away from both of those talks feeling like I could hit the ground running if I wanted to dig further into either topic.
As I mentioned earlier, the best thing about the conference was the people. Where else can you go where there are over one thousand other people into Python and doing all sorts of interesting things with it? I got to hang out with the cool group of guys (Jamie, Dave and Michael) that I work with plus the other folks who I have met through Python. Then there was the "web dudes" party, meeting some international Pythonistas, exploring downtown Chicago, hanging out with Bob at the CherryPy BOF and copious amounts of pizza at Gino's East.
I think that one thing I'll do differently next year is get involved with more of the open space talks. I think that smaller more focused groups with more interaction would make for a better experience. And maybe I'll come up with something to give a lightning talk about.
So yeah, great conference. Thanks to everyone who worked hard to make it happen.
cw
PS
Check out my PyCon photos.