OK, I need to read more of it then. As I said, I only got through maybe a half of it before I had to go do stuff. Hopefully the rest will make a better impression.
Edit: The rest of the article made a better impression, but I still feel he's defending the concept of software patents too much. First of all, many processes and algorithms become obvious to a programmer after he spends a while tackling a problem. Chances are he'll happen upon a solution that has been patented, simply because it is the most efficient way to go. Look at some of the UI shit getting patented. Once upon a time, XOR cursors were patented too. That's straight boolean algebra, very useful (and obvious) for primitive computer displays without separate frame buffers. Second, the pace of advancements in technology means that even a few years of monopoly will choke independent innovation. By the time a patent expires, the game has moved well beyond it.
I think there's a problem here at the conceptual level, not just the definition one. At least he acknowledges the problems and proposes (limited) solutions. But yeah, I'm one of the angry mob he mentions. Bring out the torches and pitchforks, some tar and feathers too. Let's head to the patent office . . .