Jan Schaumann gave a talk at Velocity 2012 titled "Down With The Fancy Pants!: how people have been optimizing the wrong things and increased complexity." Not long after, though seemingly unrelated, Jeff Atwood wrote an article about the long history of pointing out PHP's foibles titled The PHP Singularity, in which he suggests that what is needed to escape the gravity of the enormous PHP ecosystem is a revolution in tooling to address the shortcomings of non-PHP systems in the areas where PHP is strong.

In his slides, Jan suggests that connections between software components increase the complexity of software systems exponentially; that the rules of thumb established by Brooks in The Mythical Man Month regarding communication among teams applies to "communication" between software components. The communication which increases this network complexity isn't limited to actual communication; to processes or nodes sending messages over TCP or via queues, but also to the number of implicit dependencies that software has on libraries or the architecture that runs them.

I've only anecdotal evidence to lean on, but in my experience this is true, and the elegance of the theory lends some not inconsiderable weight to its likeliness. There's a pervasive idea that software development is really complexity management, and there's a cross-discipline consensus that less is more when managing complexity. Jan goes on to identify different types of complexity, much in the same way that Chuck Moore's principle on the "Conservation of Complexity", explained by Elizabeth Rather in this marvelous usenet post (emphasis mine):

Any given problem has a certain intrinsic level of complexity. In the solution of the problem, this complexity will be conserved: if you dive in with too little advance thought, your solution may become very complex by the time it's done. On the other hand, if you invest more in thought, design, and preparation, you may be able to achieve a very simple solution (the complexity hasn't gone away, it's become embodied in the sophistication of the design).

As the smaller, more modestly funded technical teams of post-bubble startups found, they could go a long way with conceptually simple tools which were nonetheless sophisticated enough to reduce vast incidental complexities previously inherent to building software: interpreted languages reduce (or remove) the incidental complexities of build systems and speed up development cycles. On the client side, JavaScript libraries paper over the heterogeneous browser market to provide a simpler, expressive platform.

This kind of tooling has played a large part in the attempt to reduce software complexity. But, unlike the sculptor who cannot carve an improved chisel or the painter who cannot paint himself a superior brush, software developers live in a world in which the techniques required to improve upon their tools are the same as those employed to develop their products. And I fear we've gone too far.

Simple tools are only worthwhile if the incidental complexity they address is not greater than the additional network complexity of their introduction. In project management terms, borrowing the words of Alex Martelli:

Since a little supervision and control make things so much better when compared to total disorder and anarchy, it's a natural fallacy to think that MUCH MORE control and supervision will make everything just perfect. Wrong. When some sort of threshold is exceeded, the attempts towards tighter control take on dynamics of their own and start absorbing unbounded amounts of energy and effort in a self-perpetuating system.

Substituting the concept of "abstraction" for "control", I see this situation whenever the epithet "beautiful" is used where "simplicity" is meant, where small incremental improvements in soft areas like readability are made at the cost of increased network complexity for the whole system. Tools like HAML and CoffeeScript offer syntactic improvements that do very little to reduce incidental complexity, but which nonetheless contribute to the network effect of increased architectural complexity.

In the spirit of the late Joe Miklojcik III, who coined the phrase Gosling Tarpit, I'll call this The Tooling Tarpit, a situation in which a endless desire for beauty leads to the cyclical creation of tools and abstraction layers which has a net negative impact on the incidental complexity they were designed to address. With modern web applications having such a high base network complexity, as effectively conveyed in the form of funny images in Jan's talk, is an extra tool or deployment philosophy really enough to counter the PHP singularity?

I suspect that the real advancement that will eventually topple PHP will require a similar level of increased sophistication as accompanied the move from rigid compiled languages to dynamic interpreted ones. In that leap, drastically simplified (or removed) were incidental complexities like memory management, string handling and text encoding, build processes, platform requirements, object serialization; the list goes on. The next great leap just might be a system which manages, reduces, or nullifies the network effects that have led to this tarpit where tools and systems are stacked upon each other with no end in sight. The next great leap just might be backward.

Jun 29 2012