Travelling `light' for the future
I have recently been moved to think more about the efficiency of the technologies we use, triggered by a recent comparison made between my software Cfengine and contestant Puppet. The comparison showed an average 25 times greater resource efficiency in Cfengine's favour[1]. It brought back memories of times Puppet's author would accuse me of "premature optimization" on the Cfengine forum. Checking myself from an obvious opportunity to gloat, it reminded me that there was something more fundamental at work here. Our world has gotten fat on bad technology. And we teach these bad habits in College.
Anyone who has used Windows along side Linux on the same PC, knows about inefficiency from painful experience. Setting aside the annoying fact that Windows thwarts all attempts to do work (as it interrupts you every five minutes to ask: are you sure you don't want to reboot the machine again?), it also executes similar tasks (compiling code, or running software) at about half the rate of my usual Linux desktop. That is an unscientific number, measured using my intestinal tract as a ruler, but it seems absolutely real to me, and I believe it has an obvious explanation.
This is not just a Windows problem. I have heard the same kinds of complaints from users of software like OpenView and Tivoli, and a dozen other "classic software engineering denizens". Compare it to the kind of quicksand you expect from a typical Java application and the culprit is not too hard to find. It's today's design methodology. Software engineering teaches programmers to be resource hogs.
Small but perfectly formed
We've come along way from Donald Knuth's elegant Art of Computer programming, with its carefully constructed algorithms at the machine level. Today, software engineering `best practice' is to make each program an obese prevarication of intent, piling layer upon layer of interfaces with so-called "APIs", programs clothed as if for a polar expedition. The result is that it takes data about as long to get out of these programs as it takes light to get out of the sun's core. Is it any wonder that we need faster and faster computers to run more and more bloated software, with very little added functionality?
I still think wistfully of the 8 bit, 1MHz BBC Micro I had in 1983, and the fast and responsive word processor I used to write my first book (Wordwise Plus)[0] - what a lovely memory. It was faster and more responsive than Word or Open Office on a GHz processor today. Sure, I didn't have the same features or the same amount of memory; disk performance is in a different universe compared to the 300-1200 baud cassette storage I had then, but for perhaps 80% of what I do, I would be just as well off with the BBC Micro as I would with my ThinkPad. This is why I now like the idea of compiled text processing (Tex). First you have a simple fast editor and then you have a simple fast processor. Division of labour eliminates the confusion of intentions that results in more monolithic software like modern Word-processors, and the division of labour means that I am not wasting cycles contending for unnecessary reformatting in real time.
Too many layers make you hot
As you certainly know, CPU cycles generate heat with very high efficiency. Semiconductor technology works by basically dumping stored energy into the earth, heating up the CPU with every bit of information destroyed. This costs electricity to run, and the efficiency of energy use is very low. This means that every bit of work we do, every API we create, adds to the already prolific global warming effect of the IT industry (the growth of IT already surpassed the airline industry in 2006 and total emissions are expected to dominate by 2020[2][3]).
There are further complications such as power factors, but let's forget all that for now and do what Einstein called a gedanken experiment. Take an application you use every day, and suppose it runs 365 days per year costing you an annual power bill of $1000. Let's say it is "badly written", meaning it's old fashioned linear code without a layered structure. Now, we rewrite it in "modern style" to be full of function calls and constructors and destructors and abstraction. Each call costs the overhead of a stackframe creation and destruction, copying of data, etc. The ratio of code executed per subroutine to the code needed to perform stack management in the abstraction layers now determines how much of your electricity is going into useful work. Suppose your redesign adds a 1% overhead in coding. That's a 2% electricity at 50% power supply conversion and 4% with datacentre cooling overhead added. That's an annual increase in at least $40. Surely nothing to worry about.
Indeed, it turns out that most of the energy used by a computer comes from just switching it on, so if we want to be fair to software writers, we could choose to discount the basic inefficiency of having a computer in the first place. About 80% of the power in fact, according to studies done my friends at IBM a few years ago (reference escapes me for the moment). So we are fighting over the remaining 20%, and thus we could divide the efficiency of a program by a fairness factor of 5 for a round figure. A mere $8 a year would be petty cash.
Such numbers as 1% would indeed be fine, but this is not what we are talking about. Consider the Windows example earlier of possibly 50-100% slowdown, and the staggering 25x numbers from the Puppet measurement, it seems to my intestines that the actual overhead could be approaching much higher levels (200%-2500%!). Even if you divide by 5, running something like Puppet instead of Cfengine would lead to a 5x increase in power usage over the times it was running, assuming that the cost of contention is about the same. If it runs for a 5 minutes in each half our, that could still be almost a doubling of power cost of running the software. It would be very interesting to do a direct study of what running heavyweight software actually costs the world.
What is going on?
The arguments for this gross loss of efficiency are "better abstraction", "better re-use of code", "more intuitive models" (object orientation), etc etc. Even after some forty years of re-invention, what Object Orientation actually is is still hotly debated. It "is a" very confusing admixture of what anyone wants it to be on a given day, with a few interesting themes. More worrying, in my view, is the philosophy of hierarchical taxonomic "modelling" that it brought into fashion, which leads to exponential information complexity and expensive layers of overhead, encouraging many small functions with a very low code to overhead ratio.
Should we care about "premature optimization"? Should we care that it costs us more and more every year to do the same thing? I think so. From the point of view of economics, it is clearly nonsense that computers do less and less as technology gets better. It's a kind of "Red Queen" effect in reverse, as though all these APIs are like antibodies, trying to choke software. Okay, we have become more sophisticated in the Windowing systems and desktops we use for our comfort and productivity, but gone are the days when we praised the Rolls Royce or Hummer for their high fuel consumption just for their special quirks. We no longer think that it is acceptable to tear down forests or ravage natural beauty to acquire resources at any price. The benefits we have acquired from software are not in proportion to their efficiencies, as a comparison between Windows and Linux, or Puppet and Cfengine clearly demonstrates.
Swallow the spider to catch the fly, don't ask me why...
Most of the CPU cycles that get added using modern software engineering methods go into additional overhead. The paradoxical idea is that you can increase productivity in coding by having this overhead, as it alleviates some of the burden from programmers. It is also claimed that you can more easily re-use the code so you make an investment in the future too.
Software engineering methods were largely designed for formal design processes, but studies have shown that there is far less code re-use in projects than one might think (presumably apart from obvious library code which has existed from the beginning of time). Thus one eventually moved to the less ambitious notion of design patterns. The Open Source phenomenon on the other hand has contributed more to "out of house" code re-use by simply publishing code, with no attempt to dress it up nicely for re-use, e.g. the GNU project most of which is written in `old-fashioned' languages like C[4].
Software, it seems, has not learned the lesson that "less can be more". Wrapping a process in layers of bureaucracy only slows things down. It never makes society work better, so why would we expect it to make software better. This is the "curse of management". To make re-use efficient is only a question of knowledge management -- keeping things simple and accessible.
So, if we never throw anything away or re-cycle resources, if we don't strip away the fat of bureaucracy, won't we just consume all the available resources and then collapse? All the interface technology is like fluid in the lungs, or obesity related apnea, drowning programs in their own inefficiency. The irony is that as the complexity grows, we typically add more layers to simplify the existing ones by wrapping scripts around software, or adding `management layers', like adding a ventilator instead of draining the lungs, or losing weight. The Open Source approach is to recycle stuff in more of a car-boot sale. Leave it around and someone will pick it up. It's cheap and doesn't push the overhead onto someone else.
So I am not a fan of modern programming frameworks like Java and Ruby. I have always distanced myself from researchers who talk about "management layers", riding on the success of the OSI 7 layer model has driven them mad with its own propaganda. How did adding more ever make something simpler? Well, layers are seductive because they can hide ugliness, but they don't make it go away. Management layering like make-up cannot reduce ugliness, it can only help to make the best of what you've got. Traffic lights help to balance the use of limited shared resources (a crossroads), but it cannot alleviate the basic problem of too much traffic.
When I wrote Cfengine 3, derived from its predecessor, I put considerable effort into taking out spurious overhead and cancerous growths in the code, as well as the language. The move has led to both a leaner, simpler and more powerful result, but the move did not pass without criticism. The radical appendectomy, while good for the technology, makes humans uneasy as no one like change. It's as if there are hundreds of system administrators trapped in an abusive relationship with their software: can't fix it, too scared to leave because of what they have already invested. (We inflate the value of the past over the value of the future -- don't ask me why, perhaps we'll die.)
I, personally, have fits of throwing things away and making things better, to counter this tendency. Our CEO at Cfengine, Thomas Ryd, came up with the expression `bloatware' for the feature-bloated software systems of the day, piling feature on feature to correct previous errors. You swallow a spider to catch a fly. Eventually you must swallow a horse, and die of course.
The is image of someone trapped by guilt conjures back the image Walter Benjamin's Angel of History[5]:
"This is how one pictures the angel of history. Where we perceive a chain of events, he sees one single catastrophe which keeps piling wreckage upon wreckage and hurls it in front of his feet. The angel would like to stay... and make whole what has been smashed. But a storm is blowing from Paradise...[and] ... irresistibly propels him into the future to which his back is turned... This storm is what we call progress.[4]"
Even Angelus Novus wanted to `travel light'.
Networks and more
I have only mentioned program code in this essay, but this is the tip of the melting iceberg. Also there is the matter of network design, centralized or decentralized architectures, and communication light or heavy protocols. How many cycles could we save by better design of networks? The whole discussion of scalability stems from the same kind of observations. This is also an area where I have spent a lot of time avoiding complexity in Cfengine.
Well, this essay is already long enough, so this is something for another day.
Oslo 13 Feb 2010
References
[0] http://news.bbc.co.uk/2/hi/technology/7303288.stm[1] Jarle Bjørgeengen, Puppet and Cfengine Compared, http://www.usenix.org/publications/login/2010-02/index.html
[2] http://www.thewhir.com/blog/Brian_Fry/121109_Greenwashing_Carbon_Offsets
[3] http://www.newscientist.com/article/dn12992
[4] Open Source code re-use [5] Walter Benjamin, Theses on the Philosophy of History (1940)