Lost Values of Computing

Computing has always been a field laden with implicit values. Arguably the most fundamental of these is efficiency-which is hard-baked into some of the discipline’s most fundamental theories and insights. This includes efficiency in terms of both time-including the idea of computational complexity and the fundamental debate around P vs. NP, and space, in terms of optimizing the space requirements of an algorithm or data structure within a given level of complexity. Another is robustness-the ability to recover from failure or disruption, encapsulated in the design of the original Internet networking protocols that were to some degree inspired by systems intended to withstand a nuclear attack.

Other values have changed over time as the nature of computing technology has evolved. The (until recently, anyway) inevitable churn of Moore’s Law — increasing storage capacity and more powerful processors, at a fraction of a cost of earlier systems—has greatly impacted our notion of what is possible with computing systems. Moreover, the entry of computing into the mainstream economy (in some cases, supplanting it) has led to increased capitalization of the field, allowing us to invest in ideas and infrastructure that would never have been possible even a generation ago.

This trend has certainly had some positive outcomes — putting computing technologies in more people’s hands, with more powerful applications, allowing them to achieve tasks that were previously only possible for experts with sophisticated hardware and software. In short, computing has come a long way from its DIY hacker bricolage past, when it was only accessible for an elite few who happened to have access to the right hardware and expertise. However, as with all change, this transition has not been without other consequences. In this essay, I reflect on some cultural values that we may have lost along the way, and speculate on how recapturing some of these might help us achieve a healthier and more sustainable relationship to technology.

Patience and Care: My professors at UW used to tell me about the punch card days of computing-when they would submit computing jobs to the mainframe and would have to wait hours (or even days) for the result. If the program had an error (even a trivial one), they would not learn about it until the result came back-which meant some time fixing it, and another long wait before finding out whether the fix worked. This meant that coding was a task to be done with excessive care and attention — especially around thesis or paper deadlines. On the more positive side, the time spent waiting for your computing results could also be used productively-at the local campus bar, socializing with friends, for example. I still remember my days at an undergrad working in the Sun Lab at Brown University-where on busy days around programming assignments there was often a waitlist for accessing a workstation. I made some great friends just waiting for a machine to open up.

Frugality: The inherent cost of computing cycles and data storage also led to some elegant (or, some might say, obfuscated) approaches to solving computational problems. I still remember hearing about a program written in assembly that dynamically overwrote the program memory as a way of reducing the footprint of the running application. The cost (and customizability) of computing hardware meant that rather then replace an outdated machine — you would try to squeeze as much value as possible out of it by upgrading the hard drive, the motherboard, or various peripherals. Even software was expensive — buying a shrink-wrapped computer game at the bookstore meant an investment of months of time learning all of the finer aspects of it, before shifting your attention elsewhere.

Humility: It is hard to remember our field as ever being humble in any real sense of the world. From the beginning, computer scientists have been gripped by fantastic visions of what can be achieved with this technology-whether that has meant creating human-level intelligence, or of coordinating the plans and activities of entire nations (or even the world) on the basis of computer-assisted data gathering and analysis. However, the failure of these ambitious projects created room for sincere reflection and recalibration. Refugees from the MIT AI lab like Terry Winograd and Phil Agre, as well as researchers from other disciplines like Leigh Star and Lucy Suchman, reflected on the limitations of computational approaches to solving complex social problems, and on ways to make computing research and practice more relevant to the human condition. While this critical tradition has certainly continued today, much of it focuses on the unbridled power of technology, rather then the inherent limitations of technology and the various practices that surround it (with some important exceptions).

Service: Before computing became a trillion dollar business, it was a hobby, and a way to help others solve their problems. Online fora like bulletin boards and Usenet, as well as open source hardware and software development communities, embodied this spirit of collaborative problem solving (as well as, admittedly, some more pathological and even downright illegal tendencies). While such activity still flourishes on the Internet, it is often overshadowed in the popular media and in terms of people’s attention and time by stories of well remunerated software engineers working for companies making private goods, as well as the inevitable ascent (and sometimes descent) of these companies as they navigate the chutes and ladders of capitalism. While we all have ample opportunity to serve others with our computing knowledge (especially when visiting our parents’ houses), only a few of us do so in any systematic or consistent way.

My point in this nostalgic reverie on the “glory days” of computing is not to paint some picture of a perfect, idyllic Eden to which we must return. I also do not mean to belittle or dismiss all of that which has been achieved along the way. Computing has made great strides — not only in its technological capabilities, but also in terms of other important values like access and inclusion. While there is still much progress yet to be made on all these fronts, we can be proud of all that we have achieved as a discipline and profession.

But, with increased power, comes increased responsibility. As computing increases its hold on our individual attention, as well as on mediating our social, political and economic relations, its cultural influence must not be understated. By not taking a stand on our core values, we are letting the withering force of global capitalism dictate the direction of this influence. This has had large and disturbing effects-on our economy, politics, social relations and personal well-being.

In the search for appropriate ethical norms, we often seek lessons from other disciplines — like philosophy, law or sociology. While there is definitely much to be learned from this kind of interdisciplinary discussion, my point in this article is to remember that computing itself has had a rich and diverse heritage of cultural norms and values, at least before our newfound riches and deepening interactions with capital.

Perhaps by recapturing some aspects of our shared history and culture, we can once again make computing into a more human (and humanizing) influence on the world. More patience and care might help us design and use systems in a way that does not distract us from the parts of life that make it worth living. Frugality might ensure that the systems (and companies) we build do as much as they can, with as little impact as possible. Humility might teach us to be reflective about the kinds of the issues that can be solved with technology, and those that require non-technical solutions (or have no “solution” at all). Service might help us create a more human and humane technology industry, where the needs of the rich and privileged do not outweigh those of others who also need systems and tools that work for them.

Im sure I have captured just some values that have defined our history, and can inspire and motivate future generations of computer scientists. As our stature grows, I hope that we as a profession continue taking an active role in defining our culture and norms, as in fields like medicine, law and engineering. By doing so, we will not only ensure that the best and brightest continue to flock to our courses and programs, but that they do it for the right reasons, contributing to a legacy and impact that we can be proud of.

hegemon