Featured Post

Linkedin

 These days, I mostly post my tech musings on Linkedin.  https://www.linkedin.com/in/seanmcgrath/

Wednesday, October 04, 2017

It is science Jim, but not as we know it.

Roger Needham once said that computing is noteworthy in that the technology often precedes the science[1]. In most sciences, it is the other way around. Scientists invent new building materials, new treatments for disease and so on. Once the scientists have moved on, the technologists move in to productize and commercialize the science.

In computing, we often do things the other way around. The technological tail seems to wag the scientific dog so to speak. What happens is that application-oriented technologists come up with something new. If it flies in the marketplace, then more theory oriented scientists move in to figure out how to make it work better, faster or sometimes to try to discover why the new thing works in the first place.

The Web for example, did not come out of a laboratory full of white coats and clipboards. (Well actually, yes it did but they were particle physicists and were not working on software[2]). The Web was produced by technologists in the first instance. Web scientists came later.

Needham's comments in turn reminded me of an excellent essay by Paul Graham from a Python conference. In that essay, entitled 'The hundred-year language'[3] Graham pointed out that the formal study of literature - a scientific activity in its analytical nature - rarely contributes anything to the creation of literature - which is a more technological activity.

Literature is an extreme example of the phenomenon of the technology preceding, in fact trumping, the science. I am not suggesting that software can be understood in literary terms. (Although one of my college lecturers was fond of saying that programming was language with some mathematics thrown in.) Software is somewhere in the middle, the science follows the technology but the science, when it comes, makes very useful contributions. Think for example of the useful technologies that have come out of scientific analysis of the Web. I'm thinking of things like clever proxy strategies, information retrieval algorithms and so on.

As I wander around the increasingly complex “stacks” of software, I cannot help but conclude that wherever software sits in the spectrum of science versus technology, there is "way" too much technology out there and not enough science.

The plethora of stacks and frameworks and standards is clearly not a situation that can be easily explained away on scientific innovation grounds alone. It requires a different kind of science. Mathematicians like John Nash, economists like Carl Shapiro and Hal Varian, Political Scientists like Robert Axelrod, all know what is really going on here.

These Scientists and others like them, that study competition and cooperation as phenomena in their own right would have no trouble explaining what is going on in today's software space. It has only a little to do with computing science per se and everything to do with strategy - commercial strategy. I am guessing that if they were to comment, Nash would talk about Equilibria[4], Shapiro and Varian would talk about Standards Wars[5], Robert Axelrod would talk about the Prisoners Dilemma and coalition formation[6].

All good science Jim, but not really computer science.

[1] href="http://news.bbc.co.uk/1/hi/technology/2814517.stm

[2] http://public.web.cern.ch/public/

[3] http://www.paulgraham.com/hundred.html

[4] http://www.gametheory.net/Dictionary/NashEquilibrium.html

[5] http://marriottschool.byu.edu/emp/Nile/mba580/handouts/art_of_war.pdf

[6] http://pscs.physics.lsa.umich.edu/Software/ComplexCoop.html

No comments: