Technologies That Multiply Inequalities

Gabriel Levy reviews The Bleeding Edge: why technology turns toxic in an unequal world by Bob Hughes (Oxford: New Internationalist, 2016)

How is it that the internet – a technology with such powerful, democratic potential – hovers over us like a monster that intrudes and spies, interferes in our collective interactions and thought processes, force-feeds us corporate garbage and imposes new work disciplines? What happened?

Bob Hughes, by thinking both about how computers work and how society works, offers compelling insights about these and other questions.

One of Hughes’s riffs on technological themes starts with the Forbes web page about Marc Andreessen, one of the world’s richest men, who in 1994 released the first version of the Netscape Navigator browser. Microsoft distributed Netscape with the Windows operating system, making Andreessen an instant multi-millionaire.

picture of demo: by Rainer Jensen/AP.

When Hughes visited the page, he found 196 words of information about Andreessen (last week, when I looked, there were only 81). Hughes found (and so did I) about another 1000 words of promotional and advertising material.

Behind those words lay a HTML page with 8506 words, or 88,928 characters. The promotional material multiplied the capacity required by the page 88 times – and that was not including the graphics. Including these, the page took about three-quarters of a megabyte of memory.

In the mid 1990s, Netscape Navigator made such advertising displays easy. It allowed web pages to create “cookies” on users’ machines to monitor the information that they enter, and enabled them to record and monitor information used in commerce. It effectively buried the principle, with which Tim Berners-Lee and other founders of the internet began, that the system would base itself on a huge mountain of identifiable text-only documents.

Hughes describes how Netscape Navigator and other developments opened the way for the dotcom bubble on the world’s stock exchanges, in 1997-2000, and a new technological leap, Web 2.0. This made possible live search functions, interactive social media, and the proliferation of online auction, booking, commercial and banking sites.
Web 2.0 was created under pressure from increasingly powerful e-business interests, against principled resistance from the World Wide Web Consortium, whose founding principles included a commitment to making the Web accessible to all, including the visually handicapped and the blind, and in as many different ways as possible - Hughes writes (p. 226).


The interactivity of the modern web is linked with an obsession among those who control it with speed – the computer-crunching power that allows pages to be updated and re-rendered instantaneously.

The crunching-power required has in the last 15 years turned the internet – which is often thought of as a relatively resource-light technology – into a massive user of electricity. Humungous data centres are built in parts of the USA where their owners cut special deals for bulk purchase with the local power companies. (See New York Times report here.)

Greenpeace has published estimates of the data centres’ electricity demand, compiled in the teeth of the internet companies’ maniacal secrecy, ranging from 263 billion kwh/year (including 76 billion kwh in the USA) for data centres only, to 372 billion kwh/year for data centres and telecoms infrastructure. (Another good article here.)

Consumption by the whole “cloud”, including manufacture and use of devices, has been estimated at 623 billion kwh/year. That’s more than the national total consumption of all countries except the USA, Russia, Japan and China. More than the total consumption for all purposes by India.

Hughes discusses all this and more. He argues that a society as unequal as capitalism constantly strangles and misdirects technological potential, instead of developing it.

Hughes points to ways that things could have gone, had it not been for the corporations’ grip – which was tightened not in the first stages of internet development, when the military and optimistic amateurs, by turns, pushed things forward, but later. There is a lengthy passage on the possibilities of analog computer technology, left unrealised as digital, and its corporate controllers, gained precedence.

The Moniac computer in the Science Museum in London (see “About the picture of Moniac”, below)


In a section on the incredible, and largely planned, obsolescence of computers, he writes:

“Computers could be used to increase the size of the ‘economic pie’ for everyone while reducing human environmental impact. This was the confident expectation in the 1970s – but it would have required a large-scale determined commitment to mutuality that failed to develop” (p. 133).

Such an attitude would always have struggled to thrive in a society that has “turned inter-personal rivalry into a cardinal virtue”, Hughes argues.

Hughes is attentive to the way that inequality not only provides the context for technology, but pervades the industry itself. Workers at the Foxconn factories in China, where there was a large number of suicides in 2010, were paid $130 per month, about one 31,000th of the salary of Apple’s then CEO, the late Steve Jobs, who was on $48 million a year.

“The inequality embodied in something like an iPhone could not be more different from the egalitarianism that made it possible, but it is not without precedent” (p. 57). Those precedents go at least as far back as the technological developments in the 16th century textile industry that enabled employers to wreck skilled labour systems and impoverish those that depended on them.

Hughes is no technophobe. The point he hammers home again and again is that technologies – and his examples are not only from computing history but from e.g. materials manufacture and road transport – take the directions they do because of the way society works. His emphasis is on inequality. At times it was unclear to me how, except by going past capitalist property relations, he thought that that would be overcome.

Nevertheless, I found Hughes’s approach a refreshing antidote to the one-sided technological determinism of writers associated with the “left” such as Paul Mason, in Post-Capitalism (see my review here) and Nick Srnicek and Alex Williams, in Inventing the Future (see my comments here).

Hughes writes with knowledge, and feeling, about the egalitarian and liberatory potential of computers and other technology. He considers voices in technology that challenged its social role such as Norbert Wiener and Stafford Beer, both pioneers of cybernetics. He has a fascinating chapter on Cybersyn, the cybernetics programme used by Salvador Allende’s government in Chile – on which Beer, who was British, worked – before it was overthrown in the bloody CIA-backed military coup of September 1973. (See an article by Eden Medina here and another one here.)


Stafford Beer


I warmed to Hughes’s optimism. In his final chapter, he argues that we need to “take the whole notion of Utopia more seriously”. In a more equal world, “jobs would serve community needs rather than profit, caring roles would be a priority and automation would encourage skilled work rather than eliminate it. But to arrive there we will need to undermine the ‘apparatus of justification’ on which inequality depends” (p. 303). The Bleeding Edge is a fine contribution to a serious conversation about technology and radical social change. GL, 5 December 2017.

About the picture of Moniac (the Monetary National Income Automatic Computer). Moniac was built in 1949 by Bill Phillips, a New Zealand economist, and is now displayed in the Science Museum in London. Hughes writes: “Phillips had realised that the mathematical models John Maynard Keynes and Joan Robinson had used to describe economic systems were very similar to ones used in hydraulics. […] His computer modelled the flows of cash, credit, savings and investments in an economy with a system of plastic tubes, reservoirs, pumps and valves. […] MONIACs were used in teaching and in research at the London School of Economics and several other institutions until Keynesian economics went out of fashion in the 1970s” (pp. 232-233). This is an example of analog computing, which physically models systems you are trying to control or examine, and which has since died out.


Note (21.12.17). This article has been corrected, to say that Stafford Beer was British, not American.

Gabriel Levy blogs @ People and Nature

1 comment:

  1. Interesting take on technology and equality. I may just look that book up

    ReplyDelete