image

Digital Media and Society

This book is dedicated to Steve Hutchinson,

who taught me electronic sticklebricks.

Digital Media and Society:
An Introduction

ADRIAN ATHIQUE

polity

65 Bridge Street

Cambridge CB2 1UR, UK

350 Main Street

Malden, MA 02148, USA

Contents

List of Figures and Boxes
Acknowledgements
Introduction
Part I: Digital Histories
  1   Building a Digital Society
  2   The Socio-technical Interface
  3   Typing the User
  4   Audience as Community
Part II: Digital Individuals
  5   Pleasing Bodies
  6   Reality Checks
  7   My Personal Public
  8   Going Mobile
Part III: Digital Economies
  9   The Road to Serverdom
10   Digital Property
11   Consuming Power
12   Information at Work
Part IV: Digital Authorities
13   Virtual Democracy
14   Under Scrutiny
15   Managing Risk
16   Living in a Cloud
Postscript: Towards a Digital Sociology
Bibliography
Index

Figures and Boxes

FIGURES
  3.1   The evolution of the user concept
  4.1   Media systems by diffusion structure
11.1   The ‘old media’ industries
11.2   The ‘new media’ industries
11.3   The ‘long tail'
BOXES
  1.1   Characteristics of an information society
  1.2   Timeline of computerization
  1.3   Characteristics of a network society
  2.1   Social shaping of digital media
  2.2   Post-development social shaping of the Internet
  3.1   The determination of the user
  3.2   Social shaping of the user
  4.1   What is an audience?
  5.1   Digital liberation
  5.2   Visual pleasure and the virtual body
  5.3   Digital degeneration
  6.1   The order of simulation
  6.2   Orders of reality
  6.3   Common sociological concerns with computer games
  6.4   The game algorithm
  6.5   Computer game genres
  7.1   Social benefits of SNS
  7.2   Negative impacts of SNS
  9.1   Neoliberal political economy
  9.2   Neo-Marxist political economy
10.1   The principles of Free and Open Source Software
10.2   Key arguments for copyright regimes
10.3   Arguments for a free-use culture
11.1   Definition of the prosumer
12.1   Identified subsectors of the creative industries
12.2   Internet cultures and managerial roles
13.1   The Internet as a public sphere
13.2   Blogging the question
14.1   Evolution of surveillance techniques
14.2   The argument for digital surveillance
14.3   Arguments against digital surveillance
15.1   Potential cyber-crimes
15.2   Typology of cyber-crime 1: technology
15.3   Typology of cyber-crime 2: legality
15.4   Definition of hacking crimes
15.5   Consequences of digital piracy
15.6   Digital sex crimes
15.7   Typology of cyber-crime 3: occurrence

Acknowledgements

A work of this kind, stemming as it does from many years of teaching and learning in the field of digital media, owes a considerable debt of gratitude to the many colleagues with whom I have worked on the various subject areas covered in this book. Accordingly, I would like to express a debt of gratitude to Doug Clow and James Aczel at the Open University, Geoff Cox, Phil Ellis and Phaedra Stancer at the University of Plymouth, Graham Barwell, Kate Bowles, John Robinson and Chris Moore at the University of Wollongong, and Michael Halewood, Berfin Emre, Richard Davis and Rebecca Ellis at the University of Essex. I would also like to thank other colleagues, past and present, whose work is referenced here, including Mark Andrejevic, Melissa Gregg, Lynne Pettinger, Gareth Schott and Graeme Turner. For their support in providing space for the development of a substantive programme in digital sociology at Essex, I would also like to thank Mike Roper, Rob Stones, Sean Nixon and Eamonn Carrabine. Thanks are due also to Michael Bailey, for taking up the charge. I would also like to recognize the important contributions made to this endeavour by all of the students that I have taught in Australia, New Zealand and the UK. I am confident that their contributions, in terms of testing, challenging and extending the ideas discussed here, will grow in significance over the years. Finally, I would like to express my profound gratitude for the support provided by my editor at Polity Press for this book, Andrea Drugan. Like so many of the authors that have worked with Andrea in recent years, I have been struck by her extensive knowledge of the field, her enthusiasm for critical pedagogy and her capacity to deliver sound practical advice consistently and in real time. I would also like to thank Lauren Mulholland, Clare Ansell and Helen Gray, along with the reviewers of the proposals and drafts and everyone at Polity Press who strived to make this book as good as I wanted it to be.

Introduction

Over the last two decades, our view of mass communication in modern society has been extensively reconfigured by the ‘new media’ applications stemming from the rollout of digital technologies. In so many different ways, the digital media has come to be seen as the definitive technology of our times. The powerful combination of mechanical calculation, electronics, binary code and human language systems touches us in almost every aspect of life. Quite literally, the digital media have become the ‘operating system’ of almost everything else. In everyday life, our interpersonal relationships are conducted in a large part through digital communications. The institutions of work and governance are finely regulated by the same inexorable logics of programmatic management that lie at the heart of the technology. Our access to the vast stores of human knowledge, to cultural expression and to significant events unfolding in societies across the world is overwhelmingly mediated through various forms of digital media ‘content’. Digital media, then, take us close to the rhythm of social life across the broad scale of human affairs. To understand these phenomena fully we have to discover how a ‘whole way of life’, in all its complexity, becomes infused with the presence of digital systems. As such, the multifaceted relationship between digital media and human actions poses one of the most complex questions facing contemporary sociology, and one that impacts upon almost every academic discipline in one aspect or another.

Naturally, given the enormous breadth and scale of the subject matter, this book cannot encompass every possible instance or example of our encounter with the digital. My objective, instead, is to provide a relatively accessible and succinct account of some of the major areas of sociological concern. Accordingly, this book will examine a broad range of phenomena – from social networking and digital labour to the rise of cybercrime and identity theft, from the utopian ideals of virtual democracy to the Orwellian nightmare of the surveillance society, and from the free software movement to the seductions of online shopping. Many of these topics have been the subjects of recent works in their own right, and I would encourage you to read further in the areas that you find most germane or compelling. It remains important, nonetheless, to understand the intrinsic connectivity between them, and that is why I hope this book can provide a useful primer for those interested in understanding the digital media in depth. In taking this larger view, I will not put forward any single explanation or seek to predict the future evolution of digital society. The overarching aim of this work, instead, is to situate the rise of the digital media within the context of dynamic social interaction and to encourage a critical engagement with our complex and rapidly changing world. To put that more simply, I want you read this book and make up your own mind about what is going on.

In that light, I have chosen to approach the topics covered here from a range of sociological approaches that each have something to offer to readers willing to make their own assessments of the claims being made. In each case, this will not be the only possible approach to that topic, and you will also need to make your mind up about which sets of theories and models make the most sense to you. In the process, we will interrogate many of the classic questions of sociological inquiry in their digital manifestation: the competing forces of structure and agency, the predominance of conflict or consensus, the relationship between action and meaning and the interface of individual and collective experience. This necessarily entails a degree of eclecticism. The role of technological mediation has been at the heart of the major sociological theories of the past thirty years but, at the same time, a good deal of the sociological examination of the digital media has been conducted within other academic disciplines. Once again, the collation of all this work is naturally beyond the scope of this book but in bringing together what I can in brief, I hope that new light can be shed on areas both familiar and unfamiliar to your studies thus far.

I should also note that ‘digital society’, the term that I employ here, is not a term in common use. I am using it precisely because the other available terms are all in some sense partial or overlapping with the terrain that I wish to explore. Each of them favours the structure, usage or form of digital communication in different ways. Taking the broader notion of a digital society therefore allows us to consider their significance in more relative terms. My intention is also one of synthesis, since various approaches have currency and validity and therefore merit our attention. Arguably, it is also the case that all of these models tend to privilege the broader (macrological) forms of society. At various points in this book, I will do the same. However, I will also take pains to emphasize that society is something that we live in, and to which we all contribute at the level of everyday life (that is, the micrological). Human beings make digital technologies social, and that puts human beings at the heart of any digital society. There is also a notable anglophone bias to my presentation, a charge to which most studies of the ‘information revolution’ must also answer. Human societies are, of course, somewhat distinctive in both character and function, and we should expect their ‘digitization’ to reflect this. However, keeping within the art of the possible, I will not seek to account for the full order of cultural diversity. I will instead ask you respectfully, as a reader, to make sense of what is discussed here within your own social and cultural situation.

Adrian Athique
November 2011

Part I

Digital Histories

CHAPTER 1

Building a Digital Society

It is important that many of the structural possibilities of a digital society were foreseen in the early days of electronic computing (and even before). Others were either unanticipated or came more from the pages of fantasy novels than from the rational projection of technologists. Parallel processes were in play. This first chapter casts its eye firmly on the futurism of the past, specifically from the advent of modern computer science in 1936 to the crisis of undue complexity in 2007. In this brief history, I will recall how digital technologies, through a virtuous circle of mediation, came to reinforce and implement particular ways of conceptualizing and ordering society in the late twentieth century. As the application of digital technologies gathered pace from the 1970s onwards, the various imperatives driving technological change were also reshaping the dominant political structures in developed and developing societies. Because of the apparent confluence of technical and social change, it has become commonplace to argue that the impact of digital technologies is comparable to that of the Industrial Revolution (Castells 1996). An ‘information revolution’ is seen to mark an irreversible transition from physical to intangible (untouchable) commodities and actions, and from embodied to mediated social processes (Leadbeater 2000). More cautious scholars have argued that it is vitally important to recognize that there is a crucial element of continuity with the previous ‘industrial’ age in terms of a progressive technical automation of human functions and social organization (Webster 2006). In that respect, the digital media have a substantial and fascinating history, and it is worth situating these arguments against the past evolution of digital society.

Information Society and the Atomic Age

The digital technology of today finds its origins in the mechanical calculating machines invented during the nineteenth century. The ‘analytical engine’ conceived by Charles Babbage in the 1830s was the first machine intended to process and store information with multiple purposes and flexible configurations of usage (Swade 2001). A prototype of this machine was completed by the beginning of the twentieth century. The storage of information records on ‘punched’ cards was also an innovation of the nineteenth century that would later become a key component of computing in the twentieth century. In 1936, Alan Turing introduced the concept of the ‘Turing machine’, a device that would make calculations based upon a large store of printed information which could be selectively applied for mathematical processing (Petzold 2008). Turing took this concept further to demonstrate the idea of a ‘universal machine’ that could read the description of any computational process (an ‘algorithm’) and then simulate its operation. Turing was one of the most significant figures in modern mathematics and as a consequence, following the outbreak of the Second World War, he was recruited to work at Britain’s secret code-breaking centre at Bletchley Park. Turing famously devised the ‘bombe’ machine in order to decipher the secret codes produced by the German cryptological machine (the ‘enigma’) (Copeland 2004).

The global conflagration that killed between 50 and 70 million people in the mid twentieth century occurred on the cusp of several major scientific breakthroughs, including not only computational machines, but also modern electronics and nuclear physics. In that respect, the war years (1939–45) were as much a scientific and technological contest as they were a military one. The most technologically advanced nations in the world, Britain, the United States and Germany, effectively conscripted their scientific talents and applied them relentlessly to military applications, culminating in the advent of computers, missiles and the atomic bomb in the 1940s. It is in that context that Konrad Zuse developed in 1941 the first programmable machine operated through information stored in binary code. The United States built the first electronic computer in 1941 and Britain developed an electronic device with limited programmability (the ‘colossus’) in 1943 (Copeland et al. 2006). In 1942, Britain took the momentous decision to share all of its scientific secrets with the United States, and the collaboration between the two countries enabled them to surpass Germany in the fields of logical computing and atomic weaponry. Needless to say, the atomic bomb, and its use against Japan in 1945, was an epochal moment in human history. The significance of the emergence of modern computer science, however, was kept under tight secrecy, and did not become fully apparent until a number of years after the war.

The United States Army built the ENIAC device in 1946 to aid in the successful delivery of missile weapons, whilst Britain built the first programmable electronic computers (the ‘Manchester computers’) between 1948 and 1950. Accordingly, the pursuit of electronic computing – in primitive but strategically important forms – by the major antagonists during the Second World War in the 1940s is commonly seen as heralding what has been called the ‘information age’. The conflict had brought together large number of scientists, academics and technicians on an unprecedented scale and had demonstrated how major technical achievements could be made quickly through such systematic collaboration. It was this experience that underpinned the decision to massively expand university and technical education in the post-war decades. In making his assessment of these developments for the future, Vannevar Bush, the director of the Federal Office of Scientific Research and Development in the United States, wrote an essay in 1945 in which he reflected on the growing specialization of knowledge and the new tools for managing information that would become essential in the post-war world. Famously, Bush projected the imminent arrival of a desktop information management machine that he called the ‘Memex’. The Memex would facilitate the storage, retrieval and, most critically, the linkage of information customizable to the needs of each user.

A Memex is a device in which an individual stores all his books, records and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory. It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk. In one end is the stored material. The matter of bulk is well taken care of by improved microfilm. Only a small part of the interior of the Memex is devoted to storage, the rest to mechanism. Yet if the user inserted 5,000 pages of material a day it would take him hundreds of years to fill the repository, so he can profligate and enter material freely.

Vannevar Bush (1945) ‘As We May Think’, Atlantic Monthly, 176(1): 101–8

The development of ‘mainframe’ computers in the 1950s and 1960s produced rapid leaps in the application of electronic computing to solving advanced mathematical problems. These machines were far from the desk-based device envisioned by Vannevar Bush, commonly taking up the size of an entire room or more. Mainframes required a massive amount of power and a large team to maintain and operate. Nonetheless, the energies spent upon the development of these machines stemmed from a widespread recognition that the concentration of information in forms that could be processed in any number of ways would open up enormous potentials for scientific development. Computerization would simultaneously solve the problem of memorizing and managing all that information. The speed of electronic processing promised to overcome the time- and scale-based limitations of human thinking. This step-change in efficiency could obviously be applied to scientific experiments, but also to any number of large and complex processes employed in military, bureaucratic and manufacturing applications. ‘Information management’ would no longer be a technique of making and maintaining records, but rather a dynamic process of experimentation that employed digitized records (‘data’) as its raw material.

Cold War and White Heat

The 1950s and 1960s were characterized by the onset of the ‘Cold War’, a period in which the wartimes allies of the capitalist West and communist East were pitted against each other in an intense scientific and technological contest to master the new technologies of the age. These (dangerous) rivalries were also expressed in their respective desire to demonstrate the supremacy of their opposing economic systems. As such, the potential of computing to improve the efficiency of industrial production was quickly recognized both by state-owned enterprises in the communist bloc and the private industrial corporations of the Western world, in which the United States had now become predominant. The pursuit of ‘information technology’ was intended to transform the productive process of global industry, with this modernization furnishing a capacity to rapidly develop and commercialize any number of new technologies. In 1963, the British Prime Minister, Harold Wilson, referred to the ‘white heat’ of a technological age. The focus of commercial competition was therefore shifting from territorial expansion to the pursuit of more efficient industries and markets via rapid automation. Three years before, US President Dwight D. Eisenhower had already spoken of the new institutional form of scientific research and its co-evolution with what he called the ‘military-industrial complex’ (1961).

It was the machinery of ‘high technology’ that caught the public imagination in the 1960s, via the ‘space race’, nuclear power and the domestication of electronics (notably television). The new centrality of information management, however, subsequently proved to be an equally profound development in the remaking of the modern world. By the 1970s we had entered an era in which vast stores of information appeared to hold greater significance than large volumes of physical resources. All forms of human processes, concepts and activities were being recorded as data that could, in turn, be applied and improved by the machinery of information technology. The perceived outcome of ‘computerization’ was that electronic calculation could increase both the scale and speed of almost any process repeatedly and infinitely. Thus, it was not simply the capacity of the computer to hold inconceivable amounts of information, but its programmatic capacity to select the right bits of data and process them in new combinations that was permanently revolutionary. In the process, computerization promised to make the conduct of almost any complex undertaking vastly more efficient. The Cold War militaries were playing with nukes, a dangerous game that required automation and the elimination of human error. Military applications thus took primacy in the paranoia of the 1950s, but by the end of the 1960s computer processing was also applied with enthusiasm to all the institutions of modern life. Universities, the traditional storehouses of information, were at the forefront of this process and were influential advocates of computerization.

The benefits of greater speed and efficiency in information processing were also immediately obvious to the various branches of modern government and to the commercial corporations of the day. Sociology, as a putative science of social organization and systematic observer of human behaviour, was a natural partner in the process of ‘informationalization’. As a technology of record, the more information that could be collected about every aspect of society, the more efficiently society could be assessed and managed. Equally, there were clear commercial benefits in knowing more about the habits of consumption in mass society. Ever more complex industrial processes became conceivable, making automated mass production bigger, faster and more innovative. It is fair to say, then, that computerization in the era of the mainframe was overwhelmingly corporate in scale and instrumental in purpose. The possession of data and the means to process it was intended to confer advantages that were inherently competitive in intent and managerial in flavour. Efficiency became the watchword of the day. In order to achieve these aims, it was necessary in the first instance to put computer technology to work on itself, rapidly advancing the technology and automating the process of its own development. Thus, information about computing itself became a major objective of scientific research, and algorithms for the computation of information (the ‘software’) became a major constituent in the development of the electronic components (the ‘hardware’) intended to carry out those functions.

It was, without a doubt, the United States that led the charge in computerization. Their Soviet adversaries made huge efforts in developing their own mainframe systems, while Britain, with its diminishing resources, struggled to keep up. Other European countries pooled resources to stay in the ‘high-technology’ game. In the developing world, India was keen to commit to the development of computer science, while Japan (under the post-war tutelage of the United States) pursued this new technological and managerial paradigm with unparalleled enthusiasm. As a consequence, it is relatively unsurprising that when the information age was given structural expression in the model of an ‘information society’, this took the form of a political economy of which the major advocates were American (Drucker 1959; Bell 1973) and Japanese (Masuda 1990). This emerging perspective on social structure has seen many different iterations in theory, but the hallmark of information society theory is social organization via data processing. In Robert Hassan’s definition: ‘At the broadest level of conceptualization, we can begin by saying that the information society is the successor to the industrial society. Information, in the form of ideas, concepts, innovation and run-of-the-mill data on every imaginable subject – and replicated as digital bits and bytes through computerization – has replaced labour and the relatively static logic of fixed plant as the central organizing logic of society’ (2008: 23).

Box 1.1 Characteristics of an information society

  •   Knowledge displaces skills – fundamental importance of guiding processes over physical actions
  •   Mechanical archives – complete automation of informational processes
  •   Social life as data – unprecedented collection and collation of information on human activity
  •   Purposeful knowledge – value is extracted from the application of information rather than its meaning or essence
  •   Continuous innovation – configuration of data in new forms becomes the basis of knowledge production
  •   Competitive velocity – the accelerated speed and efficiency of information techniques constitute an advantage in all fields of activity
  •   Exponential change – The primary goal of the ‘information revolution’ is the total transformation of human affairs

The Flowering of Electronic Revolutions

The co-evolution of computerization and electronics was marked by a series of important hardware developments. The biggest breakthroughs were the transistor and the subsequent development of the integrated circuit (silicon chip), which allowed for the photolithographic production of millions of tiny transistors in cheap, powerful and extremely small computer processors. Many of these innovations had far wider applications than the rapid upscaling of computing machines. The post-war decades were also the era in which the Western democracies pursued the dream of a society in which productive efficiency put mass consumption at the heart of everyday life. As such, the fruits of electronic technology became rapidly available in the form of affordable consumer devices that quickly came to transform the domestic environment. This was an era of new ‘labour-saving’ devices that automated everyday chores and paved the way for women, in particular, to use their increasing free time to enter the paid workforce. This, in turn, increased the spending capacity of the family unit and further accelerated the purchase of devices designed to support a faster and more mobile way of life. The most striking domestic electronic device, however, was the television. Television brought an unprecedented range of information and popular entertainment into living rooms, irrevocably shifting the primary terrain of cultural consumption from public to private spaces. Along with the transistor radio, the electric record player and the telephone, the television quickly came to mediate our cultural environment (McLuhan 1964). Indeed, it was television that popularized the very term ‘media’ in everyday usage, and made media studies a logical proposition for academic study.

The culture industry, which Theodor Adorno and Max Horkheimer famously described in 1944, expanded rapidly to produce a vast array of accessible cultural works which could be enjoyed via the machinery of the home (Adorno and Horkheimer 1993). Thus, the era of electronics was also to be the era of the pop song, the soap opera and, along with them, domestic advertising. For the generation born in the twenty years after the end of the Second World War (the so-called ‘baby boomers’), the new milieu of consumer electronics and mass consumption became an all-encompassing social environment. Since the realization of a computerized society took place simultaneously with the timeframe of their own lifespan, the baby boomers became critically important in the shaping of the digital present. By the end of the 1960s, the baby boomers were hitting their twenties. Due to massive public investments by the previous generation, they were far better educated on average and the long post-war boom in the West gave them ready access to jobs, housing, contraception and medical care. The fruits of the post-war dream thus fell into the hands of those who came to have quite different aspirations from their parents. Overall, they expected more material affluence, more individual freedoms (sexual, financial and cultural) and, by various means, they demanded unprecedented social mobility.

In their youth, the baby boomers also became particularly enraptured by the transformation of popular culture that took place as they were growing up. With the massive expansion of the cultural marketplace and the rising purchasing power of ordinary people, pop music, film, fashion, art and television were all newly accessible domains in which conscious and constant cultural experimentation was taking place. These developments, by their very nature, touched those in all walks of life, but it was the youth demographic in the educated classes that came to demonstrate (quite literally) their impatience to take up the reins of steering the information age. Their challenge to the old establishment and its values was expressed in the heady radical populism of 1968 (Doggett 2008). It was in the following decade, however, that middle-class baby boomers actually came to replace their predecessors in the professional domain. Nonetheless, the politics and revolutionary ambitions of the 1960s were reflected in the expression of a vague ‘counter-culture’ in the United States and Western Europe. This constituted a new view of social participation that valued self-expression above all else, and eschewed the old certainties and conformities of the 1950s (Roszak 1995). Baby boomers (in very general terms) placed their faith instead in technology, personal ambition and free speech as determinants for a better world. Many of the brightest of the generation, regardless of their wider politics, were infused with these values and it was logical, therefore, that many of them took a keen interest in computers.

What made the information revolution of the 1970s technologically different from the preceding era of the mainframe was that it was achieved in large part by ‘distributed computing’. The twin rise of computerization and consumer electronics had driven a process of miniaturization that had begun with the demands of the space race. Electronic components had been subjected to a continuous process of refinement that made them smaller and therefore suitable for a wide range of new applications. In computing, this led to the recognition of ‘Moore’s law’, which famously predicted that the processing power of a small integrated circuit would double every two years, reducing the size of computers and leading to exponential growth in computer power. While many of the larger technology corporations continued to concentrate on the supercomputers that were their stock-in-trade, the younger generation of computer technicians turned their attention to the possibilities of much smaller multipurpose devices, ushering in the era of the ‘microcomputer’. These single-user devices could be assembled from basic components, were fully programmable and easily fitted on a desktop. A number of small companies were founded with the intention of commercializing basic microcomputers, including the now legendary Apple Computer in 1976.

Initially, microcomputers were sold as build-at-home kits for the technically minded and required knowledge of electronics and software programming. These were rare skills outside of the scientific community, but there was sufficient interest from the public to encourage the world’s biggest computer corporation, IBM, to develop its own microcomputer design. In 1980, they commissioned a tiny company of microcomputer enthusiasts to write the operating software for this machine. This company, Microsoft, had the vision of putting a computer ‘on every desk and in every home’ (Gates 1995). The logic of this was that the concentration of computing power in large ‘timeshared’ devices would soon give way to a vast number of smaller machines that effectively distributed computing capacity amongst individual employees by giving them their own personal machine with software oriented towards their basic everyday tasks. The development of computer programs for ‘word processing’, ‘spreadsheets’ and basic ‘databases’ made microcomputers ideal for the modernizing office, and the afford-ability and versatility of microcomputers made them quick to deploy. Along with IBM, new companies such as Apple Computer marketed mass-produced ‘personal computers’ (PCs) that came straight out of the box with pre-loaded software. Before long, PCs were augmented with ‘ethernet’ technologies that allowed them to send messages to each other via an organization’s internal telephone network. This made electronic mail (demonstrated in 1972) available on a mass scale and further accelerated the ‘computer revolution’ in the workplace.

As the counter-culture generation had shouted at the establishment in the West, the industrialized nations had been shaken by the oil crisis in 1973, which definitively ended the era of cheap energy on which their post-war economies had been founded. Throughout the 1960s and 1970s, the upheavals stemming from the decolonization of the old European empires also saw the antagonisms of the Cold War spread across the continents of Africa and Asia. Western capitalism was forced into a new era where costs rose, markets closed and profits inevitably declined. Simultaneously, Germany and Japan (having rebuilt themselves with astounding speed after their wartime defeat) had quickly put into place rapid advances in electronic manufacturing that introduced much more intensive competition into the global manufacturing economy. To respond to these twin challenges, American and British capitalism simply had to become more efficient. Given that information technology had already established itself as an available solution, its application throughout the economy was accelerated. In order to streamline the day-to-day operations of innumerable businesses, the business community and public administration turned to personal distributed computing with vigour. IBM, having neglected to insist on the copyright for Microsoft’s ‘Windows’ operating system, saw their PC design widely copied by manufacturers across the globe, whilst Microsoft founder, Bill Gates, went on to become the richest man in the world.

The era of personal distributed computing was the moment in which software surpassed hardware in importance, and well-designed programs running on fairly simple computers could generate enormous revenues. Bill Gates was much more than a skilled programmer, however. He was also an ardent believer in, and advocate for, the information revolution. Gates believed that the efficiency savings of personal computers would overcome the contradictions and crises of capitalism and lead to a world where business worked well, and worked well for everybody (1995). He found a receptive audience, since the early fruits of distributed computing coincided with an equally revolutionary economic doctrine coming to the fore, which sought above all else to reduce workforces, streamline administration and invest in entrepreneurial projects. The new information technology (IT) companies fitted this bill quite nicely, and their products also made these objectives immediately plausible. Generational change in the political and business establishment brought another group of revolutionaries to the fore, in this case committed to the triptych of efficient capitalism, small government and computer science. Thus, in many different ways, the older institutional form of the information revolution became inflected with the various cultural and commercial revolutions of the day.

Revolutions, of course, require wide public participation and, throughout the 1980s, people from all walks of life embraced computerization with suitable fervour. In the home, computer gaming consoles and ‘pocket money’ machines such as the ZX81 and Commodore 64 encouraged children to learn computer skills from a young age. The cultural industries, always on the lookout for something new, ‘got into computers’ in a big way, with digital music and digital animation infusing the popular culture of the 1980s. Thus, it was machines of almost exactly the kind envisioned by Vannevar Bush that quickly came to dominate our vision of the future. It had taken thirty years, which, of course, is a very small space of time in the context of science and technology. During those intervening years, computers (real and imagined) had become part of the fabric of popular culture. From Captain Kirk’s handheld communicator on Star Trek (1966) to the pathological computer in Kubrick’s 2001 – A Space Odyssey (1968), to ‘R2D2’ in Star Wars (1977), ‘Orac’ in Blakes Seven (1978) and countless others, the socialization of the information machine was widely reflected in our visions of the future. In the economic domain, fortunes were made and economies were remade. In the university system, the classicism of the past was replaced by a restless venture futurism, as a new generation of students went into business administration and computer science. All of this, however, was far from being the endpoint in the making of a digital society.

Box 1.2 Timeline of computerization

  •   Mechanical computers conceived in the nineteenth century
  •   Electronic computing developed during the Second World War (1939–45)
  •   Nuclear technology and the Space Race encourage development of massive ‘mainframe’ computers in the 1950s
  •   Computers spread beyond military applications to wider uses in universities and commercial corporations in the 1960s
  •   Less powerful ‘microcomputers’ are developed in the 1970s, with commercial build-at-home machines
  •   Mass-produced personal computers see widespread application in the workplace during the 1980s, with the coming of the ‘spreadsheet’ and ‘email’ software
  •   Apple Macintosh computers (1984) corner the commercial design market with advanced graphical capabilities
  •   Home computer ownership grows amongst the middle classes from the mid-1980s
  •   The advent of the World Wide Web and ‘multimedia’ capability in the mid-1990s heralds the era of ‘new media’
  •   Domestic computer ownership and Internet connection becomes common across all social classes in developed countries during the 2000s
  •   In the 2010s, the personal handheld computer is an everyday accessory, along with a wide range of specialized digital devices

The New Media Age

By the 1990s, everything was in place for the next phase of ‘informationalization’. A gigantic apparatus of cheap offshore manufacture was being constructed (mostly in Asia), there was a commercial standard in operating software (primarily via Microsoft’s Windows platform) and there was an unprecedented boom in telecommunications (brought about by high-speed ‘fibre-optics’ and privatization). There were also boots on the ground, as basic operating skills became widespread and the personal computer became a fixture of everyday life in most developed societies. What brought all of these things together in a powerful new configuration was the development of the hypertext transfer protocol (http) by British computer scientist Tim Berners-Lee. Working at the European Organization for Nuclear Research (CERN), Berners-Lee had been concerned with improving the efficiency of information exchange in the scientific community. From the 1980s onwards, he worked extensively on the development of the ‘hypertext’ envisioned by Ted Nelson and Douglas Engelbart in the United States during the late 1960s. In a hypertext, related information in one electronic document is ‘linked’ to relevant information in other documents, allowing the reader to click through related sequences of information. What Berners-Lee envisioned in 1989 was the creation of a standardized ‘hypertext mark-up language’ (HTML) and the usage of the ‘Internet’ system to make a universal, global hypertext system.

The ‘Internet’ itself is a series of telecommunication linkages between the world’s major computer networks. It began with the ARPANET in the United States in 1965 (which was initially designed to allow remote and dispersed command of a nuclear war) and was subsequently extended to integrate the computer networks being developed in America’s top universities. As the operators of these networks became able to talk to each other in the early 1970s, email and live ‘chat’ were invented. The system became international with the connection to University College London, in 1973, which collaborated with Stanford University and the US Department of Defense in developing the ‘transmission control protocol’ and ‘Internet protocol’ (TCP/IP) that provided a standard mechanism for computer networks to ‘talk’ to each other using the telephone system (Cerf and Kahn 1974). Between 1975 and 1984, a number of large-scale networks were developed for civilian applications (TELNET, USENET and NSFNET). What effectively limited the capacity of users to work across these linkages was the lack of a software standard (both between networks and individual machines). Tim Berners-Lee’s HTML provided this standard, along with his WorldWideWeb software for translating HTML into a visual display (the web ‘browser’) and the global address system (http:) that allowed users to easily ‘visit’ documents on any number of different networks and to ‘browse’ through information across the entire system of interlinked computer networks (the Internet). In combination with the TCP/IP telecommunications standard, this meant that anyone with a telephone connection and a ‘modem’ device could dial into the Internet system and start ‘browsing’ the web (Chandler 2005).

By choosing to put this software in the public domain, Berners-Lee also gave every user the opportunity to publish their own documents via the Internet. The impact of this was quite extraordinary, with the rapid development of a browser that could read pictures as well as text (1993) and the incorporation of a web browser into the predominant Windows operating system (1995). Within the space of five years, this experimental global hypertext system was being used by millions of people to both read and publish materials that could be accessed by computer terminals all over the world. The impact of this was seen as being so significant that a ‘second media age’ (Poster 1995) was proclaimed, the age of ‘New Media’ (Flew 2002; Giddings and Lister 2011). In many respects, this was not entirely true, because few of the technologies that were used in the World Wide Web were actually new. Rather, this was a new configuration of technologies. Similarly, most of the materials that people were accessing over the Internet were digital renditions of existing media of communication: text, photographs, video, telephone calls. What was entirely new was the sensory experience of using the Internet, since materials in a range of audiovisual formats became accessible in an ‘interactive’ digital state where their sequence of occurrence (via hyperlinks) and their form (the data itself) could be readily altered by the user (Negroponte 1995). Even more significant was the fact that the World Wide Web was not simply a medium of reception (for reading). It was also a system for communication between individuals, making it an entirely new form of social network (Scott and Carrington 2011). Equally astounding was the sheer scale of the World Wide Web, since the Internet gave it global reach, and the number of users grew exponentially to include a third of the world’s population, some 2 billion people, in 2011.

The internet is above all a decentralized communication system. As with the telephone network, anyone hooked up to the internet may initiate a call, send a message that he or she has composed to one or multiple recipients and receive messages in return. The internet is also decentralized at the basic level of organization since, as a network of networks, new networks can be added so long as they conform to certain communications protocols … [it is] fascinating that this unique structure should emerge from a confluence of cultural communities which appear to have little in common: the Cold War Defense Department, which sought to insure survival against nuclear attack by promoting decentralization; the counter-cultural ethos of computer programming engineers, which showed a deep distaste for all forms of censorship or active restraint, and the world of university research … Added to this is a technological stratum of digital electronics which unifies all symbolic forms in a single system of codes … If the technological structure of the internet institutes costless reproduction, instantaneous dissemination and radical decentralization, what might be its effects upon the society, the culture and the political institutions?

Mark Poster ‘Cyber Democracy – Internet and the Public Sphere’, in D. Porter (ed.) (1997) Internet Culture, London and New York: Routledge, pp. 201–17

In the social sciences, the Internet is generally seen as intrinsically different from earlier ‘mass media’ technologies primarily because of its individualized interface and its decentred structure (see Cavanagh 2007; Fuchs 2008). Since the Web is a mediated social network, it has also been seen as heralding an era of entirely new social relationships, between individuals, between groups, between humans and machines, and between citizens and states. In the commercial domain, the aesthetics of the World Wide Web have been characterized by the predominance of popular culture modelled on the earlier mediums of print, television and computer games. Thus, having proclaimed the era of ‘new media’, a large number of ‘old media’ concerns have made extensive use of the system for delivering entertainment, providing information and communication services and promoting home shopping. The blank page of the ‘electronic frontier’ during the 1990s also inspired a vast array of entrepreneurial schemes to exploit this public system, making the Internet economy big enough to cause a global stock-market bubble that collapsed in 2000. The incipient era of ‘new media’, then, was primarily a gold rush, but the Internet itself has proved to be an enduring system. It is indisputable that the Internet has popularized computerized entertainment, dispersed unprecedented amounts of information amongst the (global) general public and created entirely new forms of sociability. The Internet and Web therefore constitute the major architectures behind contemporary personal computing.