Cover Page

Table of Contents


Introduction – All Set for an E-journey

Chapter 1: The First Information Revolution

1.1. Information: the catalyst for the development of the human community

1.2. Writing

1.3. Counting

1.4. Sorting: Hollerith’s tabulating machines

1.5. Europe lagging behind

Chapter 2: From Electromechanics to Electronics

2.1. The NCR crucible

2.2. A company named CTR

2.3. IT: aproduct of World War II

2.4. IT: a complex, precious and expensive commodity

2.5. The trials and tribulations of IT in Europe

2.6. Centralization of IT power and work organization

Chapter 3: The Dawn of the Digital Era

3.1. The quest for new freedom

3.2. The colorful saga of major firsts

3.3. The internet explosion

Chapter 4: Light and Shade in the Digital World

4.1. The family tree of the digital world

4.2. The slippery slope

4.3. The engines powering progress in the digital era

Chapter 5: The Promise and Reality of New Technology

5.1. IT effectiveness called into question

5.2. The value of IT

5.3. The IT sector set up as a model

5.4. Telecommunications in the eye of the storm

5.5. Shifting boundaries and extended companies

5.6. Corporate network players

5.7. New opportunities and new competition

5.8. The new time/space framework

Chapter 6: IT Policies in Efficient Enterprises

6.1. Reduce the shortfall between promises and reality

6.2. Shedding light on IT and information systems

6.3. Information governance

6.4. Making choices

6.5. Structuring

6.6. Realization

6.7. Measurements and monitoring

6.8. To do it oneself or ask someone else to do it?

6.9. Sisyphus and security

Chapter 7: New Instructions for CIOs

7.1. Lessons of the past

7.2. The CIO's missions

Chapter 8: New Vision(s)?

8.1. Gurus and a New Economy in a perfect world

8.2. The technological outlook

8.3. Citizenship and economic development

8.4. Developments in the Third World

8.5. Security and freedom: what are the real threats?

8.6. Press, media and culture

8.7. Health and education


References and Bibliography




Reality cannot be judged or properly appreciated if there are no bearings. One of the major strengths of this book is to put IT into perspective and trace its development through time.

Jean-Pierre Corniou’s viewpoint gives rise to a first observation and a first surprise. Information Technology, which is said to be such a recent development, is in fact almost as old as the automobile. More than a hundred years of bustling history is hardly negligible! Many of us continue to blame system failures and computer bugs on youthful indiscretions. This could not be further from the truth.

IT is therefore more than a century old, and it has seeped into all the fields of activity on which each and every one of us has come to depend. And yet, for many company directors – myself included – it resembles a black box which has great difficulty in revealing its secrets.

This dependence can sometimes be difficult to bear because, unlike in other fields of activity, we do not have a set of intuitive parameters at our disposal that might enable us to appreciate the value and efficiency of the IT investments we make.

What we do know instinctively is that IT is a tool to aid productivity, and today, productivity levels – however little and however badly they are assessed – remain the primary justification for IT investment. Unfortunately, this productivity remains tied up to the image of factory automation or the mechanization of accounting operations, which have led to hours of work being cut down to mere seconds of processing. This simplistic vision does not suffice, because IT is also a tool which creates its own demand. Measurements of productivity are thus rendered difficult, if not to say impossible, most notably because of this continuous need, and because of the recurring dynamics between needs being created and satisfied. When a factory is automated, there is a “before” and an “after”, and the two relatively stable states can be compared. With IT, there is no stable final state.

IT is thus an incredibly versatile object. Things have changed somewhat with the arrival of ERPs which are more systemic tools, but so far we have always built systems that are made-to–measure. This flexibility is further emphasized by the continuous creativity that is constantly pushing back the barriers of what is possible. The end-result is that IT always generates more hunger than it can satisfy. How then can actual, permanent productivity levels be measured?

The computerization of accounting systems is a fine example. Granted, processing has gained in speed, but that is not the most important point because, other than speed, we have come to expect details of cost prices, expenses, analyzes, etc. How can these be measured? Faced with this observation, it is best to acknowledge that quantitative measures are, in some cases, insufficient unless there is an absolute desire to create them artificially. Actually, the fact that it is so difficult to assess productivity has led us to manage IT from an expenditure point of view, by carrying out benchmark analyzes with other businesses and optimizing the use of the expenditure rather than its size.

These difficulties are also encountered in the technological choices we make. With a lack of distance and a lack of measurements, our IT choices are based on trust in the recommendations of our advisors. Perhaps other businesses have already used the product and can vouch for its effectiveness. Inevitably, there is a natural degree of uncertainty with this kind of procedure. However, this acceptable air of uncertainty is coupled with an unbearable element: agreements drawn up with the vendor through contracts which are far from perfect. From that point onwards, there is the distinct feeling that the vendor is taking you for a ride and that, at the same time, the client is acting in an irresponsible manner. If contracts have not been clearly formalized, the client can be tempted to review demands, and we are only too aware of the fact that reviewed demands bring about additional expenditure. In this area, we should be as thorough with IT as we are with the provision of other goods and services.

Choices and decisions: a manager’s major responsibility. However, anyone who purchases a black box but lacks the necessary skills is bound to fail. This is applicable to the IT and automotive sectors alike. Today, 75% of a vehicle’s cost price can be attributed to its bought-in components. If the manufacturer does not fully understand and master these various parts, efficient vehicles will not be produced. The same goes for IT. It is essential to master the architecture and the system and have a full understanding of the subject throughout time, which is not necessarily the same for every car manufacturer. The only rule is that, in a major corporation, production is a complex system, and if a number of elements can be bought in from outside, it will simultaneously be rendered simpler and more efficient.

Obviously, our understanding of IT has evolved, as has its organization within a corporate environment. At one time, it was thought that the computerization of a company defined a certain kind of internal organization, and that organization- and computerization-related tasks were naturally intertwined. Optimal IT system issues even became defined as optimal organizational issues. However, today, with the aid of the perspective proposed by Jean-Pierre Corniou, it is clear that all of this has no meaning.

In effect, the division between those who use IT and everyone else no longer exists. We all use one or more computers and thus all contribute to IT on a general level. The IT manager is therefore no longer in charge of a single sector, but is responsible for the technology which acts as the liaison between all company workers.

Jean-Pierre Corniou quite justifiably uses the image of IT bilingualism. It is a case of being competent in IT whilst simultaneously understanding and being familiar with the company’s various business lines. A further difficulty is linked with the ambivalence towards the notion of in-house clients. The clients of CIOs are other company players, but these are not clients who are spending their own money. Within a company, the real money is that which comes in from outside clients. CIOs are unlike other suppliers because they also play an advisory role. Parallels can be drawn between the CIO-client relationship and that which exists between doctors and their patients: just because patients are prepared to pay their doctors a lot of money does not mean the latter can prescribe any old remedy! This leads to a relationship which is thus one of bilateral authority … and this can indeed be rather complicated!

The idea of a new economy did not solve anything. I have always been annoyed by the opposition between the old and new economies. The lion’s share of the mistakes made by non-IT experts – and sometimes by IT experts – could be traced back to their tendency to draw upward curves and build future scenarios without enough of a history behind them. This led to all the economic bubbles and disillusionment. With my experience gained as a company director, I am in a position to say that strategic thinking must be founded on long periods which provide more solid reference points, in order to assess the various possible scenarios. The erratic ups and downs of technological stocks in recent months have been a clear illustration of how short track records are judged. The growth rates experienced over an extremely short term had been excessively projected over too long a period. Failure was inevitable. However, there is another reason behind all this. The instigators of the so-called New Economy had forgotten the customers.

Let us take a look at what is happening in the automotive industry, and at Renault in particular. The e-vehicle on which we are currently working – featuring a mobile interface with the outside world and which is rich in information – is but in its early stages. The human/machine interface is the focal point of this e-vehicle, and the person on board is in most cases the driver, most of whose energy must be devoted to driving the car. Simple human/machine interfaces must therefore be designed, making use of the sole means available, namely speaking and hearing. Alas, we have a long way to go before achieving efficient voice control in a noisy environment. Furthermore, we do not yet have the slightest idea about what the true solvent market will represent – in other words, what the customer really needs. Whatever, in all cases, the product must be of genuine value if it is to attract solvent demand. Once again, this book clearly portrays the distinction between the dazzling rise of mobile phones, with a simple interface produced in response to a solvent need, and WAP systems, with their non-user-friendly interface and for which a solvent demand has yet to be identified.

We are aware of the fact that a company’s efficiency rests on its ability to manage useful information. One of the issues in my eyes is the ability to provide access to data which has been sorted and prioritized. In this area, today’s systems are a very long way from perfect. We have an unprecedented wealth of information at our fingertips but do not have the means of sorting it and providing it to the person who needs it at the time they need it. In this field, I have been struck by, for instance, the fact that the web has now become an accumulated mass of information that takes longer to read than traditional paper publications.

Therefore, Jean-Pierre Corniou’s book is a fine starting point which can be read with interest and understood by all. However, what is most important is that it puts its subject matter in perspective. Different elements are put back into context allowing the surrounding scenery to emerge, and objectives and decisions to be redefined. This book does not provide answers to questions and does not say what should or should not be done. It is a reference book, and that is where its major strength and greatest usefulness lie.


Chairman and Chief Executive Officer, Renault

Introduction – All Set for an E-journey

As dawn broke over the 21st century, Information Technology officially celebrated its golden anniversary … it might be thought that the 50 year-mark would go hand-in-hand with the age of reason, but nothing could be further from the truth. In the extended family of innovation, IT is still the rebellious teenager: immature, incomplete and frustrating. Frequent counter-performances have left users irritated, while corporate board-members continue to be annoyed by the sheer unpredictability of results, despite the massive investments being made. So often in the world of IT, arx tarpeia capitoli proximo: high praise is swiftly followed by a sharp fall! The lack of foresight on the part of programmers led to record turnover being generated by the need to correct the Y2K bug and its actual or presumed effects. The overly-hyped stock market triumphs of the New Economy were followed by a nosedive which left investors in a state of shock; the very investors who had sought to plough their funds and put their unlimited trust in high-yield ventures which would seemingly be devoid of setbacks. Nevertheless, IT remains fallow ground for dreams of greater things, and is still behind many a success story. It continues to embody a new intellectual frontier, offering a world of ubiquity and unlimited exploits.

However, what do we mean by “Information Technology”? The term may suggest semantic singularity, when in fact IT now encompasses a wide and growing field of activities, competencies and products. The traditional “hardware” and “software” pigeon-holes – the former referring to physical equipment, the latter to programs – now fall short of accurately illustrating the types of product at stake. What really matters for today’s end-users is the quality of service provided by the diverse objects on offer. As such, IT is constantly evolving, fading into the history of innovative technology only to re-emerge on a far more widespread basis, taking on new and hitherto unimaginable forms.

As the 21st century began, a new leaf was turned as the Y2K bug was laid to rest, and along with it one of the blemishes on the skin of IT’s difficult early days. IT has entered a period of vast technical and social change, which is in line with the new perspectives faced by world economy. Indeed, two major shock-waves hit the world at the end of the 20th century: the fall of the Communist regime in 1989 and the bursting onto the scene of the web in 1993. Despite being the result of extremely different logical processes, the two were concomitant, and freed up convergent forces which, over a short period, have shaped a worldwide economic system in which rules seem to be laid down without any form of counterbalance. Economy, and now society as a whole, has come to be characterized by the free circulation of goods, individuals and ideas, instant and simultaneous access to unlimited information, and the bringing down of barriers between sectors, businesses and scientific disciplines. The movement, steeped in chaos and controversy, somewhere between Davos and Porto Alegre, appears to be irrepressible. But is it really? Is the model of this new economy – entirely focused as it is on short-term performance-the indisputable product of the third technological revolution?

For the world of IT, which had already undergone major transformations in the 1980s with the unharnessed development of micro-computing, the web has brought about a major upheaval, shaking in its wake players across the IT sector, whether manufacturers, software editors, service vendors and of course corporate IT departments. After the financial world had built them up with fever pitch enthusiasm, internet stocks were also the first to suffer a violent change of fortune. The IT world experiences the ups and downs of the business world with the accelerator full on. The IT world is a stage on which all the passion and drama of human activity is played out faster than elsewhere.

Ever since its earliest days, IT had been the affair of specialists who would sort problems out amongst themselves, using their own specific language. Nowadays, IT has made headway into ordinary, everyday life, the general public and mass-distribution media. Developments are dictated as much by mass-marketing policies as by technical evolution. All areas of the social and the corporate worlds now make use of IT tools. With the diversification of equipment, micro-computers have taken on multiple and hidden forms. Children and teenagers are no longer inhibited by the machines, which they make their own with unerring ease. Much like wireless telegraphy, cinematography and aircraft, which were fairground attractions before becoming mass phenomena, IT is now commonplace. However, the process is far from complete. There is still a mythical side to IT, which periodically continues to inspire when the boundaries are pushed further back.

The sole ambition of this book is to present the keys to understanding the technological world in which we live, and which has already deeply transformed our environment. Technological innovation, which shows no sign of letting up, will have an ever-stronger influence on our lives in the coming decades. History must help us to understand how and why we have reached this stage of development. Which opportunities have we drawn benefits from, and which have we wasted? Who and what economic forces can be found at the forefront of the movement? The advent of the information society is not the result of “generation spontaneity”, but has been the fruit of many contributors over more than half a century. An historical approach – hitherto under-exploited in this field which tends to shelter behind a modern, instantaneous image - offers a new viewpoint. By tracking the paths of those who contributed to the design of modern IT, by going back over the dreams and analyzing the failures, an in-depth view of this complex sector will be forthcoming. By going back to the roots of IT, we can attempt to comprehend what lies ahead in the years to come, with a view to minimizing the risks and maximizing the potential of future progress.

This book is as much a result of my impassioned personal experience as of analysis in the cold light of day, and will not fail to take sides and make choices, however unfair or excessive they may at times appear to be. The aim is to find a way through the thick undergrowth and shed some light on the subject. Future progress can only be achieved if a clear vision is provided. Action is enabled by a jargon- and mystery-free understanding. Investigative efforts of the like are justified by the subject at hand. Excess and passion have surrounded the history of IT. Technical adventures often resemble fictional novels. IT entrepreneurs have always been ambitious individuals on a quest for glory and fortune. Market growth has offered them opportunities which have never existed in any other sector. No-one is indifferent to IT, which as a discipline is at the crossroads between science and technique, rationality and humanity, culture and business. Feared and ignored by previous generations, but entirely taken on board by today’s children and teenagers who choose not to burden themselves with seeking a global vision, IT is now very much of our time: both the product and driving force of our era. In all fields, the 21st century will be marked by progress made in the deployment of information and knowledge processing techniques.

At what cost? In what conditions? With whom? Who will reap the benefits? Will IT cease to be studied and questioned? Or will transformations continue thus bringing about new concerns as well as new opportunities? We will attempt to understand yesterday’s challenges to be able to picture those we will face tomorrow, by providing a lucid answer to the following question: will IT die or be transfigured?

Chapter 1

The First Information Revolution

1.1. Information: the catalyst for the development of the human community

To be able to comprehend the social and economic impact of a nascent industry destined to gradually turn the world upside down, it is essential to go back to the sources of the history of IT: beyond the simple yet fascinating story of inventors and the objects they invented. However, this does not imply doing the work of an historian, but simply involves putting the expansion of this technical universe into perspective. Particular political and social contexts – and major shifts in history such as World War II – have always precipitated innovative processes, invariably led by a handful of ambitious and creative individuals, seeking, whatever the price, to automate one of humankind’s identity-forming activities: intellectual creation. Granted, what they created is essential. Without their efforts and their failures, we would not have the machines of today, which have become such a familiar part of our environment. Our attention will be focused on the uses that others then made of their discoveries. How did the use of calculating machines and intellectual production tools mould evolutions across society? A machine equates to nothing without the relevant education, training and work processes which render it operable. This socio-technical system will be considered, and, whenever necessary, ethical judgments as to the choices made will not be evaded.

Information is at the heart of all human and economic activity. It conditions all forms of exchange. From Sumer through Gutenberg, the cultivation of the rich layer of soil giving rise to the development of organizational progress and social construction has slowly resulted from the mastering of the signs and tools used to produce, disseminate and stock information. The milestones along the road are familiar to all: writing appeared around 3200 B.C., numbers and the first alphabet in Phoenicia (1100 B.C.), and then printing in 1450. Communication was made easier with each step forward. There was an upsurge in trade and commerce, as man learned to count and keep track of transactions. Knowledge continued to develop, as man began formalizing thoughts which had been enhanced by the ever-increasing number of viewpoints now available. There is a dialectic relationship between progress made in the field of material media and the ever-richer content being carried. The Phoenicians created their alphabet because trade could only be developed if faster information processing was available, hieroglyphs being deemed too slow and too little understood. Ease of distribution of the Bible multiplied with the invention of the printing press, far surpassing the low productivity of scriptoria in monasteries, and the development of literacy drove the diffusion of the very culture which would feed the Renaissance. If this determinism seems overly basic, it might nevertheless be acknowledged that innovation succeeds when it meets a real latent need, even if the latter has yet to be – or cannot be – expressed.

Let us travel through time to the second half of the 19th century. During this period, the tools destined to shape the tertiary world and those about to lay the foundations of the information society appeared simultaneously. The early ambition of the handful of men behind this movement was straightforward: to make the execution of basic functions learnt by children from the moment they attend school – writing, counting, sorting, classifying – faster, more reliable, and, if possible, automatic. The same process of innovation was then repeated for each product. Inventors would freely venture down every path opened up before them by the techniques and materials of their particular era, hand-crafting “their” machines, whether for calculating, typing or data processing, etc. With the aim of reproducing texts and figures, the rotary press was invented, as was … carbon paper! Inventors would then seek to industrialize their prototypes with the hope of striking gold. The necessary resources – funds and a pioneering spirit – were forthcoming, because the inventors in question were part of a young, up-and-coming nation that was eager to grow and craved discoveries: the United States of America. Indeed, even as early as 1890, this wave of research was being nurtured Stateside. The new machines were seized upon by establishments in such diverse sectors as banking, insurance, railways, trade and even the public authorities, leading to corporate activities being reviewed and new services being developed. It was not until the 1920s that such tools started to develop in Europe, as in the meantime clerks and managers alike had taken a great deal of convincing. The resistance of the middle ranks, as well as cultural misgivings, were already holding back the deployment of technology aimed at boosting performance and productivity.

The sources of the IT industry can thus be found at the end of the 19th century, at the time of the emergence, over a short period, of the first machines aimed at increasing the ability of individuals to conceive and diffuse thoughts.

1.2. Writing

Typewriters both spearheaded and symbolized the new office world. They made a major mark on the 20th century, shaping the social organization model and its hierarchical layers, archetypes and images. They also enabled the large-scale arrival of women in the workplace. More than a century down the line, our modern computers still incorporate the fundamental characteristic of typewriters: the keyboard, which conveys the bygone image of the typist, with the now pointlessly complex QWERTY or AZERTY arrangements of keys, which modern IT has yet to shake off.

One name is synonymous with typewriters: Remington. Its inventor, the Milwaukee publisher Christopher Sholes, was not the first to dream up a machine which would do away with the tedious manual chores associated with elaborating and reproducing documents, but he was the first to break the 25 words per minute barrier achieved by the best clerks. The chosen sequence of letters prevented the type-bars from getting stuck against each other in the machine, and writing became a swifter process. The next step was to find a manufacturer capable of overcoming the technical complexity of the machine within acceptable economic conditions, and after some unsuccessful attempts, light weapons manufacturer Philo Remington was responsible for putting the first thousand typewriters on the market in 1873. Sales were slow to pick up, and it took five years to shift the first batch of machines. In 1878, a second model was released which included a major innovation for its time, one we now take for granted: the possibility of shifting between upper-case and lower-case characters. By 1890, annual sales had hit the 20,000 mark, and a distribution and maintenance network taking in major cities was set up, as well as a beginners” and in-service training system, essential if the market penetration of Remingtons was to be ensured, given the fact that bona fide qualifications were needed in order to master the equipment. In 1900, the United States boasted 112,000 qualified typists, 76% of whom were female.

The fundamental activity of the office world is to produce documents. The essential complementary functions are the ability to classify, sort, retrieve and archive the said documents. While these secondary activities have never aroused as much passion as typewriters, many inventions have nevertheless greatly helped the rise of office work. In the early 1890s, the first vertical filing folders were introduced. However, with the escalating number of files, a means of rapidly identifying the desired documents became indispensable. Hence, the many inventions that emerged, including bank clerk James Rand’s handy system of colored labels and strips. Spurred on by the fast-expanding economic climate, he soon set up his own company, Rand Ledger Co., which grew rapidly, becoming a multinational force in 1908. His son, James Rand Jr., invented a card-based information storage system, going under the name of Kardex. It was to be widely used for the best part of a century. Rand Kardex Co. went on to become the undisputed leader in document classification and archiving and – thanks to a daring mergers and acquisitions strategy – to master all office-work related techniques. Come 1925, Rand had 219 branches in the United States, and a further 115 offices overseas. After that fine start, the United States witnessed a whole host of considerable “information technology” developments in the 1890s, across a market which was hungry for innovations to feed its growth. American society was devoid of the inhibitions affecting European society around that time, and the US enthusiastically embraced the technical innovations which rapidly transformed the tertiary world.

1.3. Counting

The history of calculating machines stretches further back in time than that of the typewriter. It officially began with Blaise Pascal in 1642, even though, long before then, Leonardo da Vinci had outlined the basic principle, and in 1623 the German Wilhelm Schickard had built two prototypes. The “Arithmometer”, the first mass-produced calculator – manufactured at a rate of one or two per month – appeared in France through the initiative of Thomas Colmar. The second generation of machines can be attributed to the Swede, Willgodt T. Odhner, who perfected a “pin-wheel” system. This particular innovation was later taken on by many other manufacturers. However, with the exception of sectors such as insurance, investing in calculating machines could not be justified, given that accountants were well-trained in the art of calculating by hand, without making mistakes and far more quickly than with a manually operated machine, tediously difficult to use. The market would only be convinced if data could be input quickly, and, above all, if the machine could provide written results.

Two men, Door E. Felt and William S. Burroughs, tackled these critical issues. The 1885 invention of the then 22-year-old Felt was a keyboard, which ensured that the dials of the Arithmometer or its rivals were no longer subjected to the hazardous handling of a stylus. A seasoned user could enter a 10-digit number in a single operation using the keyboard, made up of nine columns of eight figures. In 1887, Felt joined forces with a local Chicago-based manufacturer, leading to the founding of Felt & Tarrant Manufacturing Co., and the production of 1,000 “Comptometers” in 1900. This invention met with considerable success from 1915 onwards, and millions were manufactured by Felt & Tarrant, who also developed a youth training scheme aimed at increasing the market penetration of their product. The same machine underwent constant improvement, and continued to be manufactured until the end of the 1950s. F&T disappeared upon their 1961 merger with Victor Adding Machine Co., having failed to come to terms with the onslaught of computerization.

Around the same time, in 1886, the American Arithmometer Co. was founded, with a view to manufacturing and selling the first-ever calculating machine to feature a printing function, 28-year-old William Seward Burroughs” 1885 invention, the “Adder-lister”. The company was re-named Burroughs Adding Machine Co. after the premature death of its founder in 1905, before becoming Burroughs Corporation in 1953 and going on to become a major name in the history of IT. By the end of the 19th century, Burroughs was shifting 8,000 units per year, thanks to an effective sales organization, and in the years leading up to World War I, production hit 13,000 per annum. By then, the various models on offer were being specifically designed for the sectors in which they were to be used. In 1906, the car manufacturer Ford produced a model which featured a rack large enough to carry an adding machine: the “Burroughs Special”. By 1920, 800,000 Burroughs machines had been sold worldwide, and the one million mark was reached in 1926. These figures show the extent to which innovative products were already being met with growth rates that would send sales figures soaring. Burroughs also made headway abroad, and as early as 1925 were present in 60 countries, including Canada in 1917, Brazil, Argentina and Mexico in 1924, Belgium in 1925, followed by Germany in 1926. Burroughs extended operations by taking over a number of competitors and integrating their techniques. 1911 saw the introduction of the subtracting machine. The 1921 takeover of Moon-Hopkins brought about the launch of the first machine to combine an electric typewriter with a calculating machine, which could genuinely be deemed to be the ancestor of subsequent office calculators. Finally, a portable calculator, weighing less than 18 kilograms (40 pounds) was put on the market in 1925, and was an instant success.

In the US, Felt & Tarrant and Burroughs were joined in the race by dozens of other companies, entering this constantly developing market which was sustained at least as much by the sheer complexity of the fiscal system as by economic growth.

Date Event
1898 Radio invented by Marconi.
1900 Paper clip patented.
1901 First transatlantic telegraphic radio transmission. First numerical keyboard for punching cards.
1902 Arthur Pitney receives a U.S. patent on the world's first postage meter approved for use by the U.S. Postal Service in 1920.
First electric typewriter to be sold worldwide: the Blickensderfer Electric.
1903 Clipper Manufacturing Company, the first company to manufacture paper-fastening devices L. C. Smith and Brothers Typewriter Company formed (became Smith Corona Company in 1926
1904 Three-ring binder patented by Irving Piff Manufacturing Company.
1905 Star Furniture Company founded in Zeeland, Mchigan (renamed Herman Mller in 1923).
1906 Stenotype machine invented by Ward Stone Ireland.
The Haloid Company founded to manufacture and sell photographic paper (name changed to Xerox Corporation in 1961).
Vacuum valve invented by De Forest.
1907 Telephotography inaugurated when Arthur Korn telegraphs a photograph from Munich to Berlin, Germany.
1908 Olivetti founded in Italy by Camillo Olivetti.
1909 Bakelite, the first totally synthetic plastic, patented.
1910 Lefax loose-leaf personal organizer invented by J. C. Parker.
1911 Computing Tabulating Record Co. founded (became International Business Machines – IBM – in 1924).
1912 Corona makes a portable manual typewriter.
1913 Edouard Belin invents the Belinograph, a portable facsimile machine capable of using ordinary telephone lines.
1915 First North American transcontinental telephone call between Thomas A. Watson in San Francisco and Alexander Graham Bell in New York City.
1947 Bardeen, Brattain and Shockley invent the transistor at Bell Laboratories.

Major milestones in the first information revolution1

1.4. Sorting: Hollerith’s tabulating machines

The first research aimed at developing machines to automate calculations was made in fields of activity that used specific calculation tables and formulae, such as shipping and sailing, astronomy and architecture. The tables they used were produced by networks of individuals and were not error-proof, despite systematic double-checking procedures. Indeed, it might be noted that the word “computer” originally referred to the person whose specialty it was to compute! It was not until the 1960s that the term took on its current meaning. It was one of the major projects of the early 19th century which showed up the need for means of calculating, namely Napoleon’s decision to overhaul the land registration system in France. He called on a gentleman by the name of Gaspard de Prony to handle the assignment, which involved converting the measurements made under the monarchy into decimal units. To achieve this, de Prony set up what amounted to a veritable calculation factory, based on the sharing of the tasks at hand. British mathematician Charles Babbage found the experience particularly appealing, and it inspired him to conceive the “Difference Engine”, aimed at reducing the inaccuracy of mathematical tables and enabling them to be printed directly. Unfortunately, the machine designed by this whimsical authoritarian was somewhat in advance of what 1820s technology would permit. Babbage was not destined to achieve his goals, and succeeded only in squandering a considerable amount of money … although in 1991 engineers from the British Museum did manage to build a machine based on one of his plans. Another Englishman, George Boole, a contemporary of Babbage who once crossed paths with him in 1861, was to play a major role in the history of IT. Boole’s “The Mathematical Analysis of Logic” formed the theoretical basis of binary language. The quest for techniques enabling tools and automatons to be programmed can be traced back to Antiquity and the camshafts used in hydraulic organs by engineers in Alexandria. Later, the technique was perfected in automatons designed for the purposes of entertainment, and, later still, French silk-weaver Joseph-Marie Jacquard used punched cards to control the warp and weft threads on a silk loom.

The first large-scale use of mechanical tools to process data occurred in the United States during the 1890 census. Demographic growth in the US had taken off to such a degree that a census could not be carried out using manual means of calculation. Between 1840 and 1860, the population had risen from 17.1 million to 31.4 million. The mechanical punch-card machine which enabled the census forms to be processed was the invention of one Herman Hollerith, an engineer born to a couple of German immigrants in 1860, and who had been heavily influenced by Jacquard’s system. In August 1890, after a mere six weeks, the Census Bureau proudly announced that the American population totaled 62,622,250. This feat was the source of much pride for the young American nation, now ranked the world’s leading state. However, the invention of the punch card and its large-scale use to draw up concrete results, rather than simply to assess laboratory experiments, paved the way to today’s modern automatic data-processing tools. In terms of productivity and measurable gain, the Census Bureau saved $5m by making use of the Hollerith Machine. The world of banking, insurance and industry then took up Hollerith’s offer to utilize his machines, which would otherwise have remained unused until the next census.

Not only did Americans take to the tools which made everyday, repetitive chores easier, but they also unearthed a whole new breed of company workers who would mark the corporate world in years to come: “systematizers”, whose mission it was to restructure office tasks, following the example of the work carried out by Frederick Taylor in the industrial world. Systematizers introduced typewriters, calculators and filing systems, thus increasing levels of productivity in banks, insurance companies and corporate head offices. They were the forerunners of today’s organizers and consultants.

1.5. Europe lagging behind …

In Europe, the sheer weight of cultural habits and social groups, together with slow-moving state administrations, meant that office techniques failed to evolve at the same pace as in the US. However, American companies were rapidly setting up subsidiaries in Europe, and left little or no room for European innovators. Hollerith’s company, Tabulating Machine Co. (TMC) expanded throughout the continent. The British Tabulating Machine Co. was founded in the UK in 1907. The German branch, Deutsche Hollerith Maschinen Gesselletstaft (Dehomag), followed in 1910, and citing national integration was controversially involved in the Nazi regime2. On 15th July 1914, TMC created “International Time Recording”, its French subsidiary. Prior to that, 1913 had seen the arrival on European soil of Powers, TMC’s major competitor. It was in Germany that the typewriter really began to develop, with the 1904 launch of AEG’s legendary Mignon typewriter. The work carried out by the Italian Camillo Olivetti also deserves recognition. His Ml machine was put on the market in 1911, and was an innovative step forward both in terms of its design and the mass-production processes used in its manufacture. The M40, launched in 1930, was a direct result of Olivetti’s concern for technical perfection, and already traced the shape of modern machines. Swedish calculator manufacturer Facit should also be saluted. Founded in 1918, the company went on to dominate the market until the end of the 1950s.

World War I left Europe listless, while the US basked in triumph. The war gave opportunity for women to obtain work in factories and offices, but the Taylorization of the tertiary world limited them to menial duties. “We will recruit female operators aged between 15 and 20, and basic primary-level education will suffice. They will be drafted in from outside the company. Existing staff, however, will be employed either for codification or graphical purposes, in other words before and after but never during machine operations.”3 The quaint stereotypical image of the cultured, piano-playing lady was long-gone! The office world was split down the middle between everyday members of the typing pool, whose mission it was to increase the rate at which documents were produced without trying to understand what it was they were writing, and shorthand typists, the elite secretaries. The office world thus left a durable mark on the perceived image of the roles of workers, and contributed substantially to stifling the potential transformations brought about by technical innovation.

While the mechanization of office tasks struggled to progress in Europe, the thinking that surrounded the organization of tertiary work met with relative success, albeit limited to some circles. In France, Henri Fayol began reflecting upon methods for corporate management, and declared before an audience of fellow company directors that “we must strive to discover the laws that will make the organization and workings of management machines as flawless as possible”. Fayol, an experimentation and measurement enthusiast who was intent on pinpointing the rules for command and organizational principles in companies, published “General and Industrial Management” in 1916, and three years later founded the Centre d'Études Administratives (Center for Management Studies), which remained operational until 1925. This center was entirely given over to experimental research into corporate organization, and ensured that Fayol’s thinking exerted its influence over the up-and-coming generation of 1920s company directors.

By the end of World War I, the various players had taken up their positions, and the balance of power was not to shift throughout the 1920s: the United States dominating the field of data-processing, gaining a foothold all over the world, and holding a position of leadership that nobody could contest.

1 Sources: 'From Carbons to Computers', Smithsonian Institution: (

2 Edwin Black, IBM and the Holocaust, Time Warner Paperbacks, 2001.

3 Revue du Bureau, No. 294, dated August 1935.