cover.eps

Title page image

Publisher’s Acknowledgements

VP Consumer and Technology Publishing Director:
Michelle Leet

Professional Technology & Strategy Director:
Barry Pruett

Marketing Manager:
Lorna Mein

Acquisitions Editor:
Jody Lefevere

Project Editor:
Charlotte Kughen

Copy Editor:
Grace Fairley

Technical Editor:
Omer Kilic

Editorial Manager:
Mary Beth Wakefield

Editorial Assistant:
Matthew Lowe

About the Authors

EBEN UPTON is a founder of the Raspberry Pi Foundation, serves as the CEO of Raspberry Pi (Trading) Ltd, its trading arm, and is the co-author, with Gareth Halfacree, of the Raspberry Pi User Guide. In an earlier life, Eben founded two successful mobile games and middleware companies (Ideaworks 3d and Podfun), held the post of Director of Studies for Computer Science at St John’s College, Cambridge and wrote the Oxford Rhyming Dictionary with his father, Professor Clive Upton. He holds a BA in Physics and Engineering, a PhD in Computer Science, and an Executive MBA, from the University of Cambridge.

JEFF DUNTEMANN has been professionally published in both technical nonfiction and science fiction since 1974. He worked as a programmer for Xerox Corporation and as a technical editor for Ziff-Davis Publishing and Borland International. He launched and edited two print magazines for programmers and has 20 technical books to his credit, including the best-selling Assembly Language Step By Step. He wrote the “Structured Programming” column in Dr. Dobb’s Journal for four years and has published dozens of technical articles in many magazines. With fellow writer Keith Weiskamp, Jeff launched The Coriolis Group in 1989, which went on to become Arizona’s largest book publisher by 1998. He has a longstanding interest in “strong” artificial intelligence, and most of his fiction (including his two novels, The Cunning Blood and Ten Gentle Opportunities) explore the consequences of strong AI. His other interests include electronics and amateur radio (callsign K7JPD), telescopes and kites. Jeff lives in Phoenix, Arizona with Carol, his wife of 40 years, and four bichon frise dogs.

RALPH ROBERTS is a decorated Vietnam Veteran who worked with NASA during the Apollo moon-landing program and has been writing about computers and software continuously since his first sale to Creative Computing magazine in 1979. Roberts has written more than 100 books for national publishers and thousands of articles and short stories. In all, he’s sold more than 20 million words professionally. His best sellers include the first U.S. book on computer viruses (which resulted in several appearances on national TV) and Classic Cooking with Coca-Cola®, a cookbook that has been in print for the past 21 years and has sold 500,000 copies.

TIM MAMTORA works as a master engineer in IC Design for Broadcom Limited and is currently the technical lead for the internal GPU hardware team. He has worked in mobile computer graphics for nearly seven years and previously held roles developing internal IP for analog TV and custom DSP hardware. Tim holds a Masters in Engineering from the University of Cambridge, and he spent his third year at the Massachusetts Institute of Technology, which sparked his interest in digital hardware design. He is passionate about promoting engineering and has dedicated time to supervising undergraduates at the University of Cambridge and giving talks about opportunities in engineering to his old school. Outside of work he enjoys a variety of sports, photography and seeing the world.

BEN EVERARD is a writer and podcaster who spends his days tinkering with Linux and playing with robots. This is his second book; he also wrote Learning Python with Raspberry Pi (Wiley, 2014). You can find him on Twitter at @ben_everard.

About the Technical Editor

OMER KILIC is an embedded systems engineer who enjoys working with small connected computers of all shapes and sizes. He works at the various intersections of hardware and software engineering practices, product development and manufacturing.

In memory of Alan Drew, without whom I would have stopped before I got started.
—Eben Upton

To the eternal memory of Steve Ostruszka 1917-1990, who gave me his daughter’s hand and honored me with his friendship.
—Jeff Duntemann

Learning Computer Architecture with Raspberry Pi®

Introduction

WHEN I WAS 10 years old, one of my teachers sat me down in front of a computer at school. Now, this isn’t what you think. I wasn’t about to be inducted into the mysteries of computer programming, even though it was a BBC Micro (the most programmable and arguably the most architecturally sophisticated of the British 8-bit microcomputers, on which I would subsequently cut my teeth in BASIC and assembly language). Instead, I was faced with a half-hour barrage of multiple choice questions about my academic interests, hobbies and ambitions, after which the miraculous machine spat out a diagnosis of my ideal future career: microelectronic chip designer.

This was a bit of a puzzler, not least because what I really wanted to be was a computer game programmer (okay, okay, astronaut) and there was nobody in my immediate environment who had any idea what a 10-year-old should do to set him on the path to the sunlit uplands of microelectronic chip design. Over the next few years, I studied a lot of maths and science at school, learned to program (games) at home, first on the BBC Micro and then the Commodore Amiga, and made repeated, not particularly successful, forays into electronics. As it turned out, and more by luck than judgment, I’d happened on a plausible road to my destination, but it wasn’t until I arrived at Cambridge at the age of 18 that I started to figure out where the gaps were in my understanding.

Cambridge

Cambridge occupies a special place in the history of computer science, and particularly in the history of practical or applied computing. In the late 1930s, the young Cambridge academic Alan Turing demonstrated that the halting problem (the question “Will this computer program ever terminate, or halt?”) was not computable; in essence, you can’t write a computer program that will analyse another arbitrary computer program and determine if it will halt. At the same time, working independently, Alonzo Church proved the same result, which now shares their names: the Church-Turing thesis. But it is telling that while Church took a purely mathematical approach to his proof, based on recursive functions, Turing’s proof cast computation in terms of sequential operations performed by what we now know as Turing machines: simple gadgets that walk up and down an infinite tape, reading symbols, changing their internal state and direction of travel in response, and writing new symbols. While most such machines are specialised to a single purpose, Turing introduced the concept of the universal machine, which could be configured via commands written on the tape to emulate the action of any other special-purpose machine. This was the first appearance of a now commonplace idea: the general-purpose programmable computer.

After the outbreak of the Second World War, Turing would go on to play a central role in the Allied code-breaking effort at Bletchley Park, where he was involved (as a member of a team—don’t believe everything you see at the movies) in the development of a number of pieces of special-purpose hardware, including the electromechanical bombe, which automated the process of breaking the German Enigma cipher. None of these machines used the specific “finite state automaton plus infinite tape” architecture of Turing’s original thought experiment; this turned out to be better suited to mathematical analysis than to actual implementation. And not even the purely electronic Colossus—which did to the formidably sophisticated Lorentz stream cipher what the bombe had done to Enigma—crossed the line into general-purpose programmability. Nonetheless, the experience of developing large-scale electronic systems for code-breaking, radar and gunnery, and of implementing digital logic circuits using thermionic valves, would prove transformative for a generation of academic engineers as they returned to civilian life.

One group of these engineers, under Maurice Wilkes at the University of Cambridge’s Mathematical Laboratory, set about building what would become the Electronic Delay Storage Automatic Calculator, or EDSAC. When it first became operational in 1949, it boasted a 500kHz clock speed, 32 mercury delay lines in two temperature-controlled water baths for a total of 2 kilobytes of volatile storage. Programs and data could be read from, and written to, paper tape. Many institutions in the U.S. and UK can advance narrow claims to having produced the first general-purpose digital computer, for a particular value of “first”. Claims have been made that EDSAC was the first computer to see widespread use outside the team that developed it; academics in other disciplines could request time on the machine to run their own programs, introducing the concept of computing as a service. EDSAC was followed by EDSAC II, and then Titan. It was only in the mid-1960s that the University stopped building its own computers from scratch and started buying them from commercial vendors. This practical emphasis is even reflected in the current name of the computer department: Cambridge doesn’t have a computer science faculty; it has a computer laboratory, the direct descendant of Wilkes’ original mathematical laboratory.

This focus on the practical elements of computer engineering has made Cambridge fertile ground for high-technology startups, many of them spun out of the computer laboratory, the engineering department or the various maths and science faculties (even our mathematicians know how to hack), and has made it a magnet for multinational firms seeking engineering talent. Variously referred to as the Cambridge Cluster, the Cambridge Phenomenon or just Silicon Fen, the network of firms that has grown up around the University represents one of the few bona fide technology clusters outside of Silicon Valley. The BBC Microcomputer that told me I should become a chip designer was a Cambridge product, as was its perennial rival, the Sinclair Spectrum. Your cell phone (and your Raspberry Pi) contains several processors designed by the Cambridge-based chip firm ARM. Seventy years after EDSAC, Cambridge remains the home of high technology in the UK.

Cut to the Chase

One of the biggest missing pieces from my haphazard computing education was an idea of how, underneath it all, my computer worked. While I’d graduated downwards from BASIC to assembly language, I’d become “stuck” at that level of abstraction. I could poke my Amiga’s hardware registers to move sprites around the screen but I had no idea how I might go about building a computer of my own. It took me another decade, a couple of degrees and a move out of academia to work for Broadcom (a U.S. semiconductor company that came to Cambridge for the startups and stayed for the engineering talent) for me to get to the point where I woke up one morning with “microelectronic chip designer” (in fact the fancier equivalent, “ASIC architect”) on my business card. During this time, I’ve had the privilege of working with, and learning from, a number of vastly more accomplished practitioners in the field, including Sophie Wilson, architect (with Steve Furber) of the BBC Micro and the original ARM processor, and Tim Mamtora of Broadcom’s 3D graphics hardware engineering team, who has graciously provided the chapter on graphics processing units (GPUs) for this book.

To a great degree, my goal in writing this book was to produce the “how it works” title that I wish I’d had when I was 18. We’ve attempted to cover each major component of a modern computing system, from the CPU to volatile random-access storage, persistent storage, networking and interfacing, at a level that should be accessible to an interested secondary school student or first-year undergraduate. Alongside a discussion of the current state of the art, we’ve attempted to provide a little historical context; it’s remarkable that most of the topics covered (though not, obviously, the fine technical details) would have been of relevance to Wilkes’ EDSAC engineering team in 1949. You should reach the end with at least a little understanding of the principles that underpin the operation of your computer. I firmly believe that you will find this understanding valuable even if you’re destined for a career as a software engineer and never plan to design a computer of your own. If you don’t know what a cache is, you’ll be surprised that your program’s performance drops off a cliff when your working set ends up larger than your cache, or when you align your buffers so that they exhaust the cache’s associativity. If you don’t know a little about how Ethernet works, you’ll struggle to build a performant network for your datacentre.

It’s worth dwelling for a moment on what this book isn’t, and what it won’t tell you. It isn’t a comprehensive technical reference for any of the topics covered. You could write (and people have written) whole volumes on the design of caches, CPU pipelines, compilers and network stacks. Instead, we try to provide a primer for each topic, and some suggestions for further study. It is concerned primarily with the architecture of conventional general-purpose computers (in essence, PCs). There is limited coverage of topics like digital signal processing (DSP) and field-programmable gate arrays (FPGAs), which are primarily of interest in special purpose, application-specific domains. Finally, there is little coverage of the quantitative decision-making process that is the heart of good computer architecture: how do you trade off the size of your cache against access time, or decide whether to allow one subsystem coherent access to a cache that forms part of another component? We can’t teach you to think like an architect. For the advanced reader, Hennessy and Patterson’s Computer Architecture: A Quantitative Approach remains an indispensable reference on this front.

The Knee in the Curve

With that last disclaimer in mind, I’d like to share a couple of guiding principles that I have found useful over the years.

In computer architecture, as in many things, there is a law of diminishing returns. There are, of course, hard limits to what can be accomplished at any given moment, whether in terms of raw CPU performance, CPU performance normalised to power consumption, storage density, transistor size, or network bandwidth over a medium. But it is often the case that well before we reach these theoretical limits we encounter diminishing returns to the application of engineering effort: each incremental improvement is increasingly hard won and exacts a growing toll in terms of cost and, critically, schedule. If you were to graph development effort, system complexity (and thus vulnerability to bugs) or cash spent against performance, the curve would bend sharply upward at some point. To the left of this “knee”, performance would respond in a predictable (even linear!) fashion to increasing expenditure of effort; to the right, performance would increase only slowly with added effort, asymptotically approaching the “wall” imposed by fundamental technical limitations.

Sometimes there is no substitute for performance. The Apollo lunar project, for example, was an amazing example of engineering that was so far to the right of the “knee” (powered by the expenditure of several percent of the GDP of the world’s largest economy) that it fundamentally misled onlookers about the maturity of aerospace technology. It is only now—after 50 years of incremental advances in rocketry, avionics and material science—that the knee has moved far enough to permit access to space, and maybe even a return to the Moon, at reasonable cost. Nonetheless, I have observed that teams that have the humility to accurately locate the knee bring simple, conservatively engineered systems to market in a timely fashion and then iterate rapidly, tend to win over moon-shot engineering.

Conservatism and iteration are at the heart of my own approach to architecture. The three generations of Raspberry Pi chips that we’ve produced to date use exactly the same system infrastructure, memory controller and multimedia, with changes confined to the ARM core complex, a small number of critical bug fixes and an increase in clock speed. There is a tension here: engineers (myself included) are enthusiasts and want to push the boundaries. The job of a good architect is to accurately assign a cost to the risks associated with radical change, and to weigh this against the purported benefits.

Forward the Foundation

We founded the Raspberry Pi Foundation in 2008, initially with the simple aim of addressing a collapse in the number of students applying to study Computer Science at Cambridge. We’re seeing encouraging signs of recovery, both at Cambridge and elsewhere, and applicant numbers are now higher than they were at the height of the dotcom boom in the late 1990s.

Perhaps the most striking aspect of the change we’ve witnessed is that the new generation of young people is far more interested in hardware than we were in the 1980s. Writing an assembly language routine to move a sprite around on the screen clearly isn’t quite as much fun as it used to be, but moving a robot around the floor is much more exciting. We see 12-year-olds today building control and sensing projects that I would have been proud of in my mid-20s. My hope is that when some of these young people sit down in front of the distant descendants of the BBC Micro careers program of my childhood, some of them will be told that they’d make great microelectronic chip designers, and that this book might help one or two of them make that journey.

—Eben Upton, Cambridge, May 2016

Chapter 1

The Shape of a Computer Phenomenon

THAT OLD SAYING about good things coming in small packages describes the Raspberry Pi perfectly. It also highlights an advance in computer architecture—the system-on-a-chip (SoC), a tiny package with a rather large collection of ready-to-use features. The SoC isn’t so new—it’s been around a long time—but the Raspberry Pi’s designers have put it into a small, powerful package that is readily available to students and adults alike. All for a very low price.

A tiny piece of electronics about the size of a credit card, the Raspberry Pi single-board computer packs very respectable computing power into a small space. It provides tons of fun and many, many possibilities for building and controlling all sorts of fascinating gizmos. When something is small, after all, it fits just about anywhere. The Raspberry Pi does things conventional computers just can’t do in terms of both portability and connectivity. Things you will find inspire your creativity—fun things!

What’s not to like? Get ready for some truly exciting computer architecture.

In this chapter introducing the truly phenomenal Raspberry Pi line of computer boards, we look first at the Raspberry Pi’s goals and history. We include the history of the Raspberry Pi’s development and the visionary people at the Raspberry Pi Foundation who dreamed up the concept and achieved the reality, and we look at the advantages this tiny one-board computer has over much larger computers. We then take a tour of the Raspberry Pi board.

Growing Delicious, Juicy Raspberries

As significant advances in computing go, the Raspberry Pi’s primary innovation was the lowering of the entry barrier into the world of embedded Linux. The barrier was twofold—price and complexity. The Raspberry Pi’s low price solved the price problem (cheap is good!) and the SoC reduced circuit complexity rather dramatically, making a much smaller package possible.

The road to the development of the Raspberry Pi originated at a surprising point—through a registered charity in the UK, which continues to operate today.

The Raspberry Pi Foundation, registered with the Charity Commission for England and Wales, first opened its doors in 2009 in Caldecote, Cambridgeshire. It was founded for the express purpose of promoting the study of computer science in schools. A major impetus for its creation came from a team consisting of Eben Upton, Rob Mullins, Jack Lang and Alan Mycroft. At the University of Cambridge’s Computer Laboratory, they had noted the declining numbers and low-level skills of student applicants. They came to the conclusion that a small, affordable computer was needed to teach basic skills in schools and to instill enthusiasm for computing and programming.

Major support for the Foundation’s goals came from the University of Cambridge Computer Laboratory and Broadcom, which is the company that manufactures the SoC—the Broadcom 2835 or 2836, depending on the model—that enables the Raspberry Pi’s power and success. Later in this chapter you will read more on that component, which is the heart and soul of the Raspberry Pi.

The founders of the Raspberry Pi had identified and acted on the perceived need for a tiny, affordable computer. By 2012, the Model B had been released at a price of about £25. The fact that this represented great value for money was recognised immediately, and first-day sales blasted over 100,000 units. In less than two years of production, more than two million boards were sold.

The Raspberry Pi continued to enjoy good sales and wide acceptance following the highly successful release of the Model B+ (in late 2014). And in 2015, the fast, data-crunching Raspberry Pi 2 Model B with its four-core ARM processor and additional onboard memory sold more than 500,000 units in its first two weeks of release. Most recently, the Raspberry Pi Zero, a complete computer system on a board for £4—yes, £4—was released. It’s an awesome deal if you can get one—the first batch sold out almost immediately.

In 2016, the Raspberry Pi Model 3 Model B arrived. It sports a 1.2GHz 64-bit quad-core ARMv8 CPU, 1 GB RAM, and built-in wireless and Bluetooth! All for the same low price.

The original founders of the Raspberry Pi Foundation included:

The organisation now consists of two parts:

The Raspberry Pi Foundation’s website at www.raspberrypi.org (see Figure 1-1) presents the impetus that resulted in the Raspberry Pi. This is what they say on the About Us page:

The idea behind a tiny and affordable computer for kids came in 2006, when Eben Upton, Rob Mullins, Jack Lang and Alan Mycroft, based at the University of Cambridge’s Computer Laboratory, became concerned about the year-on-year decline in the numbers and skills levels of the A Level students applying to read Computer Science. From a situation in the 1990s where most of the kids applying were coming to interview as experienced hobbyist programmers, the landscape in the 2000s was very different; a typical applicant might only have done a little web design.

image

FIGURE 1-1: The Raspberry Pi official website

As a result, the founders’ stated goal was “to advance the education of adults and children, particularly in the field of computers, computer science and related subjects”.

Their answer to the problem, of course, was the Raspberry Pi, which was designed to emulate in concept the hands-on appeal of computers from the previous decade (the 1990s). The intention behind the Raspberry Pi was to be a “catalyst” to inspire students by providing affordable, programmable computers everywhere.

The Raspberry Pi is well on its way to achieving the Foundation’s goal in bettering computer education for students. However, another significant thing has happened; a lot of us older people have found the Raspberry Pi exciting. It’s been adopted by generations of hobbyists, experimenters and many others, which has driven sales into new millions of units.

While the sheer compactness of the Raspberry Pi excites, resonates and inspires adults as well as youngsters, what truly prompted its success was its low price and scope of development. Embedded Linux has always been a painful subject to learn, but the Pi makes it simple and inexpensive. Continuing education in computers gets just as big a boost as initial education in schools.

System-on-a-Chip

An SoC or system-on-a-chip is an integrated circuit (IC) that has the major components of a computer or any other electronic system on a single chip. The components include a central processing unit (CPU), a graphics processing unit (GPU) and various digital, analogue and mixed signal circuits on just one chip.

This SoC component makes highly dense computing possible, such as all the power that is shoehorned into the Raspberry Pi. Figure 1-2 shows the Broadcom chip on the Raspberry Pi 2 Model B. It’s a game-changing advance in computer architecture, enabling single-card computers that rival and often exceed the capabilities of machines that are many times their size. Chapter 8, “Operating Systems”, covers these small but mighty chips in detail.

image

FIGURE 1-2: Broadcom chip on the Raspberry Pi 2 Model B

The Raspberry Pi features chips that are developed and manufactured by Broadcom Limited. Specifically, the older models as well as the latest (the £4 Raspberry Pi Zero) come with the Broadcom BCN2835 and the Raspberry Pi 2 has the Broadcom BCM2836, and the new Model 3 uses the Broadcom BCM2837. The biggest difference between these two SoC ICs is the replacement of the single-core CPU in the BCM2835 with a four-core processor in the BCM2836. Otherwise, they have essentially the same architecture.

Here’s a taste of the low-level components, peripherals and protocols provided by the Broadcom SoCs:

An Exciting Credit Card-Sized Computer

Just how powerful is the Raspberry Pi compared to a desktop PC? Certainly, it has far more computational ability, memory and storage than the first personal computers. That said, the Raspberry Pi cannot match the speed, high-end displays, built-in power supplies and hard-drive capacity of the desktop boxes and laptops of today.

However, you can easily overcome any disadvantages by hanging the appropriate peripherals on your Raspberry Pi. You can add large hard drives, 42-inch HDMI screens, high-level sound systems and much more. Simply plug your peripherals into the USB receptacles on the board or via other interfaces that are provided, and you’re good to go. Finish it off by clicking an Ethernet cable into the jack on the Raspberry Pi or sliding in a wireless USB dongle, and worldwide connectivity goes live.

You can duplicate most features of conventional computers when you attach peripherals to a Raspberry Pi, such as in Figure 1-3, and you also gain some distinct advantages over large computers, including:

image

FIGURE 1-3: Peripherals attached to a Raspberry Pi 2 Model B

image

FIGURE 1-4: GPIO pins enable control of real world devices.

What Does the Raspberry Pi Do?

The Raspberry Pi excels as the brains for all sorts of projects. Here are some examples randomly picked from the many thousands of documented projects on the Internet. This list may inspire you in choosing some projects of your own:

This list just scratches the surface of possible uses for the Raspberry Pi. There’s not enough room to list everything you could do, but this book gives you the information you need to come up with your own ideas. Let your own desires, interests and imagination guide you. The Raspberry Pi does the rest.