did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780465042258

Go to

by
  • ISBN13:

    9780465042258

  • ISBN10:

    0465042252

  • Format: Hardcover
  • Copyright: 2001-10-17
  • Publisher: Basic Books

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $27.50 Save up to $6.87
  • Buy Used
    $20.63

    USUALLY SHIPS IN 2-4 BUSINESS DAYS

Supplemental Materials

What is included with this book?

Summary

The co-author of "U.S. vs. Microsoft" offers the remarkable story of the scientific revolution that made the new economy possible--software--told through the unsung heroes of programming and their achievements.

Author Biography

Steve Lohr is senior writer and technology correspondent for the New York Times, and is co-author of U.S. vs. Microsoft. He lives is New York City

Table of Contents

Acknowledgmentsp. ix
Introduction: The Rise of Software and the Programming Artp. 1
Fortran: The Early "Turning Point"p. 11
The Hard Lessons of the Sixties: From Exuberance to the Realities of COBOL and the IBM 360 Projectp. 35
Breaking Big Iron's Grip: Unix and Cp. 63
Programming for the Millions: The BASIC Story from Dartmouth to Visual Basicp. 81
The European Influence: From Algol to Pascal to C++p. 99
A Computer of My Own: The Beginning of the PC Industry and the Story of Wordp. 115
Computing for the Masses: The Long Road to "Gooey" and the Macintoshp. 139
Programming for Everyman: Just Let the Users Do Itp. 159
Java: The Messy Birth of a New Languagep. 181
There Has To Be a Better Way: Apache and the Open Source Movementp. 203
Afterwordp. 221
Notesp. 223
Referencesp. 239
Indexp. 243
Table of Contents provided by Syndetics. All Rights Reserved.

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Excerpts


Chapter One

Introduction: The Rise of Software

and the Programming Art

A lone sailboat in the distance makes its way across the rippled surface of Lake Washington in the crisp autumn dusk, framed on the horizon by the skyline of Seattle. The view is from the lakeside home of Charles Simonyi, who was a 17-year-old computer programming prodigy when he left Budapest for good in 1966. Since then, he has come a remarkable distance, in every sense. His house, though all but invisible from the road, sweeps down the hillside toward the water's edge, covers more than 20,000 square feet and includes a library, computer lab, fitness center, and swimming pool. Made of glass, wood, and steel, the home is a work of high modernism, outside and in. The black floors of polished stone glisten, and visitors are asked to remove their shoes. The walls are bare except for works of modern art by Roy Lichtenstein, Jasper Johns, and Victor Vasarely. Besides art, Simonyi collects jets. He has two, including a retired NATO fighter, which he flies. His multimillion-dollar philanthropic donations have placed his name on an endowed chair at Oxford University and on the mathematics building at the Institute for Advanced Study in Princeton. Simonyi fled Hungary as a teenager with nothing, but he now regards money with the nonchalance of the billionaire he has become. "I have no mercenary reasons for things anymore," he said.

    Simonyi owes it all to software, and his uncanny facility with computer code -- aided, of course, by good timing, good luck, and the whimsy of capitalism. His career began at Hungary's Central Statistical Office in the mid-1960s, where he was a kind of communist version of an American teenage computer hacker. He hung around, made himself useful, and taught himself how to program on a Russian-made Ural II. In computing time, the Budapest center was living in the early 1950s, generations behind the West. Over the years, advances in software have allowed programmers to lift their gaze up further and further from the level of binary digits, or bits -- the 1's and O's that are the natural vernacular of the machine. But Simonyi learned to talk to the computer almost entirely on the machine's terms. "It was Stone Age programming," he recalled. "I've been through a time warp."

    After immigrating to the United States, Simonyi changed his name from Karoly to Charles. He attended the University of California at Berkeley and Stanford University, and later joined the Xerox Palo Alto Research Center. Simonyi was at Xerox PARC during the glory years of the 1970s, when the team there did so much of the research and development that has shaped how people use personal computers. At Xerox PARC, Simonyi was the principal developer of Bravo, an innovative program for writing and editing text that allowed a person to display words on a computer screen as if plucked from the imagination of a skilled typesetter. It was a capability that became known as WYSIWYG -- "What You See Is What You Get" -- and it opened the door to the desktop publishing industry, and helped define the personal computer as a tool for enhancing individual creativity.

    When it became clear that Xerox did not really grasp the significance of the work of its Palo Alto lab, Simonyi looked for work elsewhere. In the summer of 1980, he made an unannounced call, on a little company outside Seattle trying to make its way in the fledgling personal computer industry -- Microsoft. The startup had only 40 employees, but Simonyi sniffed the future there. He and Bill Gates hit it off immediately, and Simonyi went to Microsoft.

    Microsoft's Word text editor is one of the most widely used software programs in the world, and Simonyi is the "father of Word," the commercial descendant of Bravo. To him, the personal computer is a kind of delivery vehicle for software, empowering users and magnifying the power of the programmer. "You write a few lines of code and suddenly life is better for a hundred million people," he said. "That's software."

    For the last several years, Simonyi has been working on an ambitious research project with the goal of greatly improving the productivity of computer programmers. He believes that the tools and methods programmers use are still fairly crude, limiting the amount of human intelligence that can be transmitted in software and thus slowing progress. Despite the constraints, Simonyi cannot help but marvel at the rise of software during his lifetime. "It shows how powerful software is. Even with the primitive tools we still use, look at how much software can do. It's amazing."

The ascent of software in the postwar years -- as a field of endeavor, as an industry and as a medium of communication and commerce -- has been rapid, remarkable, and almost surreptitious. The ancestry of what we now call computer programming goes back at least to the nineteenth century, when the English mathematician Charles Babbage struggled with how to handle calculations in his Analytical Engine, a conceptual forerunner of the modern computer. What he was trying to do we would now call programming. The most fundamental concept in programming is the algorithm -- simply put, a set of instructions for doing something, a recipe for calculation. The algorithm apparently traces its roots to the Babylonians, and the word is a distortion of al-Khwarizmi, the family name of a Persian scholar, Muhammad ibn Musa al-Khwarizmi, who wrote a treatise on algebraic methods.

    Yet it was not until World War II that electronics had advanced to the point that building useful computers became a real possibility. In those early days, programming was an afterthought. It was considered more a technician's chore, usually referred to as "setting up" or "coding" the machine. The glamour was all in the hardware -- that was deemed real science and engineering. The ENIAC, for Electronic Numerical Integrator and Computer, was the machine generally credited with starting the era of digital electronic computing. That computer, at the University of Pennsylvania, did not have software. Its handlers had to set up the machine by hand, plugging and unplugging a maze of wires and properly positioning row upon row of switches. It was as if the machine had to be rebuilt for each new problem. It was hard-wired programming. To do it, the government hired a handful of young women with math skills as trainees. These early women programmers were known, literally, as "computers," a throwback to the eighteenth century use of the term to refer to the human computers who prepared statistical tables used in map-making and ocean navigation.

    Programming the ENIAC to calculate the trajectory of artillery shells -- its Pentagon-assigned mission -- was painstaking and difficult work, and the women devised some innovative techniques for simplifying the process. They would draw elaborate charts on paper, mapping out how the problem could most efficiently navigate its way through the machine. Then, they would set up the machine by hand. "We knew how every wire and every switch was to be set," recalled Jean Bartik. That could take weeks. Yet, thanks to their efforts, the ENIAC's public demonstration was a great success. It could calculate a firing trajectory faster than a shell flew. "Fabulous," Bartik recalled, "one of the most exciting days of my life," though it was in the spring of 1946, after the war was over.

    The term used to describe the practitioners of the new profession evolved quickly. A human "computer" became a "coder." And "programmer" would soon irresistibly supplant the more quotidian label -- apparently a contribution from some English members of the craft, perhaps being both more status-conscious and more literary. Grace Hopper, a software pioneer who began computing equations for the war effort on the Harvard Mark I in 1944, always felt that programming was too lofty a term for the early work. "The word `programming' didn't appear until it came over from England," she recalled. "Actually I think what we were writing when we wrote machine code was coding. We should have reserved the word programming for a higher level. But it came over from England, and it sounded better than being a coder so everyone wanted to be a programmer."

    Higher-level programming, however, would soon be possible because of a breakthrough in computer design. The idea came out of the ENIAC group, and was articulated in a June 1945 paper, "A First Draft of a Report on the EDVAC," written by John von Neumann. A renowned mathematician and game theorist, von Neumann was a consultant to the Manhattan Project that developed the atomic bomb. Designing the bomb required thousands of computations, mostly done by battalions of clerks with desktop calculating machines. So von Neumann, intrigued by the potential of computers, became a consultant to the ENIAC project in 1944. The EDVAC, for Electronic Discrete Variable Automatic Computer, was to be the successor to the ENIAC. Others were involved in the EDVAC planning, notably the ENIAC project leaders, J. Presper Eckert and John Mauchly, but von Neumann wrote the report and he got the credit for designing the "stored-program computer," which later became known as the von Neumann architecture. Virtually every computer today employs the von Neumann architecture.

    The early stored-program computers began appearing after the war. The stored-program design meant that not only the computer's data -- typically, numbers to be calculated in those days -- but also its programming instructions could be stored in the machine. At one level, there was a straightforward efficiency benefit to this, enabling a measure of automation. The hand work of setting switches and wires could be eliminated because the programming instructions could be placed onto punched cards or tapes and fed into the computer, along with the data to be processed.

    Yet there was a much deeper implication to the stored-program concept. It would make building software an engineering discipline that, in the phrase of the computer scientist Butler Lampson, is "uniquely self-referential" in that all the machinery of computing could be applied to itself. That is, a stored-program computer could be used to have programs modify other programs or to create new ones. And it is this computer-mediated interaction of programming code -- a digital ecology inside the machine, one piece of code scooting off, modifying another piece, which loops back to mingle with yet another -- that made possible the development of programming languages that are far more understandable to humans than binary 1's and 0's. This ability of code to assemble, reassemble, and modify itself constantly is behind everything from computer games to the Internet to artificial intelligence.

    The developers of the early stored-program computers were also the first to get a real taste of the intricate, often unforeseen complexity of programming. The first stored-program computer to get up and running was built by a team led by Maurice Wilkes at Cambridge University. The machine was called the EDSAC, for Electronic Delay Storage Automatic Calculator. In his memoir, Wilkes recalled precisely when he first grasped that "bugs" were destined to be the programmer's eternal nemesis. "By June 1949," Wilkes wrote, "people had begun to realize that it was not so easy to get a program right as had at one time appeared." Wilkes was laboring to get his first "non-trivial program" to work and as he was about to mount a flight of stairs at Cambridge, he remembered, "the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding errors in my own programs."

The word "software" arrived on the scene long after computers were in use, suggesting a grudging recognition of this troublesome technology. The first published use of "software" as a computing term was in 1958, in the American Mathematical Monthly . John Tukey, a mathematician at Princeton University, wrote, "Today the `software' comprising the carefully planned interpretive routines, compilers, and other aspects of automative programming are at least as important to the modern electronic calculator as its `hardware' of tubes, transistors, wires, tapes and the like." Such sentiments were not necessarily the prevailing view at the time.

    In the engineering culture of computing, programmers were long regarded askance by the hardware crowd; hardware was the real discipline, while programmers were the unruly bohemians of computing. The hardware people tended to come from the more established field of electrical engineering. There were EE departments in universities, and hardware behaved according to the no-nonsense rules of the "hard sciences" like physics and chemistry. Some mathematicians were fascinated by computers and programming, but their perspective was often from the high ground of theory, not wrestling with code and debugging programs. It was not until the 1960s, with the formation of computer science departments, that programming began to be taken seriously in academia, and then only slowly.

    The recruiting and hiring of programmers in the 1950s, and beyond, was scarcely a science. Programming skills were much in demand: new people had to be trained, but there was no sure test for ability. "Early programming is where the story originated that if you looked in one ear and couldn't see daylight you could hire the person," said Robert Bemer, who was a manager in IBM's programming research department in the late 1950s. "It seemed we were just taking personnel in off the streets." Lois Haibt joined IBM in 1955, becoming a member of the 10-person team that developed the Fortran programming language as a freshly-minted graduate from Vassar College. "They took anyone who seemed to have an aptitude for problem-solving skills -- bridge players, chess players, even women," she recalled. As an IBM manager, Bemer cast his recruiting net broadly. "I once decided to advertise for chess players because I thought they would be pretty good programmers. It worked very well. We even hired the US chess champion, Arthur Bisguier. He mostly played chess and didn't do that much programming." Lesser chess players, however, proved to be more productive. The ads in 1957, which appeared in The New York Times, The Los Angeles Times , and Scientific American , yielded four or five hires -- a good catch, Bemer figured, at a time when there were an estimated 15,000 professional programmers in the United States, roughly 80 percent of the world's code writers.

    Today, much has changed. The software industry is huge, employing nearly 9 million professional programmers worldwide. Computer science is a respected field in academia; fine minds and research funding are dedicated to plumbing the mysteries of software. For good reason, since it is software that animates not only our personal computers and the Internet, but also our telephones, credit-card networks, airline reservations systems, automobile fuel injectors, kitchen appliances, and on and on. A presidential advisory group on technology observed in 1999 that software is "the new physical infrastructure of the information age" -- a critical raw material that is "fundamental to economic success, scientific and technical research, and national security."

    Indeed, the modern economy is built on software, and that dependence will only grow. Business cycles and Wall Street enthusiasms will come and go, but someone will have to build all the needed software. Programmers are the artisans, craftsmen, brick layers, and architects of the Information Age. None of this could have been imagined in the early days, because no one could foresee what the pace of technological change would make possible -- the ever-expanding horizons of computing, thanks to advances in hardware and software. John von Neumann and Herman Goldstine, leading computer visionaries of their day, wrote in 1946 that about 1,000 lines of programming instructions were "a reasonable upper limit for the complexity of problems now envisioned." An electric toothbrush may now have 3,000 lines of code, while personal computer programs have millions of lines of code.

    Despite its importance, computer programming remains a black art to most people, and that is hardly surprising. Software, after all, is almost totally invisible. It cannot be touched, felt, heard, smelled, or tasted. But software is what makes a computer do anything useful, interesting or entertaining. Computers are very powerful, but very dumb, machines. Their view of the world is all 1's and 0's, switches ON or OFF. The simple computer that ran the "Pong" video game of the 1970s -- two lines of light for "paddles" tapping a cursor-like "ball" -- saw the world like this:

0011101010101000011100011010101000

And IBM's Deep Blue supercomputer, which defeated the world chess champion Gary Kasparov in 1997, saw the world like this:

0011101010101000011100011010101000

    There were, fundamentally, only two differences between those two computers. The superior speed and power of the turbocharged bit-processing engine in Deep Blue, and the software. Software is the embodiment of human intelligence -- the mediator between man and machine -- conveying our questions or orders to the computers that surround us.

As a profession, programming is a curious blend of art, science, and engineering. The task of making software is still a remarkably painstaking, step-by-step endeavor -- more handcraftmanship than machine magic, a form of creativity in the medium of software. Chefs work with food, artists with oil paint, programmers with code. Yet programming is a very practical art form, and the people who are pulled to it have the engineering fascination with how things work and the itch to build things.

    As a child, Grace Hopper would tear apart and rebuild clocks. Ken Thompson, creator of the Unix operating system, built backyard rockets. Dan Bricklin, co-creator of the electronic spreadsheet, built the family television from a Heathkit set. James Gosling, creator of the Java programming language, rebuilt old farm machinery in his grandfather's yard in Calgary. Building things, it seems, is the real thrill for those naturally drawn to programming -- especially so since software is a medium without the constraints of matter. The programmer can build simulated cities without needing steel, glass, or concrete; simulated airplanes without aluminum, jet engines, or tires; simulated weather without light, heat or water. At a computer, the programmer can make ideas real -- at least visually real -- and test them in a virtual world of his or her own creation.

    Much of the history of computer programming can be seen as the effort to extend the franchise -- to make it easier for more and more people to program. FORTRAN, the first real programming language, was intended to make it easier for scientists and engineers to program. COBOL was designed to make it easier for business people to program. Over the years there have been a succession of advances in programming to make things less difficult. But the idealistic vision of making programming accessible to everyone -- a notion that first surfaced in the 1960s -- has remained out of reach, although there have been significant strides. Nearly everyone can use a computer these days, and many thousands, even millions, of people can do the basic programming required to create a Web page or set up a financial model on a spreadsheet.

    Yet more serious, and seriously useful, programming remains a fairly elite activity. By now, there has been research done on skilled programmers. It has found, yes, they share certain intellectual traits. They are the kind of people who have deep, particular interests outside work as well as professionally. An interest in science fiction, for example, will tend to be focused on one author or two. The same would be true of music, recreational pursuits, whatever. It is the kind of intellectual intensity and deep focus required in programming. In psychology, academics have looked at software programmers when studying what is called flow -- a state of deep concentration, total absorption, and intellectual peak performance that is the mental equivalent of what athletes describe as being in the "zone."

    Still, such study only hints at what it takes, and who has the potential, to be a gifted programmer. "Some people are three to four times better as programmers, astonishingly better than others with similar education and IQ," said Ken Kennedy, a computer science professor at Rice University. "And that is a phenomenon that is not really understood" -- further evidence, it seems, that programming is as much art as science.

    Donald Knuth has spent his career teaching the craft. Knuth, a professor emeritus at Stanford, helped create the field of computer science as an academic discipline. He is best-known as the author of the defining treatise on writing software, The Art of Computer Programming, a project he began in 1962 and that now runs to three volumes, and counting. In the book-lined, second-floor study of his home in the hills behind Stanford, Knuth observed, "There are a certain percentage of undergraduates -- perhaps two percent or so -- who have the mental quirks that make them good at computer programming. They are good at it, and it just flows out of them.... The two percent are the only ones who are really going to make these machines do amazing things. I wish it weren't so, but that is the way it has always been."

    This book is about a comparative handful of those people with the requisite mental quirks to build amazing things in code. It is intended as a representative -- by no means definitive -- history of computer programming, told mainly through the stories of some of the remarkable people who made it happen.

Excerpted from Go To by STEVE LOHR. Copyright © 2001 by Steve Lohr. Excerpted by permission. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Rewards Program