The Knowledge Web From Electronic Agents to Stonehenge and Back -- And Other Journeys Through Knowledge

  • ISBN13:


  • ISBN10:


  • Format: Paperback
  • Copyright: 2000-06-22
  • Publisher: Simon & Schuster
  • Purchase Benefits
  • Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $19.95 Save up to $2.99
  • Buy New


Supplemental Materials

What is included with this book?

  • The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.


InThe Knowledge Web,James Burke, the bestselling author and host of television'sConnectionsseries, takes us on a fascinating tour through the interlocking threads of knowledge running through Western history. Displaying mesmerizing flights of fancy, he shows how seemingly unrelated ideas and innovations bounce off one another, spinning a vast, interactive web on which everything is connected to everything else:Carmenleads to the theory of relativity, champagne bottling links to wallpaper design, Joan of Arc connects through vaudeville to Buffalo Bill.Illustrating his open, connective theme in the form of a journey across a web, Burke breaks down complex concepts, offering information in a manner accessible to anybody -- high school graduates and Ph.D. holders alike. The journey touches almost two hundred interlinked points in the history of knowledge, ultimately ending where it begins.At once amusing and instructing,The Knowledge Webheightens our awareness of our interdependence -- with one another and with the past. Only by understanding the interrelated nature of the modern world can we hope to identify complex patterns of change and direct the process of innovation to the common good.

Author Biography

James Burke has written seven books, including the bestselling Connections and The Day the Universe Changed. He contributes a monthly column to Scientific American and serves as director, writer and host of the television series Connections 3, which airs on the Learning Channel. He lives in England, France and airplanes.

Table of Contents

How to Use This Book
What's in a Name?
Drop the Apple
An Invisible Object
Life Is No Picnic
Elementary Stuff
A Special Place
Fire from the Sky
Hit the Water
In Touch
Table of Contents provided by Publisher. All Rights Reserved.



Change comes so fast these days that the reaction of the average person recalls the depressive who takes some time off work and heads for the beach. A couple of days later his psychiatrist gets a postcard from him. The message on the card reads: "Having a wonderful time. Why?"

Innovation is so often surprising and unexpected because the process by which new ideas emerge is serendipitous and interactive. Even those directly involved may be unaware of the outcome of their work. How, for instance, could a nineteenth-century perfume-spray manufacturer and the chemist who discovered how to crack gasoline from oil have foreseen that their products would come together to create the carburetor? In the 1880s, without the accidental spillage of some of the recently invented artificial colorant onto a petri-dish culture that revealed to a German researcher named Ehrlich that the dye preferentially killed certain bacilli, would Ehrlich have become the first chemotherapist? If the Romantic movement's concept of "nature-philosophy" had not suggested that nature evolves through the reconciliation of opposing forces, would Oersted have sought to "reconcile" electricity and magnetism and discovered the electromagnetic force that made possible modern telecommunications?

Small wonder, then, that the man and woman in the street are left behind in all this, if the researchers themselves don't get the point. But given the conditions under which science and technology work, how else could it be? At last count there were more than twenty thousand different disciplines, each of them staffed by researchers straining to replace what they produced yesterday.

These noodling world-changers are spurred on by at least two powerful motivators. The first is that you are more likely to achieve recognition if you make your particular research niche so specialist that there's only room in it for you. So the aim of most scientists is to know more and more about less and less, and to describe what it is they know in terms of such precision as to be virtually incomprehensible to their colleagues, let alone the general public.

The second motivator is the CEO. Corporations survive in a changing world only by encouraging their specialists to generate change before somebody else does. Winning in the marketplace means catching the competition by surprise. Not surprisingly, this process also surprises the consumer, and nowhere so frequently today as in the world of electronics, where by the time the user gets around to reading the manual, the gizmo to which it refers is obsolete.

We live in this permanently off-balance manner because of the way knowledge has been generated and disseminated for the last 120,000 years. In early Neolithic times the requirement to teach the highly precise, sequential skills of stone-tool manufacture demanded a similarly precise, sequential use of sounds and is thought to have given rise to language. The sequential nature' of language facilitated description of the world in similarly precise terms, and in due course a process originally developed for chipping pieces off stone became a tool for chipping pieces off the universe. This reduction, of reality to its constituent parts is at the root of the view of knowledge known as "reductionism," from which science sprang in the seventeenth-century West. Simply put, scientific knowledge comes as the result of taking things apart to see how they work.

Over millennia, this way of doing things has tended to subdivide knowledge into smaller and more specialist segments. For example, in the past hundred years or so, the ancient discipline of botany has fragmented and diversified to become biology, organic chemistry, histology, embryology, evolutionary biology, physiology, cytology, pathology, bacteriology, urology, ecology, population genetics and zoology.

There is no reason to suppose that this process of proliferation and fragmentation will lessen or cease. It is at the heart of what, since Darwin's time, has been called "progress." If we live today in the best of all possible materialist worlds, it is because of the tremendous strides made by specialist research that have given us everything from more absorbent diapers to linear accelerators. We in the technologically advanced nations are healthier, wealthier, more mobile, better-informed individuals than ever before in history, thanks to myriad specialists andu the products of their pencil-chewing efforts.

However, the corollary to a small minority knowing more and more about less and less is a large majority knowing less and less about more and more. In the past this has been a relatively unimportant matter principally because for most of history the illiterate majority (hard-pressed enough just to survive) has been unaware that the problem existed at all. Technology was in such limited supply that there was only enough to share it among a few elite decision-makers.

It is true that over time, as the technology diversified, knowledge slowly diffused outward into the community via information media such as the alphabet, paper, the printing press and telecommunications: But at the same time these systems also served to increase the overall amount of specialist knowledge. What reached the general public was usually either out-of-date or no longer vital to the interests of the elite. And as specialist knowledge expanded, so did the gulf between those who had information and those who did not.

Each time there was a major advance in the ability to generate, store or disseminate knowledge, it was followed by an "information surge" and with it a sudden acceleration in the level of innovation that dramatically enhanced the power of the elites. But sooner or later the same technology reached enough people to undermine the status quo. The arrival of paper in thirteenth-century Europe strengthened the hand of church and throve, but at the same time created a merchant class that would ultimately question their authority. The printing press gave Rome the means to enforce obedience and conformity, then Luther used it to wage a propaganda war that ended with the emergence of Protestantism. In the late nineteenth century, when military technology made possible conflicts in which hundreds of thousands died, and manufacturing technology generated untenable working and living conditions for millions of factory workers, radicals and reformers were aided in their efforts by new printing techniques cheap enough to spread their message of protest in newspapers and pamphlets.

By the mid twentieth century scientific and technological knowledge far outstripped the ability of most people, even the averagely well-informed, to comprehend it. The stimulus, of the Cold War brought advances in computer technology that seemed likely to place unprecedented power in the hands of economic and political power blocs. There was talk of "Big Brother" government, rule by multinational corporations, the central databases that would hold personal files on every individual, and the creeping homogenization of the human race into one giant "global village." Unchecked state and corporate industrialization finally began to generate the first visible signs of global warming, runaway pollution decimated the animal population and the tropical forests went down before fire and axe at an alarming rate.

However, at the same time, the failing cost of computer and telecommunications technology also began to make it possible for these developments to be discussed in an, unprecedentedly large public forum. And the more we learned about the world through television and radio, the more it became clear that urgent measures were needed to preserve its fragile ecosystems and its even more fragile cultural diversity. At the end of the twentieth century the emergence of the ubiquitous Internet and affordable wireless technology offered the opportunity for millions of individuals to think, of becoming involved.

However, the culture of scarcity, with which we have lived for millennia has not prepared us well for the responsibilities technology will force on us in the next few decades. Reductionism, representative democracy and the division of labor have tended to leave such matters in the hands of specialists who are, increasingly, no more aware of the ramifications of their work than anybody else.

The result is that national and international institutions are coming under unprecedented stress as they try to apply their obsolete mechanisms to twenty-first-century problems. In Britain recently a case was brought against an individual which rested on the fifteenth-century meaning of the word "obscene." Medical etiquette has changed little since 1800. In some places science and religion are in conflict over the definition of life.

Western institutions function as if the world had not changed since they were established to deal with the specific problems of the time. Fifteenth-century nation-states, emerging into a world without telecommunications, developed representative democracy; seventeenth-century explorers in need of financial backing invented the stock market; in the eleventh century the influx of Arab knowledge triggered the invention of universities to process the new data for student priests.

In the coming decades it is likely that many social institutions will attempt to adapt by becoming virtual, bringing their services directly to the individual much in the way that banks have already begun to. But their new accessibility will in turn likely subject them to proliferating and diversifying demands that will change how they work and make them redefine their purpose. In education, the old reductionist reliance on specialism and testing by repetition will have to give way to a much more flexible definition of ability. As machines increasingly take over the tasks that once occupied a human lifetime, specialist skills may take on a merely antiquarian value. New ways will have to be found to assess intelligence in a world in which memory and experience seem no longer of value (again, this is nothing new: the alphabet and later the printing press both presented the same perceived threat).

When a corporate workforce becomes scattered across the country, or the globe, in thousands of individual homes or groups, and deals direct with millions of customers, the value of communication skills is likely to outweigh that of most others. Such ability may be possessed by people who would previously have been thought unqualified to work for the corporation, because in the old world they would have been too young, or too old, or too distant, for example. A virtual education system will have to deal with problems such as a multicultural global student body bringing very diverse experience, attitudes and aims to the class. In terms of international law, recent cases involving copyright or pornography reveal how complex such legal problems are likely to become.

This book does not attempt directly to address any of these problems. Rather, it suggests an approach to knowledge perhaps more attuned to the needs of the twenty-first century as described above. Some readers will no doubt see this approach as more evidence of the "dumbing-down" of recent years. But the same was said about the first printing press, newspapers, calculators and the removal of mandatory Latin from the curriculum.

In its fully developed form, the "webbed" knowledge system introduced here would be inclusive, not exclusive. Modern interactive networked communications systems married to astronomically large data storage capability ought to ensure that at times of change nothing need be lost. No subject or skill will be too arcane for its practitioners to pursue when the marketplace for their skills is planetwide.

Also, no external memory device from alphabet to laptop seems to have degraded human mental abilities by its introduction. Rather these abilities have been augmented each time by the new tools. Some skills, such as rote memory, become less widely used, but there seems to be no evidence that the capability for them disappears. In many cases machines also take over routine work, freeing individuals to use their skills at higher levels.

The latest interactive, semi-intelligent technologies seem likely to make this possible on an unprecedented scale. They also bring to an end a period of history in which the human brain was constrained by limited technology to operate in a less-than-optimal way, since the brain appears not to be designed to work best in the linear, discrete way promoted by reductionism. The average healthy brain has more than a hundred billion neurons, each communicating with others via thousands of dendrites. The number of potential ways for signals to go in the system is said to be greater than the number of atoms in the universe. In matters as fundamental as recognition it seems that the brain uses some of its massive interconnectedness to call on many different processes at once to deal with events in the outside world, so as quickly to identify a potentially dangerous pattern of inputs.

It is this pattern-recognition capability that might prove to be the most useful attribute of a webbed knowledge system driven by the semi-intelligent interactive systems now being developed. As this book hopes to show, learning to identify the pattern of connections between ideas, people and events is the first step toward understanding the context and relevance of information. So the social implications of webbed knowledge systems are exciting, since they will make it easier for the average citizen to become informed of the relative value of innovation. After all, it is not necessary to understand the mathematics of radioactive decay to make a decision about where to site a nuclear power plant. As I hope you will see, this approach to knowledge may be one way to enfranchise those millions who lack what used to be called formal education and to move us toward more participatory forms of government.

I would not pretend that what follows is more than a first exercise, a number of linked storylines intended to introduce the reader to the kind of information infrastructures we may begin to use in the next few decades. But I hope they will introduce the reader to a new, more relevant way of looking at the world, because in one way or another, we're all connected.

James Burke

London 1999

Copyright © 1999 by London Writers

Chapter 1


This book takes a journey across the vast, interconnected web of knowledge to offer a glimpse of what a learning experience might be like in the twenty-first century once we have solved the problem of information overload.

In the past when technology generated information overload the contemporary reaction was much the same as it is today. On the first appearance of paper in the medieval West, the English bishop Samson of St. Alban's complained that because paper would be cheaper than animal-skin parchment people would use paper to write too many words of too little value, and since paper was not as durable as parchment, paper-based knowledge would in the long run decay and be lost. When the printing press was developed in the fifteenth century it was said that printed books would make reading and writing "the infatuation of people who have no business reading and writing." Samuel Morse's development of the telegraph promised to link places as far apart as Maine and Texas, triggering the reaction: "What have Maine and Texas to say to each other?" The twentieth-century proliferation of television channels has led to concerns about "dumbing-down."

The past perception that new information technologies would have a destabilizing social effect led to the imposition of controls on their use. Only a few ancient Egyptian administrators were permitted to learn the skills of penmanship. Medieval European paper manufacture was strictly licensed. The output of sixteenth-century printing presses was subject to official censorship by both church and state. The new seventeenth-century libraries were not open to the public. Nineteenth-century European telegraphs and telephones came under the control of government ministries.

The problem of past information overload has generally been of concern only to a small number of literate administrators and their semiliterate masters. In contrast, twenty-first-century petabyte laptops and virtually free access to the Internet may bring destabilizing effects of information overload that will operate on a scale and at a rate well beyond anything that has happened before. In the next few decades hundreds of millions of new users will have no experience in searching the immense amount of available data and very little training in what to do with it. Information abundance will stress society in ways for which it has not been prepared and damage centralized social systems designed to function in a nineteenth-century world.

Part of the answer to the problem may be an information-filtering system customized to suit the individual. The most promising of the systems now being developed will guide users through the complex and exciting world of information without their getting lost. This book provides an opportunity for the reader to take a practice run on such a journey. The journey (the book) begins and ends with the invention of the guidance system itself -- the semi-intelligent agent.

There are several types of agent in existence acting like personal secretaries in a variety of simple ways: filtering genuine e-mail from spam, running a diary, paying bills and selecting entertainment. In the near future agents will organize and conduct almost every aspect of the individual's life. Above all they will journey across the knowledge webs to retrieve information, then process and present it in ways customized to suit the user. In time they will act on behalf of their user because they will have learned his or her preferences by learning from the user's daily requirements.

In the search to develop semi-intelligent agents, one of the most promising systems (and the one which starts this journey) may be the neural network. Such a network consists of a number of cells each reacting to signals from a number of other cells that in turn fire their signals in reaction to input from yet other cells. If input signals cause one cell to fire more frequently than others, its input to the next cell in the series will be given greater weighting. Since cells are programmed to react preferentially to input from cells that fire frequently rather than from those that fire rarely, the system "learns" from experience. This is thought to be similar to the way learning operates in the human brain, where the repetition of a signal generated in response to a specific experience can cause enlargement in the brain cell's synapses.

The synapse is the part of the cell that releases transmitter chemicals that cross the gap to the next cell. If sufficient chemicals arrive on the other side, they generate an impulse. If enough of these signals are generated in the target cell, they cause its synapses to release chemicals in turn, and "pass the message on." A cell with larger synapses, releasing larger amounts of chemical, is therefore more likely to cause another cell to fire. Networks of such frequently firing cells may constitute the building blocks of memory.

This theory of neuronal interaction was first proposed in 1943 by two American researchers, Walter Pitts and Warren McCulloch, who also suggested that such a feedback process might result in purposive behavior when linking the senses with the brain and muscles if the result of the interaction were to cause the muscles to act to reduce the difference between a condition in the real world as perceived by the senses and the condition as desired by the brain.

Pitts and McCulloch belonged to a small group of researchers calling itself the "Teleological Society," another of whose members was the man who invented the name for this neural feedback process. He was Norbert Wiener, and he was the first to see the way in which feedback might work in a machine, during his research on antiaircraft artillery systems during World War II. Wiener was a rotund, irascible, cigar-chomping MIT professor of math who prowled what he described as the "frontier areas" between the scientific disciplines. Between biology and engineering wiener developed a new discipline to deal with feedback processes. He called the new discipline "cybernetics." Wiener recognized that feedback devices are information-processing systems receiving information and acting upon it. When applied to the brain this new information-oriented view was a fundamental shift away from the entirely biological paradigm that had ruled neurophysiology since Freud, and it was to affect all artificial-intelligence work from then on.

Wiener first applied his feedback theory early in World War II, when he and a young engineer named Julian Bigelow were asked to improve the artillery hit rate. At the beginning of the war the problem facing antiaircraft gunners was that as the speed of targets increased (thanks to advances in engine and airframe technology) it became necessary to be able to fire a shell some distance ahead of a fast-moving target in order to hit it. Automating this process involved a large number of variables: wind, temperature, humidity, gunpowder charge, length of gun barrel, speed and height of target, and many others. Wiener used continuous input from radar tracking systems to establish the recent path of the target and use that path to predict what the target's likely position would be in the immediate future. This information would then be fed to the gun-moving mechanisms so that aiming-off was continually updated.

The system had its most outstanding successes in 1944; when British and American gunners shot down German flying bombs with fewer than one hundred rounds per hit. This was an extraordinary advance over previous performance, estimated at one hit per twenty-five hundred rounds. In 1944, during the last four weeks of German V-1 missile attacks on England, the success rate improved dramatically. In-the first week, 24 percent of targets were destroyed; in the second, 46 percent; in the third, 67 percent; and in the fourth, 79 percent. The last day on which a large number of V-1s were launched at Britain, 104 of the missiles were detected by early-warning radar, but only four reached London. Antiaircraft artillery destroyed sixty-eight of them.

Early in his work on the artillery project Wiener had frequent discussions with a young physiologist named Arturo Rosenbleuth, who was interested in human feedback mechanisms that act to ensure precision in bodily movement. For the previous fifteen years Rosenbleuth had worked closely with Walter Cannon, professor of physiology at Harvard. Earlier in the century Cannon had invented the barium meal, which was opaque to X-rays. When ingested by a goose the barium revealed the peristaltic waves that occurred in the bird's stomach when it was hungry. Cannon observed that hunger seemed to precipitate the onset of these waves. He then observed that when a hungry animal was frightened the waves stopped.

This led to Cannon's ground-breaking studies, of the physical effects of emotion. He discovered that when an animal was disturbed its sympathetic nervous system secreted into the bloodstream a chemical that Cannon named "sympathin." This chemical counteracted the effects of the disturbance and returned the animal's body systems to a state of balance. Cannon named the balancing process "homeostasis." In 1915 Cannon discovered that the principal body changes affected by the sympathetic system were those involved in fight, sexual activity or flight. In such situations sugar flowed from the liver to provide emergency energy and blood shifted from the abdomen to the heart, lungs and limbs. If the body were wounded, blood clotting occurred more rapidly than usual. In 1932 Cannon published a full-scale account of his research titledThe Wisdom of the Body.

What had initially triggered cannon's interest in homeostatic mechanisms was the work of the man to whom Cannon dedicated the French edition of his book. He was an unprepossessing but eminent French physiologist named Claude Bernard, who had started his working life as a pharmacist's assistant in Beaujolais, where his father owned a small vineyard. After being forced to give up his early schooling for lack of funds, Bernard took up writing plays. He produced first a comedy and then a five-act play, which he took to Paris in 1834 with the intention of making a career in the theater. Fortunately for the future health of humankind Bernard was introduced to an eminent theatrical critic, Saint-Marc Girardin, who read the play and advised Bernard to take up medicine.

At first Bernard planned to be a surgeon, but becoming dissatisfied with the general lack of physiological data he began to gather his own data by experimenting on animals. By 1839 his dexterity in dissection had brought him to the attention of the great physiologist François Magendie, who appointed him as assistant. One winter morning, in 1846 some rabbits were brought to Magendie's lab for dissection and Bernard noticed that their urine was clear and acidic. As every nineteenth-century French winemaker knew, the urine of rabbits is usually turbid and alkaline. Bernard realized that the rabbits had not been fed and theorized that since the urine of carnivores is clear, the hungry, herbivorous rabbits must have been living on their fat. When he fed grass to the rabbits their urine returned to its normal alkaline turbidity. He double-checked with an experiment on himself. After twenty-four hours subsisting only on potatoes, cauliflower, carrots, green peas, salad, and fruit, Bernard's own urine went turbid and alkaline. Bernard then starved the rabbits, fed them boiled beef and dissected them to find out what had happened. He saw a milklike substance (he took it to be emulsified fat) that had formed at the point where the rabbit's pancreatic juice was pouring into the stomach: There was clearly some link between the juice and the emulsification of the fats.

Two years later he discovered the glycogenic function of the liver, which injects glucose into the blood. It was this discovery thai led to Bernard's greatest contribution to the sum of human knowledge, because he saw that the function of the liver and the pancreas (and perhaps other systems, too) was to maintain the body's equilibrium. He summed up his research: "All the vital mechanisms; however varied they may be, have only one object, that of preserving constant the conditions of life in the inner environment." Follow-up research on the pancreas led an English researcher, William Bayliss, to coin the phrase that Cannon would use as his book title: "the wisdom of the body."

Not everybody was happy with Bernard's work, especially when he designed an oven in which to cook animals alive. An American doctor, Francis Donaldson, who attended Bernard's lectures in 1851, wrote: "It was curious to see walking about the amphitheater of the College of France dogs and rabbits, unconscious contributors to science, with five or six orifices in their bodies from which at a moment's warning, there could be produced any secretion of the body, including that of the several salivary glands, the stomach, the liver, and the pancreas."

Bernard was well aware of public opposition to vivisection but defended it: "The science of life is like a superb salon resplendent with light which one can enter only through a long and ghastly kitchen." Alas, Bernard's wife was unable to take the heat. After leaving him in 1869 she went in search of the antivivisection activists to whom she had been sending regular contributions.

She did not have far to go. In Paris a fanatical young vegetarian Englishwoman named Anna Kingsford, the owner of theLady's Own Paper,had come to France to study medicine. Kingsford became well-known at the medical school for refusing to let her professors vivisect during the lessons she attended and for demonstrating against the practice. Kingsford's lecture halls were close to Bernard's labs, and she became so obsessed by his work that she set about directing all her energies toward killing him with thought waves. Bernard died only few weeks after she had begun to concentrate her mental energies on him, convincing her that she had been the instrument of divine will. Kingsford also claimed to have been responsible for the death of another vivisector, Paul Bert. However her efforts to do the same to Louis Pasteur failed.

Legislation to protect animals from ill treatment took a long time to reach the statute books, even in England, where the first such laws were passed. In 1800 the first bill to outlaw bull-baiting had ignominiously failed in its passage though the Houses of Parliament, opposed by George Canning (later prime minister), who claimed that bull-baiting "inspired courage and produced a nobleness of sentiment and elevation of mind....Putting a stop to bull-baiting was legislating against the spirit and genius of almost every country and age." However, in 1821 Dick Martin, MP for Galway, forced through a bill to protect horses and cattle against ill treatment. It was the first law of its kind in any country. In 1824 the Society for the Prevention of Cruelty to Animals was formed at the unfortunately named Old Slaughter Coffee House in London. The publication in 1859 of Darwin'sOrigin of Speciesseemed to strengthen the relationship between humans and animals and support the animal-defense argument. In 1876 the Victoria Street Society against Vivisection was formed with Lord Shaftesbury as chairman. The same year a bill was passed to prevent the vivisection of dogs, cats, mules, horses and asses. By the late nineteenth century the animal-defense movement had spread throughout the Western world and given birth to hundreds of local groups known as Humane Societies, in spite of the fact that the name more properly belonged to earlier humanitarian work of an entirely different nature.

The Royal Humane Society was founded in London in 1774 largely as the result of the efforts of Dr. William Hawes to promote knowledge of artificial-respiration techniques. Hawes based his ideas on the translation of a paper by the Amsterdam Society for the Recovery of the Apparently Drowned. The society had been founded in 1767 after several cases of successful resuscitation had been reported in Switzerland. In the nineteenth century interest in drowning became acute with the spectacular increase in cargo tonnage and passenger traffic on the high seas following the spread of industrialization. As the number of ships rose so did the number of shipwrecks and deaths.

From time to time the Royal Humane Society awarded a gold medallion for outstanding feats of bravery, and in 1838 the recipient hit the front pages because she was a slightly built, twenty-two-year-old woman. On the night of February 6, a paddle steamer, theForfarshire,battling through a gale en route from Hull to Dundee with a full cargo and sixty-three passengers, sprang a leak in her boiler. The captain decided to take shelter among the Farne Islands off the coast of Northumberland. During this maneuver the ship hit the rocks and broke in two, and all but thirteen passengers and crew were drowned. The survivors, exposed to the full force of the storm, included a mother and two children. Overnight the two children and an adult died. At five o'clock the next morning Grace Darling, daughter of the local lighthouse keeper, caught sight of the wreck and the survivors clinging to the rocks. Grace and her father rowed to the rescue, struggling through mountainous sea in a small open boat. The drama was reported in the newspapers and Grace became an instant national hero. Alas, she was to die four years later from tuberculosis. Meantime she had inspired the public to offer massive financial and political support for the eventual establishment of the Royal National Lifeboat Institution, in 1854.

That same year came another highly publicized loss at sea. The USSSan Francisco,an American troopship carrying hundreds of soldiers, foundered in an Atlantic hurricane. The secretary of the navy sent for Matthew Maury, the only man in America who would be able to tell where to look for survivors. After studying his wind and current charts, Maury pinpointed the spot and the survivors were found in the water.

Maury was the fourth son of a Huguenot-English family long settled in Virginia (his grandfather had taught Thomas Jefferson), and he had joined the U.S. Navy in 1825. It was during a voyage to South America that Maury became interested in finding faster ways to cross the ocean. On his return in 1834 he took leave and wrote his first work on navigation. In 1839 Maury published a series of articles in theSouthern Literary Messenger,one of which advocated the establishment of a naval school. It would become the U.S. Naval Academy at Annapolis.

In 1847 Maury issued the first of several charts and then, in 1851,Explanations and Sailing Directions to Accompany the Wind and Current Charts.At the instigation of the U.S. government, copies of the charts andSailing Directionswere distributed free to all masters of vessels on the understanding that they would keep a full log of journeys and forward these logs to Maury, in Washington. Logs were to include temperature of air and water, direction of wind and currents, and air pressure. Captains were also required to throw overboard (at given intervals) a bottle containing a piece of paper carrying the ship's position and the date. They were also to pick up any such bottles they came across and note all details in their logs. In return for these services masters would receive free copies of Maury's further work. Over eight years Maury collected and processed data on many millions of observations, as a result of which he was able to identify faster sailing routes. One ship's master following Maury's suggested route from New York to Rio de Janeiro halved the usual journey time. It was reckoned that Maury's "Path-of-Minimum-Time" routes saved American shipping forty million dollars a year.

In 1853 Maury crowned his career when he persuaded sixteen countries (among them the United States, Britain, Belgium, Holland, Russia, France, Norway, Denmark and Portugal) to meet in Brussels for the first International Meteorological Congress "to plan an uniform system of meteorological observation at sea, and to agree a plan for the observation of the winds and currents of the oceans with a view to improving navigation and to enrich our knowledge of the laws which govern those elements." Not long after he had returned from Brussels, Maury received a letter from a retired paper-manufacturing millionaire named Cyrus W. Field, who was seeking advice on the ideal route for a transatlantic submarine telegraph cable.

Submarine cables had already been laid successfully in the relatively shallow waters between England and Holland, Scotland and Ireland, but the Atlantic represented a formidable challenge. Field had managed to get a favorable charter from the British government for a fifty-year monopoly on any cable laid between Newfoundland and Ireland. The British also offered to provide a cable-laying ship as well as a generous advance on income from telegraph messages. Field then spent two years laying a cable between Newfoundland and the North American mainland (stockholders in the company included such luminaries as Lady Byron and Thackeray). When the link was completed Field wrote to Maury to solicit his views on the best route out of Newfoundland toward Europe.

Maury reported that soundings revealed a shallow "telegraph plateau" across much of the North Atlantic, and in 1857 work began on laying the cable. After a few hundred miles had been laid the cable snapped. Three more attempts were made and on August 5, 1858, 1,850 miles of copper wire connected Valencia, Ireland, with Trinity Bay, Newfoundland; and traffic began with an inaugural message from Queen Victoria to President Buchanan. At the celebration dinner in New York Field said modestly: "Maury furnished the brains, England gave the money, and I did the work." Then the cable failed again. In 1865 they found the parted ends, spliced them and the work was done. The U.S. Congress voted Field a gold medal.

Field had also written to the man whose work had inspired the whole venture: Samuel Morse, inventor of the most successful form of telegraph. Morse's advantages over other telegraphers were his key and the Morse Code, which he demonstrated before Congress in 1844. The idea had come to him in the autumn of 1832 during a voyage back to the United States from France. Morse first learned what he needed to know about the principles of electricity and one of his friends, Alfred Vail, provided the finance and hardware (Vail's father had a machine shop in New Jersey). Vail also suggested what would later become known as the Morse Code.

At this time Morse was a well-known artist, professor of art at New York University, and had just spent three years in Europe studying and painting. Morse was a strange man given to apocalyptic patriotic views. He had been brought up as a strict Calvinist by his father Jedidiah, America's foremost geography scholar, who had earlier led the Old Calvinist "Great Awakening" crusade against liberal theology. Like his father, Morse looked forward to the triumph of American culture and believed that only an elite could lead the country to salvation. Morse was also extremely xenophobic. At one point he painted a picture of the pope conspiring to arm American Catholics, provoke disorder, rig elections and elect foreigners to public office. Morse also helped publish a book about Maria Monk, a woman who claimed to have been a nun in Montreal, where she also claimed to have witnessed unnatural sexual acts performed by clergy and to have seen crypts filled with the corpses of illegitimate children. In the end it was revealed that Monk (rumored to have had a romantic affair with Morse) had escaped from a mental institution.

Morse believed that art was a tool placed in his hands by God to be used to save Protestant America. He believed that the millennium was imminent, and that when it came America would carry the empire of peace to the world. It was therefore essential to prepare American art for the great day. Morse founded the National Academy of Arts and Design in 1826 and was its president until 1845. The aim of the academy was to foster American artistic talent so that American genius could take its rightful place in the world and inculcate true Protestant virtues in other Americans.

In 1829 Morse decided to visit Europe to study artistic masterworks in preparation for what he hoped would be his greatest triumph, the commission to paint the four remaining murals for the Rotunda of the Capitol Building in Washington D.C. To this end, while in Paris in 1831, he painted the giantGallery of the Louvre.The painting reproduced in miniature thirty-eight Louvre masterpieces. Morse's aim was to show that while the classical past was worthy of study it should not be the subject of slavish emulation by American artists, who, like the artist shown in theLouvrepainting (Morse himself), could learn from the Old Masters and then develop their own distinctively American style. On his return theLouvrepainting was put on exhibition in New York and was a disastrous flop. The commission for the Rotunda murals went to other artists. Morse turned to the telegraph as an alternative tool with which to make Protestant America great. Communications technology would be an instrument of Divine Will, redeeming America by transmitting messages of peace and love. At his demonstration in Congress in 1844 Morse's first transmitted message echoed these beliefs: "What hath God wrought!"

Morse had learned his art at the feet of Washington Allston, the most completely Romantic American painter, whom he had met in Boston in 1810 and with whom he became life-long friends. Only a year after their meeting Allston inspired Morse to attempt the first of his grand historical American scenes,Landing of the Pilgrims at Plymouth.That same year Morse joined Allston and his wife on his first trip to Europe. Allston was a good-looking Harvard-educated gentleman from South Carolina who on the death of his step-father in 1801 had sold the family property to finance a career in painting. On his earlier visit to London Allston had studied with Benjamin West, the president of the Royal Academy, and then in 1804 moved on via Paris to Rome. There he met Washington Irving, who later wrote: "I do not think I have ever been more completely captivated on a first acquaintance. He was of a light and graceful form, with large blue eyes, and black, silken hair waving and curling around a pale, expressive countenance. A young man's intimacy took place immediately between us, and we were much together during my brief sojourn at Rome....We visited together some of the finest collections of paintings, and he taught me how to visit them to the most advantage, guiding me always to the masterpieces, and passing by the others without notice." Allston'sItalian Landscapeshows the profound effect of Italy on his work. His fresh New England eye was overwhelmed by the light, the color, the ancient ruins, the landscape dotted with hilltop villages, the rich mingling of Renaissance, medieval and classical architecture and the pastoral nature of Italian peasant life.

In 1805 Allston met and painted the English Romantic poet Samuel Taylor Coleridge, whom he would later recognize as his greatest intellectual mentor. At the time of their meeting Coleridge was suffering the aftereffects of his failure to give up opium. Aged thirty-three, with "Kubla Khan" and the "Ancient Mariner" poems behind him, Coleridge was already famous. He was also an alcoholic, deeply in debt, unhappily married with three children, had failed in a venture to set up a utopian settlement on the banks of the Susquehanna in Pennsylvania, and was extremely hypochondriac (he coined the word "psychosomatic").

It was partly to try to wean himself from opium (and his penchant for taking it in brandy), and partly to get away from his wife, that in 1804 Coleridge had run away to Malta. There, thanks to an influential acquaintance, he landed the job of secretary to the British civil commissioner, Alexander Ball. The post also included free food and accommodation in the commissioner's palace in Valetta, the island's capital. Coleridge's workload was light and consisted principally of rewriting Ball's dispatches to London. Although Coleridge complained incessantly about his health, the nightmare of withdrawal symptoms, the dull company and his inability to write new poems, he enjoyed the climate and the countryside and managed to produce some of his best prose. He also began to feel the first stirrings of mortality: "I had felt the Truth; but never saw it before clearly; it came upon me at Malta, under the melancholy dreadful feeling of finding myself to be Man, by a distinct division from Boyhood, Youth, 'Young Man.' Dreadful was the feeling -- before that, life had flown on so that I had always been a Boy, as it were -- and this sensation had blended in all my conduct." When his friends William and Mary Wordsworth saw him on his return to England, they were to remark that he had changed for the worse.

Coleridge saw from the dispatches he was editing that he had arrived in Malta at a critical time. Ball was arguing the strategic importance of the island now that Napoleon had given up Louisiana, lost Santo Domingo and would inevitably turn his attention to the Mediterranean. Ball also suggested to the British government that Algiers, Tunis and Tripoli were ripe for colonization and "they are capable of growing all Our colonial produce." He also argued that though both Russia and France wanted Malta they should not be allowed to take it. At the time, the island was a hotbed of intrigue. The Maltese were agitating for independence, Russian and French spies were imagined to be everywhere and there was an American naval squadron on station commanded by Commodore Edward Preble. Among Preble's officers was the young Stephen Decatur, leader of the daring and successful 1804 raid on Tripoli harbor to destroy the American frigatePhiladelphia,which had run aground and been captured during the American-Tripolitanian War. During a brief trip to Sicily Coleridge met and dined with both intrepid Americans, and for years later regaled friends with tales of their exploits.

Coleridge's employer, Rear Admiral Alexander Ball, had joined the British Navy at the age of twelve. This event, he told Coleridge, had been inspired by readingRobinson Crusoe.Ball had the air more of an academic than a sailor, bookish and thoughtful. After serving in the Caribbean, America and Newfoundland, in 1783 he took a year off and went to France to study the language. At one point there, during a visit to St. Omer, he met another young captain with whom his fate was to be bound up, in spite of the fact that on this occasion each expected the other to make the required formal call, so neither did so. Ball then served in the English Channel, went again to Newfoundland, was stationed off the French Coast and in 1798 was posted to the Mediterranean where he was to meet the young captain with whom he had failed to exchange courtesies in St. Omer. At this time Britain was expecting to be invaded by Napoleon, and much of the British fleet was patrolling outside French harbors in the English Channel and on the French Atlantic coast. Hearing a rumor that Napoleon was assembling a Mediterranean fleet in Toulon, the British also sent a fleet to blockade that port.

In April the Toulon blockade fleet flee

Excerpted from The Knowledge Web: From Electronic Agents to Stonehenge and Back--And Other Journeys Through Knowledge by James Burke
All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.

Rewards Program

Write a Review