Inventing the Future – Encyclopaedia Britannica

EB First Edition

         Replica of 1st Edition Encyclopaedia Britannica 1768

Only an Artifact of the Scottish Enlightenment?

Who would have guessed that at the end of the 20th Century it would be a company founded in Scotland in 1768 that would invent a key part of the mechanics that would let people intuitively navigate the electronic flood of text, sound and images soon to drench the planet from the internet?

In 1989, 221 years after the company’s founding in Edinburgh during the Scottish Enlightenment, Chicago-based Encyclopaedia Britannica, Inc., publisher of its eponymous Encyclopædia Britannica reference work, had not only solved this puzzle for the first time, but it was also issued a patent for it. While it may be incongruous that a legacy reference print publisher would be the party to make the discovery, this is exactly what happened

Normal patents on inventions today have a revenue producing life of 20 years. The patents Britannica filed for in 1989 were issued by the U.S. Patent and Trademark Office in 1993. Immediately controversial, software industry opposition caused the Commissioner of Patents to promptly order a reexamination by the Patent Office. Following the Commissioner’s invitation, the Office cancelled the patent a year after it had issued. After more years of litigation by Britannica, another court finally reversed the Patent Office and in 2002, the patent was reissued. Then it was finally up to Britannica to enforce the patent against infringers. The family of Compton’s Patents were unusual both in their long and controversial history, but also in that they never earned a nickel. Indeed, in 2015, after years of lawsuits in multiple court venues, they were finally found to have been improperly issued by the U.S. Court of Appeals for the Federal Circuit. In 2011 the court found that there had been a technical and procedural error in the original filing papers.

The technical defects meant that a court never got to a detailed ruling on whether then commonplace GPS navigation systems infringed the patents covering Britannica’s invention. When Britannica later sued its outside patent law firm for legal malpractice for committing the technical error, another court denied this claim saying that, if the patent shouldn’t have been issued by the Patent Office in the first place, Britannica couldn’t have been hurt by the law firm’s mistake.

Even though Encyclopaedia Britannica never benefited financially from the extraordinary human/machine interface it had been the first to build, it had reason to be proud of its fundamental achievement. The public filing of its patent application had provided the roadmap for others to follow in quickly developing many other complex software applications besides encyclopedias. The Britannica human/machine interface provided for the first time seamless navigational paths into and through complex databases of mixed media including, text, graphics, maps, videos and audio elements. When developed, the goal had been to have even a nine-year old master the navigation. Of course today some four-year-old children are playing with computers in a way unthinkable in 1989 when the Compton’s Patent application was filed.

Britannica’s landmark invention had partly to do with the evolution of the personal computer in the mid-1980s. But it also had to do with a small group of encyclopedists who had been struggling for many years before to define what an electronic encyclopedia would look like. The culmination of their work happened to coincide with the coming of age of the personal computer in the nascent consumer market. This was the secret sauce that made the breakthrough in the human/machine interface possible.

This fortuitous combination produced a remarkable cultural result. It meant that for the first time, children, as well as adults, could easily and quickly access and navigate complex and media-rich stores of digital information. It also created a plumbing roadmap for the software design that in later years would prove essential in making user friendly such diverse applications as automobile GPS navigation systems and websites on the internet.

Four pioneers in the development of computer interfaces stand out: Vannevar Bush, Ted Nelson, Douglas Engelbart, and Alan Kay. Each made exceptional contributions to the developing field of how humans interact with machines and each helped set the stage for Encyclopaedia Britannica’s unique invention in the 1980s. Two of the four, Bush and Kay, even directly applied their thinking specifically to the problem of building an electronic encyclopedia,

Vannevar Bush

The scientist with the most penetrating early vision of the machine’s potential role in helping us easily access the growing storehouse of human knowledge was Vannevar Bush. After he received a joint doctorate in electrical engineering from the Massachusetts Institute of Technology and Harvard in 1916, Bush showed a bent for military applications by inventing a submarine detection device during World War I. Then in the 1920s at MIT, he began to design and build analog computers. These early machines used voltage variances to reflect different numeric values.

These machines were the precursors to today’s binary language, digital computers that use zeros and ones to represent data. In 1928, Bush was issued a pioneering patent for one of his computers and by 1935 his Rockefeller Differential Analyzer was the most powerful computer of its day. It was quickly put to the task of solving problems associated with the development of long- distance power lines. Then, in World War II, it was turned to the task of producing artillery ballistics tables to assist the military.

At the beginning of World War II, Bush made recommendations to President Franklin Roosevelt about how to organize scientific research to keep the military abreast of new technologies. Then, during the war, Bush headed the federal government’s Office of Scientific Research and Development. It has been said that radar (from the acronym for “radio detection and ranging”) won the war, and the atomic bomb ended it. Bush and his Office had played a crucial role in both developments.

Towards the end of the war, Bush gave considerable thought to the potential application of computers to peacetime requirements and their likely evolution in the post- war era. He came to believe computers could play an important peacetime role in managing the increasing store of humanity’s accumulated knowledge.

The Atlantic Monthly’s As We May Think Article 1945

 In a now landmark article for the Atlantic Monthly published in July 1945, entitled “As We May Think,” Bush laid out a vision of a world in which computers would be central to our social and business life. The article remains to this day, stunning in the accuracy of its perceptions regarding the likely evolution of computing. In an introduction describing the thrust of Bush’s article, the Atlantic Monthly’s Editor wrote, “Now, says Doctor Bush, instruments are at hand which, if properly developed, will give men access to and command over the inherited knowledge of the ages.” No small step for Mankind that.

In the article, Bush looked at recent advances, such as the cathode ray tube, dry photography and microphotography and pondered how logical extensions of these technologies might be applied to create a future miniaturized Encyclopædia Britannica:

The Encyclopædia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van.

Although Bush thought in terms of microfilm rather than magnetic drives, optical discs or silicon wafers for data storage, he conjured up a likely playback machine for a high-capacity storage medium that closely resembles the personal computer of today.

Bush called it a Memex and described it this way:

Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, ‘memex’, will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding    speed  and flexibility. It is an enlarged intimate supplement to his memory. … On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard and sets of buttons and levers. Otherwise, it looks like an ordinary desk. In one end is the stored material . . . Wholly new forms of encyclopedias will appear ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.

Bush did a prescient job describing in 1945 his idea of what a personal computer of the future might look like. This is particularly true given the required reliance on vacuum tubes for the computers in that era. Vacuum tubes were a great limitation for the computers of the time.

Though the transistor was invented in 1947 by physicists at Bell Telephone Laboratories, the shift from slow, heat producing

vacuum tubes that often burned out, to cooler, more powerful and reliable transistors did not unfold overnight.

For instance, when the March 1949, issue of Popular Mechanics surveyed the then state of the art ENIAC computer (from “Electronic Numerical Integrator And Computer”), the potential impact of the transistor, let alone the microprocessor chip, was entirely missing:

Where a calculator on the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps weigh 1.5 tons.

Ted Nelson – Hypertext Envisioned and Pursued

 Another element of the information management challenge that Bush understood was the fact that quickly finding information through data compression and advanced displays didn’t solve the need to move with ease from one type of pertinent information to different, but related, information. He recognized that there remained a need for a human/machine interface that more realistically mirrored the way people thought.

And, so, in one final burst of creative insight, Vannevar Bush dreamed up what we call today “hypertext” or “hyperlinking.” That’s the highlighted text or interactive graphic on a computer screen that, when clicked upon with a mouse, takes the user to related information stored in a different location.

Bush saw that static indices were an imperfect way to search for and access information and what was needed was a more direct way of moving from one thought to a related one. He understood that a major limitation in quickly accessing desired information was the absence of ways to associatively access that information. In short, he saw the need for a random-access mechanism that would also provide quick connections to related information in different locations– hyperlinks as we now refer to them. As Bush put it:

Mere compression, of course, is not enough; one needs not only to make and store a record but also be able to consult it. Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and re-enter on a new path. The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. Selection by association, rather than indexing, may yet be mechanized.

While Ted Nelson’s software concept named Project Xanadu, was unable to be reduced to practice notwithstanding decades of fitful development, researchers today look back on Nelson’s ideas about hypertext as influential in how people thought about computer interface concepts and the potentially revolutionary nature of hyperlinks. Showing Changing Drafts of the Declaration of Independence

Ted Nelson’s parents were Hollywood royalty. Father Ralph Nelson directed the 1963 movie Lilies of the Field that led to Sidney Poitier winning the Academy Award for Best Actor. His mother was actress Celeste Holm who was nominated for her performance in the 1950 movie All About Eve.  With a BA in philosophy from Swarthmore College, a small liberal arts school in Pennsylvania founded by Quakers, Nelson started graduate school in sociology in 1959 at the University of Chicago. Moving on to Harvard, he received his MA in 1962.

It was at Harvard that he began to work on a “writing system” that would let people store what they had written, change it, and print it out. His concept included being able to see alterations in a side-by-side format that would also retain the train of changes. As Project Xanadu evolved through the decades of the unsuccessful effort to produce a useful and commercial software product, hints of what could be in store were evident, but never made workable.

Nelson used the term “hypertext” in several papers he published in 1965. Though the code for Xanadu could never be written that would make dream come true, the search went on to find a workable way to usefully connect non- sequential text. Nelson published his ideas in a paper submitted to the Association for Computing Machinery in 1965. He further developed them in his books Computer Lib/Dream Machines (1974) and Literary Machines (1981).

In the 1950s and 1960s, the utility seen in the foresight and musings of Bush and Nelson was still far away given the state of computer development at the time. This was the era of big iron, as the IBM and other mainframe computers were known. Even with their growing power and scale, they could not yet manage the facile integration of images with text, let alone coupling them with sound and video. While continuing to develop, the speed and processing power of big iron’s central processing units remained limited in their utility.On the storage end of things. magnetic drum memory devices had come to market in 1950. They functioned by storing information on the outside of a rotating cylinder coated with ferromagnetic material. This was circled by read and write heads that remained in a fixed position.

Douglas Engelbart – The Mouse and Graphical User Interface

Douglas Engelbart was born in Portland, Oregon, in 1925. When he died in 2013, Ted Nelson gave an impassioned eulogy at his memorial service. You get a good view of Nelson’s charismatic personality as he rails against the forces he believes held himself and Engelbart back during their lives.

Engelbart had been drafted into the Navy in World War II, where he had served as a radar technician. Perhaps his familiarity with cathode ray tubes prepared him for the role he was to play later in the evolution of the visually centric human/computer interface. While awaiting discharge from the Army in the Philippines at the end of the War, he had read Bush’s article, “As We May Think.” As it turned out, Bush’s precepts remained at the center of Engelbart’s later career in computer science. When he got home, he pursued an education in electrical engineering, receiving a B.S. from Oregon State University in 1948 and a Ph.D. from the University of California, Berkeley, in 1955.

After 1957, when the Soviet Union launched Sputnik, the first earth orbiting satellite, the U.S. government, through the Department of Defense’s Advanced    Research Projects’ Agency (ARPA), and the Air Force Office of Scientific Research made funds available to further research in computer science. Engelbart had joined a group at the Stanford Research Institute (SRI) in Menlo Park, California, and in 1962, under a contract with the Air Force Office of Scientific Research, wrote a seminal paper building on Vannevar Bush’s earlier concepts. In the paper Augmenting Human Intellect: A Conceptual Framework, he sketched out the basis of his advanced thinking on the development of a human/machine interface.

The paper cites Bush’s Memex as important in thinking about next steps not in building a better computer, but in building a better way for humans to interact with the machines so as to leverage the unique powers of human intellect so that it can be efficiently applied to analyze the vastly increasing body of mankind’s knowledge. Engelbart writes in the paper:

The Memex adds a factor of speed and convenience to ordinary filing-system (symbol-structuring) processes that would encourage new methods of work by the user, and it also adds speed and convenience for processes not generally used before. Making it easy to establish and follow the associative trails makes practical a new symbol-structuring process whose use can make a significant difference in the concept structuring and basic methods of work.

It is also probable that clever usage of associative-trail manipulation can augment the human’s process structuring and executing capabilities so that he could successfully make use of even more powerful symbol-structure manipulation processes utilizing The Memex capabilities. An example of this general sort of thing was given by Bush, where he points out that the file index can be called to view at the push of a button, which implicitly provides greater capability to work within more sophisticated and complex indexing systems.

Later in the 1960s, Engelbart and his colleagues at SRI, particularly William K. English and John F. Rulifson, created what they called the “Online System (NLS).” They also developed a graphical user interface (GUI, pronounced “gooey”) to facilitate operating it.

In the 1960s, in corporate America, universities and the government, “big iron” IBM mainframe computers ruled. Input into computers was still done largely through punch cards. Output was typically paper as well. Standard computer output to a visual device was still a printout.

These machines were not for ordinary folk, as they were almost entirely devoted to a triad of commercial, scientific, and number crunching users. It was quite a departure for Engelbart and his band of software engineers to focus on a highly visual interface, one that even lay people might master. Their unique approach to GUIs and computing led to the development of basic tools such as the mouse, hypertext linking and word- processing in a windows environment.

On December 9, 1968, Engelbart demonstrated his NLS at the Fall Joint Computer Conference in San Francisco. Those who witnessed his use of a keyboard, display screen and mouse knew they were present at an unusual moment. It’s not surprising that footage from this event was later put on display at the Smithsonian Museum’s exhibit on the Information Age. The combination of the mouse as a tool to interact with the display screen was a giant home run for those present and for the generations of computer users to follow.

Alan Kay

Alan Kay

                        Alan Kay

The Department of Defense’s Advanced Research Project Agency funding of SRI’s work dried up in the early 1970s. When Engelbart’s Stanford Research Institute  activity center closed in 1977, a number of its computer researchers moved on to Xerox Corporation’s Palo Alto Research Center (PARC) to carry on their work on human/computer interfaces.

PARC researchers, including notably Alan Kay, continued to focus on marrying graphics and animation to computer systems. They also thought about simpler interfaces that even children could interact with. Pertinent to Britannica, Kay would also focus later on the likely nature of an electronic encyclopedia.

Kay’s early education had had a lot to do with computers. After a tour in the Air Force working on IBM computers, Kay had enrolled at the University of Colorado, receiving his undergraduate degree in mathematics and molecular biology in 1966. In 1969, he received his PhD in computer science from the University of Utah. His thesis was about graphical object orientation. After teaching two years at the Stanford Artificial Intelligence Laboratory, Kay moved on to PARC, where he focused on bitmap displays, windowing, and the point-click-and-drag user interface.

When Steve Jobs and his colleagues at Apple visited PARC in 1979, they saw the future of computing in what Kay and his colleagues had been working on. Apple’s later unique graphical user interface reflected PARC’s cutting edge approach to interface design. Not surprisingly, Kay later served Apple directly as a research Fellow, before serving in a similar capacity for The Walt Disney Company, and, beginning in November 2002, for Hewlett-Packard. Through the work of Nelson, Engelbart, Kay and many others, Bush’s early ideas about advances in computing technology evolved and, by the early 1980s, computing machines had begun to enter the consumer mainstream.

However, the prevailing operating system displays of the day were still arid and text centric. There were no high resolution or color displays. Also missing was the much larger local storage capacity required to play the game of dynamic knowledge management.

As a result, dreaming up a theoretical machine with an interface for ordinary folk and filled with programs rich in data and loaded with computer-based hyperlinks remained much easier to do than actually building one. Many such as Ted Nelson and Alan Kay had begun to think the interface side of things through. Kay in particular gave extended thought to construction of a complex encyclopedic database.

However, the stage had been set for the big breakthrough: computers built for the consumer market. Louisiana Senator Huey Long’s depression era campaign promise of “a chicken in every pot” became “a computer in every home” for Apple and IBM in the 1980s. Time magazine made the IBM Personal Computer “Machine of the Year” in 1981, and the next year Steve Jobs of Apple made the cover. This was testament to the fact that the computer finally was moving out of its prior confines of big government, big business, and big universities and into the home.

The Proper Study of Mankind

Pope, Alexander - Proper StudyAlthough a print publisher throughout its long life, Encyclopaedia Britannica had been keeping abreast of these computer developments closely. When the first CD- ROM (for Compact Disc-Read Only Memory) storage discs came out in 1985, Britannica had just put the finishing touches on its multi-decade, massive rewriting of its 1928 14th Edition. The 15th Edition had originally been published in 1974 in a 30-volume set. The 15th Edition was structurally rounded out n 1985 with the addition a separate, two-volume index to the 15th Edition.

This redesign of the Encyclopaedia Britannica in the several decades before the CD- ROM-based Compton’s Encyclopedia launch was a critical precursor to EB’s invention. The Britannica multimedia search system patent would not have been possible without the specialized learning that grew out of the computer-assisted design of the 15th Edition print set. When the Compton’s Patent was reissued by the Patent Office in 2002 after a lengthy reexamination, the stage was set for Britannica to exploit its achievement monetarily.

English poet Alexander Pope began the second epistle of his 1732 work An Essay on Man with this couplet:

Know then thyself, presume not God to scan; The proper study of Mankind is Man.

His reference to our genome-embedded drive to understand ourselves and catalog our knowledge is symbolized and given tangible shape by the encyclopedic form. The long, continuous history of the encyclopedia in our civilization is evidence that our collective need for self-examination is hard- wired into our brains.

Thus, the presence of a reference publisher at the center of a critical human/machine interface development in the 1980s was not entirely an accident. It stemmed in part from the very nature of encyclopedias in modern society.

The word “encyclopedia” comes from the Greek words enkyklios, meaning general, and paideia, meaning education The effort to create a system of knowledge or circle of learning in the form of an “encyclopedia” spanning humankind’s knowledge has been with us for over 2,000 years, although it hasn’t always been called this. Speusippus, who died in 339 BC, recorded his uncle Plato’s thinking on natural history, mathematics, and philosophy. Speusippus also apparently attempted to record detailed descriptions of different species of plants and animals.

However, it was Denis Diderot’s Encyclopedie ou Dictionnarie raisonne des Sciences, des Arts, a et des Metiers, published in 1751 in Paris, that first popularized the use of the term encyclopedia to describe works containing a broad compendium of knowledge. Shortly thereafter, in 1768, the first edition of the Encyclopædia Britannica, the oldest and most comprehensive English- language encyclopedia, was published in Edinburgh, Scotland.

The Encylopaedia Britannica First Edition

EB First EditionTAhe three-volume First Edition of the Encyclopædia Britannica paid homage to its classical roots in two conspicuous ways. One was a departure from the conventional spelling of encyclopedia.

The use of the æ ligature preserved an ancient bequest of Greek and Roman scribes used to denote diphthongal pronunciation. Even by 1768 this device had fallen out of use except in the most rarefied of contexts.

The other nod to annuity was the Latinate title itself. It could easily have been called the British Encyclopedia, since Latin had long ceased to be the lingua franca of the educated. In the more than two and a half centuries years since that first edition, Britannica’s stewards have continually changed everything else about the work, but they have always left its unusual title untouched.

The current 15th Edition was first published in 1974. The last print set bore the 2010 year on its copyright and the permanent cessation of printing the Encyclopaedia Britannica was announced in 2012.

Although there were regular revisions o f p r i nt e di t i ons published, s i nc e t he 193 0s , readers typically kept their sets up to date by annually by buying yearbooks that review recent developments.

Today, the Encyclopaedia Britannica is available to a global audience never dreamed of in the history of the print set. In the current era, the online version of Encyclopaedia Britannica receives over 7 billion annual page views, in more than 150 countries, with in excess of 150 million students using it in more than 20 languages.

The Encyclopedist’s Art


  EB Editor Philip W. (“Tom”) Goetz

In the twentieth century, encyclopedists were not been the only people to worry about how to facilitate access to an ever-growing sum of knowledge. The problem arising from the information explosion of modern times was also noticed by those who helped create it. In particular, the scientists and mathematicians who had created whole new disciplines of knowledge, such as atomic physics and computing machines, had also begun to think about how to increase efficient access by their colleagues and lay people to growing domains of information.

Since the mission of an encyclopedia is to encompass in an abbreviated and accessible form all of our knowledge about everything, the editorial investments needed to create encyclopedias have always been substantial. As a result, the number of encyclopedias has always been relatively few. Also, while there are several thousand distinguished outside contributors asked to write articles for an encyclopedia such as the Britannica (more than 4,000), there is a much smaller number of career encyclopedists charged with the actual design and creation of the work and its ongoing revision.

In the modern era, professional encyclopedists around the world working continuously in the English language have mostly numbered in the hundreds rather than the thousands. And for over two centuries, the encyclopedists at Britannica have remained the most skilled and respected of their breed. The task of an encyclopedist is an odd one. There are not many of these folks around, and the few that around tend to spend their days in single- minded thought on how best to organize a brief, narrative summary of our cumulative understandings of history, art, literature, science, religion, philosophy, and culture.

The encyclopedist’s art has traditionally been more of what to leave out, rather than what to put in.

During my 28-year tenure at Britannica, I had the privilege of working frequently with EB’s Editor for much of that time, Phillip W. (“Tom”) Goetz.