World home page
|Stone. Papyrus. Movable type. Telephone. Television. World Wide Web.
Revolutions in communications pave the way for revolutions in human life. Try to picture the Protestants breaking off from Catholic orthodoxy without the Gutenberg Bible. Or liberal democracies replacing kings without newspapers and pamphlets. McDonald’s and Levi’s conquering the world without tv and movies. Or the Tienanmen Square uprising without faxes.
So what will come of the 1990s’ most astonishing communication development, the World Wide Web—brainchild of a Massachusetts UU, Tim Berners-Lee?
To approach such a question, we must first understand what the Web is. Four components made the World Wide Web possible: first, a hardware network of cables and computers; second and third, two software breakthroughs; and fourth, an ongoing worldwide collaboration to ensure that all the hardware and software can communicate.
The first component, the physical network on which information travels, consists of computers and cable links and is constantly changing (for example computers log on and off via phone lines). The second component was added in the 1970s, when a pair of computer researchers invented a method of electronically “addressing” packets of information so that, once launched onto the hardware network, they would arrive at a particular destination. When you send an e-mail message, you are using these first two components, which make up what we’ve come to call the Internet.
The next two components come from Tim Berners-Lee. Berners-Lee came
up with the second software breakthrough, the program called hypertext,
which allows your computer to jump from one document to another when you
double-click on a blue underlined “link.” But more important than this
technological breakthrough was a conceptual one—Berners-Lee’s vision of
a World Wide Web. Berners-Lee imagined using hypertext plus the Internet
to create a global “public square” where anyone, anywhere, any time, could
communicate anything. It’s because of this visionary concept—and because
he persuaded other techies worldwide to volunteer their expertise and time
to turn his vision into reality—that today you and I can use the cyberspace
information bazaar known as the World Wide Web.
Understated yet enthusiastic, with muscular good looks and thinning blond hair, Berners-Lee, 43, was born in England and now lives with his wife and children in a Boston suburb. His mathematician parents met while working together on the Ferranti Mark I, the first computer sold commercially. Growing up, Berners-Lee solved math and computer problems the way other children watched tv. He went on to study physics at Oxford University—where, as a hobby, he built his first computer using a soldering iron, some early computer processors, and an old television. After finishing at Oxford in 1976, he worked as a computer programmer, arriving in 1980 at CERN, the European Particle Physics Laboratory in Geneva, Switzerland, where he wrote programs that supported the physicists’ work.
There Berners-Lee invented the precursor to hypertext. Frustrated by what he thought of as his poor memory, he drew on some ideas that had been bandied about in computer journals to write a program that let him jump, via links, from one bit of information to another. For instance, in his memory program he might have a standard address-book-type listing for a CERN member—but he could also, from there, link to papers they’d written, projects they were involved in, memos they’d sent, names of their spouses and children, and other potentially useful information.
Soon Berners-Lee realized that his personal memory aid could also solve
a problem in group collaboration: while colleagues held a universe of useful
information in their minds and computers, they weren’t sharing it efficiently.
Often, they had to pass the same information about themselves and their
work to their collaborators over and over again—with the danger, Berners-Lee
wrote, of conveying “half-understandings verbally, duplicating effort through
ignorance, and reversing good decisions from misunderstanding.” Using Berners-Lee’s
program, CERN scientists could not only make a new particle physics paper
available to other researchers but could also link it to every other paper
referenced in it, as well as, if they chose, to their favorite Beethoven
symphony or directions to their offices. The program eventually evolved
into hypertext, the software that now lets you “click” on those highlighted
words on a World Wide Web site and be transported to an entirely different
What was new about hypertext: the links were direct, random, and flexible. That requires some explanation. Earlier computer programs let you reach a variety of other documents—but to get there, your computer’s request for those documents had to travel, via a series of menus, up and down a ladder of information, checking in at the top with a central network authority that could either be a person, like the moderator of an e-mail news group, or an automatic computer function. It’s as if every time you wanted to walk to your neighborhood ice cream stand, you had to go to the town square and get directions and approval from the local cop. Berners-Lee invented a program that entirely bypassed that central authority, letting you walk directly from your house to the ice cream stand (or the local porn shop or an ACLU meeting) without telling anyone where you were going. The downside is that, now, there’s no central authority to inform you that the ice cream stand has been torn down—so that sometimes you may click on a link and wait patiently for the results, only to get that annoying message “404: File Not Found.”
What’s more, without a central authority, not only could you make direct links, you could also make random links—meaning they didn’t have to make sense to anyone but the creator of the website. No central computer checked to ensure that a source document and the destination document were related to each other: a hypertext document could let users click on a physics paper, a photo album, and a chili recipe.
Finally, the links were flexible, meaning that a website designer could
change them at whim, without registering the change anywhere but on the
But Berners-Lee’s most revolutionary idea—first proposed in 1989—was a global hypertext project to be known as the World Wide Web, which would let Internet users anywhere share their knowledge with all other users. Berners-Lee conceived of hypertext tools that would enable any user to view any document, no matter what software or operating system had been used to create it. Just as important, he envisioned this network as free and open to anyone with Internet access, and not just confined to, say, CERN employees. This way, Berners-Lee imagined, physicists and programmers worldwide could collaborate and argue about each others’ ideas without being hindered by differences in their computers or software.
Encouraged by his CERN colleagues, he created the first tools that enabled
you to read and write hypertext, and he wrote early specifications for
URLs, HTTPs, (universal resource locators and hypertext transfer protocols,
the “addresses” that you type in your Web browser’s window to get you where
you want to go), and HTML (hypertext markup language, which your Web browser
reads to allow you to view a document). All these he envisioned as just
a start, encouraging programmers worldwide—in keeping with the computer
hackers’ culture—to join in on their own time to discuss, improve, and
refine the tools until they were so good that building a website and surfing
the Web were possible not only for experts but even for those of us who
have no idea how our computers work.
Berners-Lee’s initial World Wide Web site was completed in 1991. This
would have been the moment for Berners-Lee to copyright hypertext and develop
the Web as a private system. He did not, and Berners-Lee is regularly asked
why the Web hasn’t made him a multimillionaire. The question brings a slight
but distinct wince. “The only way the World Wide Web would have taken off
was as a totally open, free system,” answers Berners-Lee. “Any attempt
to claim intellectual property would have killed it and has killed other
projects in the past and will kill other projects in the future.” Keeping
the World Wide Web software free was in keeping with a technology developed
“by individuals thinking a global hypertext system would be a neat idea,
working on it without management approval, on lunch breaks, after hours—installing
a little software or writing a piece of code or putting a server up or
installing a browser,” Berners-Lee adds.
After developing the CERN website, Berners-Lee continued working on the design of the Web, coordinating feedback from volunteers and users worldwide. As the number of websites grew exponentially, users around the world were wondering how its further development would be coordinated and overseen so as to keep it open and free to all. And so in 1994, Berners-Lee joined the Massachusetts Institute of Technology Laboratory for Computer Science as director of the newly formed W3C, or World Wide Web Consortium, which works to ensure that all new Web technologies can communicate freely with each other. Toward that end, Berners-Lee and his W3C colleagues encourage and moderate discussions among the various competing interests trying to capitalize on the Web—convening ongoing talks among software companies like Microsoft and Netscape, cell-phone innovators Nokia and Motorola, inventors of security cameras that communicate via the Web, and others. The W3C discussions help people and organizations like these work to a constantly evolving set of specifications and standards and prevent the wide-open Web from being cut up into warring Balkan states. Without W3C’s ongoing discussion about emerging standards, sooner or later, you wouldn’t be able to travel or “link” from one site to another without buying the particular software or hardware used in each website. Berners-Lee functions as a kind of an anti-authority in charge of technological consensus among competitors, a sort of cyber-peacekeeper.
Being Web peacekeeper and making sure that we can all move freely across
cyberspace doesn’t pay as well as being Microsoft’s Bill Gates. And so
no secretary guards Berners-Lee’s door, and Berners-Lee’s modest office
is at the end of a fluorescent-lit hallway crowded with PCs, printers,
plastic chairs, and all-purpose tables. It overlooks a parking lot and
a maze-like intersection of faceless MIT buildings. Berners-Lee’s reward
for his innovations may not be monetary, but it’s a reward nonetheless:
the thrill of having helped build, guide, and oversee the constant evolution
of the Web.
But what are the social consequences of Berners-Lee’s invention? Berners-Lee explains that, as with any new technology, the Web’s relationship to society developed in three stages. In the first, the Internet was both invented and used by what Berners-Lee calls a countercultural “group of long-haired hackers” who believed in keeping the Internet (and later the Web) “a society of total freedom in which there were no laws.” In the second stage, as Internet and Web use spread to the mainstream, the usual social conventions, values, and laws started to be applied. Now, in the third stage, people are realizing that these conventions, values, and laws will have to be adapted or reconceived if they are to work for the Web—especially those concerning privacy, individual ownership of intellectual property, and the limits placed by national borders on both commerce and concepts.
For example, it’s now possible to invade people’s lives in new and startling ways: no law prevents someone from installing a video camera in her own front window, pointing it at your door, and broadcasting your comings and goings over the Web. Nor did existing laws prevent Microsoft from embedding an identifying number in the latest version of its Windows software so that each document your computer created or handled could be traced back to you. (The company has promised to release an update that removes this function.) Unscrupulous hackers, meanwhile, can engage in new varieties of criminal activity by investigating and selling your medical history or duplicating your online “identity” to, say, send nasty e-mail to your boss that appears to come from you—or even to steal from your company.
Even more significant to many people is the way the Web disarms the old enforcers of intellectual supervision—from totalitarian governments, who may still be able to restrict the information publicized in newspapers or on the radio but who have a harder time controlling access to the Internet, to parents who wish to monitor their children’s reading. Finally, in the Web Age no one knows exactly what happens to copyright, the idea that you own what you create: If anyone can paste or type this article into an e-mail message or post it on a Web page—a much easier method of distribution than, say, photocopying it and passing it out—what value does one’s copyright have left? Questions like these proliferate. Does the e-mail you send at work belong to you or to your employer? Should corporate e-mail be as private as cafeteria conversations, or can it be treated like paper documents—as when, in its antitrust case against Microsoft, the US government subpoenaed Microsoft employees’ e-mail messages?
Berners-Lee does not pretend to have the answers. Rather, he reminds
us that it’s only appropriate that we are now struggling over how to apply
our habits and beliefs to this new technology. New norms and laws will
be worked out by a wide variety of interests, from individual computer
users to the chip-maker Intel to various legislatures.
Berners-Lee envisions our most common worries about the potential harm the Web might do as lying at two ends of a continuum—the dystopic extremes of too much individualism and too much homogeneity. In the US especially, we worry about formerly isolated hatemongers who use the Web to find each other, publicize their vicious philosophies, and exchange, say, bomb-making recipes or abortion providers’ home addresses. Berners-Lee places these kinds of extremists on the individualistic end of his continuum, dryly commenting that we fear such people will dig themselves into “a cultural pothole so deep and steep and slippery-sided that when they stagger out of the computer den into the street and find someone different from themselves, the only thing they can do is shoot him.”
Europeans, on the other hand, are especially concerned that the Web could promote a uniform corporate world culture, where national, ethnic, and individual variety is erased, and we all think the same Disney thoughts and eat the same McFood. Certainly the Web is one more in an array of media, like movies, video, telephone, fax machines, and satellites, that can spread what Berners-Lee calls “the lowest common denominator—the McDonald’s American culture—so that everyone’s vocabulary will shrink and the diversity of cultures as we know it will vanish.”
In the face of these two fears—too many differences, too much similarity—Berners-Lee exudes an almost physical optimism, a kind of healthy person’s immunity to dark imaginings. His temperamental poise expresses itself in his frequent use of the word “balance,” and he has a humanist’s faith in progress and reason. He believes that, in the end, humanity will avoid the dangers of the Web. According to him, societies and people have always balanced these two extreme tugs. Each of us in his or her daily life must balance impulses toward individualism and homogeneity. If we spent our entire lives alone, at one extreme, or hopping between world conferences, at the other, most of us would lose our minds—either creating a private, unintelligible monoculture or talking only in “CNN-speak” and only about global issues. Yet most of us manage simultaneously to lead private, group, and world lives, argues Berners-Lee. Put all of us together, and things inevitably balance out, he believes, for the greater social good.
Furthermore, like so many people in computing and high tech, whose work
depends on the free flow of ideas for inspiration, Berners-Lee is a fundamentalist
libertarian, believing that ideas and information should move without hindrance.
Thus when you find a hate site on the Web, “it is right to be horrified,”
he says. Being horrified and taking some action is part of the process.”
Freedom of speech, in this view, can be countered only by more freedom
of speech—whether on real-world or on cyberspace street corners.
Berners-Lee sees parallels between his high-tech libertarianism and his Unitarian Universalist faith. He rejected the Anglican Church of his childhood early and rejected as well the doctrinaire Christianity he encountered in Geneva. When he arrived in Massachusetts in 1994 to work at MIT and found a UU church in his neighborhood, he says, it was “like a breath of fresh air, like coming home—somewhere to talk about spiritual matters and ethical matters and things that count, a place to think and talk and listen without being required to swallow some dogma.”
He appreciates an environment where everyone is free to think and debate,
whether it’s the Web or his congregation. “In a Web structure each person
behaves like a neuron in a brain,” he says, “trying to figure out how best
to live in the world and play their part. Systems like that tend to be
more resilient and more able to evolve”—toward a better computer system,
a better community, or both at once. UUs, Berners-Lee notes with self-mocking
amusement, “are all searching for the truth, but we get very suspicious
when anyone says they’ve found it.”
Berners-Lee isn’t so arrogant as to make predictions about the Web’s future: what he finds exciting is being there at the center, watching for “some other group of people to get together and come up with things we can’t even imagine yet,” he says. He’s looking forward to being surprised. And he adds that he’s sure that whatever is coming will make the world a better place.
To hear such faith from someone at the end of the bloody 20th century, with its vicious and evil wars and purges, is both astonishing and heartening. When pressed to explain such optimism, Berners-Lee points to the Web itself as an example of what can result from human beings’ collaboration, hard work, and goodwill. “The fact that something so global can be set up by the grass-roots efforts of people in a lot of different countries in totally different situations is itself a tremendous message of hope—a message about the nature of human beings,” he says, leaning forward enthusiastically in his corduroys, brown vest, and blue-striped oxford shirt. “Allowing everybody to write their own Web page and express some vision of what is true and right—whether that’s a car accident yesterday or how the universe is run—gives us this decentralized, distributed quest for truth, harmony, and understanding.”
E.J. Graff is affiliated scholar at Radcliffe
College's Schlesinger Library. Her book, What Is Marriage For? will
be published by Beacon Press in June 1999.
World main page
Send a letter to the editor
Subscribe to World
Unitarian Universalist Association
25 Beacon Street, Boston, MA 02108 -- Telephone (617) 742-2100 -- Fax (617) 367-3237
This page was last updated August 12, 1999 by firstname.lastname@example.org.
All material copyright © 1999, Unitarian Universalist Association
There have been 269 accesses to this page since May 3, 1999
Address of this page: http://www.uua.org/world/0599feat4.html