A Brief History of the Internet

1998 Walt Howe
(Last updated October 24, 1998)

The Internet was started in 1969 under a contract let by the Advanced Research Projects Agency (ARPA) which connected four major computers at universities in the southwestern US (UCLA, Stanford Research Institute, UCSB, and the University of Utah). The contract was carried out by BBN of Cambridge, MA and went online in December 1969. By June 1970, MIT, Harvard, BBN, and Systems Development Corp (SDC) in Santa Monica, Cal. were added. By January 1971, Stanford, MIT's Lincoln Labs, Carnegie-Mellon, and Case-Western Reserve U were added. In months to come, NASA/Ames, Mitre, Burroughs, RAND, and the U of Illinois plugged in. After that, there were far too many to keep listing here.

When Senator Ted Kennedy heard in 1968 that BBN had won the ARPA contract for an "interface message processor (IMP)," he sent a congratulatory telegram to BBN for their ecumenical spirit in winning the "interfaith message processor" contract.

The Internet was designed to provide a communications network that would work even if some of the sites were destroyed by nuclear attack. If the most direct route was not available, routers would direct traffic around the network via alternate routes.

The early Internet was used by computer experts, engineers, and scientists. There was nothing friendly about it. There were no home or office personal computers in those days, and anyone who used it, whether a computer professional or an engineer or scientist, had to learn to use a very complex system.

Ethernet, a protocol for many local networks, appeared in 1974, an outgrowth of Harvard student Bob Metcalfe's dissertation on "Packet Networks." The dissertation was initially rejected by the University for not being analytical enough. It later won acceptance when he added some more equations to it.

The Internet matured in the 70's as a result of the TCP/IP architecture first proposed by Bob Kahn at BBN and further developed by Kahn and Vint Cerf at Stanford and others throughout the 70's. It was adopted by the Defense Department in 1980 and universally adopted by 1983.

The Unix to Unix Copy Protocol (UUCP) was invented in 1978 at Bell Labs. Usenet was started in 1979 based on UUCP. Newsgroups, which are discussion groups focusing on a topic, followed, providing a means of exchanging information throughout the world . While Usenet is not considered as part of the Internet, since it does not share the use of TCP/IP, it linked unix systems around the world, and many Internet sites took advantage of the availability of newsgroups. It was a significant part of the community building that took place on the networks.

Similarly, BITNET (Because It's Time Network) connected IBM mainframes around the educational community and the world to provide mail services beginning in 1981. Listserv software was developed for this network and later others. Gateways were developed to connect BITNET with the Internet and allowed exchange of e-mail, particularly for e-mail discussion lists. These listservs and other forms of e-mail discussion lists formed another major element in the community building that was taking place.

As the commands for e-mail, FTP, and telnet were standardized, it became a lot easier for non-technical people to learn to use the nets. It was not easy by today's standards by any means, but it did open up use of the Internet to many more people in universities, in particular. Other departments besides the computer, physics, and engineering departments found ways to make good use of the nets--to communicate with colleagues around the world and to share files and resources. Libraries, which had been automating their catalogs went a step further and made their automated catalogs available to the world.

While the number of sites on the Internet was small, it was fairly easy to keep track of the resources of interest that were available. But as more and more universities and organizations connected, the Internet became harder and harder to track. There was more and more need for tools to index the resources that were available.

The first effort to index the Internet was created in 1989, as Peter Deutsch and his crew at McGill University in Montreal, created an archiver for ftp sites, which they named Archie. This software would periodically reach out to all known openly available ftp sites, list their files, and build a searchable index of the software. The commands to search Archie were unix commands, and it took some knowledge of unix to use it to its full capability.

McGill University, which hosted the first Archie, found out one day that half the Internet traffic going into Canada from the United States was accessing Archie. Administrators were concerned that the University was subsidizing such a volume of traffic, and closed down Archie to outside access. Fortunately, by that time, there were many more Archies available.

At about the same time, Brewster Kahle, then at Thinking Machines, Corp. developed his Wide Area Information Server (WAIS), which would index the full text of files in a database and allow searches of the files. There were several versions with varying degrees of complexity and capability developed, but the simplest of these were made available to everyone on the nets. At its peak, Thinking Machines maintained pointers to over 600 databases around the world which had been indexed by WAIS. They included such things as the full set of Usenet Frequently Asked Questions files, the full documentation of working papers by those developing the Internet's standards, and much more. Like Archie, its interface was far from intuitive, and it took some effort to learn to use it well.

In 1991, the first friendly interface to the Internet was developed at the University of Minnesota. The University wanted to develop a simple menu system to access files and information on campus through their local network. A debate followed between mainframe adherents and those who believed in smaller systems with client-server architecture. The mainframe adherents "won" the debate initially, but since the client-server advocates said they could put up a prototype very quickly, they were given the go-ahead to do a demonstration system. The demonstration system was called a gopher after the U of Minnesota mascot--the golden gopher. The gopher proved to be very prolific, and within a few years there were over 10,000 gophers around the world. It takes no knowledge of unix or computer architecture to use. In a gopher system, you type or click on a number to select the menu selection you want. You can use the U of Minnesota gopher today to pick gophers from all over the World

Gopher's usability was enhanced much more when the University of Nevada at Reno developed the VERONICA searchable index of gopher menus. It was purported to be an acronym for Very Easy Rodent-Oriented Netwide Index to Computerized Archives. A spider crawled gopher menus around the world, collecting links and retrieving them for the index. It was so popular that it was very hard to connect to, even though a number of other VERONICA sites were developed to ease the load. Similar indexing software was developed for single sites, called JUGHEAD (Jonzy's Universal Gopher Hierarchy Excavation And Display).

Peter Deutsch, who developed Archie, always insisted that Archie was short for Archiver, and had nothing to do with the comic strip. He was disgusted when VERONICA and JUGHEAD appeared.

In 1989 another significant event took place in making the nets easier to use. Tim Berners-Lee and others at the European Laboratory for Particle Physics, more popularly known as CERN, proposed a new protocol for information distribution. This protocol, which became the World Wide Web in 1991, was based on hypertext--a system of embedding links in text to link to other text, which you have been using every time you selected a text link while reading these pages. Although started before gopher, it was slower to develop.

The development in 1993 of the graphical browser Mosaic by Marc Andreessen and his team at the National Center For Supercomputing Applications (NCSA) gave the protocol its big boost. Today, Andreessen is the brains behind Netscape Corp., which produces what has been so far the most successful graphical type of browser and server, a condition which Microsoft is trying very hard to overcome.

Since the Internet was initially funded by the government, it was originally limited to research, education, and government uses. Commercial uses were prohibited unless they directly served the goals of research and education. This policy continued until the early 90's, when independent commercial networks began to grow. It then became possible to route traffic across the country from one commercial site to another without passing through the government funded Internet backbone.

Delphi was the first national commercial online service to offer Internet access to its subscribers. It opened up an email connection in July 1992 and full Internet service in November 1992. All pretenses of limitations on commercial use disappeared in May 1995 when the National Science Foundation ended its sponsorship of the Internet backbone, and all traffic relied on commercial networks. AOL, Prodigy, and CompuServe came online. Since commercial usage was so widespread by this time and educational institutions had been paying their own way for some time, the loss of NSF funding had no appreciable effect on costs.

Today, NSF funding has moved beyond supporting the backbone and higher educational institutions to building the K-12 and local public library accesses on the one hand, and the research on the massive high volume connections on the other.

Microsoft's full scale entry into the browser, server, and Internet Service Provider market completed the major shift over to a commercially based Internet. The release of Windows 98 in June 1998 with the Microsoft browser well integrated into the desktop shows Bill Gates' determination to capitalize on the enormous growth of the Internet. Microsoft's success over the past few years has brought court challenges to their dominance. We'll leave it up to you whether you think these battles should be played out in the courts or the marketplace.

A current trend with major implications for the future is the growth of high speed connections. 56K modems and the providers who support them are spreading widely, but this is just a small step compared to what will follow. 56K is not fast enough to carry multimedia, such as sound and video except in low quality. But new technologies many times faster, such as cablemodems, digital subscriber lines (DSL), and satellite broadcast are available in limited locations now, and will become widely available in the next few years. These technologies present problems, not just in the user's connection, but in maintaining high speed data flow reliably from source to the user. Those problems are being worked on, too.

During this period of enormous growth, businesses entering the Internet arena are scrambling to find economic models that work. Free services supported by advertising have shifted some of the direct costs away from the consumer. Services such as Delphi are now offering free web pages, chat rooms, and message boards. Online sales are growing rapidly for such products as books and music CDs and computers, but the profit margins are slim when price comparisons are so easy, and public trust in online security is still shaky.

We live in interesting times!

For more information on Internet history, visit these sites: