Tim Berners-Lee (TBL), in his first blog post, reminds us of a very important bit of web history. He writes: "The first browser was actually a browser/editor, which allowed one to edit any page, and save it back to the web..." TBL might also have noted that the Enquire program that he wrote in 1980 (10 years before the WWW) supported an edit mode.
The idea of a read/write web had been motivating the work of many hypertext developers like TBL long before the web was born. But, the last 10 years experience with the largely "read-only" web has caused many people to forget that the original idea was to create a writeable, creative space -- not just a network of things to be read. Fortunately, the growth of blogging is finally causing the renaissance of the read/write web. What we don't understand, I think, is how the original idea of the read/write web could have been "lost" -- even temporarily.
Of course, when TBL refers to the "first browser", he's talking about the first browser for his wonder-filled creation, the World Wide Web. There were many hypertext "browsers" before TBL's and many supported read/write operation - although virtually none were networked. (e.g. see the pre-WWW Hypertext 1989, and Hypertext 1987 conferences which had many papers that assumed read/write and collaborative hypertext use.) The odd thing about the TBL's WWW is that while it has done a fantastic job of popularizing Vannevar Bush's idea of associative links (later renamed "Hypertext" by Ted Nelson), it took us far away from the idea of the read/write collaborative web that we've only recently begun to re-discovery through the "invention" of Blogs and Wikis. As TBL says: The "web took off very much as a publishing medium, in which people edited offline". Clearly, while the web is something much grander than anything of which TBL could have dreamed, it is also still somewhat less than what he and other early hypertext developers hoped it would be.
It is somewhat a mystery how, why and when we could have temporarily lost such a powerful idea as the read/write web. I'm sure that in the future, students of the growth of ideas will come up with some definitive answers, however, my personal feeling is that the decisive moments came when the second generation WWW browsers like NCSA's Mosaic, Thomas Bruce's Cello, Lynx and others were released as "read-only" browsers. These early, pattern- and expectation-setting browsers did us all a great service by vastly improving the web reading experience and bringing the web within the reach of less technical users. But, the then unrecognized cost of the wide-spread and rapid adoption of these read-only browsers was the loss of the assumption that a hypertext tool was just as much a tool for the writer as it was a tool for the reader.
The fact that the second generation of web browsers was read-only can be explained, I think, in two ways. First, writing an editor for HTML (a variant of Goldfarb's SGML) was, in those days, quite a black art and much more difficult than writing a viewer. But most important, I think we need to consider the organizational context within which these browsers were developed. We need to think about the problems that the developers were trying to solve. For instance, Lynx was developed as part of an effort to provide an easier to use CWIS or "Campus Wide Information System." It was conceived as the solution to a publishing problem - not a creation problem. Lynx was also probably somewhat influenced by the "publish only" attributes of the then popular DEC VTX product that was the foundation for one of Lynx's earliest versions. (Note: 10 of the 27 CWIS's listed in 1991's RFC1290 relied on the now forgotten DEC VTX product.) Similarly, Cello was conceived of as tool to provide students of law and others in the legal profession with "access to the myriad information resources of the Internet" -- information creation was not a problem that Cello's developer was trying to address. The motivations for NCSA's Mosaic appear to have been similarly focused on "information access" not creation.
Thus, while TBL and others may have seen hypertext and the web as a solution to a wide-range of problems that included not only information access but also it's creation, the implementers of second generation browsers were able to see within the product of TBL's thinking a solution to a smaller but more immediately important problem set. As is often the case, those who copy an idea tend to be selective about what they copy and in the inevitable process of this conceptual editing, some of the original concept is lost even if its form is maintained. The developers of second generation browsers focused on implementing those parts of the idea that addressed the specific publishing and access problems that they were trying to solve. Left on the cutting floor was the immediate path to solving the problems that had motivated earlier developers and thinkers.
Just as TBL's efforts were blunted by the "editing" of later implementers who had simpler goals, I had similar problems with my own attempts to get Hypertext built at Digital Equipment Corporation (DEC) during the 80's -- and I had similar "competitors".
As product manager for ALL-IN-1, DEC's office automation system, I had worked closely with the DEC VTX developers in the early 80's as they adapted the ideas of VideoText to the world of character cell VT100 terminals supported by VAX mini-computers. (For instance, I worked closely with Paul Dickson in defining the very BEEP-like DASL protocol that VTX relied on for inter-machine communications over DECnet.) As a character-cell application, VTX inevitably looked very much like TBL's Enquire even though we were unaware of Enquire (it was never a released "product") and it looked somewhat like the much later Gopher system. VTX was widely distributed (for the time) and it, like Gopher after it, was used primarily in publisher-dominated information "access" applications like Campus Wide Information Systems.
For me, working on VTX and seeing it enter the DEC Office product line was an exhilarating experience since I was only in the computer business after having read Vannevar Bush's 1945 article pre-saging Hypertext when I was in college in 1974. By the time I worked with the VTX folk, I had already been dreaming of hypertext for years... But, VTX wasn't enough. It assumed a highly centralized network and was primarily a publishing platform. I wanted to do more and better -- to better approach what I thought Bush had envisioned.
With the initial help of Marios Cleovoulou and later others including Per Hamnqvist (Per now works with me at PubSub.com) I got the chance to build what I thought was a proper HyperInformation system while with Digital in Valbonne, France in 1986-1989. The system, named "Memex Prototype 1" after Bush's Memex, was a plug-in to the DECWindows (graphical interface) version of TPU (Text Processing Editor) that shipped on VAX/VMS. It was later integrated with VaxNotes which was built on TPU. Memex, by leveraging the DECnet (network) integrated RMS filing system of VMS was able to transparently access and edit files anywhere on Digital's worldwide network - the "EasyNet." (We didn't need to create something like "HTTP" since it was effectively already built into RMS...) Thus, I believe Memex was the first "wide-area-network enabled" and distributed hypertext system. Memex, as an editor integrated tool, also focused just as much on the problem of content-creation as it did on content-display...
While Memex was what I considered a vastly superior approach to DEC VTX and while it enabled read/write use, the problem I had in getting support for the product was that there were very, very few networks of any substantial size in the 80's and thus not very many people who could use the product. Additionally, most of DEC's network customers had already been exposed to the much more controlled environment supported by VTX and saw the VTX approach as a reasonable solution to their existing, limited information publishing needs. Thus, while I usually presented Memex as the start of a whole new revolution in read/write networked interaction, most folk who took the time to evaluate it only considered it as a VTX-like content-publishing tool. They heard my words but "edited" them down to address their specific needs and prior ideas. In fact, I was able to find only two large networks in which the need for collaboration and creation support was high enough to get them to consider Memex. These were ITT Europe and the CERN nuclear physics lab on the French Swiss border. We installed Memex experimentally at ITT and CERN in 1987-1988 but weren't able to get much traction with other customers. (Note: No, I don't know if TBL was aware of Memex being installed at CERN. He doesn't mention it in his book although he does say that we were responsible for introducing him to Vannevar Bush's writings. VaxNotes is mentioned in TBL's 1989 original proposal for the Web. )
Eventually, the Memex project morphed into a formal product: DEClinks (later renamed LinkWorks) and was bundled as part of the VMS operating system as a system service to enable any application to be HyperInformation-enabled. This made VMS the first major operating system to have "HyperInformation" or "HyperText" integrated into the base offering. However, by the time that happened, the industry had already begun to move away from VMS/DECnet and towards Unix, TCP/IP and the second generation World Wide Web browsers. DEClinks/LinkWorks was eventually retired from OpenVMS around 2000...
The same dynamics of "concept editing" that occurred with TBL's Web and in the reaction to my own Memex project is actually fairly common in the software business. (I will, however, spare you by not documenting other obvious examples.) Typically, those who first work on an idea or concept will see that it has a very wide scope of application, however, they are almost always forced to focus on one or another specific, limited scopes in order to present their idea in a relevant fashion to potential adopters who have specific issues addressed by the innovation. Often, in the process of this focused positioning to convince early-adopters to accept an idea or process, much of the full richness of the vision is lost or put aside. The cost of acceptance is thus often the loss of very important, even essential, elements of the vision. Sometimes, the loss is permanent and only the original thinker knows what paths were not traveled... Sometimes, the loss is temporary as others eventually re-discover the lost facets or the original visionaries are able to eventually find a means to get others to understand more of the full vision. This re-discovery of the original vision's depth and breadth is hopefully what we see happening on the Web today as Blogging, Wikis and other read/write tools resurrect the original idea of the read/write Web.