November 23, 2024

Interoperability, and the Birth of the Web

Tim Berners-Lee was here yesterday, and he told some interesting stories about the birth and growth of the Web.

I was particularly intrigued by his description of the environment at CERN, where he worked during the relevant years. CERN was (and still is) the European nuclear physics research lab. It had a permanent staff, but there was also a constant flow, in and out, of researchers and groups from various countries and institutes. These people generally brought their own computers, or used the same kinds of computers as at their home institutions. This meant that the CERN computer network was a constantly changing hodgepodge of different systems.

In this environment, interoperability – the ability to make all of these systems work together, by using the same protocols and data formats – is necessary to accomplish much of anything. And so the computing people, including Tim B-L, were constantly working to design software that would allow disparate systems to work together.

This was, in some respects, the ideal environment for developing something like the web. You had a constant flow of people in and out, so institutional memory couldn’t just live in people’s heads but had to be written down. These people were scientists, so they wanted to write down what they knew in a way that would be accessible to many others. You had a diverse and constantly changing network, so your technical solution would have to be simple and workable across a range of architectures. And you had a clever technical staff.

One wonders where the equivalent place is today. Perhaps there is a place with the right ingredients to catalyze the growth of the next generation of online communication/collaboration tools. Perhaps CERN is still that place. Or perhaps our tools have evolved to the point where there doesn’t have to be a single place, but this can happen via some Wiki/chat/CVS site.

Comments

  1. What happened at CERN is that people who cared about an issue were not afraid to step on the toes of existing industries and paradigms. That resulting web technologies were adopted so quickly by non-academic communities is evidence that existing technologies and industries weren’t being responsive and/or innovative enough.

    Turning to today, if you pay any attention to today’s “web 2.0” evangelists, you become aware of the emerging importance of “social networking” tools as a major player in web based application development and group communications. Today it’s easily possible for groups of people to form and communicate about topics they care about. This is related to the availability of tools that enable them to group themselves and information about their communications into meaningful clusters. Traditional institutions may be fearful about this, as I have found in my own research, but I see now way of stemming the tide.

    What we see emerging today with such wide availability of web based communications and applications is that the physical location of creative people in a single institution such as CERN is less important than it once was. Add to that the fact that “virtual” communities can form, regroup, and re-form at the drop of a hat. You now have a situation where creative solutions can much more easily emerge unencumbered by traditional institutional boundaries and be adopted by large numbers of people. Look at the rapid adoption of tools such as Facebook.

    Other than time and money, what are the limiting factors? And why are we starting to hear complaints about the over-blogging of society? My answer: intelligence and creativity are the limiting factors. Maybe what we’re seeing with the explosion of web based communications and publishing is just Sturgeon’s Law that “Ninety percent of everything is crap.”

    All these virtual communities that form and reform at the drop of the hat just don’t have the same brainpower as what we saw at CERN.

  2. Who says the next huge IT innovation will be about interoperabie communication and collaboration? As I recall, before the Web, the conventional wisdom in the commercial IT world was that proprietary networks–remember Interactive TV?–would be the Next Big Thing. There’s probably a similarly widespread assumption out there today that’s about to be proven spectacularly wrong. Maybe it’s the currently fashionable “interoperability/Web Services/XML rules” assumption. Or maybe it’s something else entirely–something that everyone takes so much for granted that it doesn’t even come up in discussions about likely future directions of innovation. Who knows?

  3. Maybe Connie Willis was right in that novel “Bellwether”…the most fruitful source of innovation is a sufficiently chaotic academic campus. 🙂

  4. It seems to me that major research universities (such as Princeton and the University of Washington, to name a couple) are still the sorts of places that are characterized by large numbers of heterogeneous technology that is largely not controlled centrally and very smart and technically savvy staff and users. I wouldn’t be surprised to see major developments continuing to come out of our institutions.

  5. Speaking of collaboration, I heard today that the first thousand-author paper is expected to appear once CERN’s new machine, the Large Hadron Collider, gets going. So there will continue to be an urgent need for better tools at such labs.

  6. I’m a strong believer in having real problems to solve coupled with a strong need to solve them.

    I may be biased (I’m a physicist), but physics – particularly particle physics – pushes the envelope for data collection, analysis as well as collaboration (some experiments have 100s of physicists and students – a friend who does CSCW says he can’t think of another group that tends to solve its collaboration problems as well). It is an ideal breeding ground for things like the invention of the web.

  7. Todd Jonz says

    Ed writes:

    > One wonders where the equivalent place is today….perhaps our tools have
    > evolved to the point where there doesn’t have to be a single place, but this
    > can happen via some Wiki/chat/CVS site.

    Remember the old Linux aphorism about scratching an itch? The way I see it, Tim Berners-Lee had an itch at CERN, so he and his associates there scratched it and inadvertently gave the rest of us the World Wide Web as a result. Isn’t this the principle driving force behind the entire FOSS movement?

    Remember the original HTTP server from NCSA? How many of us had our hands in that code to some extent or other? How many of us shared our patches with what would become the Apache project? Most of the commercial servers available in the mid-90s were derived from this same code base, but today, some thirteen years later, the commercial products all lag far behind Apache in terms of the number of deployments. At least in this particular instance the open source world seems to have gotten things right, while commercial interests have not. And back in ’93 the NCSA server only served HTTP requests. Today the Apache server is so extensible that it often serves as the base of an applications stack the includes the likes of PHP, mySQL, etc. which can easily be assembled into something useful by your average systm administrator. None of the commercial products today is thie versatile and flexible.

    Perhaps the non-commercial nature of the original WWW components out of CERN had more to do with its evolution than did the technical or social environment at CERN. Perhaps things unfolded as they did because CERN was in the business of doing particle physics, not software development, and it were free to share its work without trying to monetize it. I think the bigger question is whether this could happen again today. If some young turk at CERN (or someplace like it) were to develop the Next Big Thing today, would it be released unencumbered to the outside world, or would an effort be made to commercialize it?