In Praise of Evolvable Systems
Why something as poorly designed as the Web became The Next Big
Thing, and what that means for the future.
If it were April Fool's Day, the Net's only official holiday,
and you wanted to design a 'Novelty Protocol' to slip by the
Internet Engineering Task Force as a joke, it might look something
like the Web:
-
The server would use neither a persistent connection nor a store-and-forward
model, thus giving it all the worst features of both telnet
and e-mail.
-
The server's primary method of extensibility would require spawning
external processes, thus ensuring both security risks and unpredictable
load.
-
The server would have no built-in mechanism for gracefully apportioning
resources, refusing or delaying heavy traffic, or load-balancing.
It would, however, be relatively easy to crash.
-
Multiple files traveling together from one server to one client
would each incur the entire overhead of a new session call.
-
The hypertext model would ignore all serious theoretical work
on hypertext to date. In particular, all hypertext links would
be one-directional, thus making it impossible to move or delete
a piece of data without ensuring that some unknown number of
pointers around the world would silently fail.
-
The tag set would be absurdly polluted and user-extensible with
no central coordination and no consistency in implementation.
As a bonus, many elements would perform conflicting functions
as logical and visual layout elements.
HTTP and HTML are the Whoopee Cushion and Joy Buzzer of Internet
protocols, only comprehensible as elaborate practical jokes.
For anyone who has tried to accomplish anything serious on the
Web, it's pretty obvious that of the various implementations
of a worldwide hypertext protocol, we have the worst one possible.
Except, of course, for all the others.
|
MAMMALS VS. DINOSAURS
The problem with that list of deficiencies is that it is also
a list of necessities -- the Web has flourished in a way that
no other networking protocol has except e-mail, not despite
many of these qualities but because of them. The very weaknesses
that make the Web so infuriating to serious practitioners also
make it possible in the first place. In fact, had the Web been
a strong and well-designed entity from its inception, it would
have gone nowhere. As it enters its adolescence, showing both
flashes of maturity and infuriating unreliability, it is worth
recalling what the network was like before the Web.
In the early '90s, Internet population was doubling annually,
and the most serious work on new protocols was being done to
solve the biggest problem of the day, the growth of available
information resources at a rate that outstripped anyone's ability
to catalog or index them. The two big meta-indexing efforts
of the time were Gopher, the anonymous ftp index; and the heavy-hitter,
Thinking Machines' Wide Area Information Server (WAIS). Each
of these protocols was strong -- carefully thought-out, painstakingly
implemented, self-consistent and centrally designed. Each had
the backing of serious academic research, and each was rapidly
gaining adherents.
The electronic world in other quarters was filled with similar
visions of strong, well-designed protocols -- CD-ROMs, interactive
TV, online services. Like Gopher and WAIS, each of these had
the backing of significant industry players, including computer
manufacturers, media powerhouses and outside investors, as well
as a growing user base that seemed to presage a future of different
protocols for different functions, particularly when it came
to multimedia.
These various protocols and services shared two important characteristics:
Each was pursuing a design that was internally cohesive, and
each operated in a kind of hermetically sealed environment where
it interacted not at all with its neighbors. These characteristics
are really flip sides of the same coin -- the strong internal
cohesion of their design contributed directly to their lack
of interoperability. CompuServe and AOL, two of the top online
services, couldn't even share resources with one another, much
less somehow interoperate with interactive TV or CD-ROMs.
|
THE STRENGTH OF WEAKNESS AND EVOLVABILITY
In other words, every contender for becoming an "industry standard"
for handling information was too strong and too well-designed
to succeed outside its own narrow confines. So how did the Web
manage to damage and, in some cases, destroy those contenders
for the title of The Next Big Thing? Weakness, coupled with
an ability to improve exponentially.
The Web, in its earliest conception, was nothing more than a
series of pointers. It grew not out of a desire to be an electronic
encyclopedia so much as an electronic Post-it note. The idea
of keeping pointers to ftp sites, Gopher indices, Veronica search
engines and so forth all in one place doesn't seem so remarkable
now, but in fact it was the one thing missing from the growing
welter of different protocols, each of which was too strong to
interoperate well with the others.
Considered in this light, the Web's poorer engineering qualities
seem not merely desirable but essential. Despite all strong
theoretical models of hypertext requiring bi-directional links,
in any heterogeneous system links have to be one-directional,
because bi-directional links would require massive coordination
in a way that would limit its scope. Despite the obvious advantages
of persistent connections in terms of state-tracking and lowering
overhead, a server designed to connect to various types of network
resources can't require persistent connections, because that
would limit the protocols that could be pointed to by the Web.
The server must accommodate external processes or it would limit
its extensibility to whatever the designers of the server could
put into any given release, and so on.
Furthermore, the Web's almost babyish SGML syntax, so far from
any serious computational framework (Where are the conditionals?
Why is the Document Type Description so inconsistent? Why are
the browsers enforcement of conformity so lax?), made it possible
for anyone wanting a Web page to write one. The effects of this
ease of implementation, as opposed to the difficulties of launching
a Gopher index or making a CD-ROM, are twofold: a huge increase
in truly pointless and stupid content soaking up bandwidth;
and, as a direct result, a rush to find ways to compete with
all the noise through the creation of interesting work. The
quality of the best work on the Web today has not happened in
spite of the mass of garbage out there, but in part because
of it.
In the space of a few years, the Web took over indexing from
Gopher, rendered CompuServe irrelevant, undermined CD-ROMs, and
now seems poised to take on the features of interactive TV,
not because of its initial excellence but because of its consistent
evolvability. It's easy for central planning to outperform weak
but evolvable systems in the short run, but in the long run
evolution always has the edge. The Web, jujitsu-like, initially
took on the power of other network protocols by simply acting
as pointers to them, and then slowly subsumed their functions.
Despite the Web's ability to usurp the advantages of existing
services, this is a story of inevitability, not of perfection.
Yahoo and Lycos have taken over from Gopher and WAIS as our
meta-indices, but the search engines themselves, as has been
widely noted, are pretty lousy ways to find things. The problem
that Gopher and WAIS set out to solve has not only not been
solved by the Web, it has been made worse. Furthermore, this
kind of problem is intractable because of the nature of evolvable
systems.
|
THREE RULES FOR EVOLVABLE SYSTEMS
Evolvable systems -- those that proceed not under the sole direction
of one centralized design authority but by being adapted and
extended in a thousand small ways in a thousand places at once
-- have three main characteristics that are germane to their
eventual victories over strong, centrally designed protocols.
-
Only solutions that produce partial results when partially implemented
can succeed. The network is littered with ideas that would have
worked had everybody adopted them. Evolvable systems begin partially
working right away and then grow, rather than needing to be
perfected and frozen. Think VMS vs. Unix, cc:Mail vs. RFC-822,
Token Ring vs. Ethernet.
-
What is, is wrong. Because evolvable systems have always been
adapted to earlier conditions and are always being further adapted
to present conditions, they are always behind the times. No evolving
protocol is ever perfectly in sync with the challenges it faces.
-
Finally, Orgel's Rule, named for the evolutionary biologist
Leslie Orgel -- "Evolution is cleverer than you are". As with
the list of the Web's obvious deficiencies above, it is easy
to point out what is wrong with any evolvable system at any
point in its life. No one seeing Lotus Notes and the NCSA server
side-by-side in 1994 could doubt that Lotus had the superior
technology; ditto ActiveX vs. Java or Marimba vs. HTTP. However,
the ability to understand what is missing at any given moment
does not mean that one person or a small central group can design
a better system in the long haul.
Centrally designed protocols start out strong and improve logarithmically.
Evolvable protocols start out weak and improve exponentially.
It's dinosaurs vs. mammals, and the mammals win every time.
The Web is not the perfect hypertext protocol, just the best
one that's also currently practical. Infrastructure built on
evolvable protocols will always be partially incomplete, partially
wrong and ultimately better designed than its competition.
|
LESSONS FOR THE FUTURE
And the Web is just a dress rehearsal. In the next five years,
three enormous media -- telephone, television and movies -- are
migrating to digital formats: Voice Over IP, High-Definition
TV and Digital Video Disc, respectively. As with the Internet
of the early '90s, there is little coordination between these
efforts, and a great deal of effort on the part of some of the
companies involved to intentionally build in incompatibilities
to maintain a cartel-like ability to avoid competition, such
as DVD's mutually incompatible standards for different continents.
And, like the early '90s, there isn't going to be any strong
meta-protocol that pushes Voice Over IP, HDTV and DVD together.
Instead, there will almost certainly be some weak 'glue' or
'scaffold' protocol, perhaps SMIL (Synchronized Multimedia Integration
Language) or another XML variant, to allow anyone to put multimedia
elements together and synch them up without asking anyone else's
permission. Think of a Web page with South Park in one window
and a chat session in another, or The Horse Whisperer running
on top with a simultaneous translation into Serbo-Croatian underneath,
or clickable pictures of merchandise integrated with a salesperson
using a Voice Over IP connection, ready to offer explanations
or take orders.
In those cases, the creator of such a page hasn't really done
anything 'new', as all the contents of those pages exist as separate
protocols. As with the early Web, the 'glue' protocol subsumes
the other protocols and produces a kind of weak integration,
but weak integration is better than no integration at all, and
it is far easier to move from weak integration to strong integration
than from none to some. In 5 years, DVD, HDTV, voice-over-IP,
and Java will all be able to interoperate because of some new
set of protocols which, like HTTP and HTML, is going to be weak,
relatively unco-ordinated, imperfectly implemented and, in the
end, invincible.
|
Clay Shirky's Writings About the Internet
Economics and Culture, Media and Community, Open Source
|
|