We appreciate Paul Strassman's passion around the goal of minimizing the risks to the world's computer systems and networks, and we are confident that his interest is in improving the security of computers and networks. However, the analysis made in his paper suffers both from a fundamentally flawed assumption and from a series of errors in analyzing the state of computer security and its causes.
The fundamental error in Strassman's paper is confusing computers with potatoes. More precisely, potatoes are natural organisms that hybridize and evolve over time. If a farmer is unhappy with the attributes of a particular variety of potato -- such as its resistance to pests or disease -- he or she must find and plant another variety or breed a new one over time. There is no option to alter the attributes of the potato itself to resist the new threat.
In contrast, computers and networks are not organic systems and do not evolve or manifest new vulnerabilities to attack on their own. Instead, they are artifacts that are built and maintained by teams of people. Those teams apply expertise in a variety of specialties, including security, to the task of developing, operating, and maintaining the computers and networks in a potentially hostile world. If a system or network manifests undesirable properties such as vulnerability to a particular form of attack, the developers or maintainers can intervene rapidly and change those properties to suit the new realities.
If an analogy must be used, it would be better to consider analogies to man-made artifacts such as fleets of trucks or aircraft. Operators of fleets standardize on a few types of vehicles based on functional requirements such as capacity and fuel efficiency. Such standardization allows them to select the "best of breed" -- the product of the best design engineers and factories -- to achieve a high level of expertise and experience in their operations and maintenance staffs, and to standardize facilities and supporting equipment to maximize the efficiency and effectiveness of their operations.
The fleet analogy is certainly more valid than an organic analogy for considerations of computers and networks. Organizations want the products of the best development teams and they want to know that those development teams will be there to back them up if a problem is discovered. They want to standardize on the best antivirus and intrusion detection software -- products that are unlikely to be available for a wide variety of truly different platforms. They want to employ operators and support staffs that are highly trained, not staffs that are equally inexpert on a variety of platforms. Finally, they want security management personnel who know the systems they are monitoring and managing rather than personnel who are equally inept at a "biodiverse" variety of unfamiliar systems.
While the potato analogy is flawed, the assumption of a Microsoft-dominated "monoculture" is no better. In fact, the information infrastructure of network components and servers that Strassman refers to is not Microsoft-dominated. (More on this point later.) An attack on the nation's or the planet's information infrastructure would have to target not only Microsoft's products but also those of a variety of competitors. Even where Microsoft products are used, the markets for security components such as firewalls, virus scanners and intrusion detection systems are very competitive.
Strassman seems to draw his assumption of a Microsoft monoculture solely from the desktop market. While desktops are the locus of viruses, and it may be that his analysis is drawn solely from widely publicized virus incidents, our experience with viruses demonstrates the flaws in his assumption of a Microsoft indifferent to customers' security needs.
After the "I Love You" virus incident, Microsoft developed and released to its Web site the Outlook email Security Update that fundamentally changes the ability of viruses to propagate using Microsoft software. The decision to release the update was not taken lightly because limiting virus propagation requires, for fundamental reasons of computer science, limiting the flexibility available to users. But the need became clear and the development was undertaken. And because computers are not organic systems, millions of users were free to install the update and protect themselves the day after it was posted to Microsoft's Web site.
Microsoft has invested heavily in enhancing the security of its products. The list of investments is too long for this brief column, but four examples are illustrative:
- Microsoft has integrated advanced authentication and encryption technologies into its Windows 2000 products, and done so in such a way that they can be deployed and managed securely even by administrators who are not security or encryption specialists.
- Microsoft has made a major investment in developing and applying automated tools that can detect and eliminate security flaws in programs. By applying these tools in the development phase, Microsoft is able to eliminate vulnerabilities from even rich highly functional products.
- Microsoft has made a major investment in the industry's leading security response process to ensure that every customer report of a product vulnerability is evaluated, and that, if necessary, the availability of software security updates is communicated to customers in a very open and public way.
- Microsoft has licensed its source code to academic and research organizations for their security review, and for their use in researching new security techniques.
As a major and successful company, Microsoft makes a ready target for the accusations made in Strassman's paper, but a more careful analysis would focus attention elsewhere. Unix and Linux systems from a variety of apparently independent sources share common components -- and common vulnerabilities. A cursory review of the CERT Coordination Center's advisory database reveals repeated references to BIND, Sendmail, and ftpd -- all common components that are released (in source code form) for integration into Unix and Linux systems.
A vulnerability in such a component jeopardizes the security of customers across a wide range of suppliers, and many of those customers may not realize that their security depends on code that the supplier didn't develop and may not have examined. The widespread assumption that availability of source code equates to security is disproved -- as everyone assumes that "someone else" will review the code for security. The shared code has led to shared vulnerabilities that were exploited to attack individual systems and to execute distributed denial-of-service attacks on Internet Web sites.
We submit that the interests of security are best served by the presence of strong suppliers, such as Microsoft, whose businesses and reputations are tied to the security of their products -- and which have the resources to invest in building security into their products, learning from operational experience and continually improving security over time.
Microsoft is committed to continuous improvement of both the security features in its products and the processes that are used to eliminate vulnerabilities. Because achieving security in the real world is a difficult task, and because customers insist on products that offer rich functionality as well as security, this is not an easy task. We often refer to security as "a journey rather than a destination," but our obligation to our customers requires that we continue on this journey and continue to address our customers' needs for more and better security.
home · who are hackers? · risks of the internet · who's responsible · how to be vigilant · interviews
discussion · video excerpts · synopsis · press · tapes · credits
FRONTLINE · wgbh · pbs online
some photos copyright ©2001 photodisc
web site copyright WGBH educational foundation |