U.S. Computer Insecurity Redux
The United States continues to face serious challenges in protecting computer systems and communications from unauthorized use and manipulation. In terms of computer security, the situation is worse than ever, because of the nation’s dramatically increased dependence on computers, the widespread growth of the Internet, the steady creation of pervasively popular applications, and the increased interdependence on the integrity of others.
There is a seemingly never-ending stream of old and new security flaws, as well as markedly increased security threats and risks, such as viruses, Trojan horses, penetrations, insider misuse, identity theft, and fraud. Today’s systems, applications, and networking tend to largely ignore security concerns–including such issues as integrity, confidentiality, availability, authentication, authorization, accountability, and the spread of malicious code and e-mail spam–and would-be attackers and misusers have significantly wider knowledge and experience. Moreover, there is a general naiveté whereby many people seem to believe that technology is the answer to all security questions, irrespective of what the questions are.
In addition to security concerns, there are serious problems relating to system dependability in the face of a wide range of adversities. Such adversities include not only misuse but also hardware malfunctions, software flaws, power disruptions, environmental hazards, so-called “acts of God,” and human errors. The nation seems to have evolved into having a rather blind faith in technologies that often are misunderstood or misapplied, and into placing trust in systems and the people involved with them, even though they have not proven entirely trust”worthy”.
Solutions–but few takers
The irony is that many solutions to these problems are already available or can be developed in fairly short order, but they are not receiving the attention they deserve.
Indeed, one of the most striking factors relating to computer security and system dependability involves the widening gap between what has been done in the research community and what is practiced in the commercial proprietary software marketplace. Over the past four decades, there have been some major research advances (as well as some major paradigm shifts) toward achieving dependably trustworthy systems. These advances include novel distributed system architectures; software engineering techniques for development and analysis; stronger security and reliability; and the use of cryptography for authentication, integrity, and secrecy. But relatively few of those advances have found their way into the mass-market commercial mainstream, which has been primarily concerned with remarkable advances in hardware speed and capacity and with new whiz-bang software features. The resulting lowest-common-denominator software is often risky to use in critical applications, whether critical in terms of lives, missions, or financial operations.
Another notable factor is that many of the system development problems that were recognized 40 years ago are still current. For example, software developments are typically late, over budget, and unsatisfactory in their compliance with requirements, which themselves are often ill-stated and incomplete. Certain historically ubiquitous types of software flaws remain commonplace, such as those that permit the execution of arbitrarily nasty code on the computers of unsuspecting victims or that cause systems to crash. Many of these problems can be easily avoided by the consistent use of hardware protections and carefully designed system interfaces, programming languages that are inherently less error-prone, and good software engineering practices. Having more precise requirements in the first place also would help. However, most of the nation’s academic institutions now employ curricula that pay insufficient attention to elements of disciplined system development and lack system architectures that encompass the necessary attributes of dependability.
One of the hopes for the future involves what I call “open-box software”: a category that includes what other people have called “open-source software” or “free software,” where “free” is interpreted not in terms of cost, but rather in terms of use and reuse. This concept stands in contradistinction to conventional closed-box software, which typically is proprietary. In essence, open-box software implies that it is possible to examine the source code, to modify it as deemed necessary, to use the code, and to incorporate it into larger systems. In some cases, certain constraints may be constructively imposed.
Open-box software is not a panacea when it comes to developing and operating dependable systems. (For example, the need to ensure disciplined system development is equally relevant to open-box systems.) Experience shows, however, that a community of developers of open-box software can greatly enhance interoperability and long-term evolvability, as well as security and reliability. If nothing else, the potentials of open-box software are putting competitive pressures on the proprietary software marketplace to do much better. But the potential is much greater than that. For example, the Defense Advanced Research Projects Agency’s CHATS program (the acronym stands for Composable High Assurance Trusted Systems, although “Trustworthy” would be far preferable to “Trusted”) and other efforts, such as SELinux, have provided significantly greater security, reliability, and dependability in several popular open-box systems.
Multidimensional effort needed
Looking ahead, one of the major challenges will be developing systems that are self-diagnosing, self-repairing, self-reconfiguring, and generally much more self-maintaining. Critical applications, in particular, will require system technologies far in excess of what is generally available in the commercial marketplace today. Once again, although the research community has a plethora of approaches to that end, the commercial marketplace has not been adequately responsive, with the notable exception of IBM, which seems to be devoting considerable effort to autonomic computing.
A particular class of systems in which these problems come to the fore is represented by voting machines that are completely electronic, such as those that use “touch-screen” systems. Ideally, these machines should satisfy stringent requirements for reliability, accuracy, system integrity, tamper resistance, privacy, and resistance to denial-of-service attacks, to list but a few. (They also should have all the other more general traits of security noted above.) In practice, current Federal Election Commission standards and requirements are fundamentally incomplete. Furthermore, the most widely used all-electronic systems provide essentially no genuine assurances that votes are correctly recorded and counted; instead, they provide various opportunities for undetected accidents and insider fraud. They also masquerade behind the cloak of closed-source proprietary code. On the other hand, even extensive certification (against incomplete standards) and code review cannot defend against undetected accidents and fraud in these systems. This is an intolerable situation.
Digging a way out of today’s security and dependability morass will require a truly multidimensional effort. One important step will be to improve undergraduate and graduate education. There also needs to be much greater technological awareness of security and privacy issues on the part of researchers, development managers, system designers and programmers, system administrators, government funding agencies, procurement officers, system evaluators, corporate leaders, legislators, law enforcement communities, and even users.
There are no easy fixes, and responsibility is widely distributed. Much greater vision is necessary to recognize and accommodate long-term needs, rather than to merely respond to short-term problems. The risks of not finding solutions to these growing problems also must be clearly recognized. But the importance of protecting the nation’s critical computer and communications infrastructures is so great that these issues must be addressed with a much greater sense of urgency.