next up previous
Next: The Opening Statements Up: The Structure of IT Previous: The Structure of IT

Introduction

This article addresses the question of how to best develop standards for the information technology industry. While it may be possible to generalize our conclusions for other industry groups, the focus here is on the issues faced by the information technology industries.

Marvin Minski once said ``Anything that you hear about computers and AI should be ignored, because we're in the Dark Ages. We're in the thousand years between no technology and all technology. . . ''[6, p 104,] While information technology is probably in somewhat better shape than Minski suggests is the case for AI, it is true that we are in a period of very rapid, and often unpredictable change. Bibliophiles will be familiar with the term ``incunabul'' which is used to refer to books produced between 1450 and 1500. The term more generally refers to any art or industry in the early stages of development. Information technology is currently in the early stages of development -- in its period of incunabula. Developments are both rapid and pervasive. There are many indications of the rate and extent of change.

All of these are indicators of a fundamental shift. As Lucky[18] and Negroponte[20] have pointed out, we appear to be moving from an economy based on physical commodities to an economy based on information. A number of authors have addressed this transformation as it pertains to reengineering or new organizational structures.

In this milieu, a number of organizations are setting standards for information technology. As has been the case over the years, these standards are one of the important factors in bringing about stability in the technology. There have been some estimates that more than 50% of all the new standards pages being developed these days are related to information technology. This number is of great interest in that the information technology industries represent only a small portion of the industries engaged in setting standards. (If all industries set standards at the same rate, one might expect fewer than 1% of standards pages to relate to information technology.) Another measure of the importance of this effort are the estimates of the costs of developing these standards. Spring and Weiss[30] make a conservative estimate of the cost of only one of the ethernet standards at about $10,000,000. Some have suggested that the OSI effort, which many now feel is doomed to obsolescence before implementation, may have cost the governments and corporations that contributed to its development more than a half a billion dollars.gif While these estimates have focused on standards developed in the traditional standards development organizations, the fact that the most significant cost factor is the contribution of employee time suggests that these cost estimates are likely to hold for standards developed in other kinds of environments as well.

We are inventing the information technology future at a rapid pace, and the extent of invention is vast. In this environment, lots of things are in a state of flux. The challenge to develop a National Information Infrastructure has been the latest addition to this mix, and like it or not, it does change the focus of technology development from one that is very much market driven to one that may be more influenced by public policy.

During the last quarter of 1994, Tony Rutkowski of the Internet Society and Steve Oksala of Unisys began a discussion about the criteria for recognizing organizations to be involved in this standards development process -- specifically, as a part of the Information Infrastructure Standards Panel effort set up by ANSI. This discussion has practical, structural and legal ramifications at the national and international levels. In this article the discussion is expanded and carried forward with an eye to asking what needs to be done to make things better. Each of the authors has been involved in the development, management, and/or study of standards. The article endeavors to take a first hand and dispassionate view of the developments over the last decade and asks how information technology standards might best be set over the coming decade.

To set the stage, we begin with a few important assumptions about our thinking:

Standards are critical, methodology is a means to an end.
This is not to say that the methods of setting standards are unimportant. As is indicated below, the people who develop a standard and the process by which it is developed influence what emerges. It is to say that once a standard is accepted, the method by which it emerged is less important than the fact that an accepted standard exist. For example, if we accept that TCP/IP, Microsoft Windows, SQL, and Ethernet are all standards at some level, they are not more or less standards because of the radically different methodologies by which they were developed. A corallary to this assumption is that great, mediocre, or lousy standards can emerge from any of the processes.

Standards shape the marketplace.
When a standard is accepted, the marketplace is affected in some significant ways. On the downside, inertia and stagnation can set in. On the upside, there is increased competition in value added products that conform to the standard and increase network externality.gif While the arguments pro and con are too complex to argue in general, it may be safe to say the ideal standard stabilizes the market and allows for a variety of interoperable products that reduce costs through competition.

A good standard is invisible.
In a sense, a perfect standard is absolute. To the extent that we have a choice about standards, they are not perfect. When there is no choice, we tend to forget that the standard actually exists. For example, there are many different sizes of paper, but the overwhelming tendency in the U.S., with the notable exception of the legal profession, is to view standard business paper size as 8.5x11 -- in Europe it is A4. We don't have to specify the filing cabinet capacity, or the file folder size, or the binder size, or the dimensions of the three hole punch, or the copier, fax, or laser printer size. All of these choices assume a standard size for business correspondence. Similarly, normal electrical devices for the home run on a standard frequency. (we can plug a light, PC, or table saw into any outlet in the house.) We don't have to check when purchasing the device to make sure. On the other hand, people are sometimes surprised that the PostScript file they just printed on their local office printer does not print across town at another office that uses the exact same hardware configured for HPGL. So long as we have to ask long and complicated questions about what standards a given device is compliant with, the standards are not invisible, and one might argue they are not standards.

Standards are not ends but means.
While many could point to situations in which standards were developed because the members of the committee wanted a standard, most would agree that the goal of standardization is more than a document.gif A standard serves as a mechanism to achieve economic and or public good goals. The economic goals, traditional to the US standards setting process, have to do with network externality, economy of scale, and interchangability. The public good goals, less prominent but none the less visible in the US standards setting process, have to do governmental efforts to assure the public access, public safety, and public welfare. Historically, these are reflected in safety standards, but they are also prominent in access to communications. Thus, our assumption here is that while there is a history of standards development as an end in itself, the rational -- professional, corporate, and national -- view is that standards are means to an end.

IT Standards are tied to competitive issues.
Most simply, when technical standards are developed, it is sometimes the case (in the IT industry this may be the case frequently) that the choices are more or less equivalent at a technical level. In this situation, the choice of one approach over another can provide a clear competitive advantage to the organization that has the largest stake, investment, and lead in the particular approach. This may have significant implications when the competing industries are tied to particular nations and the standard being accepted is one that will affect the international market. (A good example of this is the HDTV standards. While most would now agree that the US position advocating a digital standard was technically correct, the controversy was as much motivated by international competition as it was by technical issues.)

Standards are developed within a cultural milieu.
While several of the assumptions above focus on the fact that standards are a means to an end, it is equally clear that the means is far from value free. Standards are developed by a community that has a set of values and a particular perspective on information technology. For example, if asked to guess the average age of committee members on an X3, IEEE, and IETF working group, few would guess that the X3 committee members would be younger than the IETF members. If asked about the average length of hair or the political affiliation many would suggest that the members of these organizations are likely to have differences. These differences might or might not be real in any given situation. Standards are developed by people and oftentimes the people developing the standard have a long involvement in their particular field and the paradigm of the field may work as a kind of selection mechanism.

On the pages that follow, the opening discussion between Tony Rutkowski and Steve Oksala has been summarized. (The actual mail notes are contained in Appendix A.)



next up previous
Next: The Opening Statements Up: The Structure of IT Previous: The Structure of IT



Michael Spring
Mon Nov 27 18:45:46 EST 1995