On the Near Future of Business Computing

Espen Andersen, January 1991

Abstract:

In the next decade, the main influence of computers on organizations will be not so much technological breakthroughs as the proliferation of computers in itself. Keywords for this development will be standardization, electronical means of extending the business enterprise, object oriented systems, and the computer as a commodity.
 
The fact that COBOL is around and going
strong - even though it was obsolete
by 1965 or so [...] says something 
about the power of the familiar.
Alan Kay
1

Introduction

There is a tendency to see history as a string of discrete events2, which are viewed as having had an impact almost as they occurred. This may be because it often is the survivors of an era that write the annals of it, or it simply because it is easier (and more entertaining) to chronicle the life and times of a few historical figures (whether they be living persons or other entities, such as the Spinning Jenny) than tediously recording all the small advances and ventures up blind alleys that, taken together, make up a revolution.

 The history of computers is often seen as a history of such events, typically compiled into timelines3. The events most often cited may be the ENIAC, the PDP-11, the Apple II, the VISICALC spreadsheet. One must, however, remember that these events more often than not were manifestations of what was happening at the time; if Apple had not become successful with the first personal computer, someone else would - maybe Cromemco, maybe Texas Instruments, maybe Lee Felsenstein with the SOL.

 To find out what will happen in the near future, one could go to the research laboratories, look at what is generating interest there, and then with some certainty say when these inventions will be applied to real-world use and marketed commercially4. A roundup of the technologies in the labs as of today may give us neural networks, broadband ISDN, chaos theory, gallium arsenide microchips, groupware and massively parallel computers. All of these technologies will be available before 2000; in fact, most of them are available today in one form or another. Determining which of them will have the largest impact on organizations is more like informed guesswork than science, since the number of unidentifiable variables is enormous and the influence of future unrelated events large and unpredictable. Predictions therefore are often based on extrapolations of previous technical developments, limited by today's technological hurdles, such as the "speed limit" on microprocessors due to the departure from the statistically based behavior of electrons as the processor speeds go up.

 Yet the factors most influencing organizational structure and behavior are often determined outside the research laboratories: more often than not, the technically sophisticated loses out to the standardized, understandable, available. In the world of information technology, Herbert Simon's administrative man lives and decides. Moreover, technical breakthroughs are like earthquakes: we know they will come, but there is a problem with timing.

 I believe that in the next three to five years, maybe even through the next decade, the main influence from technology on organizations will be the proliferation of computers in itself: the availability of more powerful and cheaper computers which can be connected together will itself spawn new ways to use them. I do not foresee any new "spreadsheet"; the main influence computers will have on organizations will be their ubiquity. Computers will be even cheaper, even smaller, easier to use, faster, and above all the need for specialized deployment knowledge will virtually disappear. The main directions will be towards standardization, towards systems extending beyond the traditional boundaries of the business enterprise, object orientation, and, on the consumer side, the computer as a commodity, a household item. The computer will, in most families, eventually occupy a place somewhere between the car, the telephone, the TV set and the stereo equipment.

Standardization

The first computers were large machines, run by specialists. Through remote I/O devices, ever increasing processing power, timesharing and a more widespread knowledge of Fortran and other high-level languages, the computer became a tool for scientists and accountants alike. The minicomputer introduced departmental computing, with less need for a specialized processing department. Companies like Wang, Digital, Prime and Data General thrived on providing computing power to various niches in the market. Users were born and reared in the environment of a single computer manufacturer, whose architectures and user interfaces were pointedly different from one another.

 The personal computer started out around 1974 as both a revolt against the traditional world of computers to bring "computers to the people"5, and as a hobby for computer professionals. Various companies (Apple with the Apple II, Tandy with the TRS-80, Commodore with the PET and later the VIC-20, Atari with the 400 and 800) made and successfully sold computers, but it took VISICALC and later the IBM PC and Lotus 123 to really get the message through to the business community. From then on it has largely been a battle between the IBM-compatible camp and Apple Macintosh, with the minicomputers losing out to ever more powerful PCs and workstations. The mainframe is gradually being relegated to server status, and the main headache of the CTO is how to tie all these little things together again (and to hope that at least some of the users back up their data every once in a while).

 The main issue in the technical debate for the last three years has been standardization. Standards come in two flavors:

Rapidly developed and well communicated standards are of paramount importance in the adaptation of technology6. Examples of success where standardization was important may be the C programming language, the IBM-compatible PC market, the Macintosh user interface. Examples of failures may be ISDN, where the standards have taken too long, or CD-ROM drives, where a plethora of relatively small manufacturers and information vendors each holding on to their own retrieval software have led to a relatively slow adaption of a technology that should hold great potential.

 I see a dramatic awakening to standards in the information technology industry. Manufacturers see that there is a need for them to adhere to standards, as witnessed by the scramble towards UNIX by all the previously proprietary oriented minicomputer manufacturers. There is also an understanding that complete standardization is not possible. IBM's SAA, which was intended to be a focusing standard for IBM developments, has been expanded due to pressure from the installed customer base up to a point where it now includes almost everything IBM does (and has ever done). A more realistic strategy is to lower the ambitions a bit and aim for interoperability, where standardization is achieved by having translation functions in the systems, or by having standards for data formats rather then executables. A well known example of interoperability is the Microsoft Word word processor and Excel spreadsheet: documents from these applications can be copied between IBM-compatible PCs and Macintoshes without conversions. There is a need for interoperability on a lower level (binary compatibility) as networks proliferate, but this has not to my knowledge been achieved except within applications from one single vendors (an example may be the Smalltalk development environment, which may exchange images among technological platforms without any conversion). Another form of interoperability lies in the definition of interfaces between applications, as is demonstrated with Apple Hypercard front-ends to databases residing on mainframes. Hardware forms of interoperability is found in the SCSI interface for computer peripherals, now finally adopted on a large scale by IBM7, and the 1.44 MB 3.5'' diskette, now adopted by both Apple and NeXT.

 The most powerful standards may be the ones proposed and developed by institutions that do not have any direct commercial interest in them. Examples here may be the X windowing standard or the Kerberos authentication system, developed at MIT. The "official" standard committees often take so long getting a standard out that the technological evolution has surpassed it (as seen with ISDN), and the intervendor organizations often get lost in competitive bickering over details (as seen in the case of the OSF versus the SUN/AT&T consortium). On the other hand, independently defined standards may be incomplete in certain areas of little interest to the developers.
 
 

The Electronic Extension of the Business Enterprise

Businesses increasingly find that their interaction with the environment is electronically based. EDI (Electronic Document Interchange) takes care of the transactions with their suppliers, financial services and major customers, electronically based brokers may handle contact with their more sporadic organizational customers, electronic mail order services with individuals. Communications services such as Compuserve or MCI Mail may keep them in touch with their travelling representatives, be they sales personnel, service personnel or executives. Information partnerships may effectively add value to their products by linking it to other products and services from cooperating companies8.

 Interorganizational systems may be looked upon as a form of standardization. By agreeing on a standard way to conduct business, organizations connect electronically with each other, saving time and money through designing their systems for the normal, uncomplicated business transaction, leaving exceptions to be handled by humans. Such systems may be used to gain a tremendous and not very competitive advantage, as seen in the air travel industry9. Interorganizational systems may be initiated from either party of the business transaction, from an organization in the middle, by mutual consent, or, as is increasingly common, by a market provider organized specifically to provide a business arena, an electronic marketplace. Standards for the exchange of documents are emerging, notably the EDIFACT standard, which now dominates the EDI market.

 I believe that the move towards an electronical extension of the enterprise will find a number of companies in dire straits, not so much because of technical inability as because of lack of imagination. Marketing electronically requires a different way of thinking about the customer than previously: the successful companies will be the ones that exploits the technologies ability to add new value for the customer: value by adding specialized configuration, tailored products and procedures, by extending the range of products offered based on how the customer behaves. There is much to be learned here from the otherwise staid mail-order business. A mail-order company always keeps track of which advertising in which magazines gives what kind of response; this knowledge of the distribution channels is the core of the business. As we see electronical services being extended to the home, companies mastering this form of marketing will find new markets, in a time where the access to products becomes an increasingly important part of the purchasing decision.
 
 

Software Mimics Life: Object Orientation

Object oriented systems development is systems development the same way organizations are developed; by making self-contained entities that does their specialized jobs, communicating with other entities (objects) as the need arises10. Object oriented programming stems from the SIMULA simulation language11, developed in the sixties. The idea was picked up by Alan Kay and used in the development of Smalltalk, a graphically based object oriented development environment for the Xerox Star workstation12. Object oriented programming permits a prototyping approach to systems development: by taking standardized building blocks, modifying them by creating special varieties (subclasses) as the need arises. The developer no longer needs to maintain a central structure of environment variable states; each object has information about its own state, and the ability to respond to messages from other objects to change or display it.

 Object oriented programming is especially suited for the complex world of graphical user interfaces. As many a Macintosh programmer has experienced, programming in an event-driven, graphically based environment is non-trivial because of the number of environmental variables that constantly has to be monitored. Object oriented programming organizes the whole environment in a hierarchy, where each object responds to a message either by executing it themselves, or by borrowing methods to do so from objects higher up in the hierarchy (inheritance). In this way, each object stores only the methods (that is, code) that is different from the objects above it in the hierarchy; for everything else, the code from the higher objects are used (code reusability). This makes object oriented development more a question about finding a pertinent existing class of objects and modify them than actually writing new code. As most object oriented programming systems come with an extensive set of standard objects, actually having to write a new object from scratch relatively rare.

 Object orientation is beginning to show up in user interfaces as well. Bill Gates, founder of Microsoft, has long been a proponent of what he calls "document-driven" (as opposed to "application-driven") computer usage. Under this paradigm, the user groups his files according to the content, not according to the applications that created them. A document may consist of text, drawings, calculations, pictures or data from a database. The idea is that the user shall only concentrate on the content of the document, and that the document itself will know which application to invoke according to what the user wants to change. Moreover, the user can create groups of files, after content, which will be dynamically linked to each other. Elements of this idea can be seen in the Macintosh file system, where the file knows which application created it, and in the Lotus Magellan file management program for DOS-based machines, where files are grouped across directories according to content.

 Today's computers have hardware capable of running incredibly complex software--software that is not written yet, causing a situation referred to as the "software gap". The complexity of new software lies not so much in more complex algorithms, more advanced functions--at least not for the business community. Instead the powerful new hardware will be used to let the computer take over more of the complexity of the interaction between man and computer: graphical user interfaces, alternative input devices, transparent communication. I think object orientation, both in development and in user interfaces, gives an answer to the question of how to handle the incredible complexity arising from this shift in cognitive responsibilities.
 
 

Standardized usage: The computer as a commodity

In his rambling novel Cannery Row, John Steinbeck describes how the Ford Model T changed Americans' attitude towards technology. The innards of motor vehicles became familiar to a vast number of people who suddenly had the financial means to own a car. A monkey wrench ceased to be a specialist's tool, instead becoming a commodity found and used (and frequently stolen) everywhere. The Model T had its technical idiosyncrasies: a gas tank hidden under the back seat, an inadequate fuel supply system, and brakes that would wear out fast. The owners of Model Ts learned to live with them, inventing tricks like backing up steep hills to keep the gasoline flowing to the engine, or using the reverse gear as a break when the regular breaks no longer functioned.

 It is tempting to view the IBM personal computer, introduced in 1981, as the Model T of the information age. It certainly had its idiosyncrasies (a crippled operating system, a multitude of screen standards of varying quality, and a bulky and heavy case which had the switches controlling the system configuration inside, secured with 5 screws). Just as the model T, it made the regular user technically literate: as the need for upgrades arose, people bought expansion cards and installed them themselves. Few original IBM PC users have not been inside their machines.

 As more and more people get experience in the use of computers, we may expect the computer to turn into a standardized household item, almost a commodity, much like the automobile of today. The computer will become part of everyday life also in the household to a much larger extent than it is today; we will see systems for paying bills (such as CheckFree) and communication services (like Compuserve or Prodigy) becoming ubiquitous. In France, the Minitel system has been relatively successful, mainly because a) getting the terminal was very cheap (initially it was freely distributed), and b) because the telephone directory of France was offered electronically, which instantly gave the user an application of real value. Other attempts to get a public communication system has not been as successful, although Compuserve now has 500,000 members and recently (Jan. 1., 1991) added a complete telephone directory for the US to its services. Certainly IBM has seen the market possibilities with its PS/1 home computer--still, there is a need for a product that spans more than the traditional uses of a personal computer; a multi-media machine capable of something more than word processing and remote bank account managing.

 There is an issue of accessibility, or "readiness-to-hand"13; for the computer to achieve the place the car now has in society (as pictured in Apple's Knowledge Navigator video) certain things must happen. Firstly, the computer must be portable. There is a powerful move towards portable computers to gradually replace the deskbound ones; even the newspaper trend journalists have understood that portable computers are IN14. Alan Kay has envisioned a computer called a Dynabook; something so versatile that the user would write the grocery shopping list on it, and be able to carry it together with two bags of groceries15. Secondly, new user interfaces such as voice or handwriting recognition must become practical. The popularity of cellular phones may, in my opinion, in part be credited to their simple user interface. Thirdly, access to information and communication services must be seamless and cheap enough to get the "Model T" effect rolling: there is a need for the consumer electronics industry to do the same thing that was done with the VCR market; initially take low profits to establish a standard and a distribution system, then earn money through large volume and spin-off business. (Another issue is that most consumer electronics items are far to complex for the average user; the design of programmable VCR's can leave even experienced computer programmers helpless16).

 Accessibility is also a question about previous knowledge: through education of the users the need for a specialist is reduced or disappears. This has taken place in most commercialized technologies; around the turn of the century, having a car also meant having a chauffeur. I think the element of user education has been pushed a bit to far; to effectively use and understand a computer, the user must have an understanding of "why" as opposed to "how". Many users of computers, especially in production or transaction oriented environments, have not got the capability or the interest in developing this understanding. Moreover, the people training them has from the beginning had a "how" orientation; the result can be seen in the many handwritten lists of key sequences found taped to computer terminals. To overcome this, software developers, also with the large business systems, will have to adopt not only new user interfaces, but also create metaphorical systems in which the user can deduce what to do by referring to what he or she would have done in "real life".
 
 

The future is what you believe it to be

In their 15th anniversary issue, BYTE magazine asked a number of well-known computer experts about which technologies would have the largest impact on the ways we compute in the future, and what would be the next "killer application". Some of the answers were17: This just goes to prove that predictions about the future come in as many flavors as there are people predicting.

My personal dream of the computer of the future is not one, but two systems: one is a portable computer much like the Knowledge Navigator depicted by Apple. The other one is a desk-size LCD screen, tilted towards the user like an engineer's drawing table. On this desktop everything normally found in an office would be in digital form; the file cabinet, the telephone, the documents floating around. Applications would be available as the documents are accessed. Information repositories would be instantly and transparently on-line. The voice and video-based electronic mail system would have a filter shutting out sales pitches. In fact, the only thing physical in this office would be the chair, the coffee mug--and the dust, since the owner would have better things to do than sit tied to a computer all day....

Endnotes:

  1. BYTE Magazine, September 1990, p. 335.
  2. Carr, Edward Hallett: What Is History, New York: Vintage Books 1961
  3. See for instance "The Computer Age" and "The Precomputer Age", Computerworld, November 3. 1986.
  4. Beniger, James R.: The Control Revolution, Cambridge, Massachusetts: Harvard University Press 1986.
  5. Levy, Stephen: Hackers, Anchor Press/Doubleday 1984.
  6. Kosnik, Thomas J.: "Perennial Renaissance: The Marketing Challenge in High Tech Settings", in Von Glinow, Mary Ann and Mohrman, Susan Albers (editors): Managing Complexity in High Technology Organizations, New York: Oxford University Press Inc., 1990. Also Rogers, Everett M.: Diffusion of Innovations, New York: Free Press 1983.
  7. Baran, Nick: "IBM in the Nineties", BYTE Magazine, IBM Special Edition, Fall 1990, pp. 63-70
  8. Konsynski, Benn R. and McFarlan, Warren: "Information Partnerships--Shared Data, Shared Scale", Harvard Business Review, September-October 1990.
  9. See for instance the debate in Harvard Business Review following Max D. Hoppers' article "Rattling SABRE--New Ways to Compete on Information" in the May-June 1990 issue.
  10. Andersen, E. and Konsynski, B.R.: "What the Hell is OOPS, anyway?", Technical note, Harvard Business School, 1990
  11. Dahl, Ole-Johan, Myhrhaug, B. and Nygaard, Kristen: SIMULA 67: Common Base Language, Norwegian Computer Center, 1970.
  12. Goldberg, Adele (ed.): A History of Personal Workstations, Reading, Massachusetts: Addison-Wesley 1988. Also Krasner, Glenn: Smalltalk 80: Bits of History, Words of Advice, Reading, Massachusetts: Addison-Wesley 1983.
  13. Winograd, T. and Flores, F.: Understanding Computers and Cognition, Reading, Mass.: Addison-Wesley 1985
  14. Stocker, Carol: "1991: What's IN and What's OUT", Boston Globe, January 1. 1991
  15. Alan Kay, speaking at the Norwegian School of Management, December 1989.
  16. Norman, Donald A.: The Psychology of Everyday Things, New York: Basic Books 1988.
  17. "The BYTE Summit", BYTE Magazine, September 1990.

Copyright © 1991 Espen Andersen.
This page at http://www.espen.com/papers/near_fut.htm.
Contact information at http://www.espen.com/contact.html.