Critical IT Issues: The Next Ten Years

Reading Time: 43 min 
Permissions and PDF

It’s a Monday morning in the year 2000. Executive Joanne Smith gets in her car and voice activates her remote telecommunications access workstation. She requests all voice and mail messages, open and pending, as well as her schedule for the day. Her workstation consolidates the items from home and office databases, and her “message ordering knowbot,” a program she has instructed, delivers the accumulated messages in the order she prefers. By the time Joanne gets to the office she has sent the necessary messages, revised her day’s schedule, and completed a to-do list for the week, all of which have been filed in her “virtual database” by her “personal organizer knowbot.”

The “virtual database” has made Joanne’s use of information technology (IT) much easier. No longer does she have to be concerned about the physical location of data. She is working on a large proposal for the Acme Corporation today, and although segments of the Acme file physically exist on her home database, her office database, and her company’s marketing database, she can access the data from her portable workstation, wherever she is. To help her manage this information resource, Joanne uses an information visualizer that enables her to create and manage dynamic relationships among data collections. This information visualizer has extended the windows metaphor (graphical user interface) of the early 1990s to three-dimensional graphic constructs.

Papers that predict the form of IT in the year 2000 and how it will affect people, organizations, and markets are in plentiful supply Scientific American has devoted a whole issue to this subject, describing how the computing and communications technologies of the year 2000 will profoundly change our institutions and the way we work.1 What is missing is a vision of what the IT function in a large organization must become in order to enable this progress. With some trepidation, we will attempt to fill this gap.

In the early 1980s, one of us published a paper that forecasted the IT environment in 1990.2 In this paper, we revisit those predictions and apply the same methodology to a view of the IT environment in the year 2000. We describe the fundamental technology and business assumptions that drive our predictions. Scenarios illustrate how the IT function will evolve in terms of applications, application architectures, application development, management of IT-based change, and economics.

References

1. Scientific American, September 1991. This issue is devoted to a series of articles on how computers and telecommunications are changing the way we live and work.

2. R.I. Benjamin, “Information Technology in the 1990s: A Long-Range Planning Scenario,” MIS Quarterly, June 1982, pp. 11–31.

3. M.L. Dertouzos, “Communications, Computers, and Networks,” Scientific American, September 1991, pp. 30–37.

4. T.W. Malone, J. Yates, and R. Benjamin, “The Logic of Electronic Markets,” Harvard Business Review, May–June 1989, pp. 166–172.

5. “A Talk with INTEL,” Byte, April 1990, pp. 131–140.

6. J. Yates and R.I. Benjamin, “The Past and Present as a Window on the Future” in The Corporation of the 1990s, M.S. Scott Morton, ed. (New York: Oxford University Press, 1991) pp. 61–92.

7. V.G. Cerf, “Networks,” Scientific American, September 1991, pp. 42–51.

8. Unix, developed by Bell Labs in the early 1970s, is an “operating system, a religion, a political movement, and a mass of committees,” according to Peter Keen. “It has been a favorite operating system of technical experts . . . owing to is ‘portability’ across different operating environments and hardware, its support of ‘multitasking’ (running a number of different programs at the same time), and its building-block philosophy of systems development (building libraries of small ‘blocks’ from which complex systems can be built).” See

P.G.W. Keen, Every Manager’s Guide to Information Technology (Boston: Harvard Business School Press, 1991), pp. 156–157.

9. J.C. Emery, “Editor’s Comments,” MIS Quarterly, December 1991, pp. xxi–xxiii.

10. M.J. Piore and C.F. Sabel, The Second Industrial Divide: Possibilities for Prosperity (New York: Basic Books, 1984); and

J.P. Womack, D.T. Jones, and D. Roos, The Machine That Changed the World (New York: Rawson Associates, 1990).

11. T.W. Malone and J.F. Rockart, “Computers, Networks, and the Corporation, “Scientific American, September 1991, pp. 92–99.

12. S. Zuboff In the Age of the Smart Machine: The Future of Work and Power (New York: Basic Books, 1988).

13. “Billing Systems Improve Accuracy, Billing Cycle,” Modern Office Technology, February 1990; and

C.A. Plesums and R.W. Bartels, “Large-Scale Image Systems: USAA Case Study,” IBM Systems Journal 23 (1990): 343–355.

14. M. Weiser, “The Computer for the Twenty-First Century,” Scientific American, September 1991, pp. 66–75.

15. Object request brokers are technologies that allow the user to access programs developed by other companies or groups much as the telephone directory allows a user to speak with someone. These tools give more people access to pre-existing solutions. See:

H.M. Osher, “Object Request Brokers,” Byte, January 1991, p. 172.

16. K. Swanson, D. McComb, J. Smith, and D. McCubbrey, “The Application Software Factory: Applying Total Quality Techniques to Systems Development,” MIS Quarterly, December 1991, pp. 567–579.

17. M.S. Scott Morton, ed., The Corporation of the 1990s (New York: Oxford University Press, 1991), pp. 13–23.

18. K. Laudon, A General Model for Understanding the Relationship between Information Technology and Organizations (New York: New York University, Center for Research on Information Systems, January 1989).

19. See E.H. Schein, Innovative Cultures and Organizations (Cambridge, Massachusetts: MIT Sloan School of Management, Working Paper No. 88-064, November 1988); and

E.H. Schein, Planning and Managing Change (Cambridge, Massachusetts: MIT Sloan School of Management, Working Paper No. 88-056, October 1988).

20. J.F. Rockart and R. Benjamin, The Information Technology Function of the 1990s: A Unique Hybrid (Cambridge, Massachusetts: MIT Sloan School of Management, Center for Information Systems Research, Working Paper No. 225, June 1991); and

E.M. Von Simson, “The ‘Centrally Decentralized’ IS Organization,” Harvard Business Review, July–August 1990, p. 158–162.

21. P.J. Dixon and D.A. John, “Technology Issues Facing Corporate Management in the 1990s,” MIS Quarterly, September 1989, pp. 247–255.

Reprint #:

3341

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.