Managing Codified Knowledge
Leading management and organization theorists have popularized the concept of treating organizational knowledge as a valuable strategic asset.1 They advise that to remain competitive, an organization must efficiently and effectively create, locate, capture, and share knowledge and expertise in order to apply that knowledge to solve problems and exploit opportunities. As more firms begin to incorporate knowledge management into their overall business strategy, many are showing tremendous interest in implementing knowledge management processes and technologies.
Although knowledge management is gaining wider acceptance, few organizations today are fully capable of developing and leveraging critical organizational knowledge to improve their performance.2 Many organizations are so complex that knowledge is fragmented, difficult to locate and share, and therefore redundant, inconsistent, or not used at all. In today’s environment of rapid change and technological discontinuity, even knowledge and expertise that can be shared often quickly becomes obsolete. However, while the popular press calls for effectively managing knowledge, almost no research has been done regarding how to do it.
This article focuses on how to configure a firm’s resources and capabilities to leverage its codified knowledge. I refer to this broadly as a knowledge management architecture. I based this framework on research that was motivated by several questions:
- What are the characteristics of explicitly codified knowledge and how should organizations think about managing it?
- What role should information technology play?
- How are organizational capabilities and information technology best integrated and applied to managing knowledge?
- What lessons have companies learned in these endeavors?
To address these questions, I first describe the characteristics of explicit knowledge and its relationship to competitive advantage. Building on research and knowledge about the design of information products,3 I describe an architecture for managing explicit knowledge. I use that framework to derive two fundamental and complementary approaches, each of which is illustrated by a case study. I conclude with a summary of key issues and the lessons learned.
What Is Knowledge?
Knowledge is commonly distinguished from data and information. Data represent observations or facts out of context that are, therefore, not directly meaningful. Information results from placing data within some meaningful context, often in the form of a message. Knowledge is that which we come to believe and value on the basis of the meaningfully organized accumulation of information (messages) through experience, communication, or inference.4 Knowledge can be viewed both as a thing to be stored and manipulated and as a process of simultaneously knowing and acting — that is, applying expertise.5 As a practical matter, organizations need to manage knowledge both as object and process.
Knowledge can be tacit or explicit.6 Tacit knowledge is subconsciously understood and applied, difficult to articulate, developed from direct experience and action, and usually shared through highly interactive conversation, storytelling, and shared experience. In contrast, explicit knowledge is more precisely and formally articulated, although removed from the original context of creation or use (e.g., an abstract mathematical formula derived from physical experiments or a training manual describing how to close a sale).
Explicit knowledge plays an increasingly larger role in organizations, and many consider it the most important factor of production in the knowledge economy. (Imagine an organization without procedure manuals, product literature, or computer software.)
Knowledge may be of several types,7 all of which can be made explicit:
- Declarative knowledge is about describing something. A shared, explicit understanding of concepts, categories, and descriptors lays the foundation for effective communication and knowledge sharing in organizations.
- Procedural knowledge is about how something occurs or is performed. Shared explicit procedural knowledge lays a foundation for efficiently coordinated action in organizations.
- Causal knowledge is about why something occurs. Shared explicit causal knowledge, often in the form of organizational stories, enables organizations to coordinate strategy for achieving goals or outcomes.
Knowledge also may range from the general to the specific:8
- General knowledge is broad, often publicly available, and independent of particular events. Because the context of general knowledge is commonly shared, firms can more easily and meaningfully codify and exchange it — especially among different knowledge or practice communities.
- Specific knowledge, in contrast, is context-specific. Codifying specific knowledge so that it is meaningful across an organization requires that a firm describe its context along with the focal knowledge. This, in turn, requires explicitly defining contextual categories and relationships that are meaningful across knowledge communities. To see how difficult (and important) this may be, ask people from different parts of your organization to define a customer, an order, or even your major lines of business, and observe how much the responses differ.9
Explicating Knowledge
Effective performance and growth in knowledge-intensive organizations requires integrating and sharing highly distributed knowledge.10 However, appropriately explicating tacit knowledge so it can be efficiently and meaningfully shared and reapplied, especially outside the originating community, is one of the least understood aspects of knowledge management. Yet organizations must not shy away from attempting to explicate, share, and leverage tacit, specific knowledge. This suggests a more fundamental challenge, namely, determining which knowledge an organization should make explicit and which it should leave tacit — a balance that can affect competitive performance.
Knowledge may be inherently tacit or seem tacit because no one has yet articulated it, usually because of social constraints.11 Articulating particular types of knowledge may not be culturally legitimate — that is, challenging what the firm knows may not be socially or politically correct, or the organization may be unable to see beyond its habits and customary practices. And, of course, making private knowledge publicly accessible may result in power redistribution that certain organizational cultures may strongly resist. In addition, intellectual constraints — that is, the lack of a formal language or model for articulating tacit knowledge — may impede efforts to make it explicit.
When one compares the potential explicability of a body of knowledge to whether a firm has articulated that knowledge, four outcomes are possible (see Figure 1). If left unarticulated, potentially explicable knowledge represents a lost opportunity to efficiently share and, thus, leverage that knowledge. A competitor, by articulating a similar body of knowledge so it can be routinely integrated and applied, may gain a competitive advantage in the marketplace. However, attempting to make inherently inarticulable knowledge explicit may result in losing the essence of that knowledge, causing performance to suffer. Determining when to make articulable knowledge explicit (i.e., exploiting an opportunity) and when to leave inarticulable knowledge in its “native” form (respecting both the inherent strengths and limits of tacit knowledge) is central to managing an appropriate balance between tacit and explicit knowledge.
Organizations often do not challenge the way they store, treat, or pass on knowledge, which may result in managers blindly accepting the apparent tacitness of some types of knowledge. Mrs. Fields Original Cookies developed process knowledge (i.e., cookie baking) to an explicable level and articulated the process in recipes that result in cookies of consistently high quality throughout the franchise network.12 Ray Kroc, founder of McDonald’s, gained tremendous leverage in articulating and routinizing the process of hamburger making to produce a consistent (if not gourmet) level of quality. But when imagination and flexibility are important, knowledge routinization may be inappropriate. It is the manager’s responsibility to know the difference.
Thus far, I have defined explicit knowledge, discussed some of its characteristics, and made a case for explicating knowledge. Although explicit knowledge represents only part of an organization’s intellectual landscape, it is crucial in a firm’s overall knowledge strategy. Next, I describe the frameworks and architectures needed for managing explicit knowledge.
Knowledge Management Architecture
The management of explicit knowledge utilizes four primary resources (see Figure 2):13
- Repositories of explicit knowledge.
- Refineries for accumulating, refining, managing, and distributing the knowledge.
- Organization roles to execute and manage the refining process.
- Information technologies to support the repositories and processes.
Knowledge Repository
Design of a knowledge repository reflects two basic components of knowledge as an object: structure and content.14 Knowledge structures provide the context for interpreting accumulated content. If a repository were conceived as a “knowledge platform,” a firm could derive many views of the content from a particular repository structure. Each view of the repository may differ on the basis of its content, format, and presentation context.15 A high degree of viewing flexibility enables users to dynamically alter and interactively combine views to more easily apply the knowledge to new contexts and circumstances. Knowledge-as-object becomes knowledge-as-process.
The basic structural element is the knowledge unit, a formally defined, atomic packet of knowledge content that can be labeled, indexed, stored, retrieved, and manipulated. The format, size, and content of knowledge units may vary, depending on the type of explicit knowledge being stored and the context of its use. The repository structure also includes schemes for linking and cross-referencing knowledge units. These links may represent conceptual associations, ordered sequences, causality, or other relationships, depending on the type of knowledge being stored.
To reflect a full range of explicit organizational knowledge, firms must strive to record in their repositories:
- Meaningful concepts, categories, and definitions (declarative knowledge).
- Processes, actions, and sequences of events (procedural knowledge).
- Rationale for actions or conclusions (causal knowledge).
- Circumstances and intentions of knowledge development and application (specific contextual knowledge).
- Linkages among the various types of knowledge.
Such a repository, when indexed using appropriate concepts and categories, can provide the organization with meaningful access to its content. The repository should accommodate changes or additions to the firm’s knowledge (e.g., by linking annotations), as subsequent authors and creators adapt the knowledge for use in additional contexts.
A knowledge platform may consist of several repositories, each with a structure appropriate to a particular type of knowledge or content. These repositories may be logically linked to form a composite or “virtual” repository, the content of each providing context for interpreting the content of the others (see Figure 3). For example, product literature, best-sales practices, and competitor intelligence for a particular market might be stored separately but viewed as though contained in one repository.
Knowledge Refinery
The refinery represents the process for creating and distributing the knowledge contained in a repository. This process includes five stages (see Figure 2):
- Acquisition. An organization either creates information and knowledge or acquires it from various internal and external sources.
- Refinement. Before adding captured knowledge to a repository, an organization subjects it to value-adding processes (refining), such as cleansing, labeling, indexing, sorting, abstracting, standardizing, integrating, and recategorizing.
- Storage and retrieval. This stage bridges upstream repository creation and downstream knowledge distribution.
- Distribution. This stage comprises the mechanisms an organization uses to make repository content accessible.
- Presentation. The context in which an organization uses knowledge pervasively influences its value. Firms must develop capabilities that enable flexibility in arranging, selecting, and integrating knowledge content.
Acquisition, refinement, and storage create and update the knowledge platform, whereas retrieval, distribution, and presentation derive various views of that knowledge.
Knowledge Management Roles
Knowledge management programs often overemphasize information technology at the expense of well-defined knowledge management roles and responsibilities. Traditional organizational roles typically do not address knowledge management or the cross-functional, cross-organizational process that a firm uses to create, share, and apply knowledge. I present an architecture that suggests a set of organizational roles a firm should define explicitly. First, some organizations assign a chief knowledge officer to comprehensively handle knowledge management as a cross-organizational process. This person is responsible for the organization’s knowledge management architecture. Many organizations also cluster those responsible for knowledge management into knowledge or expertise centers, each being responsible for a particular body of knowledge. Their responsibilities typically include championing knowledge management, educating the organization, mapping knowledge, and integrating the organizational and technological resources critical to the knowledge management architecture. In addition, firms must assign explicit responsibility for each stage of the refinery and their interfaces. Assigning responsibility for the seamless movement of knowledge from acquisition through use, as well as the interfaces between these stages, helps to ensure that knowledge repositories will be meaningfully created and effectively used.
Role of Information Technologies
The information technology infrastructure provides a seamless “pipeline” for the flow of explicit knowledge through the five stages of the refining process to enable:
- Capturing knowledge.
- Defining, storing, categorizing, uindexing, and linking digital objects that correspond to knowledge units.
- Searching for (“pulling”) and subscribing to (“pushing”) relevant content.
- Presenting content with sufficient flexibility to render it meaningful and applicable across multiple contexts of use.
Using information technologies — for example, the World Wide Web and groupware — a firm can build a multimedia repository for rich, explicit knowledge. Organizations capture and store units of knowledge in forms that assign various labels, categories, and indexes to the input. A flexible structure creates knowledge units, indexed, and linked by categories, that reflect the structure of the contextual knowledge and the content of the organization’s factual knowledge, displayed as flexible subsets via dynamically customizable views.
Effective use of information technology to communicate knowledge requires that an organization share an interpretive context. When communicators share similar knowledge, background, and experience, they can more effectively communicate knowledge via electronically mediated channels.16 For example, by means of a central electronic repository, an organization can disseminate explicit, factual knowledge within a stable community having a high degree of shared contextual knowledge. However, when communicators share an interpretive context only to a moderate degree, when they exchange knowledge that is less explicit, or when a community is loosely affiliated, more interactive modes — such as e-mail or discussion databases — are appropriate. When context is not well shared and knowledge is primarily tacit, firms can best support communication and narrated experience with the richest and most interactive modes, such as videoconferencing or face-to-face conversation.
Classifying Knowledge Management Applications
On the basis of this concept of knowledge management architecture, a firm can segment knowledge processing into two broad classes: integrative and interactive, each addressing different knowledge management objectives. Together, these approaches provide a broad set of knowledge-processing capabilities. They support well-structured repositories for managing explicit knowledge, while enabling interaction to integrate tacit knowledge.
Integrative Applications
Integrative applications exhibit a sequential flow of explicit knowledge into and out of a repository. Producers and consumers interact with the repository rather than with each other directly. The repository becomes the primary medium for knowledge exchange, providing a place for members of a knowledge community to contribute their knowledge and views. The primary focus tends to be on the repository and the explicit knowledge it contains, rather than on the contributors, users, or the tacit knowledge they may hold.
Integrative applications vary in the extent to which knowledge producers and consumers come from the same knowledge community. At one extreme, which I label electronic publishing, the consumers (readers) neither directly engage in the same work nor belong to the same practice community as the producers (authors). Once published, the content tends to be stable, and the few updates required usually originate with the authors. The consumer accepts the content as is, and active user feedback or modification is not anticipated (although it could be allowed). For example, the organization may produce a newsletter, or its human resources department may publish its policies or a directory of employee skills and experience.
At the other extreme, the producers and consumers are members of the same practice community or organizational unit. While still exhibiting a sequential flow, the repository provides a means to integrate and build on their collective knowledge. I label this an integrated knowledge base. A best-practices database is the most common example of this type of application. Practices are collected, integrated, and shared among people confronting similar problems.
Regarding the organizational roles for managing integrative applications, acquisition requires knowledge creators, finders, and collectors. Capturing orally conveyed knowledge requires interviewers and transcribers. Documenting observed experiences requires organizational “reporters.” Identifying and interpreting deeply held cultural and social knowledge may require corporate anthropologists. Refining requires analysts, interpreters, abstractors, classifiers, editors, and integrators. A librarian or “knowledge curator” must manage the repository. Others must take responsibility for access, distribution, and presentation. Finally, organizations may need people to train users to critically interpret, evaluate, and adapt knowledge to new contexts.
Interactive Applications
Interactive applications focus primarily on supporting interaction among those people with tacit knowledge. In contrast to integrative applications, the repository is a by-product of interaction and collaboration rather than the primary focus of the application. Its content is dynamic and emergent.
Interactive applications vary according to the expertise level of producers and consumers and the degree of structure imposed on their interaction. When formal training or knowledge transfer is the objective, the interaction tends to be primarily between instructor and student (or expert and novice) and structured around a discrete problem, assignment, or lesson plan.17 I refer to these applications as distributed learning.
In contrast, interaction among those performing common practices or tasks tends to be more ad hoc or emergent. I broadly refer to these applications as forums. They may take the form of a knowledge brokerage — an electronic discussion space where people may either search for knowledge (e.g., “Does anyone know. . . .”) or advertise their expertise. Highly interactive forums support ongoing, collaborative discussions among the producers and consumers as one group, continually responding to and building on each individual’s additions to the discussion. The flow continually loops back from presentation to acquisition. With the appropriate structuring and indexing of the content, a knowledge repository emerges. A standard categorization scheme indexes contributions so the firm can reapply that knowledge across the enterprise.
Interactive applications play a major role in supporting integrative applications.18 For example, a forum may be linked to an electronic-publishing application so that editors can discuss the quality of the contributions or provide a place for readers to react to and discuss the publication. Best-practice databases typically require some degree of forum interaction, so that those attempting to adopt a practice have an opportunity to discuss its reapplication with its creators. Regarding the organizational roles for managing interactive applications, acquisition requires recruiters and facilitators to encourage and manage participation in forums so that those with appropriate expertise contribute. The communicators often refine, structure, and index the content, using guidelines and categories built into the application and supplemented by a conference moderator. Assuring the quality of the knowledge may require quality-assurance personnel, such as subject-matter experts and reputation brokers. Usually a conference moderator manages a conference repository throughout its life cycle. Initially, others may need to work with users until they are comfortable with gaining access to and using the application.
Two Case Studies
I present two cases studies of managing explicit knowledge. One is an example of an integrative architecture for the electronic publishing of knowledge gleaned by industry research analysts.19 The second illustrates the effective use of an interactive architecture for discussion forums to support servicing customers.20
Integrative Architecture
Technology Research Inc. (TRI)21 is a leading international provider of market information and industry analysis to information technology vendors and purchasers. TRI employs more than 300 analysts and annually publishes more than 15,000 research reports that address more than fifty distinct subject areas (called research programs). (See Figure 4 for TRI’s knowledge management architecture.)
The on-line knowledge repository comprises a standard set of knowledge units consisting of the executive summaries, abstracts, main text, graphics, tables, and charts from TRI research reports. The company updates its research reports continuously, so the repository is, in this sense, dynamic. Knowledge units are indexed and linked for flexible access, and users may sequentially navigate from one unit to the next within a report, access similar units across reports (e.g., executive summaries only), or access particular units directly. This standardization enables TRI to integrate analysts’ explicated knowledge across research programs for meta-analysis, creating new knowledge not possessed by any single analyst. As technology changes, new research areas emerge that cut across TRI’s traditional research programs and internal organizational boundaries. Building repositories based on a flexible yet standard structure enables TRI to respond by integrating those repositories into composite platforms to support virtual research programs. From its repositories, TRI derives standard monthly reports and more frequent ad hoc bulletins for each research program from several electronic formats (Web, CD, fax, e-mail).
TRI’s refinery encompasses two stages: analysis and publishing. Analysis involves collecting, evaluating, and interpreting market information, and reporting the results. The analysts’ tacit knowledge of their particular industry is applied to this information to produce an explicitly reported interpretation. The process is similar to investigative reporting, in that analysts try to get “the story behind the numbers.”
In the publishing stage, editors convert analysts’ reports to a standard format and decompose them into knowledge units, assigning standard document identifiers and keywords and creating links among knowledge units. While perhaps less efficient than having all analysts initially write to a standard format, TRI’s approach preserves the analysts’ autonomy and creative, entrepreneurial spirit. TRI manages this trade-off to foster a balance between the efficiency and speed of knowledge management and knowledge-worker morale, commitment, and performance quality. TRI distributes on-line documents primarily via Web-enabled Lotus Notes.TM
TRI’s experiences illustrate how digitizing content alone is not adequate to exploit the opportunities for flexibility and innovation in the design and delivery of explicit knowledge. Digitized documents must be structured as knowledge units within a modular and flexible repository from which multiple knowledge views can be rapidly and efficiently created as new user needs arise in new contexts. In addition, a robust, seamless, and scalable technology infrastructure is key to enabling the flexibility required for an integrative knowledge management refinery. It provides a multitude of user-defined views of rich, multi-media documents, embeds hyperlinks, and provides an efficient yet flexible distribution channel.
Implementing this new architecture has been as much an organizational and social intervention as a technical one. TRI assigned and then trained people to perform new roles to shepherd the movement of knowledge from raw to useable product; this human-resource investment was instrumental in the company’s success. However, the existing roles and responsibilities of TRI analysts, editors, and IT professionals have changed. The move to institute process and content standards reduced the level of analyst autonomy and discretion in regard to writing format and style, placing many decisions in the hands of editors and production staff. Ultimately, success in electronic publishing was based at least as much on effectively managing organizational change as on implementing a sound product architecture and electronic-publishing technology.
Interactive Architecture
Buckman Laboratories (BL), a $300 million international specialty chemicals company with more than 1,200 employees (called associates) operating in more than 80 countries, is a recognized leader in knowledge management.22
The basis for competition in BL’s industry has changed from merely selling products to solving customers’ chemical-treatment problems. This requires not only knowledge of products and their underlying chemistry, but also knowledge of how to apply them in various contexts. While many BL associates have college degrees in chemistry and related fields, selling and applying BL products requires practical field experience in solving customer problems. This knowledge is tacit, residing primarily with the field associates scattered worldwide. Field-based knowledge is complex in that it has to account for, often subconsciously, many interacting variables and can be specific to a geographical region, a mill, or even a particular machine. It is dynamic, emergent, and continually evolving. BL management believes that in this type of competitive environment, strategic advantage results primarily from applying the most recent practical knowledge and experience of all associates to each customer problem.
To accomplish this, Bob Buckman, chairman of BL Holdings (the BL parent company), envisioned an online knowledge management capability that BL implemented as K’Netix,® The Buckman Knowledge Network. It was founded on several key principles:
- Direct exchange of knowledge among employees.
- Universal, unconstrained ability to contribute to and gain access to the firm’s knowledge without regard for time zone, physical location, language, or level of computer proficiency.
- Preservation of conversations, interactions, contributions, and exchanges.
- Easy accessibility — that is, searchable by all BL associates.
BL has placed much of its explicit knowledge about customers, products, and technologies into on-line electronic repositories comprising a set of integrative knowledge management applications. However, BL has progressed well beyond integrative knowledge management. Its on-line interactive Tech Forum supports the core of BL knowledge strategy (see Figure 5). Any associate can use Tech Forum to locate, capture, distribute, share, and integrate the practical, applied knowledge and experience of all other BL associates in support of the customer.23 The forum uses a standard structure; comments are “threaded” in conversational sequence and indexed by topic, author, and date. The content typically comprises questions, responses, and field observations.
Well defined and specifically assigned, knowledge management roles at BL are of two broad classes: those that facilitate the direct and emergent exchange of knowledge through the forum (the interactive aspect of the architecture) and those that support refining and archiving the record of those exchanges for future use (the integrative aspect). BL has successfully integrated the two in terms of organization structure and knowledge flow.
BL organizes several knowledge management roles under the Knowledge Transfer Department (KTD). Subject experts assigned throughout the company take the lead in guiding discussions about their area of expertise and provide a measure of quality assurance regarding the advice given by others. With the support of KTD personnel, they periodically review Tech Forum to identify useful threads for storage in an on-line repository. The threads are extracted, edited, summarized, and assigned keywords. Thus, valuable emergent content is collected and integrated so that it is widely accessible, easily distributed, and profitably reused. KTD personnel continually monitor Tech Forum, encourage participation, and provide end-user support and training. The most technically qualified person at each operating company worldwide is available to offer advice via Tech Forum. Product development managers use the forum to offer on-line technical advice to field personnel and to stay current with applications issues arising in the field. Research librarians assigned to particular industries search for publicly available information about their industries. An information technology group maintains the technical infrastructure.
Customers stated that BL’s ability to leverage its collective knowledge via Tech Forum was instrumental in making a sale to them. However, the technology is not proprietary or leading edge; the process is not complex. The true source of BL’s advantage is not in the technology or the process, which are easily imitated, but in the culture and structure of the organization. The organization’s willingness to create, share, and reapply knowledge provides the context for successfully executing BL’s knowledge strategy and architecture.
Another reason for the forum’s success is that it has become part of the ongoing habits and practices of the organization.24 Everyone expects his or her coworkers to read the forum regularly; to post problems, replies, and observations there; and to contribute whenever possible. Consistent, collective compliance creates and continually reinforces perception of the forum as a reliable and efficient means for sharing knowledge and solving problems. Its use, supported by active management of the architecture, has become self-sustaining. BL management understands that the confluence of culture, roles, norms, habits, and practices leading to this success is difficult to imitate and, therefore, together with associates’ knowledge and the technology infrastructure, provides a true competitive advantage.
Context of Knowledge Management
I have described explicit knowledge, proposed an architectural framework for its management, and presented two examples of its application. This framework is a coherent approach to begin designing a capability for managing explicit knowledge. Next, I discuss several key issues about the broader organizational context for knowledge management, the design and management of knowledge-processing applications, and the benefits that must accrue to be successful.
Knowledge architectures exist within four primary contexts that influence how knowledge management affects an organization’s performance.
Strategic context addresses an organization’s intent and ability to exploit its knowledge and learning capabilities better than the competition.25 It includes the extent to which the members of an organization believe that superior knowledge is a competitive advantage and how they explicitly link strategy, knowledge, and performance. The successful firms I have studied are able to articulate the link between the strategy of their organization and what members at all levels of that organization need to know, share, and learn to execute that strategy. This articulation guides how they deploy organizational and technological resources and capabilities for explicating and leveraging knowledge, which increases the probability of their adding value.
Knowledge context addresses the competitiveness of an organization’s knowledge. Existing knowledge can be compared to what an organization must know to execute its strategy. Where there are current or future gaps, knowledge management efforts should be directed toward closing them, assuring a strategic focus. An organization also must assess the quality and strategic value of its knowledge relative to the competition. To the extent that the bulk of a firm’s knowledge is common and basic, that knowledge will provide less competitive advantage than if the firm’s knowledge is unique and innovative. Explicating and leveraging that innovative knowledge can provide the greatest competitive benefit.
Organizational context reflects the organization roles and structure — formal and informal — as well as the sociocultural factors affecting knowledge management such as culture, power relations, norms, reward systems, and management philosophy. Beyond the knowledge management roles proposed earlier, effective knowledge creation, sharing, and leveraging requires an organizational climate and reward system that values and encourages cooperation, trust, learning, and innovation and provides incentives for engaging in those knowledge-based roles, activities, and processes.26 I have consistently observed this aspect to be a major obstacle to effective knowledge management.
Technology context addresses the existing information technology infrastructure and capabilities supporting the knowledge management architecture. One adage states that knowledge management is 10 percent technology and 90 percent people. However, without the ability to seamlessly collect, index, store, and distribute explicit knowledge electronically whenever and wherever needed, an organization will not fully exploit its capabilities and incentives. As the BL and TRI examples illustrate, the technology need not be complex or leading edge to provide significant benefit. Its absence, however, would seriously impinge on the efforts of these companies to effectively manage their knowledge assets.
New Organizational Roles
The successful firms that I observed have explicitly defined and rewarded roles that facilitate knowledge capture, refinement, retrieval, interpretation, and use. Perhaps the most important role is that of subject-matter expert, functioning as an editor to assure quality of content and as a repository manager, who assures the quality of context by thoughtful abstracting and indexing. In converting to on-line knowledge management, TRI found the need for a much greater investment in editors to perform these roles. BL showed its commitment by assigning some of its most knowledgeable people to these roles.
Managing Knowledge-Processing Applications
Knowledge management applications form a continuum from low to high interaction complexity. Forums are the most interactive and complex application because they tend to span the entire tacit/explicit knowledge-processing cycle. Establishing a well-defined social community and shared context to support the use of the technology plays a key role in an application’s success. Electronic publishing, in contrast, is perhaps the most straightforward. It is one-way distribution of explicit knowledge to a user community that may be loosely affiliated and related only by its need for access to the same knowledge repository, but not necessarily supported by a social community. The greater the interaction complexity, the more that challenges become social, cognitive, and behavioral in nature rather than technical and, thus, require well-managed organizational change programs.
Knowledge repositories have a life cycle that firms must manage. Once created, repositories tend to grow, reaching a point at which they begin to collapse under their own weight, requiring major reorganization.27 Their rejuvenation requires deleting obsolete content, archiving less active but potentially useful content, and reorganizing what remains. Content or topic areas may become fragmented or redundant. Reorganizing requires eliminating those redundancies, combining similar contributions, generalizing content for easier reapplication, and restructuring categories as needed. Successful knowledge management organizations proactively manage and reorganize their repositories as an ongoing activity rather than waiting for decline to set in before acting.
Complex knowledge management problems typically require multiple repositories segmented by degree of interactivity, volatility of content, or the structure of the knowledge itself. Each repository may have a different set of processes and roles by which its content is created, refined, and stored. Long-lived, archival knowledge may have a more formal review and approval process, whereas best practices may undergo expedited editing, and discussion databases for rapid exchange may have no review process other than after-the-fact monitoring by a forum moderator. Furthermore, the use of knowledge repositories typically causes knowledge creation and knowledge application to become separated in time and space. Therefore, firms must continually evaluate the knowledge to ensure that it applies to current context and circumstances. Firms may need to segment their repositories and their underlying management processes on the basis of the volatility of their context as well as content. For example, the storage structures and processes for managing product knowledge in rapidly changing markets may differ significantly from managing that knowledge in stable markets. Segmenting these repositories and identifying any significant differences in their refinery processes are crucial for successful application, as is their integration to provide seamless access to their knowledge.
For knowledge repositories to be meaningful, their structure must reflect the structure of shared mental models or contextual knowledge tacitly held by the organization. In most organizations, those structures are neither well defined nor widely shared. Yet their explication is essential for effectively managing explicitly encoded organizational knowledge. This requires that a firm define what a knowledge unit means and how to meaningfully index and categorize a collection of knowledge units for ease of access, retrieval, exchange, and integration. Creating “semantic consensus” even within common practice communities is often a difficult task, let alone across an entire organization. TRI found developing standards to be a particularly difficult challenge, yet one that had to be addressed for the publishing process to function. For example, when TRI first migrated to online publishing, it had no standard spellings for vendor names, technology keywords, or even research programs — all essential for effective repository management. TRI even struggled to create a standard and consistent definition of a knowledge unit. BL had more flexibility within its forums, yet also found that developing a meaningful indexing scheme for its file library was critical for its use. These experiences are not unusual. Different lexicons naturally emerge from different parts of an organization. In many ways, standards are not compatible with the culture of many organizations. However, the ability to integrate and share knowledge depends on some broadly meaningful scheme for its structure.
Integration of knowledge across different contexts opens an organization to new insights. A practice community’s exposure to how its knowledge can be applied in other contexts increases the scope and value of that knowledge. Often the variety of experiences within a local community of practice is not expansive enough to fully understand some phenomena. By being able to combine experiences across communities, the scope of experience is broadened, as is the ability to learn from those experiences. For example, I worked with a leading imaging firm that created a standard way to capture and share sales techniques among its market segments. By sharing knowledge of how customers in different market segments used a particular product, salespeople in each territory were exposed to patterns, insights, and selling opportunities they might not have perceived on their own.
Benefits Depend on Application
The nature of the benefits gained from managing explicit knowledge depends on the type of application. Electronic publishing and other low interactivity, high-structure applications tend to provide a significant cost saving or increased efficiency. Publishing electronically is much less expensive than distributing on paper. In the case of distributed learning, electronically distributing prepackaged knowledge (e.g., electronic textbooks and course notes) can save significant travel expenses. In contrast, the more interactive or emergent-content applications tend to provide support for solving problems, innovating, and leveraging opportunities. The greatest impact, however, comes from combining the two.
For example, BL is adding a distance-learning capability to its other applications, rounding out its portfolio. The company is poised to reap the greatest benefit by integrating the capabilities of all its applications. BL will be able to archive its emergent knowledge (developed through the Tech Forum), make it available for searching by associates in the field, and also edit and repackage the knowledge as training materials by means of the distance-learning application. Thus, training will have more of a “real world” feel and focus. Students will be able to review actual problems and, after deliberating independently, find real-life solutions. Formal training will take place in the field, giving students the ability to directly apply or integrate the training materials with their own day-to-day problems. In this way, those materials become more relevant and interwoven into the student’s tacit experience and the learning more meaningful and lasting. By integrating the interactive, emergent forums with the structured content and distribution of formal training, a firm encourages a continual cycle of knowledge creation and application. Tacit knowledge is made explicit via forums, formally transferred via distance learning, and tacitly reapplied in context. New tacit knowledge becomes available for sharing with others via the same cycle. Each turn of the cycle increases the knowledge of the organization,28 providing potentially greater competitive advantage.
In summary, organizations that are managing knowledge effectively:
- Understand their strategic knowledge requirements.
- Devise a knowledge strategy appropriate to the firm’s business strategy.
- Implement an organizational and technical architecture appropriate to the organization‘s knowledge-processing needs.
These factors enable the organization to apply maximum effort and commitment to creating, explicating, sharing, applying, and improving its knowledge.
Some view knowledge management as merely the current business fad. Yet knowledge is the essence of humans as individuals and collectivities. Respecting and institutionalizing the role of knowledge and learning may be the most effective approach to building a solid and enduring competitive foundation for business organizations. Firms can derive significant benefits from consciously, proactively, and aggressively managing their explicit and explicable knowledge. Doing this in a coherent manner requires aligning a firm’s organizational and technical resources and capabilities with its knowledge strategy. This requires mapping the firm’s organizational and technical capabilities and constraints to its knowledge-processing requirements. It may require significant organizational and technical interventions. The knowledge management architecture provides a framework for guiding this effort.
References
1. For example, see:
J.S. Brown and P. Duguid, “Organizational Learning and Communities-of-Practice: Toward a Unified View of Working, Learning and Innovation,” Organization Science, volume 2, February 1991, pp. 40–57;
T. Davenport, S. Jarvenpaa, and M. Beers, “Improving Knowledge Work Processes,” Sloan Management Review, volume 37, Summer 1996, pp. 53–66;
P.F. Drucker, “The New Productivity Challenge,” Harvard Business Review, volume 69, November–December 1991, pp. 69–76;
B. Kogut and U. Zander, “Knowledge of the Firm, Combinative Capabilities, and the Replication of Technology, Organization Science, volume 3, August 1992, pp. 383–397;
I. Nonaka, “A Dynamic Theory of Organizational Knowledge Creation,” Organization Science, volume 5, February 1994, pp. 14–37;
J.B. Quinn, P. Anderson, and S. Finkelstein, “Managing Professional Intellect: Making the Most of the Best,” Harvard Business Review, volume 74, March 1996, pp. 71–82; and
S.G. Winter, “Knowledge and Competence as Strategic Assets,” in D.J. Teece, ed., The Competitive Challenge: Strategies for Industrial Innovation and Renewal (Cambridge, Massachusetts: Ballinger, 1987), pp. 159–184.
2. R.J. Heibeler, “Benchmarking Knowledge Management,” Strategy & Leadership, volume 24, March–April 1996, pp. 22–29; and
L.W. Payne, “Unlocking an Organization’s Ultimate Potential Through Knowledge Management,”Knowledge Management in Practice (American Productivity & Quality Center), volume 1, April–May 1996).
3. M.H. Meyer and M.H. Zack, “The Design of Information Products,” Sloan Management Review, volume 37, Spring 1996, pp. 43–59;
M.H. Zack, “Electronic Publishing: A Product Architecture Perspective,” Information & Management, volume 31, 1996, pp. 75–86; and
M.H. Zack and M.H. Meyer, “Product Architecture and Strategic Positioning in Information Products Firms,” in M.K. Ahuja, D.F. Galletta, and H.J. Watson, eds., Proceedings of the First Americas Conference on Information Systems (Pittsburgh: Association for Information Systems, August 1995), pp. 199–201.
4. D.G. Bobrow and A. Collins, eds., Representation and Understanding: Studies in Cognitive Science (New York: Academic Press, 1975);
J.S. Bruner, Beyond the Information Given, J.M. Anglin, ed. (New York: Norton, 1973);
C.W. Churchman, The Design of Inquiring Systems: Basic Concepts of Systems and Organization (New York: Basic Books, 1971);
F.I. Dretske, Knowledge and the Flow of Information (Cambridge, Massachusetts: MIT Press, 1981);
F. Matchlup, Knowledge: Its Creation, Distribution and Economic Significance. Volume 1: Knowledge and Knowledge Production (Princeton, New Jersey: Princeton University Press, 1980); and
D.M. MacKay, Information, Mechanism and Meaning (Cambridge, Massachusetts: MIT Press, 1969).
5. F. Blackler, “Knowledge, Knowledge Work and Organizations: An Overview and Interpretation,” Organization Studies, volume 16, number 6, 1995, pp. 1021–1046;
Kogut and Zander (1992); Dretske (1981); and J. Lave, Cognition in Practice (Cambridge, England: Cambridge University Press, 1988).
6. Brown and Duguid (1991);
J. Lave and E. Wenger, Situated Learning: Legitimate Peripheral Participation (Cambridge, England: Cambridge University Press, 1991);
Nonaka (1994); M. Polyani, The Tacit Dimension (Garden City, New York: Doubleday, 1966); and
P. Romer, “Beyond the Knowledge Worker,” World Link, January–February 1995, pp. 56–60.
7. J.R. Anderson, Cognitive Psychology and Its Implications (New York: Freeman, 1985); and
R.C. Schank, “The Structure of Episodes in Memory,” in D.G. Bobrow and A. Collins, eds., Representation and Understanding: Studies in Cognitive Science(New York: Academic Press, 1975), pp. 237–272.
8. H. Demsetz, “The Theory of the Firm Revisited,”Journal of Law, Economics and Organization, volume 4, Spring 1988, pp. 141–161; and
R.M. Grant, “Toward a Knowledge-Based Theory of the Firm,” Strategic Management Journal, volume 17, Winter 1996, pp. 109–122.
9. See, for example, Zack (1996).
10. This line of reasoning is addressed in: Demsetz (1988); R.M. Grant, “Prospering in Dynamically Competitive Environments: Organizational Capability as Knowledge Integration,” Organization Science, volume 7, number 4, July 1996, pp. 375–387;
Kogut and Zander (1992); and E.T. Penrose, The Theory of the Growth of the Firm New York: Wiley, 1959).
11. C. Argyris and D.A. Schon, Organizational Learning: A Theory of Action Perspective (Reading, Massachusetts: Addison-Wesley, 1978);
T.H. Davenport, R.G. Eccles, and L. Prusak, “Information Politics,” Sloan Management Review, volume 34, Fall 1992, pp. 53–65;
E.H. Schein, Organizational Culture and Leadership San Francisco: Jossey-Bass, 1992);
C.J.G. Gersick, “Habitual Routines in Task-Performing Groups,” Organizational Behavior and Human Decision Processes, volume 47, October 1990, pp. 65–97; and
R. Nelson and S. Winter, An Evolutionary Theory of Economic Change Cambridge, Massachusetts: Belknap, 1982).
12. R.E. Bohn, An Informal Note on Knowledge and How to Manage It Boston: Harvard Business School, 1986); and
J. Schember, “Mrs. Fields’ Secret Weapon,” Personnel Journal, volume 70, September 1991, pp. 56–58.
13. For an extended discussion of information product architectures, see:
Meyer and Zack (1996); and, for an extended discussion of the refinery aspect, see:
M.H. Zack, “An Information Infrastructure Model for Systems Planning,” Journal of Systems Management, volume 43, August 1992, pp. 16–19 and 38–40.
14. MacKay (1969).
15. Meyer and Zack (1996).
16. M.H. Zack, “Electronic Messaging and Communication Effectiveness in an Ongoing Work Group,” Information & Management, volume 26, April 1994, pp. 231–241.
17. Although distributed learning applications are typically supplemented with electronically published course materials and assignments (an integrative application), distributed learning refers primarily to the student/instructor interaction (an interactive application).
18. While these approaches are conceptually distinct, they could be implemented within the same software platform, and, in fact, common technology will enable smoother integration.
19. I obtained this information during twelve hours of interviews with the senior vice president responsible for information and consulting services, the director of information systems strategy responsible for the electronic-publishing project, the lead application architect, and a senior analyst/consultant to the project. I also reviewed archival documentation that included design documents, a discussion database used to support the project team, and related e-mail messages.
20. I obtained this information during approximately 100 hours of interviews and focus-group sessions with senior executives and managers of various departments at Buckman Labs.
21. This company name is a pseudonym.
22. Buckman Labs has won several awards for its knowledge management infrastructure, including the 1996 Arthur Andersen Enterprise Award for Sharing Knowledge and, in 1997, the ComputerWorld/ Smithsonian Award – Manufacturing Section.
23. Buckman Labs produces a version of the Tech Forum for Latin America called Foro Latinumber and is translating its forums, Web pages, and other knowledge repositories into several languages.
24. Zack (1994).
25. M.H. Zack, “Developing a Knowledge Strategy,” California Management Review, volume 41, Spring 1999, pp. 127–145.
26. Nonaka (1994); and
M.H. Zack and J.L. McKenney, “Social Context and Interaction In Ongoing Computer-Supported Management Groups,” Organization Science, volume 6, July–August 1995, pp. 394–422.
27. C.C. Marshall, F.M. Shipman III, R.J. McCall, “Making Large-Scale Information Resources Serve Communities of Practice,” Journal of Management Information Systems, volume 11, Spring 1995, pp. 65–86.
28. Nonaka (1994).