February 14th, 2009 · Comments Off on The Problem Solving Pattern Matters: Part Six, Enhancing Developing Solutions: Coming Up With New Ideas

(Co-Authored with Steven A. Cavaleri)
Enhancing problem seeking, recognition, and formulation alone, will move an organization some way toward the Open PSP. But enhancing developing new solutions is equally necessary to get there. There are two important stages in developing new solutions: 1) coming up with new ideas; and 2) evaluating them before communicating them to others as surviving knowledge and implementing them.
Coming up with New Ideas
There is no methodology that can guarantee producing both new and “good” ideas (i.e. ones that merit seriously evaluating them in comparison to their competitors), but there are things we can do to make their production more likely. Let’s consider the resources available when we need to come up with new ideas. First, there’s the world outside the enterprise and its various sources of information. We can acquire information helpful to us using a variety of methods that everyone is familiar with, including both interpersonal and technological methods, among the most recent being Internet searching and browsing, and “Web 2.0” tools. The extent and quality of our access to external interpersonal networks, and cultural knowledge stores, influences our ability to come up with new ideas.
Second, there’s also “our” previous knowledge, including
— our mental models, and
— the cultural knowledge we find in enterprise media, databases, and content stores.
All new knowledge we create, or discover, somehow leverages previous mental knowledge formed, in part, through interacting with cultural knowledge. However original our thinking, we always have some presuppositions. We always leverage something old in making something new. Our ability to create new ideas is based in part on the extent and quality of our access to cultural knowledge and also on the extent and quality of our ability to remember previous mental models. In addition, it is based on our ability to participate in social networks, communities, and collectives that allow us to participate in and contribute to the social aspects of the process of coming up with new ideas. The first of these is based, in turn, on the degree of transparency in our organizations, and while the second, ability to remember, is, in part, biological; it is also, in part, a skill that can be learned and taught. The third is based, in great part, on the degree of inclusiveness characterizing the networks, communities, and other collectives we wish to participate in.
Third, we also use inputs from other individuals, groups, communities, teams, and other organizational structures generating knowledge claims. How good we are at coming up with new ideas depends, in part, on the social networks in which we’re embedded and on the information that we can retrieve from our networks, or that is broadcast to us, or shared with us by our families, friends, communities, teams, and organizations. But our ability to participate in networks, and in collective problem solving efforts is based, once again, on the extent of inclusiveness in our organizations and also on the extent of interpersonal trust existing in these networks.
And fourth, we have our own mind and its ability to create ideas and models. This ability and our acts of creation are influenced by our brain functioning and also by our social, cultural, and ecological environments. We can do things to increase our ability to create new ideas. But, our ideas are not reducible to some combination of the foregoing factors. There is a creative element, something “emergent”, which we, as living systems, produce, which is beyond our ability to predict and control.
Looking at these four categories, we can see that to discuss all the important things we can do to enhance the process of coming up with new ideas by implementing measures in each of these four categories needs a lengthy work. Here, however, is a list emphasizing a few very important ways of enhancing this process. These include:
— implementing policy allowing knowledge workers to access Web 2.0, 3.0, and future generations of web technology for interactions with others outside the firewall;
–introducing a comprehensive organizational support system for openness to new ideas;
— constructing “knowledge bases” that actually distinguish knowledge from information; and
— implementing training in using social technologies, and implementing IT applications that enhance the capabilities of individuals to create new ideas.
Here are some amplifying comments expanding on this list.
First, organizations should implement a policy of continuing to acquire both advancing search technology and new applications in 2.0 and 3.0 technology. There’s a debate right now about whether Enterprise 2.0 technology is really “productive,” or wastes the time of knowledge workers. For us, however, bringing social computing and social software into the enterprise is purely a “no-brainer.” It’s not a question of direct productivity. That aspect of things has to do with the Operational Pattern (OP). But Enterprise 2.0 and the coming later generations of social software are about “exception handling,” and problem solving. That is, they enhance virtual connectivity in organizations and carry with them the potential to both draw on the learning of others, and also to enhance one’s own learning provided that the patterns of connectivity are open. Social software is about the PSP more than it is about the OP, and any organization that doesn’t commit to these technological advances will soon find the effectiveness of their PSPs lagging behind those of organizations that have introduced these newer applications.
To Be Continued
Tags: KM Software Tools · Knowledge Making · Knowledge Management
February 13th, 2009 · 5 Comments

This post continues the discussion begun in Part Three of this series with an analysis of problems with the underlying conceptual model of the presentation on the Federal KM Initiative. Here’s a graphic of that model as I’ve reconstructed it from the text of the presentation.

Reconstruction of Neil Olonoff’s Conceptual Model
Let’s work backwards from the benefits. These are enhanced by the “knowledge” resulting from increased sharing. But again, what is this “knowledge?” Is it “beliefs” in the sense of psychological orientations to situations that have been shared? Is it “attitudes” that have been shared? Is it the content we find in documents in both electronic and other formats? Is it all three? Further, will any kinds of beliefs, attitudes, or document content, enhance these benefits, or only beliefs, attitudes, or content of a particular sort? If the latter, then what sort of beliefs, attitudes, or content are we talking about? If one says, beliefs, attitudes, or content constituting knowledge, then the question arises: what differentiates beliefs attitudes, or content constituting knowledge from other beliefs, attitudes, or content? And even if we could answer these questions, then there’s the further issue of whether all shared “knowledge” will enhance the four benefit outcomes? And if not, how do we distinguish “good knowledge” that will enhance the four benefits from “bad knowledge” that detracts from them?
Some might say that all these questions are beside the point since the claim being made is that “shared knowledge” must enhance these benefits, because if we really have “knowledge,” that knowledge can’t be wrong, so anything based on it, like the four benefits, has to be enhanced since it is the result of applying knowledge that is “true.” Well, if one defines knowledge as beliefs, attitudes, or perhaps content that is “true,” that argument will hold. But then there’s another problem, and that is that “knowledge sharing” occurs by means of sharing experiences and sharing linguistic and other cultural content. Unfortunately, both experience and cultural content embed much more than “knowledge,” especially if we define it as some form of “true” content. Some of what we get from engaging with experience and content may be “knowledge,” in this sense of the term, but much more of what we label “knowledge” may be beliefs, attitudes, and content that misleads us about both ourselves and the external world. So, how will we know whether what has been shared is “knowledge” or error? And if it is error, then why should we expect it to enhance the four benefits as required by the model?
Let’s look at the next link in the model sequence connecting enhanced knowledge sharing and “knowledge.” This is another link that seems transparent on the surface, but that is problematic. The reason why it seems transparent is that one interpretation of it sounds like a tautology. “Of course, if we enhance knowledge sharing, the outcome will be a greater amount of shared knowledge. Isn’t that a consequence of the definition of knowledge sharing?” Right. But tautologies don’t provide useful knowledge about the world. And those who take refuge in them to avoid being wrong, can learn nothing, and can’t grow their knowledge. So, let’s not look at this link as a tautology. Let’s look at it instead as a link between the outcome of efforts to increase knowledge sharing and the outcome of the results of those efforts, whether or not those results actually increase shared knowledge.
What do efforts at knowledge sharing actually do? I think the answer is that, if successful, they get people to increase their frequency of expression of knowledge claims to each other in electronic or other media formats, and also in interpersonal social contexts. It’s important to realize, however, that from the viewpoint of those on the receiving end to whom these knowledge claims are new, the knowledge claims themselves are just information, not knowledge. They may or may not be knowledge to those sending them. Some of them may even be claims that have been accepted by the organization as “knowledge.” But from the standpoint of those receiving them they are just information until they’re accepted by the recipient.
The significance of this is that both “information” and “knowledge” are the outcome of “knowledge sharing” efforts. Still further, insofar as people receiving knowledge claims don’t find them acceptable, “knowledge sharing” may give rise to “problems,” rather than to enhanced knowledge. If this is right, then the question arises: how can we distinguish among the information, knowledge, and problem outcomes of knowledge sharing activities, and furthermore, do all of these outcomes add to the four benefits discussed above? Again, what if the outcomes we label “knowledge” are in error? What are the arguments suggesting that the information, problem, and knowledge outcomes of knowledge sharing are likely, in fact, to produce the four benefits named by Neil? Whatever these arguments are, they are not covered in the presentation, and they are very important, because it is likely that much of the content we label “knowledge” is in error, and that it is therefore false information.
Moving to the first link in the model sequence, between the achieved goals and higher quality “knowledge sharing” activities, Neil’s presentation specifies 9 goals that, if achieved, will enhance knowledge sharing activities. Let’s look at each one and ask whether, accomplishing it is likely to enhance knowledge sharing. For each goal, let’s ask whether it, alone, or in combination with another goal is likely to enhance knowledge sharing.
The first two goals: establish a Federal KM Center; and create a Federal CKO position, do not, by themselves enhance knowledge sharing activities. Rather we have to assume that these entities will act in a way that is effective for doing so. Neil mentions that the Center and the CKO would provide a resource for Federal KM, consultation on KM, coordination with colleagues outside of the Center, and would demonstrate the benefits of sharing and collaborating across agencies. These activities might enhance “knowledge sharing,” but, as they often say, the devil is in the missing details. The third goal: enact policies, standards, and practices for KM Governance won’t enhance knowledge sharing directly, but will enhance the efficiency of KM activities. Goal 4: establishing a web presence, and KM awareness, again, will not directly enhance knowledge sharing; but will, at most, make the Federal KM Center and the CKO a hub for supporting efforts to enhance knowledge sharing. Goal 5: Training Federal Workers in KM skills doesn’t directly enhance knowledge sharing, but it does help to create more Knowledge Managers in the Federal System. However, in Neil’s slide 21 it’s clear that Neil isn’t really referring to KM skills, but rather to “knowledge work competencies,” which is not the same thing. What’s meant by “knowledge work competencies” is not made clear in the presentation. But Neil says that these bring a greater understanding and appreciation of the value of “knowledge sharing.” That may be, but isn’t one of the knowledge work competencies covered in the training “knowledge sharing” itself? And, if so, why not just say so?
Goal 6: build a knowledge sharing culture would, on the surface, clearly enhance “knowledge sharing.” But what does this idea really mean? That is, how do I know that I’ve achieved a knowledge sharing culture? If the only way I know that, is if knowledge sharing activity is actually enhanced, then the idea that there is a causal relationship between these two things has been replaced by the idea that a knowledge sharing culture involves knowledge sharing behavior by definition, and therefore Neil is not talking about an intermediate independent goal at all, but rather the more fundamental requirement of enhancing knowledge sharing itself. On the other hand, if by a “knowledge sharing culture,” Neil means something other than knowledge sharing behavior, then that is not made clear in the presentation, and we cannot evaluate whether or not the causal connection between the two things is plausible.
In slide 22, Neil lays out five measures for building a knowledge sharing culture: leadership support, explicit knowledge sharing and collaboration policies, proactive support and consultation, KM competency training, and build awareness of KM resources. I don’t know that the last two will directly build a knowledge sharing culture. The first three might, depending on the details. But again we are faced by a certain vagueness in the presentation that makes it very difficult to evaluate the connections between these measures and the outcome, knowledge sharing culture. Things would certainly be more clear if we knew what kinds of leadership support, explicit knowledge sharing and collaboration policies, and proactive support and consultation are involved. But this is another important gap in the underlying argument of the presentation.
Goal 7: age wave retirement and recruiting programs, may or may not enhance knowledge sharing activity. That is, their purpose is to facilitate enhanced knowledge sharing between those at the end of their careers and others. But they are not so much an answer to the problem of knowledge sharing, as they are programs that must use KM intervention procedures, techniques, and tools to facilitate knowledge sharing. The important question here, is what sorts of procedures, techniques, and tools, would be effective in the context of age wave retirement programs to enhance knowledge sharing in them. The presentation is silent about this. Of course, any such techniques would have to take into account the problem of distinguishing knowledge from information I discussed earlier.
Goal 8: establish Web 2.0 and social computing policies under the auspices of the CKO. I guess I think this goal is both too limited and too expansive. First, it’s too limited in that it should make reference to all IT tools, since tools other than Web 2.0 tools may be very important for knowledge processing and knowledge sharing. In particular, the imminent appearance of Web 3.0 tools and the coming of Web 4.0 tools in the next 5 years or sooner should also be mentioned if any software generations are at all. I think that if software is to be mentioned as a factor making enhanced knowledge sharing more likely, then this should only be done with the proper qualifications about background social conditions required for successful knowledge sharing.
Web 2.0 tools can facilitate information sharing among those who have a disposition to use them But, that willingness is key, and must be facilitated by something other than the tools themselves. Also, even where the willingness is there, enhancing information sharing is not the same as enhancing knowledge sharing. There’s nothing in either previous generation IT tools, or in Web 2.0 that inherently supports making distinctions between information and knowledge. You won’t be able to tell whether the use of Web 2.0 tools really does enhance knowledge sharing without some value-added to the tools specifically designed to make the distinction between information and knowledge. I’ve recently finished an extensive blog series on KM 2.0 and related issues for anyone interested in the relationships among Web 2.0, Enterprise 2.0, KM 2.0, social computing, social media, and KM. Links to the various blogs are here.
Second, I think this goal is too expansive because I don’t think the CKO will ever be able to have the primary responsibility for establishing social computing policies. Too many other interests apart from the KM interest are involved, including, of course, IT interests important to the CIO, and also the decentralized interests of management in each agency. The CKO may be able to recommend those social computing policies that support and enable knowledge sharing, but it will never be able to command those policies except perhaps within the Federal KM Center itself, for the simple reason that social computing is not KM’s property. It belongs to everyone.
Goal 9: Recruiting Generation Y employees, is based on the idea that Generation Y individuals have a special capability to use Web 2.0 and social computing, that individuals of other generations don’t have, and also on the accompanying idea that Web 2.0 and social computing capability is a special requirement for enhancing knowledge sharing. Even if this were true, I really wonder about this goal because it seems, on the surface, to discriminate in hiring based on age cohort, an illegal hiring practice. Assuming that Web 2.0 and social computing capabilities are needed in the Federal Government, then hiring campaigns can require those capabilities of new recruits. What they cannot do is to require birth in the Generation Y cohort as a qualification for a Federal Civil Service or contract position.
In sum, I think there are many difficulties of vagueness and ambiguity in the model underlying the presentation’s argument for a Federal KM initiative enhancing knowledge sharing. The argument is ill-defined or unpersuasive at each of the critical links between major nodes of the model. The result is that the argument taken as a whole is unpersuasive and not compelling. It’s not clear that the action plan will produce enhanced knowledge sharing, or that enhanced knowledge sharing will produce enhanced shared knowledge, or that enhanced shared knowledge will produce the four benefits mentioned in the presentation. In my view, this argument needs a very strong effort at recasting and tightening up. As it is now, it will be only be persuasive in “preaching to the choir.”
In this post, and in Part Three of this series, I’ve attempted a comprehensive critique of Neil Olonoff’s presentation calling for a new Federal KM Initiative. In a previous comment, Neil had indicated that his presentation is a very preliminary “straw man” rather than anything close to a final proposal for this initiative. So much the better then, to critique it at this early stage and to join with the 40 active participants who are offering suggestions to improve the proposal. The problems I’ve found with the presentation include: a) using too narrow a conception of KM and therefore understating its importance for Federal activities; b) failing to commit to an account of “knowledge,” thus creating a lack of clarity in one of the primary arguments in the presentation; c) proposing that the Federal CKO be located in the Federal CTOs office; d) failing to make a more positive case for KM; e) a variety of loose formulations raising conceptual questions and lowering the credibility of the argument and f) problems with the model at the core of the presentation.
In Neil’s comment on my first critical blog, he’s indicated that the proposal to locate the Federal CKO under the CTO has been withdrawn, since it was based on an error in Administration’s documents. The other areas of criticism remain however, and, I think, raise serious questions about the viability of the Federal KM Working Group proposal. I hope the FKMWG will try to meet the criticisms I’ve raised and even seriously consider my own proposal for a Federal KM Center, which uses a much broader conception of KM, commits to a specific view of knowledge, locates the CKO within a Knowledge Accountability Office responsible to the Congress, and is, I think, conceptually tighter than the present FKMWG alternatives since it’s grounded, in part, in complexity theory. In future blogs I’ll be talking more about my own proposal.
Tags: Complexity · KM Software Tools · KM Techniques · Knowledge Integration · Knowledge Making · Knowledge Management
February 12th, 2009 · Comments Off on Keep the Faith, Not the Filibuster

In a recent blog called “Keep the Filibuster, Say Dem Senators,” Ryan Grim writes about the pushback from Democratic Senators against current calls from Progressives to get rid of the filibuster. Since this pushback was coming from four of my favorite Democratic Senators: Claire McCaskill, Amy Klobuchar, Barbara Boxer, and Patrick Leahy, I thought I ought to reply to the position taken in the article, a position that I think represents a loss of long-term perspective resulting from too close involvement with everyday experience.
Of course, it’s true that progressives may need the filibuster in the future, just as the conservatives need it now. And, of course, some centrist Democrats looking to maintain disproportionate influence having nothing to do with their ideas or capabilities want to maintain the filibuster. But none of this addresses the central point. That point is how much Democracy, in the sense of rule by majority, should we have in our Government?
Frankly, I think the United States has much too little Democracy in its Government right now. Majority Rule doesn’t apply in Presidential Elections. Monetary Policy is under control of the “independent” unelected Federal Reserve. The Senate needs 60 votes to pass anything important. The will of the Congress can be thwarted by Presidential veto. The popular will is subject to International Trade Agreements. The Supreme Court is unelected. The House is elected based on unfairly apportioned districts designed to maintain a system of “rotten boroughs.” Right now, there are all sorts of arrangements in place to thwart majority opinion; but there is very little in our Governance arrangements to facilitate it. I think the extent to which majority rule is frustrated in America is detrimental to societal adaptation. We can’t change fast to meet crises, because all sorts of arrangements are in place to allow those who are against change to frustrate us. If we’re not careful, this incapacity to adapt to a fast moving world will result in the decline of the United States. It may have already done so, during the past 40 years.
We can’t and shouldn’t forget, that elections are about the results of societal learning. They are our way of making operative lessons we have learned about the failures of our society and our way of getting new decision makers in power who embody our lessons learned. If these decision makers can’t apply those lessons because of barriers of process, we cannot learn whether those new lessons are right or wrong, the velocity of our adaptation suffers, and we are less adaptive than we should be.
It’s fine for Barbara Boxer (one of my favorite politicians, by the way), to point out that if the filibuster didn’t exist, during the 1990s, the crazy Republicans would have put through a whole mess of ridiculous and harmful legislation; but we have to recognize at least two things when evaluating her anecdote about this. First, Bill Clinton was President at the time, and the Republicans didn’t have enough votes to overturn a veto in most instances. And second, and more importantly, so what if the Republicans had been able to go crazy? Had that happened, the Congress would have been turned back to the Democrats that much sooner, and with no filibuster in place those same Democrats would have quickly been able to undo any Republican damage. The point is that there is a choice between preventing change with the filibuster procedure, or removing this barrier to change in order to allow people to learn from the consequences of bad legislation. A real Democracy has more faith in the latter method than in the former. Are we a real Democracy, or not?
Tags: Politics
February 12th, 2009 · 5 Comments

A few days back Neil Olonoff was kind enough to send me a note on Linkedin alerting me to a webinar presentation he had given to members of the Federal Knowledge Management Working Group. Neil says: “It has resulted in an amazing amount of new energy and action in this group. Our Initiative to implement knowledge management in the Federal government is now well underway!” I downloaded the presentation from slideshare and have been thinking about it on and off since.
I was glad to see a proposal for a National KM Center, especially since I proposed one myself some time ago in some earlier posts, and I’m also happy that the Federal KM Working Group is pushing to get something to happen. However, I’m afraid I was disappointed in the argument of the presentation for a number of reasons. These include: a) using too narrow a conception of KM and therefore understating its importance for Federal activities; b) failing to commit to an account of “knowledge,” thus creating a lack of clarity in one of the primary arguments in the presentation; c) proposing that the Federal CKO be located in the Federal CTOs office; d) failing to make a more positive case for KM; e) a variety of loose formulations raising conceptual questions and lowering the credibility of the argument and f) problems with the model at the core of the presentation. I’ll expand on each of these points below and in a future post.
Using too narrow a conception of KM: Neil’s presentation seems to avoid actually defining KM, perhaps in the hope that by avoiding this very contentious issue in the field, his proposal could aggregate the interests of many practitioners who might all agree on what he’s proposing even if they can’t agree on how to define KM. However, the problem of definition can’t be avoided quite that easily, since his argument seems to suggest that KM, for him, is activity intended to enhance the performance of knowledge sharing in the Federal Government.
This conception of KM was characterized by Mark McElroy, nearly ten years ago now, as First Generation Knowledge Management, and it may be contrasted with the more expansive Second Generation KM view he and I have articulated in many pIaces, including in discussing National Governmental Knowledge Management last summer. In that discussion, I characterized KM as activity intended to enhance problem seeking, recognition, and formulation, knowledge production, and knowledge integration (which includes knowledge sharing, as well as other types of knowledge integration).
Of course, if the Second Generation conception of KM is used in proposals for a Federal KM Center, then, suddenly we’re talking about a KM which is much more broadly relevant to the issue of Governmental adaptiveness to challenges, since we’re not just talking about enhancing “knowledge sharing,” but are also talking about seeing problems in Government operations and processes and developing solutions to those problems, as well as just sharing these solutions with those who need them. Since KM is a manifestly much more significant discipline to decision making when construed in the Second rather than the First Generation manner, why would anyone prefer the First Generation conception?
This question is even more important to ask now that the Obama Administration has taken office. The new administration has a surfeit of problems and challenges it must meet. It does need better capabilities for sharing already existing knowledge; but it also needs better capabilities for seeing problems before their effects become unmanageable, and better capabilities for coming up with and severely evaluating and testing new ideas before they have to be applied. So, this Administration especially, since it is open-minded, pragmatic, and cares about reality, needs Second Generation KM and not just First Generation KM.
Second Generation KM is not only preferable to First Generation KM because its scope is broader; it is also preferable because First Generation KM has a foundational problem; specifically it’s inability to clearly distinguish knowledge from information. I’ll write more later about the problem of defining “knowledge” in Neil’s presentation, but the main point here is that a knowledge sharing approach assumes that we have knowledge, can identify it, can distinguish it from “just information,” and then can communicate it to others. However, there is nothing in the First Generation approach that allows us to identify “knowledge” when we see it. All the talk about “knowledge sharing” comes down to “information sharing” in the absence of the ability, inherent in this approach to distinguish knowledge from information.
In contrast to First Generation Second Generation KM includes knowledge production (creation, discovery, etc.). So, we know when we have made new knowledge. We can track knowledge as a cultural product if we wish and therefore we can tell when we are sharing knowledge and when we are sharing information. Thus paradoxically, it is the broader Second Generation approach to KM that allows one to actually track knowledge sharing activities and results, rather than simply tracking information sharing.
In proposing the Second Generation KM conception, I want to acknowledge that I’m not really suggesting anything original. Even before KMCI’s work developing the distinction between First and Second Generation KM nearly ten years ago, major works in Second Generation Km had already appeared. The foremost of these is Nonaka and Takeuchi’s, The Knowledge Creating Company, Oxford University Press, 1995, a book read widely in KM, that makes it quite clear that our discipline is about much more than knowledge sharing. And, increasingly, 21st Century KM practitioners are concerned about knowledge production as well as knowledge sharing. We see this in cases such as the Halliburton, and Partners Health Care cases, in Graduate programs such as the University of Technology, Sidney and The George Washington University, and in popular KM publications such as Inside Knowledge and KM World, in the Ark Group’s book series, and in a host of Elsevier book publications, including books by McElroy, myself, and Alex and David Bennet (Alex was a former head of the Federal KM Working Group). In light of all this, I have to ask why Neil and the current Federal KM Working Group apparently chose to represent KM in First, rather than Second Generation terms? Why diminish KM, when the transition to Second Generation is so clear?
Failing to commit to a clear account of “knowledge”: Again, the motive for this is probably to avoid those interminable arguments over definitions of “knowledge.” I empathize. However, we need to know what Neil means by “knowledge,” because he says: the Government has had “knowledge failures” and has “knowledge gaps.” He also says that “it” is an “enabler” and a “multiplier,” that it is “sticky,” and that sharing it leads to four benefits: “synergy and efficiency;” “quality and value enhancement;” “innovation;” and “morale.” How can we evaluate these claims without knowing what he means by “knowledge?” If he means one thing by “knowledge,” then “knowledge failures” would not be surprising and “knowledge” might not be such a good “enabler” and “multiplier,” and might not lead to “synergy” and ‘efficiency.” On the other hand, if he means something else by “knowledge,” it might be quite a good enabler and might fit his other descriptions as well, but it also might be very hard to identify when we encounter it; and also very hard to ensure that any KM activities can lead to its enhancement.
Nor do I mean to give the impression that there are only two possible construals of “knowledge.” Many different definitions are found in KM (See Ch. 1 of my Key Issues book). What I’m saying is that if Neil doesn’t state what conception he is using, it makes it much harder to evaluate the validity of his claims about the characteristics and benefits flowing from “knowledge.” Instead of clarity, his presentation relies on the good feelings we all have about “knowledge” to justify his claims about its utility and the significance of KM. That kind of approach may have been effective in the early days of KM in the 1990s, but now we’ve had a 20 year very mixed record of disputed successes and failures, and we have many current claims that KM is dying or dead. In this environment, we have to lay out arguments for KM that will survive moderately close analysis and that means being clear about what we mean when we use the terms “knowledge” and “KM.” We may have no agreement among us on such terms. But the need for survivable arguments and proposals demands that our readers be able to analyze what any one of us is proposing and to be clear about what our proposals really mean.
Proposing that the Federal KM center and the Federal CKO be located in the Federal CTOs office: This presents two problems. First, it supports the idea that KM is primarily about technology, and therefore should be subordinate to the CTO and the CTOs strategy. However, KM is not primarily about technology. It uses technology as a handmaiden certainly; but it’s focus is on enhancing knowledge processing, which even if we limit that to “knowledge sharing,” which I would not do for reasons given earlier, is a much broader focus than mere technology applications.
Second, and more importantly, locating the Federal KM Center in the CTO’s office subordinates it not only to the CTO’s strategy, but also to the President’s strategy, whatever that may be. Why shouldn’t The Federal KM Center be subordinate to strategy? Because there is a potential contradiction between the organizational function of enhancing a system’s adaptive capability and the operational strategy of either the CTO or the President. I’ve outlined the steps in my argument specifying this contradiction here, and I encourage readers to look at what I think is a careful development of this position. In a nutshell, however, the overall point of the argument is that adaptive functions of organizations, including problem solving and KM, are about more than just serving the variety of goals or the strategies of organizations. Rather, they are about change and the capacity to change themselves, and so they must transcend and check other executive functions of the organization, lest they freeze its operational pattern in a way that makes it too rigid to withstand the winds of change. This suggests that KM as a function should have autonomy relative to the Executive and therefore should be subordinate to the Legislative and not the Executive branch of Government, as is the Government Accountability Office (GAO). So, I envision a Knowledge Accountability Office (KAO) as the Federal KM Center, as I’ve proposed in an earlier post in this series.
Failing to make a more positive case for KM: Neil’s argument for a National KM Initiative uses the need for knowledge sharing in averting errors as the justification for the initiative. But what about the past performance of KM in the Federal Government? Why not use previous Federal KM activities and results to support a proposal for a National KM Center? In an earlier post, I’ve pointed out that a lot of KM work is done without using the label KM. Why not research previous efforts in the Federal Government of this kind? A lot of Quality work falls into this category, as does a lot of work in the intelligence arena, and in the Sciences. In general, any work in the Federal Government creating and then getting people to adopt new methodologies for helping to: see problems, solve them, and integrate them, is KM work. It’s done everyday in the Federal Government and some of it is successful. Wouldn’t the case for a Federal KM Center be that much stronger if one could point to the full range of successful KM work and talk about the need for a Center to coordinate that work, aggregate it, integrate it, encourage it, enhance it, and evaluate it as part of a systematic effort to enhance the adaptive capability of all agencies in the Federal Government?
A variety of loose formulations raising conceptual questions: On slide 2, Neil equates our “vast reservoir of information” with “these huge knowledge assets,” thus suggesting that “knowledge” and “information” are synonyms. This is something that no KM practitioner should ever do unless one is definitely committed to that view; simply because if one equates “knowledge” and “information,” one is also saying that “Information Management” is the same as “Knowledge Management,” and that there is no need for an autonomous field of Knowledge Management.
Slides 3 – 8 discuss a number of cases that illustrate the need for enhanced knowledge sharing including the: sinking of the titanic; 9 – 11 attacks; space shuttle Challenger disaster; and Hurricane Katrina. On slide 8 Neil claims that “a word to the wise” could have averted these disasters. However, this ignores the possibility that a word to the wise can be ignored by those in authority quite easily and that “knowledge sharing” may be ineffective in averting disasters much of the time. Thus, we know from the Rogers investigation that in the Challenger case when the joint that eventually was compromised by the O-rings failure behaved in unexpected ways on previous flights, NASA failed to test the joint even though it was aware of deviations from specifications. Also, NASA was warned of faulty seals, but saw the problem as “an acceptable flight risk,” and, according to the Rogers commission required proof that it was not safe to launch Challenger, rather than vice versa.
This NASA Challenger failure wasn’t just a failure of “knowledge sharing.” Rather, it was a failure of not looking for and recognizing problems where they existed. Enhancing “knowledge sharing” would not have averted the Challenger disaster. As Steven Spear has made clear in his book, only a more intense attitude toward seeking out, recognizing, and formulating problems would have done that. Further, NASA’s weakness in seeking, recognizing, and formulating problems persisted after Challenger, and was a major factor in the Columbia disaster of 2003. There, too, there was plenty of shared information, as well as shared knowledge, that might have averted the disaster, but also, there was a resistance to interpreting warnings and deviations of behavior from expectations as problems that had to be solved first, before Columbia was launched, and second, before it was allowed to re-enter the atmosphere.
While failures to share knowledge certainly existed in the Katrina and 9/11 cases, it is also clear that other factors may have been more important for averting both disasters. In Katrina, scientists had actually forecast the likelihood of a Katrina-like Hurricane hitting New Orleans, but there was a disposition to ignore these warnings, i.e. a failure of problem recognition. Further, much evidence was provided to the 9/11 commission showing that the possibility of a 9/11 type of event and the intention of al qaeda to attack the US were both there; and that had these been taken seriously by the Administration, there might have been much more disposition to “connect the dots” in time to avert the disaster.
Even the relatively unambiguous case of the Titanic may be problematic, as a case from which we can draw a “lesson learned” about “knowledge sharing.” First, when someone gets “a word to the wise” they may receive it with an open mind and evaluate it carefully. But what if they think that “they’re unsinkable,” or think that there can’t be icebergs where the “shared knowledge” says there are? My point here, is that to the recipient “shared knowledge” is always just information. The recipient will always need to evaluate it before accepting it as knowledge. If their gut reaction to the “shared knowledge” is to ignore, or to discount it, than sharing won’t avert disaster, even when the knowledge itself corresponds to reality. Of course, there’s no guarantee that “shared knowledge” will correspond to reality, and this is part of the reason why it is often ignored and doesn’t have the desired effect in averting disaster.
On slide 9 KM is characterized as “a ten year old discipline.” Actually, it’s at least a 20 year-old discipline if its dated from the time (1989) Karl Wiig used the term “Knowledge Management,” and Wiig, Karl-Erik Sveiby and Bob Buckman performed the first work identified by that name. Many in KM are aware that the discipline is much more than 10 years old, that KM work was going on all over the globe in 1999, and that KM Conferences bigger than any we have now were being held then, because KM was “the hot ticket.” In a post coming soon I’ll take up the question of problems in the model underlying the presentation.
To Be Continued
Tags: Epistemology/Ontology/Value Theory · Knowledge Integration · Knowledge Making · Knowledge Management · Politics
February 11th, 2009 · 1 Comment

When the Senate cut Education Funding, Aid to the States, Aid to Low Income Families, Renewable Energy Investments, Health Information Technology, and Science Funding, all of which would have produced $.57 for each dollar invested, and, instead, increased wasteful tax cuts which will produce an additional $.02 gain on each dollar invested, I began to receive e-mails from associations dedicated to one or another of these interests. Basically, the e-mails all express outrage over the Senate’s actions and want me to sign a petition, call or Fax my Senators, and donate funds to “fight” to restore finding for their favored interest. I think this is nonsense because it feeds into the divide and conquer strategy of the obstructionists.
As I said a few days ago, what the Senate does or does not do in relation to the Recovery act is determined by its maintenance of the procedural rule enabling the filibuster. If there were no filibuster none of these cuts would have occurred. The wasteful tax cuts would not be in the bill, we would not be looking at 438,000 – 530,000 less jobs resulting from the Recovery legislation than we were expecting from the House Bill, and the “gang of four” Senators who framed this foolish “compromise,” would have had no power to damage the House Bill. Furthermore without the Senate Filibuster and the anticipated reaction of the Senate to their actions, the House might even have constructed and the Senate may have acquiesced to a bill that could be expected to produce more than the 3 – 4 million jobs expected from the House Bill.
In short, the key to easing the dissatisfaction of everyone who wants to have their needs addressed by the Recovery Act is to join together and to end, once and for all, the power of the Senate to obstruct and the power of a few Senators to damage needed legislation by exacting a high price in return for their willingness to break a filibuster. All those who are disappointed and angry about the way the Recovery Act has been shaping up need to join together and pressure Harry Reid and the Obama Administration to exercise the “nuclear option,” and free us from the filibuster’s yoke. They all ought to gift us with a single e-mail in our inbox tomorrow, and that e-mail should call for the “nuclear option” and the immediate end of the filibuster. The constitution provided for majority rule in the Senate; not for the rule of a 60 vote super-majority. Those who want change should not have to cope any longer with the tyranny of an obsessive, ideological, and arrogant minority, who never met a tax cut they didn’t like.
Tags: Politics
February 10th, 2009 · Comments Off on The Problem Solving Pattern Matters: Part Five, More Ways of Enhancing Problem Seeking, Recognition, and Formulation

(Co-Authored with Steven A. Cavaleri)
Here are some other ways one can enhance problem seeking, recognition and formulation in organizations. First, Management can assist in moderating the natural fears of people by offering Problem Seeking, Recognition, and Communication “boot camps” to employees. The objective of these boot camps is to train people in:
— specifying standards and expected outcomes of strategies, business processes, parts of processes, and activities;
— looking for places where expectations and actual events in operational processes and activities have diverged;
— understanding why problem recognition, in the sense of pointing to knowledge gaps, is important for competitive advantage, organizational effectiveness, and job performance;
— self-evaluating the results of their activities;
— recognizing when outcomes are, in fact, falling short of their expectations;
— recognizing what type of knowledge and capability they need to overcome the performance shortfall; and
— communicating about the problems they recognize.
The boot camps should use case study, knowledge café, and narrative elicitation techniques since an important goal is to provide participants with a variety of interpersonal perspectives on the areas to be covered. It is also important that sharing perspectives in a boot camp environment can begin to create a community that will reinforce the idea that problem recognition is important. This community may then be organized as a Community of Practice (CoP) after the boot camp is over.
Second, an important barrier to problem recognition is getting “feedback” on the results of their activities to people, so they can do a good job of monitoring and evaluating the consequences of their decisions. In organizations with active Quality Management, or Balanced Scorecard, or other Performance Monitoring programs there is a great emphasis on measuring outcomes and on reporting, and this may provide a good foundation for recognizing problems (knowledge gaps) where they exist.
But, since metrics from performance monitoring systems often aren’t relevant to testing expectations about how strategies, business processes, and activities will work, Management should also support metrics development and implementation activities throughout the organization, and should develop a metrics program covering the various aspects of the PSP. Management can support metrics development generally by performing research and development (if necessary) on methodologies for developing and implementing metrics and by sharing this knowledge with staff performing other business processes. In addition, metrics development can be supported by insisting that metrics are necessary to evaluate whether the results of activities meet or diverge from expectations. Further training initiatives may also be a good way of supporting metrics development throughout the organization.
Third, another aspect of providing feedback so people can recognize problems is to use Information Technology to provide relevant information (and sometimes knowledge) that is “baked into the jobs” of knowledge workers. In the Partners HealthCare case study Tom Davenport and John Glaser report on such an application. Here, Doctors may suspect the existence of a problem after the order entry system tracks the Doctor’s order, and in cases where it appears to the system to differ from what would be prescribed by a knowledge base produced by an expert committee, reports the knowledge base recommendation to the Doctor, while alerting she or he to the conflict between the Doctor’s and the System’s prescription. This timely feedback, active knowledge base integration into the decision process of doctors, tied to the Doctor’s role of ordering prescriptions, obviates the need for problem seeking and directly stimulates problem recognition and an individual level effort at problem solving performed by the Doctor involved.
Fourth, the most important way for Management to enhance the problem recognition capacity of an organization is for it to initiate and maintain a policy of “openness” in problem recognition. That is, a policy of maintaining freedom for all participants in business processes to state that a knowledge gap affecting performance exists, and to communicate that view to as many others in the organization as they care to without fear of reprisal. Openness here will support the objective of distributed problem recognition and greatly increase the probability that problems will be addressed in the PSP. Again, the example of Toyota is instructive here. It encourages highly distributed problem seeking, recognition, and formulation, and not only are there are no reprisals for such activity, but it is viewed as part of everyone’s job.
Increasingly, a policy of openness in problem recognition requires allocating resources for an Information Technology infrastructure that will empower staff to exercise this freedom. In practical terms, that means, implementing software that will allow the free publication of newly identified knowledge gaps in the context of the business processes, work flows, and types of decisions that generated them. Enterprise 2.0, social media, social software, and social computing tools are related terms that describe current technology for implementing increases in transparency, increased participation, and content aggregation inside organizational firewalls.
To Be Continued
Tags: Knowledge Making · Knowledge Management
February 8th, 2009 · 2 Comments

(From http://www.obamamites.com)
In some previous political blogs, I’ve talked about getting bipartisanship the wrong way around, and how and why to get rid of the filibuster. In this blog, I want to intensify my message on these issues and also direct it to Barack Obama. Today, Paul Krugman weighed in on the Senate’s compromise “stimulus” bill and his evaluation of it. He said:
“The short answer: to appease the centrists, a plan that was already too small and too focused on ineffective tax cuts has been made significantly smaller, and even more focused on tax cuts.
“According to the CBO’s estimates, we’re facing an output shortfall of almost 14% of GDP over the next two years, or around $2 trillion. Others, such as Goldman Sachs, are even more pessimistic. So the original $800 billion plan was too small, especially because a substantial share consisted of tax cuts that probably would have added little to demand. The plan should have been at least 50% larger.
“Now the centrists have shaved off $86 billion in spending — much of it among the most effective and most needed parts of the plan. In particular, aid to state governments, which are in desperate straits, is both fast — because it prevents spending cuts rather than having to start up new projects — and effective, because it would in fact be spent; plus state and local governments are cutting back on essentials, so the social value of this spending would be high. But in the name of mighty centrism, $40 billion of that aid has been cut out.
“My first cut says that the changes to the Senate bill will ensure that we have at least 600,000 fewer Americans employed over the next two years.”
In my last post, I said that retaining the Senate’s filibuster isn’t worth the job of a single laid-off American. And, it’s certainly not worth the jobs of 600,000 Americans. Nor is it worth a shortfall in GDP of $2 trillion over the next two. But worse than that, it’s not worth the cost of breaking a filibuster when Barack’s energy program comes up for passage. It’s not worth the cost of breaking the filibuster, when National Health Care comes up for a vote. It’s not worth the cost of breaking filibusters when we try finally to fix the damage wrought by Katrina. Or to fix OSHA. Or to get the SEC, the FDA, the EPA, and the Consumer Product Safety Commission working again. Or to fix all the other agencies of Government gutted by the Republicans and short-changed by the Clinton Administration. It’s not worth the cost we’ll pay when we finally move to ensure that well-to-do people again pay their fair share of taxes; or when we try, once again, to restore the American Public School system to its one-time glory, or when we try to ensure that every young American gets their shot at a good University education.
No, I don’t want to attempt to build bipartisanship and keep the Senate filibuster and pay these heavy costs. I didn’t vote for that. I voted for change. If bipartisanship is part of this change, then fine. But, bipartisanship is process, and I want results more than I want process; especially extra-constitutional process like the filibuster and bipartisanship.
I don’t think there are very many Americans who will think this is a good trade-off, either. Bipartisanship and the filibuster just ain’t worth the American Dream. Maintaining the power of Susan Collins, and Olympia Snowe, and Joe Lieberman, and Ben Nelson, and Arlen Specter, and George Voinivich just ain’t worth the American Dream, or even a month’s postponement of it.
So, I say to Barack Obama: first things first. You want change? Real change? Change in Washington? Then getting the filibuster is the first change you need. After that, all the other changes will be a hundred times easier, because you’ll put together programs that can and will be successful in putting the country back to work and making it strong again. Tell Senator Reid this:
Harry we need to get that filibuster, and we need to do it, because we were elected to deliver change and we are going to be accountable if we don’t deliver “good change,” “effective change.” The Republicans aren’t going to be blamed if things don’t work. I am. You are. We are. This is our chance, and we can’t let an extra-constitutional undemocratic process stand in the way of our getting results. We’ve got to do all we can to enact our best program. Then if we fail the country, the people can try something else. But let us not fail, because we allowed an anti-democratic and extra-constitutional process enshrined by previous congresses, to stop us from taking our best shot. Let us be accountable, and then if we fail, let the Republicans be accountable for what they do. That’s what Democracy is about. Let’s, at long last, have some Democracy. Let’s give the public what they really voted for. Not bipartisanship. Not the filibuster. But a second New Deal. And this time, one that won’t stop at the water’s edge; but that will deliver the American Dream.
To Be Continued
Tags: Knowledge Making · Politics
February 7th, 2009 · 8 Comments

The Republican Tax Cut wingnut, Steve Forbes, once said of the IRS: “The only thing we can do with this hideous beast is kill it, drive a stake through its heart, bury it, and hope it never rises again to terrorize the American people!” While I don’t share this view in relation to the IRS, I do think the sentiment is perfectly crafted to express my feelings about the Senate filibuster.
The filibuster is an extra-constitutional travesty that has too often undermined the power of the US Congress to express the will of the people. It has worked to require super-majorities whenever the United States has to get anything important done. The need for super-majorities, in turn, a) has stopped action favored by a majority in many areas; b) where action is possible, it has often watered down or gutted its effectiveness, because the need to compromise with minority opponents of legislation has required agreement to loopholes, “fine print” and exceptions by the majority, and c) perhaps most important of all, the need for super-majorities, has prevented later adjustments by the majority to errors in legislation, and to its unanticipated effects, because, very often, an administration may get only one bite at the apple in each major area of concern.
The “one bite at the apple” problem is made much worse by the need for super-majorities. Legislatures can’t follow a continuous improvement/learning-based approach to legislating. They need to get it right the first time. But, that’s a virtual impossibility, because politics and economics deal with complex systems and none of us know enough about such systems to do it right the first time without pure, blind luck. The current stimulus package, the coming health care bill, the future energy and environmental legislation, all are sure to be flawed and to require continuous improvement, just as we’re finding with the TARP legislation. But we won’t be able to do that improvement, because the lack of a super majority won’t allow it, and because if there is a need for improvement, that very fact will ensure that the minority’s political interest will impel them toward preventing it.
How do we handle complex systems in non-legislative environments in order to be successful? The best method we know is to develop a solution to a problem by comparing alternatives and selecting what appears to be the best, monitor the results closely, and if those don’t meet our expectations, then recognize another problem and go back for a second or third or fourth bite at the apple, in order to continuously improve our results until we meet some standard we’ve had in mind from the beginning. The biggest problem with the need for super-majorities in Congress, is that they make legislating a “crap shoot,” because they shut the door on any realistic possibility of proceeding along the path of correcting errors. Instead, super-majorities only allow us to pass an inferior solution to a problem in the first place, and when its results are unsatisfying to everyone, to blame the “ins” for failing, get the “outs” in, and give them a chance to try to get their own solutions through the same obstructionist process.
In the twenty-first century, a society that can’t adapt to error, which is, after all, the human condition, cannot long survive. And the United States is in for a very sharp decline unless we can do something about a legislative process that is incapable of continuously evaluating and improving the results of its previous decisions. That something is getting rid of the filibuster and returning to the constitutional requirement of a simple majority in each house of Congress to pass new legislation.
Getting rid of the filibuster is easy to do, if we have the will and are willing to abandon the mythology of the desirability of immobilist government that thwarts the will of the majority. The instrument of doing it is a maneuver that’s been given the name of “the nuclear option.” It was proposed by the Republican Senate in 2005 to overcome Democratic filibusters of Presidential judicial nominees intended to block Senate confirmations. When Bill Frist, the Senate Majority Leader at the time, got ready to “trigger” the option, which would have had the consequence of eliminating the rule or precedent underlying the filibuster, a bi-partisan so-called “gang of fourteen” (7 Democratic and 7 Republican) senators arrived at a compromise which got the Republicans what they wanted, and saved the filibuster for posterity. The compromise was to avert a vote on “the nuclear option,” give up the filibuster on some of the nominees, table the consideration of others, and save the filibuster for “extraordinary circumstances.”
The 2005 conflict wasn’t the first time the nuclear option was attempted. It was moved on 10 previous occasions by various people, but each time it was attempted, it was either defeated, or a compromise was worked out to save it for future use. The procedure for implementing the nuclear option isn’t difficult. Here are the steps involved to exercise it.
1) During a filibuster, a Senator makes a point of order calling for a vote on the measure being considered by the Senate.
2) The presiding officer of the Senate, most often the Vice President of the United States makes a parliamentary ruling upholding the point of order and citing the Constitution of the United States rather than previous Senate rules (which uphold the right of unlimited debate) as the precedent supporting the ruling.
3) A supporter of the filibuster will then “appeal from the chair” by asking whether the Chair’s decision will stand as the judgment of the Senate.
4) An opponent of the filibuster then must move to table the appeal.
5) Since motions to table are not debatable, the Senate immediately votes on the tabling and decides by simple majority vote.
6) If a majority decides to table, the ruling of the Chair, that the filibuster is unconstitutional, and that majority vote is enough to bring a bill to vote and to pass it, is upheld.
7) By its action in upholding the Chair, the Senate will have established a new precedent, namely that filibusters are unconstitutional, and that all legislation thenceforth may be passed by majority vote, following a point of order calling for a vote.
In the last national election in the United States, I, like so many others, voted for change in both economic and foreign policy, which means that I voted for Democratic candidates for office right down the line. I wanted the Democrats to have their fair shot at fixing the American Economy and ending the foreign policy debacles of the Bush Administration. I didn’t vote for more of the abysmally failed Republican thinking in either of these two areas. And since I view any input from them as clueless, reality-denying, and sure to result in more people losing their jobs and their dreams, I certainly didn’t vote for that political party, whose policies have failed, to have any serious inputs into the Recovery Package.
Now, I ask myself, why are they having serious inputs into the Recovery legislation? Why are they capable of persuading people to limit the overall size of the stimulus to under One Trillion Dollars, and to eliminate or reduce funding in the Recovery Act for Head Start, Education for the Disadvantaged, School improvement, Child Nutrition, Firefighters, Transportation Security Administration, Coast Guard, Prisons, COPS Hiring, Violence Against Women, NASA, NSF, Western Area Power Administration, CDC, Food Stamps, Public Transit, and School Construction? In short, why do they have the continuing influence that I and a majority of Americans voted against them having? Why haven’t we been able to get the “ins,” “out”?
The simple answer is the existence of the filibuster. Now, I’m well aware of all the arguments out there defending the filibuster on grounds that it is an important element protecting the minority against the tyranny of the majority in the United States. I don’t buy that nonsense at all. None of the other major Democracies in the world have anything like the filibuster, and I don’t see tyrannies in any of them. Also, the United States has a surfeit of anti-democratic elements in its political system protecting minorities. We don’t need an extra-constitutional institution like the filibuster. We have too little Democracy in the United States, anyway. Not too much. And we need to redress the balance if we’re to adapt to the challenges that face us.
What is the filibuster worth? The filibuster is not worth the job of a single laid-off American.
So, let’s use “the nuclear option.” Let’s use it this week. Let’s use it for the sake of the Recovery Bill. Let’s use it for the sake of all the legislation the Obama Administration has yet to pass. Let’s use it for the sake of all the changes our country will need in this very challenging century. And finally, let’s use it to drive a stake through the filibuster’s heart, and prevent that relic of a simpler and slower moving age from continuing to sap the life-sustaining energy of political innovation out of our Republic.
To Be Continued
Tags: Knowledge Making · Politics
February 6th, 2009 · 2 Comments

(Co-Authored with Steven A. Cavaleri)
Enhancing the power of an organization’s PSP is a matter of moving it toward the Open PSP from whatever position in phase space it is in.

The Vision: Moving Toward the Open PSP
Moving an organization’s PSP is driven fundamentally by re-focusing the attention of employees from implementing existing solutions to improving those solutions, by replacing them with newly created ones. We might call the general process of enhancing an organization’s PSP, Problem Solving Pattern Management (or PSP Management). PSP Management activities include all initiatives directed at enhancing the abilities of an organization’s employees and collectives to perform:
— seeking, recognizing and formulating problems,
— solving problems by developing new solutions, and
— communicating solutions to people who may need them.
Enhancing Seeking, Recognizing, and Formulating Problems: The Importance of “Looking for Trouble”
Problem seeking activities cross the boundary between the OP and the PSP. They are vital to the PSP, because, without them, initiating it becomes dependent on passive problem recognition, alone, thus reducing the frequency of PSPs and the growth of knowledge in an organization.
An organization’s problem seeking activities may be dormant, or function at a minimal level, because of a shared belief that finding new solutions is unnecessary, or that they can be found by hiring consultants. They can, more often, be dormant because corporate cultures often include predispositions that have the effect of discouraging employees from seeing problems and from “looking for trouble.”
One of the first things an organization’s executives must do to enhance problem seeking is to forcefully contradict and campaign against the shared belief that equates problem seeking with ‘rocking the boat,’ and then legitimize employee efforts aimed at “looking for trouble.” Six Sigma and Lean thinking have had the beneficial side effect of institutionalizing looking for trouble in business processes in certain areas of organizations. In Toyota’s kaizen-based lean manufacturing system, standard-setting by workers followed by problem seeking is viewed as one of the key drivers of organizational learning and continuous improvement. As Takeuchi, Osono, and Shimisu, making reference to Toyota President, Katsuaki Watanabe point out (p. 3):
“Voicing contrarian opinions, exposing problems, not blindly following bosses’ orders—these are all permissible employee behaviors. Watanabe, who recounts how he fought with his bosses as he rose through the ranks, often says, “Pick a friendly fight.” We were surprised to hear criticism about the company and senior management in our interviews, but employees didn’t seem worried. They felt they were doing the right thing by offering executives constructive criticism.”
Work groups are the focal point of ongoing problem seeking and problem solving efforts at Toyota. The company’s management structure and responsibilities are all oriented around helping team members become highly effective problem finders, a key aspect of PSP management in open PSPs. Managers here often view their primary responsibility as being one of teaching problem seeking and problem solving methods to operational employees. As a result, there is a small span of control at Toyota and also a relatively flat organizational hierarchy, which, in turn, contributes to effectiveness in communicating problems to the hierarchy.
A good example of the differences between Toyota and American car manufacturers in the depth of their hierarchies is provided by the joint venture of General Motors and Toyota formed in 1983 called New United Motor Manufacturing Inc. (NUMMI) (See Matt May’s The Elegant Solution, pp. 61-65) Amidst comprehensive changes introduced by Toyota at NUMMI, including its introduction of a problem seeking, finding, and solving culture, Toyota also replaced 101 job line descriptions with only one – team member, and also reduced GM’s 14 level hierarchy to three levels of management: Plant Manager, Group Leader, and Team Leader.
Developing standards including ideas about acceptable defect ratios enhances problem seeking and problem recognition because these things support a process of monitoring outcomes to see if they exceed acceptable levels. If they do, it may signal a problem. Specifically, that knowledge about the operational process involved is flawed and needs to be improved, which is another way of saying that the problem must be solved and, what is the same thing, that new knowledge must be created.
Organizations like Toyota, Alcoa, Navy Reactors, Asin, Pratt Whitney, Massachusetts General Hospital, and Avenue A (See Steven Spear’s Chasing the Rabbit), that institutionalize continuous process improvement and either “kaizen,” or kaizen-like procedures, “look for trouble” in an even more radical way. They assume that only zero defect levels are acceptable in the long run. Therefore, they are always monitoring deviations from perfection and are looking to continuously improve process knowledge, i.e. to continuously close knowledge gaps, to get defect levels lower and lower.
To Be Continued
Tags: Complexity · Knowledge Making
February 5th, 2009 · Comments Off on The Wolf Cryer

Another Political Blog, I’m afraid. These may subside after awhile. But right now the dynamics of American Politics are awfully interesting, and I do think that this piece has some connection with problem solving and KM.
Once again, Dick Cheney has disturbed the peace. In a transparent attempt to “salt the mine,” he has delivered a dire warning that abandoning the practices of the Bush Administration with respect to Guantanamo and torture will leave us in grave danger of another terrorist attack on US soil, this one one during the Obama Administration. Cheney claims he knows this because he has had access to secret information that the rest of us don’t have. Along with those in the Kennedy, Johnson, and Nixon Administrations, who claimed to know things the rest of us could not know about Vietnam because we had no access to the classified information they had, he relishes using this sort of “reinforced dogmatism,” because he fancies that there is no way to argue successfully against him, because the “factual” grounding “justifying” his conclusion is available only to him and not to us.
Of course, he is quite mistaken in this. There are some very good arguments against his views. They’re ad hominem, of course, and don’t really confront the substance of his argument. But since he’s using a reinforced dogmatism anyway, there really is no substance to confront, and that’s his choice, not ours. In any event here are two such replies.
First, there’s the conjecture that when US Government officials offer a conclusion and claim to know that something is certainly true based on their special access to classified information, that conclusion turns out to be generally false. We have had much experience with this in relation to facts about the Vietnam War, the Domino Theory, the supposed intentions of the USSR and China, the Gulf of Tonkin Incident, the WMD in Iraq argument, the prior presence of al qaeda in Iraq and countless others. It is, in fact, hard to think of even one counter-example where a public conclusion based on classified information which the US Government could not reveal turned out to be true. Perhaps this is too harsh and I’m certainly open to counter-examples. But the track record of the Government in asserting conclusions of this kind, has been very poor since at least the Eisenhower Administration, and the track record of “Cassandra” Cheney himself has been highly questionable on matters ranging from WMD to Al Qaeda in Iraq to “yellow cake” sales to Iraq.
So, given the track record of the US Government in such matters, and Cheney’s own track record of knowledge claims based on secrecy, what likelihood of being true should we give this latest assertion, particularly the part of it suggesting that we can avert an attack by continuing to imprison people in Guantanamo and torture them? I think that likelihood is quite low, not so much in relation to the part of his assertion predicting another attack, which, unfortunately may be quite likely, but, rather in relation to his claim that it can be prevented by maintaining Bush Administration practices.
Second, of course, the hysterical Press has fallen all over itself to cover this “news story” of Cheney’s latest “grave” warning, as if it were really news. But, consider, here we have a man whose reputation is ruined in most circles, and whose current standing in polls is currently lower than the standing of either Jeremiah Wright or Bill Ayers. Now, he has basically three choices in reacting to the changes Obama is making in Bush Administration policy on Guantanamo and torture. He can be silent. He can express approval of what Obama is doing. Or he can strongly disapprove, while issuing a dire warning.
If he’s silent, that does nothing for his reputation. If he approves of what Obama is doing, his remaining very small base will crucify him. If he does what he just did and he’s wrong; well, things can’t really get any worse, because he’s at bottom now. On the other hand, if there is an attack on American soil again, he is once again a wise man in at least some circles. In short, giving a dire warning, is the one thing one would expect him to say, because it’s the only way to gain something from this issue for himself and his legacy.
Since the failure of the Bush Administration to do anything substantial about protecting the US’s ports and borders, while it conducted a Foreign Policy that by all accounts is a wonderful recruiting tool for the terrorist movement, has greatly increased the likelihood that there will be an attack on the United States, it would not actually be surprising if there were another in the next few years. In fact, such an attack is probably what we should expect. However, a successful attack is unlikely to have very much to do with President Obama’s abandonment of Bush’s policies and everything instead to do with the failure of Bush and Cheney’s incompetent and mindless response to September 11th — a response and a legacy that, unfortunately, both President Obama and the rest of us will have to suffer.
To Be Continued
Tags: Epistemology/Ontology/Value Theory · Politics