Lessons Learned

К оглавлению
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 
34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 
102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 
119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 
136 137 138 139 140 141 142 143 144 145 146 

Throughout this chapter, we have tried to note our observations about the process of

conducting a causal mapping study. We’d like to highlight lessons we’ve learned in three

major areas in this endeavor; the mechanics of conducting a data driven study; the value

of planning; looking for consistency and gaps.

The Mechanics of Conducting a Data-Driven Study

The mechanics of conducting this sort of study are significant — you can’t let them

become overwhelming. In our case, these mechanics included a major detour of time and

effort for satisfying the requirements of the institutional review board (a.k.a., human

subject committee). In the end, the effort needed to pass muster with for human subjects

research forced us into some early planning efforts that paid off. On the other hand, the

overhead of record keeping and drafting warning statements to participants for assuring

privacy at times seemed highly disproportional in effort for protecting against the small

risk and small probability of any harm coming to a study participant. Other mechanical

difficulties included the extraordinary time it took for transcriptions and checking

transcription. This included manual coding and mapping (though we don’t know how

much confidence we would have in any software program that claimed to provide coding

or mapping at a semantic or meaningful level), and, in our case, coordinating work by

researchers residing each on a different continent.

The Value of Planning

Steps taken early in the process can make life easier (or more difficult) in later phases.

In particular, we would note the value of planning for causal mapping while creating the

interview protocol. It can be argued that questions should be developed pertaining to

the domain of investigation without regard to the particular method of analysis. In

essence, this is what we did for this study. However, as a result we probably had to do

much more inferring of relationships than if we had designed the questions with causal

mapping in mind. Because we are looking for explicit illumination of relationships, it

would be very helpful to fill an interview protocol with “why” questions, particularly

following from open-ended observations, to elicit these types of answers. It would also

be interesting to see where respondents base conclusions on observations or extrapolation

themselves without having underlying causal models at play.

Therefore, directly eliciting information about causal relationships, drivers, and outcome

may be problematic. People may not necessarily think about their experiences in those

terms. Raising the issue may make people overtly cognizant about these issues on the

expense of other cognitive patterns — for example, unrelated events, events forming

networks with no clear causal relationships, or events perceived as hierarchies. The

argument that causal relationships that emerge in talks not having this focus may be more

trusted than those expressed because the respondent is queued along this line has merit

and warrants explicit consideration.

Looking for Consistency and Gaps

We were looking for consistency in cause and effect within as well as across interview

transcripts. We find that the causal mapping clearly documents gaps. We raise two

issues. First, people might simply not have a unified view of UML. For example, the lack

of an implemented organizational and project UML strategy will quite likely enforce this

result. People would be left to make their own decisions. Second, the gaps may indicate

lack of or inconsistent data. The number of interviewees in our research is only 11, which

may not be sufficient for convergence of findings. Our subjects were spread among

projects and organizational units and among organizations. Therefore, gaps in the causal

mappings may be used to determine additional need for data collection. What roles

should additional subjects have and how many within each role would be needed to

increase the validity and reliability of the research findings? Hence, causal mapping may

be used as a vehicle for detecting inconsistencies in practice as well as in research.

We believe that our approach to data collection was appropriate. We did not use causal

mapping principles as the basis for interview guide design. The objective of data

collection was to allow the voice of “you” to speak in a fairly free manner. The interview

guide therefore focused on UML issues to allow the respondent to express her or his own

views. Causal mapping was employed as a method of analysis of interview transcripts.

We conclude that although causal mapping may be looked upon as a tool within the

deductive school of research it is also well-suited for qualitative and exploratory

purposes.

Conclusion

Our research objective was formulated as understanding representation in systems

development in an organizational context. The approach to representation investigated

was UML. We did not find a theory that would serve as a lens for defining issues,

constructs or relationships. As an example, the often-used diffusion theory (Rogers,

1995) was found to be inappropriate, relative to our research needs. We decided that a

data-driven approach, allowing practitioners to a large degree to speak in the voice of

“you,” should be employed.

Because we think that practitioners will be concerned about UML-related issues and the

impact of one issue on another — for example, that successful use of UML in system

development would result in projects being completed on time and within budget — we

turned to causal mapping as the method for data analysis. Although our research isn’t

completed, we find that causal mapping served us well in documenting constructs,

variables, and relationships that practitioners deem relevant. Moreover, causal maps, as

we hoped, resulted in the documentation of surprising elements, as well as gaps.

This chapter has focused on describing the methods utilized in a causal mapping study.

The emphasis has been on presenting and discussing the decisions that arose during the

process. It is not difficult to explain our thinking regarding the directions we selected at

various choice points. It is more difficult to propose that these were “the right” or even

good decisions. On the whole many of these decisions represent trade-offs – getting a

reasonable job done versus holding out for a theoretical perfection; maintaining a usable

audit train versus not getting bogged down in our own detailed documentation;

understanding in detail the view of our respondents versus emphasizing more general

emergent themes that no individual may have directly expressed.

From our personal point of view, the study has significantly created value in providing

us with a much richer understanding of the role of UML in IT development – lessons that

are extremely helpful in the classroom and working with recruiters of our students. It is

our hope that the ultimate observations that will be presented in a final report will be

viewed as combining elements of “confirming” or adding weight to commonly held views

that remain basically anecdotal in nature and elements of some surprising and new

relationships and possibilities.

Acknowledgments

The authors wish to acknowledge the contribution of Saint Louis University for

providing a summer grant to the second author for work on this chapter.

References

Benbasat, I., & Zmud, R.W. (1999). Issues and opinions – Empirical research in information

systems: The practice of relevance. MIS Quarterly, 23(1), 3-16.

Bohman, J. (2000). The importance of the second person: interpretation, practical

knowledge, and normative attitudes. In H.H. Kögler & K.R. Stueber (Eds.), Empathy

& agency: The problem of understanding in the human sciences. Boulder, CO:

Westview.

Booch, G., Rumbaugh, J., & Jacobson, I. (1999). The unified modeling language user

guide. Boston, MA: Addison-Wesley.

Brancheau, J.C., & Wetherbe, J.C. (1990). The adoption of spreadsheet software: Testing

innovation diffusion theory in the context of end-user computing. Information

Systems Research, 1(2), 115-143.

Brown, D.W. (2002). An introduction to object-oriented analysis: Objects and UML in

plain English. New York: Wiley.

Checkland, P., & Scholes, J. (1990). Soft systems methodology in action. Chichester, UK:

Wiley.

Chen, P.P.S. (1976). The entity-relationship model – Toward a unified view of data. ACM

Transactions on Database Systems, 1(1), 9-36.

Cho, I., & Kim, Y.-G. (2001-2002). Critical factors for assimilation of object-oriented

programming languages. Journal of Management Information Systems, 18(3), 125-

156.

De Marco, T. (1978). Structured analysis and system specification. New York: Yourdon.

Fahey, L., & Narayanan, V. K. (1989). Linking changes in revealed causal maps and

environment: An empirical study. Journal of Management Studies, 26(4), 361-378.

Fichman, R.G. (2000). The diffusion and assimilation of information technology innovations.

In R.W. Zmud (Ed.), Framing the Domains of IT Management: Projecting

the Future…Through the Past (pp. 105-127). Cincinnati, OH: Pinnaflex Educational

Resources.

Fichman, R.G., & Kemerer, C.F. (1999) The illusory diffusion of innovation: an examination

of assimilation gaps. Information Systems Research, 10(3), 255-275.

Goles, T., & Hirscheim, R. (2000). The paradigm is dead … long live the paradigm: the

legacy of Burell and Morgan. Omega, 28(3), 249-268.

Johnson, R.A. (2002). Object-oriented analysis and design – What does the research

say? Journal of Computer Information Systems, 42(3), 11-15.

Klein, K., & Kozlowski, S.W. (Eds.) (2000). Multilevel theory, research, and methods in

organizations: Foundations, extensions, and new directions. San Francisco, CA:

Jossey-Bass.

Larsen, T.J. (2001). The phenomenon of diffusion: Red herrings and future promise. In

M.A. Ardis & B.L. Marcolin (Eds.), Proceedings of the IFIP TC8 WG8.6 Fourth

Working Conference on Diffusing Software Products and Process Innovation,

April 7-10, (pp. 35-50). Banff, Canada. Boston, MA: Kluwer Academic Publishers.

Lyytinen, K., & Damsgaard, J. (2001). What’s wrong with the diffusion of innovation

theory? In M.A. Ardis & B.L. Marcolin (Eds.), Proceedings of the IFIP TC8 WG8.6

Fourth Working Conference on Diffusing Software Products and Process Innovation,

April 7-10, (pp. 173-190), Banff, Canada. Boston, MA: Kluwer Academic

Publishers.

Moore,G.C., & Benbasat, I. (1991). Development of an instrument to measure the

perceptions of adopting an information technology innovation. Information Systems

Research, 2(3), 192-222.

Nelson, K. M., Nadkarni, S., Narayanan, V.K., & Ghods, M. (2000). Understanding

software operations support expertise: A revealed causal mapping approach. MIS

Quarterly, 24(3), 475-507.

Nurminen, M.I., & Eriksson, I.V. (1999). Research notes – Information systems research:

The ‘Infurgic’ perspective. International Journal of Information Management, 19,

87-94.

Prescott, M.B., & Conger, S.A. (1995). Information technology innovations: A classification

by it locus of impact and research approach. DATA BASE Advances, 26(2&3),

20-40.

Robey, D., & Markus, M.L. (1998). Beyond rigor and relevance: producing consumable

research about information systems. Information Resources Management Journal,

11(1), 7-15.

Rogers, E.M. (1995). Diffusion of innovations. New York: The Free Press.

Sim, E.R., & Wright, G. (2001-2002). The difficulties of learning object-oriented analysis

and design: an exploratory study. Journal of Computer Information Systems, 42(2),

95-100.

Sauer, C. (1999). Deciding the future for is failures: not the choice you might think. In W.

Currie & B. Galliers (Eds.), Rethinking management information systems: An

interdisciplinary perspective. Oxford: Oxford University Press.

Van de Ven, A.H., Polley, D.E., Garud, R., & Venkataraman, S. (1999). The innovation

journey. New York: Oxford University Press.

Van de Ven, A.H., & Poole, M.S. (1995). Explaining development and change in organizations.

Academy of Management Review, 20(3), 510-540.

Weber, R. (2003). Editor’s comment – Still desperately seeking the IT artifact. MIS

Quarterly, 27(2), iii-xi.

Yin, R.K. (1994). Case study research: design and methods. Thousand Oaks, CA: Sage.

Endnote

1 Authors are listed alphabetically and contributed equally to this article