Dissertation Research Workflow Diagram

Recently I’ve been finding that whenever I’m stuck in my odyssey towards writing up my dissertation, modelling my process flow in a concept-mapping software (such as VUE) usually helps. In this (hopefully) final stage of my PhD project there are so many resources scattered around in various software and folders on my computer that I need a formal “concept map” (if that’s the right term) to pull them all together and work out the relationships and interactions between them.

Here is for example my last concept map that I’ve knocked up when I was unsure how to proceed with writing up the first four chapters of my dissertation. There is nothing particularly scientific about this map and it probably doesn’t follow any of the conventions of process workflow modelling. But who cares: it did the trick and allowed me to plan out the next stages of what I need to do.

Actually at least 2 or 3 days of deliberation are captured in this chart. First, I needed to decide whether I was going to use ConnectedText or something else for doing the actual writing. Through trial and error I established that it’s better to use another software because however much I love working in CT, it does have some limitations. One of them is that you can only have one instance of CT running and only one edit/view window open. Since I’ve decided to use CT as my database for my reading notes, I need to use another software, so I can be writing in one software in one monitor, while referring to the CT notes in the other. Also, there isn’t an easy way to track the word count of your document while writing in CT.

I had considered WhizFolders briefly as an alternative, but I find its interface too busy to be able to concentrate on the actual writing. So I settled on Scrivener for Windows, which works well both as a two-pane outliner and as a writing tool with decent word-count tracking.

As the sequence of the process flow is not apparent from the chart, let me describe it briefly. I start with importing my master outline with inline notes from Outline 4D (via Word). The reason I created my outline in Outline 4D is because it is a single-pane outliner that allows you to have inline notes, which you can also view in an index card view on a corkboard. Then I use Scrivener to break up the imported document into a 2-pane outline using Scrivener’s handy “Split with Selection as Title” command. As I start writing the actual text (I’m working on the first 4 chapters of my thesis, which need to be contextualised within their respective literatures, namely the Introduction, the Literature Review, the Conceptual Framework, and the Methodology), I begin to review my existing reading notes.

Over the years I have read all kinds of things that are no longer relevant. Therefore I need to deploy some kind of a filtering process to select the most important notes, as well as any new reading that still needs to be done. To consolidate my final reading list (a list of bibliographic references), I use a Natara Bonsai outline. First I import into Bonsai an existing outline document that contains some of my selected references that I have kept on my iPod/iPad in CarbonFin Outliner. Then I go through my old conference papers and other writings to extract references that are still relevant and which are kept in Word files and an old Scrivener project.

Simultaneously to this process I have also designed a ConnectedText project for keeping my final reading notes and quotes, using a similar model to the one I have developed for my empirical analysis. As my old reading notes and quotes are kept in a WhizFolders database, I will need to review those and transfer them one-by-one to the CT database (I deliberately don’t want to import them en mass, as I need to separate the wheat from the chaff). I will also use the CT project for recording any new reading I still need to do. I am designing this CT database not simply just for this writing project. Very likely it will become my main database for all my future readings for years to come. This is just an opportune moment to get started with it, as I no longer want to use WhizFolders for this.

Getting back to the chart, there are basically two important elements to it: 1) the big blue Scrivener rectangle which represents my writing, and 2) the big green rectangle below it which represents the CT reading notes database. If we look at the arrows pointing to the latter, we see mostly the data that needs to be transferred (by carefully sifting through) from my old files, as well as new reading notes that will be created in iPad.

As for the arrows coming in or out of the Scrivener project, those have to do mostly with referring to external sources. In the end I won’t need Excel for planning the word count because Scrivener has good enough tools for that. I will also use Dragon NaturallySpeaking for dictating, whenever I feel the need. Sometimes it’s easier to write without it, other times it speeds things up. As for EndNote, it is simply the central database for my references, which are linked to the PDFs that may need to be read for the first time or reviewed.

But my main point here is that it was the creating of this concept map that was crucial for getting me started with the whole writing stage. Without it I would have probably sat in front of a blank page with a writer’s block for days. Now I feel fairly confident that I know what I need to do next.

Like this:

LikeLoading...

Related

This entry was posted in concept mapping, ConnectedText, project management, qualitative research, writing process and tagged CarbonFin Outliner, Natara Bonsai, Outline 4D, PhD process design, PhD project, process mapping, Scrivener for Windows, VUE, WhizFolders, workflow modelling, writer's block. Bookmark the permalink.

Chapter 3 - Research Method

The method evolved during initial research, following both my investigation of the literature and new learning in qualitative research and descriptive methods. The literature enlightened my understanding of practical and effective methods for researching psychological phenomena in organizational life. New learning in qualitative and interpretive methods enabled my adaptation of appropriate methods for the unique focus of this study. I describe my experience with research methodology below, and then summarize the approach.

The Research Problem

This purpose of this study was to uncover and describe the relationship of individual and organizational values to the design of software products. The research started with the question “how do different values influence software products?” This inquiry led toward understanding the influences of organizational and individual values on the performance, relationships, and effectiveness of system design teams and their products. The particular focus was oriented toward values in conflict, since conflicts expose differing values within a context and enable observation and critical reflection. I addressed three research questions:

What values conflicts arise within software development teams?

To what factors do product designers and managers attribute values conflict?

How do innovation processes used in product development embed organizational values?

Instead of a single focus research problem, overlapping “problems” were found in this inductive study. The overarching social concern was for a humane perspective in system and software product design, as a significant proportion of the working population now uses information systems and software tools. Only a decade ago, custom software was more the tool of back-office administrators; but bad design now will affect the daily work and personal satisfaction of millions. The neglect of a socially oriented, humane perspective has arisen with the history of information systems in industry. The information systems disciplines, traditionally led by engineers, accountants, and managers, have produced systems reflecting their engineering and management cultures and values.

A guiding thesis asserts that these value systems shape the design of software products and information systems. In countless user interactions with systems I have noticed that the user interface, use of language, tasks afforded, and design possibilities are unwittingly oriented toward these values to a greater or lesser extent. Without reflection and evaluation of the impact of design values, products become conditioned by the perspectives of designers and their managers, based in the assumptions of business and engineering cultures.

The problem addressed by this research has been pointed to by research and thinking in system design and especially participatory design. Greenbaum (1991) describes how European trade unions that first explored participatory design approaches explicitly rejected the rationalistic, systems development perspectives in favor of a more socially-oriented approach from their own culture. However, in North America the business-based rational approach predominates.

The North American information systems industry embraces the technological imperative as a dominant style of operation, what Kling (1980) refers to as the rational style of systems organization. This approach focuses on “clear goals, tasks, specialized jobs, and procedures,” and is dominated by concerns for organizational efficiency. Because a technical approach is the only acceptable organizational style, information systems managers and engineers enjoy wide decision-making power. In my research and experience with large companies and government organizations, information systems groups often determine the requirements and design not only for systems but for business processes and work practices, based on their “ownership” of the technological tools for development. The predominate value system believes technology solutions are sufficient to the needs of complex work. Technology is also assumed to solve perceived problems of productivity, communication, and production. These underlying assumptions are important, in that they indirectly coordinate work design and organizational communication. The rational operating assumptions arise from tacit values systems or theories-in-use that reward the easy fix, the technological “solution,” and a shared perception of progress.

The case studies developed in this research evaluated software projects from the rational model, which continues as the most pervasive approach in the information technology industry. These drew from with information systems fitting the rational model or commercial software products intended for use by rational organizations.

Research Approach

Habermas (1972) makes the claim that research is motivated by interests and values (discussed by Braa and Vidgen (1997), Flood and Jackson (1991) and Dahlbom and Mathiassen (1993)). Social, personal, and political interests that are part of the researcher’s context are folded into their research. Values problems encountered in research include those social and human conflicts issuing from applications, exemplified by the commercialization or military appropriation of basic science. Further, professional and technical values are confronted in research practice, such professional values, codes of ethics or practice, and conflicts over validity of claims. Since this research concerns values in an organizational context, the value-laden nature of the research should be admitted. This discussion presents the research values and interests, and offers a context for their appreciation or critique.

Qualitative and Interpretivist Methods

In organizational research, the search for research methods that attempt to gain a holistic view of organizational issues has led to greater usage of interpretivist methods. As more organizational research continues to adopt interpretive approaches, its credibility has gained in the literatures of organizational psychology, information systems, and interdisciplinary social sciences. These literatures show an increased acceptance of interpretive case study and naturalistic observation methods over recent years, as cited in the literature review.

This summary of the research method discusses the inductive approach used, and illustrates the sequence of research activities. A “pure qualitative” approach (Patton, 1981) was adopted, using a naturalistic inquiry method, qualitative data collection, and an interpretive hermeneutic approach to content and case analysis. Overall, the research approach fits with the class of social science research methods described as interpretive field studies (Klein and Myers, 1999). Other qualitative research adopting these methodologies in organizational research include those discussed by Weick (1989, 1993), Curtis (1988, 1994) and Argyris (1992), and are primarily based on case study research.

Case studies are particularly valuable for understanding complex phenomena in context, and according to Yin (1989) when “users’ intentions, technology use patterns, and social impacts – cannot be clearly separated from the social, technological, and organizational contexts in which they occur.” Interpretive field studies are often based in turn on the “soft case” study approach, described by Braa and Vidgen (1997) as a research framework for organizational study in information systems research. They demarcate between methods appropriate for prediction, understanding, and change; and soft cases are adopted when the research intent warrants understanding phenomena. Recognizing that many studies address more than one of these intents in varying degree, research approaches are mapped to the outcomes desired by the research intents. For predictive outcomes, reduction approaches are used; understanding necessitates an interpretive approach; and for institutional change, intervention approaches are employed. The research intention was to develop understanding, with an orientation toward social change. An interpretive approach was found suitable for this research.

The specific interpretive methodology adopted fits with the class of methods known as processual research (Hinings, 1997, Orton, 1997, Weick, 1993). Processual research studies organizational phenomena, specifically issues such as patterns of behavior among groups in context, and understanding the meaning of organizational behaviors. Processual research draws from the methods of inductive grounded theory (Strauss and Corbin, 1990, Hinings, 1997), using a series of research methods to acquire data inductively, evaluate it in context, and to generate initial theory. Processual research attempts to satisfy the conflict between inductive organizational studies, which are ‘data-rich, theory-poor” and deductive studies, typically “data-poor, theory-rich” (Orton, 1997). Processual research also bridges the gap between qualitative and quantitative research, as it can fit methods from both orientations without conflict. Interpretive studies generate insights and theory from the inherent richness of data acquired from naturalistic observation and semi-structured in-depth interviews.

Other researchers support processual research for use in iterative grounded theory (Orton, 1997), and for comparative case studies in advancing theory (Fox-Wolfgramm, 1997). The processual research approach allows for iterations between theory and analysis, affording variations of the research questions to form from the data analysis itself. “An iterative process researcher suspects that researchers who can predict the appropriate research design in advance might not be asking difficult enough questions.” (Fox-Wolfgramm, 1997, p. 432)

This study drew together three approaches – interpretive case research, grounded theory, and hermeneutics - to explore the data in depth and to triangulate methods to strengthen the interpretations. Since the study also developed initial theory, both inductive and deductive research methods were selected to support theory development from the data collection. The research was limited to initial theory only, suggesting significant further study in this area.

Grounding of the Research Methodology

The research literature of the three core disciplines and their specializations (participatory design – information systems, organizational psychology – organizational process, and design studies - product design) contain numerous discussions of values conflicts and the ethical concerns in organizations. I built on key studies from these literatures to establish the issues informing this work. Critical reviews and analyses were drawn from the literature as a starting point to highlight the social concerns of information systems design. Case studies were identified and evaluated for their applicability to these social and organizational issues. The critique of the case study sample also provided a literature basis for specifying criteria for developing the research instruments. These data gathering and analysis instruments were employed for interview sessions and hermeneutic analysis of content. Emergent themes and relationships among data and models were analyzed and documented.

The deductive phase used this body of data and the models as points from which to hypothesize and present cases for bolstering interpretations. As the data was interpreted, and various models developed, a validation approach was framed to test the models. A single detailed case study drawn from the interview sample was selected and observational and document details were evaluated against the models and the interview data.

The current study focused primarily on understanding values conflict in system design, suggesting an interpretivist case study approach. The theoretical development of values embedded in organizational process also required some development of a prediction-oriented model. The following steps characterize the path of inquiry.

Phase

Method

Tools

Initial (Deductive)

Development of heuristic model of product and process values, based on initial theory.

Theory development, model construction

Initial (inductive)

Literature review and development of initial research questions.

Online search, library research, following references

Investigation (inductive)

Analysis of seven case studies from literature

Case study evaluation

Summary

Development of initial values framework

Synthesis, model-building

Inductive analysis

Design and evaluation of interview guide, and interpretive field study of 10 project cases

Hermeneutic (semi-structured) interview, case analysis, and content analysis

Inductive

Interpretation of field cases and cross-case analysis

Transcript analysis, Hermeneutic content analysis

Inductive

Development of categories and interpretive models

Synthesis, model-building

Summary

Initial summarization of transcript and case study data

Synthesis, model-building

Inductive

Design of process interview guide, and interpretive field study of 2 organizations

Hermeneutic interview and content analysis

Inductive

Integration of case study and interview data toward development of theoretical model

Transcript analysis, Hermeneutic content analysis

Deductive

Development of theoretical categories and interpretive models

Synthesis, model-building

Deductive

Design of initial theoretical model

Synthesis, model-building, theory construction

Table 3-1. Inquiry Methods and Reasoning Models.

 Research Methodology Overview

The complete research methodology is illustrated in Figure 3-1, which displays the flow and relationship of these activities.

 

 
 

 

 

 

 

 

 

 

 

 

 

 


Figure 3-1. Research Method.

The methodology is described in detail in the following section, Description of Methodology.

Description of Methodology

Each of the techniques employed in the research method are described to explain their purpose and application to the study. Techniques are listed as steps in the order they were conducted, as described in the flow diagram in Figure 3-1.

1. Background and Initial Theoretical Model

Prior to conducting PDE research in earnest, an initial theoretical model was developed using heuristics from prior research and experience. This model (included as a separate section at the end of this chapter) integrated the ideas and constructions generated during pre-research. This model served as the basis for testing process and values concepts during the inductive phase of research, but was not used to guide a deductive research process.

2. Literature review

The initial research activity reviewed the literature and developed research questions. The literature review spanned the body of journals, abstracts, relevant book sections, and references from articles across the works of systems engineering, organizational psychology, and industrial design disciplines.

Methodology review was conducted across the qualitative research literature, starting from phenomenological and hermeneutic studies (Gadamer, 1976, Ricoeur, 1981, van Maanen, 1991) and qualitative research and evaluation (Patton, 1990, Strauss and Corbin, 1990, Denzin and Lincoln, 1994). The methods of case study, grounded theory, and action research were investigated and documented.

The initial research questions were drawn from exploring the research in personal and organizational values conflicts in system development throughout these literatures. Three rounds of questions were developed before defining the research question and sequences of questions for interviews. Peers and faculty reviewed each round for applicability, unbiased presentation, and independent contribution to the research question.

3. Literature case analysis

Seven case studies were identified through the literature review, drawing from journals in the areas of computer science and organizational studies. Twenty cases were evaluated and interpreted for their exposition of values and moral conflict in systems design work and in design decisions, resulting in seven studies selected for analysis of values issues.

Cross-case analysis was conducted on these case studies, specifically focusing on values issues in organizational design process. The analysis involved:

Literature review of cases. Although four of these studies dealt explicitly with software product development (Curtis, Poltrock, Tang, and Walz), the other three (Orlikowski, Robey, and Sachs) were from the information systems literature, oriented toward organizational approaches. All of the cases demonstrated results with specific implications for organizational process and behavior.

Selection of criteria for case candidates. My criteria for evaluating and selecting cases was reviewed with my doctoral committee, and included the following:

Analysis of candidate case articles from relevant literature. Twenty candidate articles were reviewed for their fit against the specified criteria. Ten studies were selected based on the initial criteria.

Identification of values dimensions and issues in the articles. Case studies were analyzed to identify supporting patterns and issues relevant to developing theory. The next section in this chapter identifies and discusses the case studies and their analysis.

In both the initial and final reviews of literatures across disciplines for this research, these seven cases were found the most applicable and detailed, except for discussion papers and derivations of the same cases in other published reports. Several other research reports and cases were used as supplementary models and theoretical guidance, (Kling, 1996, Kumar and Bjorn-Anderson, 1990, and Friedman, 1997) but these did not meet the criteria for descriptive foundation cases for purposes of this study.

4. Case-based model development

Based on the case study analysis, the initial values framework for systems development organizations was designed. Values issues, specific values dimensions, and their relationships to organizations and individual behavior were organized into a framework and analyzed for general consistency, applicability, and relevance to the data collection materials. This framework, with the research questions, was used in preparing the interview guide.

The interview guide was developed from the combined initial research, using the research questions developed in the literature review, and supplemented by the values framework developed in the analysis of research models. An initial interview guide was designed, and subsequently reviewed and critiqued by committee peer Dr. Linda Tobey. This guide was evaluated in a pilot interview, and based on evaluating this pilot session, was modified before being used consistently with all participants. See Appendix A for the interview guide.

5. Individual data collection

Interviews were scheduled with participants that fit the background requirements, based on a purposive sampling approach, specifically operational construct sampling (Patton, 1990).  The unit of analysis for this data collection was the project, so project experience was required. Although originally only 5-7 in-depth interviews were planned, participation was expanded to 10, to include more participants from other product organizations and to better balance the sample between software developers/designers and business/product managers. The interviews were conducted using a printed, standardized instrument as an interview guide for semi-structured interviews. Not every question was asked of each participant, but each question asked was presented in the same way to each participant to minimize bias.

Allowance was encouraged within the interviews for participants to reflect and pursue their own interpretations from their experience. A hermeneutic phenomenological approach (Douglass, 1997) enabled participants to reflect on the meaning of their experiences during the interviews. This approach engaged participants in a deeper exploration of their ascribed meaning of organizational behaviors and interactions with teams in development projects.

The interviewing process involved: 1) A pilot interview to refine the instrument and questions, 2) final instrument review with committee members, 3) final instrument designed, 4) interviews scheduled and conducted, 5) interviews transcribed verbatim from audiotapes, and 6) analysis of interview data.

6. Data coding and integration

The interviews were transcribed and coded according to initial categories. Content analysis of the interview transcripts followed procedures drawn from Patton (1990). The structure of analysis followed the questions represented in the interview instrument. The unit of analysis was the project case, and interview questions and dialogue focused on values conflicts in software development projects. Cross-case analysis was used to draw forth common and recurring experiences and concepts.

Several analysis passes against the data drew different sets of findings. This iteration across the transcript data is a typical procedure in grounded theory approaches, using open coding and axial coding of data (Strauss and Corbin, 1990). Open coding enabled the derivation of categories as suggested by the data, a preferred approach given the inductive approach of this phase of the study. Possible categories were defined through iterative analyses of the text data, with refinements to the coding made as new data were evaluated for their addition to the category scheme. Each pass analyzed the data for features of a specific research question, developing all the categories based on recurring themes. Axial coding was then used to stabilize the set of categories in each dimension, allowing the linking of findings and the development of connecting findings. The resulting scheme of the open coding is documented in the Interpretation Matrix, Appendix C.

Analysis and coding of the data transcript resulted in several matrices, spreadsheets, and summaries used to visualize and represent the data, enabling further discovery of patterns in the issues raised by the participants. Finally, the comprehensive findings developed from the analysis were presented as the analyses and summaries in the Findings chapter. This completed the inductive research, evaluating the data to understand its content and meaning.

Following this analysis, descriptions of software projects were organized by associating patterns of statements and observations with the concepts from the values framework. This activity initiated the deductive phase of inquiry, or the initial development of theory, following a grounded theory approach. These models are documented in Chapter 3.

7. Organizational data collection

After analyzing the initial interview data and frameworks, findings emerged pointing to the effects of organizational culture on values conflicts in projects. These attributions were significant enough to support gathering data on organizational processes and the perceived values implicitly associated with product management and development. Most of the original ten participants had independently discussed the problems of organizational culture; so follow-up interviews were arranged with different participants to explore this dimension within the study.

Five in-depth interviews were conducted to follow up on these emergent findings. Participants were drawn from two of the firms represented by the original 10 participants. These participants were selected for their broader experience across many projects in their organization to which they could refer for questions about organizational processes and development process. These interviews inquired into the values in organizational processes and their impact on the organization; therefore a different unit of analysis (organization) was specified. The data collection resulted in annotated transcripts and a matrix of relationships among organizational process factors.

8. Hermeneutic interpretation - Integration of inductive data

The conclusion of inductive research required integrating all data sources and interpreting the findings, essentially a hermeneutic analysis of consolidated data. In evaluating transcripts from project cases and organizations, I focused on the organizational process questions and consolidated responses from all sources. The hermeneutic analysis was designed as part of an integrated research method – the interview data was gathered with the hermeneutic circle in mind, the data was analyzed based on the historicity and context of the organizations involved, and the interpretations were drawn from considerations of multiple voices.

The hermeneutic interpretation of the research was developed from Gadamer’s (1976) approach, which calls for self-reflection and critical analysis of the interests at stake in both the research and the scientific methods used in conducting the research. Further developing Gadamer’s hermeneutics into research methods applicable for information systems, Klein and Myers (1999) proposed seven principles from which research should draw. These principles and their application in my study are described as follows.

1.      The Hermeneutic Circle - The hermeneutic circle is considered fundamental to the interpretation process. This principle suggests that understanding is achieved through iterations in a dialogical reflection. The researcher iterates between considering the interdependent meaning of parts and the whole that they form. This principle underlies the other interpretive principles.

2.      Contextualization – The research must critically reflect upon a social and historical background of the field of the participants, taking into account the historicity of events and foregoing interactions that shaped the environment of the researched phenomena.

3.      Interaction between researcher and participants – The research process must support reciprocal dialogue between the researcher and participants, wherein the contributions of participants are allowed to affect the co-construction of ideas. This principle calls on the researcher to acknowledge and reflect on the social construction of the data derived from the interaction.

4.      Abstraction and generalization – Hermeneutic interpretation cannot be generalized directly from the findings, but must be tempered by an abstraction process. General findings are abstracted from their idiographic details and applied to the appropriate level of understanding.

5.      Dialogical reasoning – The researcher becomes required to adjust (and iterate) among contradictions between initial theoretical preconceptions and the emergent findings of the data. It is incumbent upon the researcher to allow the data to tell the story, not to fit the findings within a predetermined theory. 

6.      Multiple interpretations – Each participant in the research may offer differing and novel interpretations of the issues studied and questioned. The multiple voices should be supported in the research by specifying where individual differences among participants affected the findings. The voices should be represented in the actual words of the participants. 

7.      Suspicion and sensitivity – The researcher must be sensitive to their own biases, and must practice “suspicion” of their own systematic distortions. While suspicion begins with the researcher’s adoption of epoche to clear the field of analysis from prejudice, the notion of suspicion carries the freedom from bias throughout the hermeneutic analysis.

The intent of the hermeneutic interpretation was to develop a thorough, multi-level description of the primary issues found in the study relating to organizational processes and embedded values. Focusing on the organization instead of project, the hermeneutic analysis generated representations that informed understanding of how organizational patterns embody values and meaning in official processes, routines, and practices.

The hermeneutic analysis drew from the complete verbal content transcribed from all interview protocols. This analysis was performed on text from the ten values interviews and the six process interviews, enabling interpretation of patterns of interaction and descriptions of values systems. This hermeneutic content analysis reviewed each interview case for the unique voice and issues raised by the participant for their situation. Each project case was initially considered separately, and evaluated with respect to their context and experience of organizational processes. After this analysis, claims from across the sources were integrated into common themes. Repeated themes and similar meanings were drawn from the voices, and organized into a qualitative description for each of the two primary organizational environments studied in the research.

The organizational cases were evaluated as single-case hermeneutic analyses, drawn from the personal experience of each individual. This approach also acknowledged rival interpretations, or voices that differed from the interpreted recurring trends. One of the strengths of a hermeneutic research approach was to identify how individual differences show up in a participant’s discourse. It functions as a check against biased or opportunistic abstraction of a participant’s specific meaning into the pooled interpretations, and forces the researcher to notice when the meaning of findings differ among individuals.

For example, one recurring claim, “process management appropriated for individual influence” must also be interpreted to allow for the legitimate organizational uses of process management. The interpretation must include and balance differences among interests for using the process.

The narrative interpretations portray the themes and meanings from two of the five individuals interviewed in depth for each of the organizations represented. Although in and of themselves these are not complete and sufficient descriptions of the phenomena, they are abstracted from the grounding of extensive contextual knowledge represented throughout the findings. The two individuals representations were balanced by contributions of the other three interviews. However, these other interviews did not elicit themes from the experience of working directly with product lifecycle management processes as the two primary interview participants.

9. Activity Theory analysis

The theoretical analysis extends the findings beyond the hermeneutic interpretation, starting the deductive reasoning cycle for theory development. By applying a theoretical interpretive model (Activity Theory) to the findings, the findings are mapped to its constructs. Where appropriate, new theoretical and explanatory constructs were developed for an emerging theoretical model. This step was experimental, and designed to allow the activity theory framework to fit together the individual and organizational activities described in the data. However, this analysis was considered a separate research activity, and was not included as part of the grounded theory Findings. The organizational activity analysis and theoretical discussion are included in Chapter 5, Development of Initial Theory.

10. Initial theory development

The final phase involves developing initial theoretical models of the phenomena, to the extent possible given the interpretations. As the final analysis of the study, the results of evaluating the models against the case study are used in further refining and describing the models. This deductive analysis completes the PDE research, but provides a platform for further description and research. The refinement and final development of the organizational values process theory is documented in Chapter 5.

Structure of the Study

Participants

Two separate rounds of interviews were scheduled to collect data on the two dimensions of the study. For the first round, semi-structured interviews were employed with 10 participants, nine male and one female, ranging in ages from approximately 28 to 55. A diversity of project experience was required from the participants, and all had worked on at least three different projects involving the development of software products. Of the 10 participants, six were selected from the same company, although each had worked on different projects, and none had worked together on any of the projects described in the case reports. The remaining four participants each worked in two different companies and organizations, for a total sample across three large organizations.

A purposive sampling method was used to select the participants, based on operational construct (Patton, 1990). The study’s intent was to explore organizational patterns in product design projects involving both managers and designers, so these constructs were necessary to elicit from the participant’s experience. Five of the 10 participants were software designers or developers, and five were product or business managers for software projects. One trial participant was also employed to test the protocols and procedure, and had experience as both a designer and product manager. Eight of the managers and designers were from the same firm (referred to as Data Online Corporation, not the company’s real name). Three participants, all designers meeting the sampling criteria, worked for three computer software companies other than Data Online. Their representations offer a balance to the seven cases from DOC.

Supplementary in-depth semi-structured interviews were conducted with an additional 6 participants matching the characteristics of the initial group, to investigate organizational process phenomena. These participants were from both DOC and one of the other firms selected for the initial project interviews, and they represented the same professional practices within those organizations. Four males and two females were interviewed, all with management or design experience.

The participants in these interviews had over five years tenure with their companies, and had a variety of professional and organizational backgrounds. Three designers were interviewed, one software engineer and two interaction designers. A key participant in this group included a software engineer with research interests and publishing in organizational studies. Three product managers were also interviewed, including a senior product manager with 10 years experience, who had held positions in product and resource management, and two with prior experience in human factors and design.

Materials

Materials used for recruiting and screening participants included the invitation to participate letter and the informed consent, which all signed. Approval of the human subjects protocol (invitation letter and informed consent) was obtained via email from all committee members. Copies of these forms are included in Appendix G.

To conduct the initial interview sessions, a 21-question interview guide was prepared and tested with both peers and a trial participant. This interview guide minimized bias, by providing a basis for a consistent sequence and approach to interviews, and adopted a consistent wording of the applicable questions. This interview guide also served as the form for collecting participant personal information, and for collecting specific notes during the interview. Interviews were audio taped and transcribed as verbatim transcripts, which are available for review. The interview guide also included two scaled survey questionnaires, one to assess product-oriented values and one to assess process values, which were administered to each of the participants following their interview. The interview guide is included as Appendix A.

The second round of interviews used a 15-question interview guide to elicit discussion of values in organizational innovation process. This interview guide is also included in Appendix A.

Procedure

The procedure used for conducting the interviews was performed as follows. Participants were scheduled for a 90-minute to two-hour session in a private location, typically a conference room. They were asked to read and sign the informed consent, and asked if they had any applicable questions for the researcher.

Interview Procedure – Product Design Values

The following procedure was then used to conduct the initial semi-structured interviews, based on the instructions specified on the interview guide for the researcher to follow:

The description of the research was read, which allowed for the participant to ask any questions to clarify the nature of the study or their expectations for participation.

An opening exercise was used to set the orientation for inquiry about values systems. Participants were asked to take three minutes to reflect on their personal values and write them down. This exercise served two purposes. It allowed the participants to reflect on their own most important life values as they currently hold them, which were then considered upon reflection as questions were asked in the interviews. Also, since many of the interviews were held at the work location, this exercise served the additional purpose of enabling participants to consider the various contexts of values in their lives, and not just the work or organizational values which normally are acted upon in the work environment. They were not asked to reveal these values as part of the interview. When participants freely offered these values during the interview, it was treated as self-disclosure and as evidence of their strength as personal values.

Participants were asked to reflect on their work history and to identify a project where they had a significant role and conflict arose among different team members during the course of the project. They were then asked to describe the project and any conflict that occurred. In most cases, this first project identified was the case described throughout their interview.

Following this project description, participants were then asked a series of open-ended questions in a semi-structured format from the interview guide. In an attempt to minimize bias from the questionnaire, each question was asked in a similar voice and manner among all participants, and minimal clarification was given if requested by the participant. If it was obvious that a question would not apply in the situation of the participant, it was skipped and the next applicable question was asked. Participants were encouraged to describe situations in significant detail, and were asked follow-up questions (as part of the hermeneutic circle) to draw forth emerging meaning.

As the interview concluded, participants filled out two Likert-scaled questionnaires. The first instrument requested 6 questions about the values associated with a software product of their selection. The second instrument asked 6 questions about values associated with the design process of their case project. Each values question used a five-point scale differentiating between degrees on a specific values dimension.

Interview Procedure – Organizational Innovation Process

A similar procedure was followed for the additional interviews to study organizational and development processes. Informed consent was provided, and participants had the opportunity to stop at any time. A research description was read, and any clarifying questions were answered. I then proceeded with the 15 questions, using the interview guide as a guideline for sequence and for staying on track. Participants in these interviews were more inclined to express widely ranging stories from their experience, requiring a much longer duration in the interviews than expected based on the initial interviews. This suggested a significant depth of understanding and consideration from their experience, perhaps more than elicited from the original interviews that focused on values conflicts in project experience.

Project Interviews

Ten participants contributed to the interviews, responding to questions selected from the interview guide (Appendix B) from their experience with a specific project case. Of the ten participants, 5 were product managers or similar managerial role and 5 were software interface designers or developers. All participants had Internet product design experience, 8 with extensive experience in this area. All case projects shared a similar work team structure – a project manager, product managers representing the business interests, software engineers, and human factors/interface designers.

All participants had over five years experience in their profession, and they represented five different organizations, from three large companies with traditional corporate cultures. A weak matrix organizational structure was employed to some extent in each organization, with only one case representing a more traditional hierarchical structure. The organizations for 9 of the cases were all representative of the product organizational context (Jones, 1998), and the single infrastructure project can be classified as a systems context project, but performed for a product context organization. All product teams sampled were from large organizations, with each containing from 50-300 individuals within the software development organization. Team sizes ranged from 6 to 120, with an average of 21 members, with most teams (mode) consisting of around 20 members.

The 10 cases represented a cross-section of software development projects: commercial product development or product infrastructure – none of the cases were drawn from corporate IT or government systems. Products represented in the sample included Web search and retrieval products, CAD-CAM engineering software, enterprise retail sales, and legal research software. Products in the sample can be classified as either innovative (discontinuous) or redesigned, continuous products, with the exception of a single infrastructure project supporting a family of products.

 

The following table briefly describes the project cases and identifies the roles of participants. The identifier key was defined for participants as a “D” for Designer/Developer or an “M” for Manager, typically a product manager.

Participant

Firm - Role

Project Case

D1 – Don

DOC - Senior Engineer

API for product infrastructure

D2 – Lynne

ADS - Product Designer

SalesView – Sales contact management product

D3 – Brian

DOC - Interface Designer

Web Enterprise – Web-based integrated news search product

D4 – Mike

CAD - Interface Designer

CAD-CAM software package

D5 - Kent

UT - Quality Engineer

CAD-CAM design tools

M1 – Paul

DOC - Product Manager

SearchBuddy – Web-based search products

M2 - Jack

DOC - Product Manager

Reference Citation Infrastructure

M3 – Hal

DOC - Product Manager

Web-based data enhancements

M4 – Karen

DOC - Product Manager

Web Product – Seven Web-based search engine products

M5 - Lloyd

DOC - Sr. Product Manager

Online Science - Web-based online scientific journals

Table 3-2. Participant Summary.

Case Interpretation Matrix

Responses were coded and summarized in a tabular format to effectively compare across dimensions of interest. A case interpretation matrix (Appendix C) enabled summary evaluation of each case specified by the participants based on open coding. This format represented the coded form of personal values, roles and conflicts, and organizational factors raised in the interviews. Organizational factors identified included process values, both consonant (in accord with personal values) and dissonant. I also noted impacts of project conflict, and organizational values espoused and in-use.

Analysis of this summary data revealed patterns in the transcripts useful for understanding both a holistic view and for targeting specific phenomena of interest. These patterns and phenomena are described in the hermeneutic analysis below in more detail. The data was summarized to identify categories with the following representations in Table 3-3.

Category

Description

Participant

Identified participant ID, job title, context of work practice, and customer orientation.

Individual Values

Listed individual’s personal values statements if disclosed.

Roles and conflicts in roles

Types of conflicts described in statements (e.g., control of product direction, information sharing) and roles in conflict with theirs.

Process values – positive and negative values

Positive, shared values and negative values issues associated with work process – organizational, team, design, or development.

Organizational values

Organizational values identified, both espoused and practiced.

Product or organizational impacts

Impact of conflicts on product or organization.

Table 3-3.  Case Interpretation Matrix Categories.

 

Process Values Analysis Method

A survey format was used following interviews to quickly collect data on the perceived values in force within the organizations represented by the case projects. Analysis of survey data supported the interpretations of values conflicts drawn from the interviews. Participants gave subjective ratings to case projects following values dimensions generated from the composite model. The instrument’s values dimensions were derived from the institutional values model described above, which structures a large set of design values inherent in organizational processes.

Evaluation of Case Studies

Cross-case analysis was conducted to inform the research problems in advance of data collection. As the literature review proceeded, candidate cases were identified and set aside for a subsequent evaluation. Twenty case studies were screened for applicability to the analysis, and of these candidates, seven were selected that fit the criteria. The following criteria were used to evaluate the case studies.

1.      The case study describes work from the domain of information systems or software product development.

2.      The cases describe a diverse constituency: Users, system designers, and managers, preferably documented with verbatim protocols from these roles.

3.      Examples of team interaction and professional conflict or differences are described. Preferably, conflicts were noted over interpretations of requirements, project activities, or how a design biases or advantages an interest group.

4.      Sufficient detail is provided in the case to support interpretation and content analysis.

Not all of these criteria applied evenly across the selected studies. Although each study fit the domain of information systems, and described most of the types of constituents of interest, other criteria may be found more definitively in one but not all studies. A recent article (Klein and Myers, 1999) presented a foundation for evaluating interpretive field studies in the IS literature. Interestingly, all of these case studies meet most of their criteria also, even though only one study (Sachs, 1995) used a hermeneutic approach. Since my goal was to elicit values constructs from the presentation and interpretation of field-based study data, the Klein and Myers approach further anchors these selected cases in the hermeneutic method.

The values systems or scales adopted for analysis of the content and statements in these studies included: Engineering-technical, organizational-managerial, personal-individual, social-political, and human values. Specific values constructs (specific attributions of value) elicited from the cases were identified by using these top-level categories as points of reference, and assigning the values dimension to the category.

0 Replies to “Dissertation Research Workflow Diagram”

Lascia un Commento

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *