navbar 4Resource papers in action research
 

Qualitative action research: improving
the rigour and economy

 

This is a resource file which supports the regular public program "areol" (action research and evaluation on line) offered twice a year beginning in mid-February and mid-July.  For details email Bob Dick  bdick@scu.edu.au  or  bd@uq.net.au

...  in which I offer some suggestions about ways in which action research can be made more attractive to practitioners by adopting methods for improving both the rigour and the economy in conducting and reporting their studies

 

Contents

 

Abstract.  In many of its common forms, action research is time-consuming and complex both to conduct and to report.  This is a disincentive to its wider use by practitioners and others.  It can be remedied by the deliberate use of dialectic methods -- multiple data sources, and constant effort to test apparent agreements and emerging interpretations.  If you do this at all phases of the action research you can increase efficiency at the same time that you improve the rigour of your study.

 

Action and research

As its name implies, action research can be viewed as having two main outcomes -- action and research.  It therefore requires two sets of procedures, one to achieve each of the outcomes.  In terms of the distinction between content and process, these are both instances of process.

(Content consists of the "what": the data.  Process consists of the "how": for example the processes for participation or for data collection.)

The first set of procedures might be called the intervention methodology.  Its eventual function is to bring about future change.  In the short term, therefore, its key function is to involve those who are most affected by the change in a way that secures their commitment.

The second set of procedures might be called the research methodology.  It has the task of adding to understanding.  In other words, it produces research outcomes.  This it does by generating valid data in a way that casts light upon the functioning of the client system or the action research process itself, or both.

This, I think, is consistent with the common use of the term (for example see most of the authors in Zuber-Skerritt, 1991).  Like Michael, Luthans, Odiorne & Burke (1981) some would limit the use of the term "action research" to participative methods.  That is the emphasis in this paper, though I would be reluctant to adopt participation as a necessary defining characteristic.

In most formulations including the present one action research is also explicitly cyclic in application.  Carr and Kemmis (1986) for example conceive each action research cycle as comprising planning, action, observation and reflection.

Another way of describing its cyclic nature is as follows.  It often begins with only a rough or fuzzy research question, and perhaps a fuzzy methodology.  It is no surprise, then, that the early cycles usually yield fuzzy answers.  The answers can be used to refine both questions and methodology, however.  The process and content therefore are successively refined at each cycle.  You are more likely to achieve the action outcomes if you do this participatively.

For some purposes it is also useful to distinguish a third component which manages the content.  This might consist of some techniques for data analysis; or models (such as categorisations or taxonomies) may be used to make sense of the data.  For example in soft systems methodology (Checkland and Scholes, 1990) the concepts of general systems theory are used so as to provide the content description.

As you might expect, action research activities vary in their emphasis on action or on research.  Some, such as those described in French and Bell (1990), focus on the action outcomes.  Others such as Heller (1976) clearly favour the research.  Whatever the balance, both can be at least partly achieved.

Further, the action and the research can enhance one another.  This is one of the advantages of the approach.

My intention here is to offer ways of increasing the research component in action research which has change as its primary goal.  If this can be done, I think, that some of the present neglect of action research can be overcome.  Practitioners, who seem to do very little research, would find it more rewarding.  Those of us who work as consultants or practitioners might find it easier to publish the results of our work.

 

The strange neglect of action research

A tradition going back to Kurt Lewin (1946, 1951) and perhaps beyond (see French and Bell, 1990) hasn't lead to widespread use of action research.  Some pockets of education have provided a fertile ground, (e.g.  in Australia, Carr and Kemmis, 1986, Kemmis and McTaggart, 1988).  In many other disciplines it has gone almost unnoticed.  However, in recent years the health sciences have made more use of it.  Nursing research using an action research approach is becoming common.

Although perhaps an extreme example, my own discipline of psychology will serve as an illustration.  Even at a 1986 conference with the theme "Bridging the gap between theory, research and practice", action research barely gained a mention despite its obvious potential to bridge the gap.

The teaching of psychology in universities makes the point as clearly.  Most psychology departments are engaged in practitioner training, yet action research rarely forms even a small part of the syllabus.

North American evidence is that conventional research teaching does not result in practitioners who do, or even read, research (Barlow, Hayes & Nelson, 1984).  We now know (Martin, 1989) that it is little different here in Australia.

On the other hand, action research appears particularly suited to practitioner training.  How is one to explain the neglect?

Part of the reason is no doubt to be found in the traditionalism of our universities.  But I think there is a more compelling reason for action research being a fringe methodology.  Its relative lack of economy in both conduct and reporting is a disincentive.  So is its perceived lack of rigour.  If these could be improved there might be more interest among practitioners and practitioner-academics in using it.

 

Gaining a focus: dialectic

At the heart of the approach I advocate is a notion of dialectic.  By this I mean something very similar to what Jick (1979) and others call triangulation, though more general.  Its effectiveness depends upon using brief and therefore multiple action research cycles (often by having cycles within cycles).  At each cycle you pursue multiple sources of information.

You can collect these in several ways, including different methods (that is, "triangulation" as the term is often used), different informants, different researchers, overlapping data from a single informant, and so on.

Within each cycle, you limit your attention to agreements and disagreements within the two or more data sets (Figure 1).  Between cycles, you seek data which challenge or disconfirm the interpretation already reached.  Each cycle begins by refining the questions and the methodology in the light of the previous cycle.

Figure 1
 
Two overlapping data sets.  Only
agreements and disagreements need be
pursued, to test the agreements and
explain the disagreements
You collect multiple data sets.  You then interpret the data by focussing on the agreement and disagreements within the data and comparing this to your prior interpretation.  You will see (below) that this provides the economy during the conduct of the study.  It also facilitates later economy in reporting.

 

Economy in conduct

The economy in the conduct of this style of research is derived from a number of sources, two in particular.

First, the interpretation takes place gradually.  It converges towards a final interpretation over the multiple cycles.  There is less need, therefore, to record the copious amounts of data that some qualitative methods require.  At each cycle you record only the data directly relevant to your current interpretation.

Second, you use the multiple sources of information to select the data to be recorded.  You focus on agreements and disagreements and ignore the more idiosyncratic data in any data set.

Space limitations hinder me from giving more detail.  In essence, during the interpretation phase of each cycle you develop more specific questions to test any emerging agreement and to explain any emerging disagreement.  You can find a more extensive discussion in my Rigour without numbers (Dick, 1999), which explains the source of the rigour in such an approach.

You will note, too, that your interpretations of the data emerge slowly and are constantly tested.  For the participants this increases the credibility of the information, especially if you use highly participative approaches.  As researcher, you can have more faith in the research outcomes.

Models such as those of action theories (Argyris, Putnam and Smith, 1985)) or systems theory can be used, but are not necessary.  If you prefer, you can derive the concepts and models from the data, as in grounded research (Strauss and Corbin, 1990). 

You don't need a research question or hypothesis at the start of the study beyond a wish to know how to improve the situation.  A general action research model such as this can therefore be used as a formative evaluation process (such as the one Adelle Bish and I reported at the conference on reflective practices).  2

So there is economy and rigour for both the action methodology and the research methodology.

Provided the process is well managed, for instance by using processes similar to conflict resolution, the various participants educate each other.  The final plans for change can be more consensual, and based on better information.

At the same time, the research methodology is strengthened and made efficient because of the increasing detail of the information and its careful interpretation.  In the early stages the questions and answers are general.  In pursuing the disagreements and testing the agreements (Dick, 1990) the research becomes quite focussed.  You sacrifice some of the richness of some approaches but gain substantial economies, and precision, in repayment.

 

Economy in reporting

I have examined or otherwise read some distressingly long action research reports on occasion.  Redundancy is often high.  The amount of data reported is often massive.  To make matters worse, if you don't use traditional methodologies you have to take more care to justify your approach.

Reducing the amount of data to be reported would often help.  So would any savings in the blow-by-blow description of the methodology.  If you report only the conclusions the reader is unable to verify your interpretation or check its adequacy.  So how can you be economical?

The answer is to be found, first, in being clear what the report is about.  During the conduct of the study the use of dialectic allows the interpretation to emerge gradually and to increase in precision.  It is therefore easier for you to know what special contribution the report makes to knowledge or understanding.  This may be knowledge about the methodology or the client system or occasionally both.

The contribution guides you in deciding the conclusions.  The conclusions in turn become the organising principle around which you can construct your report.  In general, you report only the conclusions and the methodology and evidence relevant to those conclusions.

Your methodological description need provide only enough detail for someone else to replicate the essential features.  The directly relevant information consists of the information within the dialectic -- that is, the agreements and disagreements relevant to the particular conclusion.

 

Conclusions

In summary, action research can be conducted and reported more economically without sacrificing rigour.  In fact, the rigour can be enhanced.  The use of triangulation or other dialectic approaches gives better data for intervention and for understanding as well as allowing efficiency to be improved.  Dialectic also provides the economy in reporting.  You can focus on the contribution to understanding which the study makes, and report only the conclusions, the dialectic and the methodology which relates directly to it.

 

Notes

  1. A revision of a conference paper in which some methods are offered for improving both the rigour and the economy of qualitative research in general, and action research in particular.
     
    Modified from a paper prepared for the Second world Congress on Action Learning, University of Queensland, Australia 1992.  This earlier paper is available as Qualitative action research: improving the rigour and economy.  In Christine S.  Bruce and Anne L.  Russell, eds., Transforming tomorrow today: 2nd World Congress on Action Learning.  Brisbane: Action Learning, Action Research and Process Management Association.
    back ]
     
  2. Briefly, the evaluation began in a very open-ended fashion.  In individual and group interviews, people were encouraged to talk about the class.  "Tell me about the class...".back ]

     

References

Argyris, C., Putnam, R., & Smith, D.McL.  (1985) Action science: concepts, methods and skills for research and intervention.  San Francisco, Ca.: Jossey-Bass.

Barlow, D.H., Hayes, S.C., & Nelson, R.O.  (1984) The scientist practitioner: research and accountability in clinical and educational settings.  New York: Pergamon.

Bish, A.  & Dick, B.  (1992) Reflection for everyone: catering for individual differences.  Paper presented at the Reflective Practice in Higher Education Conference, Brisbane, July 1992.

Carr, W.  & Kemmis, S.  (1986) Becoming critical: education knowledge and action research.  London: Falmer Press.

Checkland, P.  & Scholes, J.  (1990) Soft systems methodology in action.  Chichester: Wiley.

Dick, B.  (1999) Rigour without numbers: the potential of dialectical processes as qualitative research tools, second edition.  Brisbane: Interchange.

French, W.  & Bell, C.H.  (1990) Organisation development: behavioural science interventions for organisational improvement, fourth edition.  Englewood Cliffs, NJ: Prentice-Hall.

Heller, F.A.  (1976) Group feedback analysis as a method of action research.  In AW Clark, Experimenting with organisational life.  New York: Plenum.

Jick, T.D.  (1979) Mixing qualitative and quantitative methods: triangulation in action.  Administrative Science Quarterly, 24, 602-611.

Kemmis, S.  & McTaggart, R., eds.  (1988) The action research planner, third edition.  Victoria: Deakin University.

Lewin, K.  (1946) Action research and minority problems.  Journal of Social Issues, 2, 34-46.

Lewin, K.  (1951) Field theory in social science.  New York: Harper.

Martin, P.R.  (1989) The scientist practitioner model and clinical psychology: time for change?  Australian Psychologist, 24(1), 71-92.

Michael, S.R., Luthans, F., Odiorne, G.  & Burke, W.W.  (1981) Techniques of organisational change.  New York: McGraw-Hill.

Strauss, A.  and Corbin, J.  (1990) Basics of qualitative research: grounded theory procedures and techniques.  Newbury Park: Sage.

Zuber-Skerritt, O., ed.  (1991) Action research for change and development.  Aldershot: Gower.

_____

 

Copyright (c) Bob Dick 1992-2005.  This document may be copied if it is not included in documents sold at a profit, and this and the following notice are included.

This document can be cited as follows:

Dick, B.  (1999) Qualitative action research: improving the rigour and economy [On line].  Available at http://www.uq.net.au/action_research/arp/rigour2.html

 


 

 

navbar 4 

 
Maintained by
Bob Dick; this version 2.04w last revised 20050817

A text version is available at URL ftp://ftp.scu.edu.au/www/arr/rigour2.txt