Click here Contents Page


to return to the contents page.

Chapter Ten


#[p190] Chapter 11. Method after the paper "Notes from a method".



Given the nature of this project, an exploration which evolved as my ideas on what and how to study evolved, it is perhaps no surprise that the course of inquiry was far from linear. I pursued many side issues (the off-shoots described earlier), some of which stayed just that: side issues. Some fizzled out, collapsed under their own weight, strayed too far from the main path or otherwise came to be abandoned. One became the main path - "how to research". My problem then was, what to do with those "flowers" which never properly bloomed? I could have simply buried them, as irrelevant to the study, a kind of distraction - like other glitches, just noise in the system. My commitment to honesty, however, made me uneasy about this. In addition I felt they made an important contribution to providing a full picture of the research.

As confidence in the method developed I began to see these excursions as hallmarks of this kind of work: "Research is the process of going up blind alleys to see if they are blind" (Bates quoted in Green 1982 p.217). The tale of the research would be lop-sided without them. But were they of any use? I gradually conceived two ideas: (a) the abandoned themes might still act as triggers for other researchers and so, perhaps deserved at least a place in an appendix and (b) more importantly, they could add to the believability of the account (see below).

Throughout the study I was stalked by worries over "validity"; contemplating the need to provide accounts of off-shoots or blind alleys did little to dispel these. I will not recount older approaches to validity arising from psychological test construction (see Williams 1974). Miles and Huberman (1994) provide a clear account of internal and external validity in modern qualitative research. This includes discussion of ideas such as triangulation and sample selection which seem of less relevance to the present study. My approach to validity is outlined later.

#[p191] The term method seemed to me (certainly at the beginning of the project) to carry with it certain overtones such as "methodical", "standard" and "scientific". Carr (1997) traces the idea of method to ancient Greece where adhering to an agreed procedure was seen as an alternative to the difficulties of philosophising in seeking truth. Centuries later matters appear to have moved on to the stage where Gergen (1985), for instance, has needed to point out that:

the sciences have been enchanted by the myth that the assiduous application of rigorous method will yield sound fact - as if empirical methodology were some form of meat grinder from which truth could be turned out like so many sausages. (p.273)

"Methodology" also presents me with some difficulties, if we take a definition such as "the science of method" (Concise Oxford Dictionary) in the context of my doubts over what constitutes science and Gouldner’s (1982) discussion of the "background assumptions" which we import into (sociological) research, such that methodology becomes ideology:

[methodology] is commonly infused with ideological resonant assumptions about what the social world is, who the sociologist is, and what the nature of the relation between them is (p.344)

I will thus use the term method in a more general sense, i.e. simply as " a way of proceeding or doing something" (Collins English Dictionary) and methodology as "the study of how ... [researchers] go about their work" (Concise Oxford Dictionary of Sociology).

The "messy method" I describe seems far from any conventional formulation of a "scientific method" and, for me, raises pressing problems of validity. To highlight my concerns, I would like to begin with two positions which critique the whole notion of validity; look at some alternatives; then try to bring about some resolution appropriate for this inquiry, involving issues such as honesty, "testing while protecting", generalisation and communication. I conclude with some simplified suggestions for assessing research "strength", as an alternative to seeking "validity".

1. Two critiques of the notion of validity.

#[p192] The two, to me provocative, articles discussed below presented challenges which brought into sharp focus my growing uneasiness over the topic of validity. I recount them to illustrate my concerns, as a convenient, if uncomfortable, starting point. Rather than continuing to pursue the positions of these authors, and those they draw on such as Mishler and Lather, however, I later turn to practitioner research for a resolution of my difficulties.

(A) Scheurich in his article "The masks of validity: a deconstructive account" (Scheurich 1996) describes how the idea of validity became "one of the necessary truth criteria of conventional social science research: no validity no truth" (p.50) although its roots were "almost exclusively [in] testing" (Wolcott 1990 p.122). He goes on to explain how "conventional social scientists asserted that they could establish value-free, objective truth ... if the proper scientific methods were followed" (Scheurich 1996 p.50) and "if a research study had the appropriate validity, the results could be trusted" (ibid p.50). This led to a divide "one side of the map was research that passed the test of validity ... on the other side of the map was research that had not passed the test" (ibid p.50).

Although "postpositivists"[1] according to Scheurich have been "quite willing to dump conventional science, the nomological net from which validity derived its meaning " (ibid p.50), they continue to cherish validity - they will "not leave home without it" (ibid p.50). His explanation for this (which has a certain face validity) is that postpositivists "initially had to survive in conventional science-dominated settings" (ibid p.50). As Koch argues, in the field of psychology:

The ‘scientism’ that many see and decry in recent psychology was ... with it from the start ... From the earliest days of the experimental pioneers, man’s (sic) stipulation that psychology be adequate to science outweighed his commitment that it be adequate to man. (Koch 1959 p.783)

Recent, more radical re-shapings of the concept, the "successor validities" (ibid p.51) such as outlined in Mishler (1990) and Lather (1993), take a variety of lines. Mishler (1990) seeks the warrant given by a "community of scientists" (p.422) i.e. an arrived at agreement, "which may change with time" (ibid p. 420). Lather (1993) argues that "legitimation depends on a researcher’s ability to explore ... inquiry problematics" (ibid p.676). She offers several validities which can be summed up as "transgressive validity" . #[p193] Simplifying her position extremely this represents an attempt to "work against constraints of authority" (ibid p.686) while exposing the limitations of language and promoting justice. In other words, if I read her correctly, validity arises from our attempts to free ourselves and others from established habits of thought and action.

Scheurich, however, criticises these viewpoints as both, in their own way, continuing to support a divide between the valid and the invalid. He sees this division as arising from power plays : "validity is the name of the policing practices that divide good research from bad" (p.53), these embody "ideological power alignments" (ibid p.53). He goes on to explain how qualitative methods generally were originally opposed by elements within "the academy" (ibid p.53) (as also feminist approaches were originally/ are still). Opposition, in the light of Mishler and Lather’s position, however, still remains. This is opposition to "the coarse, untheorized, polyvocal Other" (p.54), which "cannot be accepted as knowledge" (ibid p.54).

As I understand him, Scheurich sees all these alternative approaches to validity as adding to "an imperial project ... drawn from the heart of Western darkness" (ibid p.55). And this project employs (unconscious) practices which seek to dominate and subvert, or exclude, certain voices. He is "troubled by the anonymous imperial violence that slips quietly [even] into our ... best intentions [and] ... transformational yearnings" (ibid p.58) and advocates instead a "proliferation of marginalized voices" (ibid p.58) unsullied by the researcher’s agenda : a "carnival, a loud clamour" (ibid p.58) of alternative "voices of difference" (ibid p.58) which can challenge "the Western knowledge project" (ibid p.58).

(B) Wolcott (1990) begins, conventionally enough, with a well presented piece on the history and definition of validity and claims that it "serves most often as a gloss for scientific accuracy among those who identify closely with science and for correctness or credibility among those who do not" (p.126 emphasis in original) and that "validity haunts qualitative researchers as a specter" (p.127). He outlines the measures he takes in his own fieldwork: "talk little, listen a lot", "record accurately", "begin writing early", "include primary data", "report fully", "be candid", "seek feedback", "try to achieve balance (by re-reading for ‘disciplined subjectivity’)" and "write accurately" (p.127-134). #[p194] He then moves on to a gripping story where he poses the question "when it really matters, does validity really matter?" (ibid p.135).

The second half of the paper gives an account of his friendship with a young man, Brad, which later turns into a study of Brad, published as an article. The friendship becomes a sexual relationship. The young man disappears then returns some time later to kill Wolcott and is almost successful. At the time of writing, Wolcott is awaiting, with great fear, Brad’s release from prison and his likely further attempts to kill him. He explains "I find no counsel or direction in questions prompted by a concern for validity. There is ... nothing scientific to measure that tells us anything important" (p.144). Wolcott describes how he "felt safe only so long as Brad remained institutionalized" (ibid p.145).

Under these circumstances, concerns over validity paled into insignificance : "I do not for a minute believe that validity points the way to saving my life or soul or suggests how to come to grips intellectually with a case study that really matters" (ibid p.146) . In this context, "any understanding I may achieve will occur largely in answer to questions that are not matters of fact" (ibid p.147 emphasis in original). Emphasising "understanding" (a perspective expanded by Maxwell 1992), Wolcott goes on to explain how "we sometime learn from poorly reported studies and poorly analyzed ones, while seemingly truthful, or correct, or neatly analyzed accounts may have no impact or provoke no further thought" (Wolcott 1990 p.148) and in the light of his experience he will no longer accept "validity as a valid criterion for guiding or judging his work" (ibid p.148).

What can we take from these two positions? Scheurich I believe can readily encompass Wolcott as part of his "carnival" of voices. But can that carnival really include "shoddy" work, slip-shod or half-hearted?[2] At the very least, accepting what Wolcott says about learning from "poor" studies, it seems reasonable to claim we need to know what is in the studies. We need an account sufficiently full and careful to help us decide how to judge the work.

It is worthy of note, in addition, that Scheurich himself is not averse to writing in elegant academic-ese with appropriate references. Wolcott also provides a very well written #[p195] account. Presumably they both have in mind some "standards" (albeit different) which colleagues might reasonably expect. Their rejection of validity does not appear to include the rejection of certain relatively commonplace criteria by which their work might be judged. I later offer a broader selection of potential criteria, in the form of a set of questions with which to interrogate a study, in leaving behind the valid/ invalid distinction.

Returning to Wolcott, he raises important issues but perhaps in a way which is unfair. Yes, if I am in the middle of war, famine, flood, earthquake or some other life-threatening situation, I am not usually going to be arguing over the validity of my current research. Validity (and the research itself) in those circumstances is an irrelevance; simple survival is paramount. However, Wolcott still wants to write, presumably to some purpose. Perhaps to challenge, to perplex, to give others some understanding. Whatever the point of his writing, the question I would raise is, apart from entertaining himself, how can he be sure he has made contact with even one other being? This aspect, the influence on others, I will also return to later.

Leaving these two accounts to one side, as galvanising triggers, I will set out the approach to handling concerns over validity which I settled upon.

2. Part of one practitioner researcher’s approach to the topic of validity.

Throughout the project I was tormented by the feeling that what I was attempting did not match up to the canon of good research. It was not "scientific". The question I felt I had to address was: in view of the nature of this investigation (a reflective self study, following its own, ever-changing method) what was the most appropriate way to deal with the concept of validity? The answer derived partly from my concerns for honesty and my role as a practitioner. I will look at the honesty issue first.

3. Honesty in research.

#[p196] In an earlier chapter I gave brief summaries of the various by-ways of the project, what I called "off-shoots". Throughout the whole of the present account I have been at pains to point out the "faltering reality" (Byng-Hall 1988 p.175), the "complex and messy" (Staw 1981 p.227) process of the enterprise. My contention is that an honest account of such "mistakes"/ "errors" / "failures" can enhance the believability of a story. The reasoning behind this is drawn from a number of disparate sources. Unfortunately there is little opportunity in the time scale of a work of this nature to seek wider confirmation of this position, thus the suggestion is more an act of faith than a fully defendable position at this juncture.

4. "Errors" and believability.

In this section I want to explore the notion that including "errors" [3] may actually render an account more believable. An interesting starting place, which, I feel, helps to make the point, is an argument which initially seems to undermine the very position I am proposing. This argument revolves around a rather Machiavellian approach: to enhance credibility in a tale of dubious merit it may be useful to deliberately insert errors. The point of raising these devious examples is to underline the powerful impact which such pieces of information can carry. A more positive view is presented later.

Robert Graves catches the Machiavelian position well, in his "Devil’s Advice to Story Tellers":

(“for copyright reasons the poem is not included here, please contact me for details”)

 

*

*

*

*

*

*

*

*

 (Graves 1965 p.89)

Deception in warfare can employ similar devices. To hide the true invasion site in the Mediterranean during world war II, a plot was cooked up to use a dead body planted with information pointing to a different site. The story is told in "The Man Who Never #[p197] Was" (Montagu 1996) and has been the subject of a popular film. Montagu explains how, to create a believable persona for the body of "major Martin" on which the documents were to be found, matters of his identity could not be made too tidy, that would have been suspicious:

Major Martin was a rather brilliant officer and was trusted by his superiors: his only visible lapses were the all too common ones of having lost his identity card and having recently let his pass to combined operations H.Q. run out of date. (Montagu 1996 p.66)

Informal contact with historians suggests that they also may be cautious of tales which are too consistent. Partner (1995) reveals how disagreements in accounts has been part of the historian’s lot since Herodotus:

This is how the Persians say it happened ...but about Io herself the Phoenicians disagree with the Persians. For they say...These are the stories of the Persians and the Phoenicians. (Herodotus cited in Partner 1995 p.26).

The historian must "construct a meganarrative out of hundreds of already incompatible segments ... [the] conflicting, overlapping record of memory" (ibid p.26). A present day example continues this theme. Two recent biographers of Roger Casement, the Irish patriot/traitor (depending on one’s point of view), who was hanged in 1919, disagree about the status of his pornographic diaries - were they part of a ‘dirty tricks’ campaign designed to smear him? Angus Mitchell and Roger Sawyer, the biographers, take issue with each other over the status of the errors in his diaries when compared to other sources:

Mitchell ... claims that the diaries don’t tally with Casement’s published journals. Sawyer says such errors are actual proof of the diaries’ authenticity: "A forger would have edited them out". (Jones 1998 p.6)

These hints then, from a number of perspectives[[4],  lead me to a position that an open and honest account of the "mistakes" / "blind alleys" / "by-ways" of a project does not detract from its believability; in fact it can serve to enhance it (leaving aside the faint possibility that such accounts could be Machiavellian implants, designed deliberately to foster that very impression[5]

The "deviations" outlined earlier (see chapter 7) are some of the "mistakes" and "blind alleys" of this study which I came to think of in a more constructive way as "probes" or "off-shoots". As noted in chapter 6, Measor and Woods (1991) maintain that such

#[p198] evidence helps us to judge the study and Deveruex (1967) also views admission of such "blind spots" as useful, not demeaning (see chapter 11). This additional evidence, together with the questions raised, particularly in chapters 8 onwards, is thus offered in a positive light to help towards the evaluation of the research.

A paper by Lenzo (1995), however, discusses some of the difficulties inherent in honest self-criticism. Citing Alcoff (1991) she argues:

The desire to find an absolute means to avoid making errors comes perhaps not from a desire to advance collective goals but a desire for personal mastery, to establish a privileged discursive position wherein one cannot be undermined or challenged. (Alcoff p.22 quoted in Lenzo 1995 p.18)

This fear of attack is one I can relate to. However, for me, apart from counselling on the stresses of this, and holding on to personal values such as honesty, the solution appears to lie in embracing a modification to a position which Lenzo introduces later in her paper.

At that point, further into the article, Lenzo discusses the difficulties of doctoral students wishing to challenge "traditional forms of closed narratives" (Lenzo 1995 p.19) and introduce doubts into their writing. She asks "What kind of textual authority can admit to uncertainty ...?" (ibid p.19). She goes on to explore an example of what she calls one student’s "variable self referencing" (ibid p.21) as one way of dealing with this, in replying to her own question. However, from my position, if I follow her argument correctly, Lenzo misses an important point. I would want to offer the (tentative) argument that an alternative resolution is to not accept the terms of the dilemma, but to turn the problem on its head and assert that the admission of errors and uncertainty becomes a source of authority. Errors and uncertainty I feel, in the light of the discussion above, render the text more believable, not less; insofar as believability is a measure of authority, they lend authority. I am now more suspicious of hygienic accounts[6].

This position I reached was valuable to me in developing my confidence over certain aspects of the method but left un-settled the issue of validity. I began to wonder if, in my concerns over validity, I was asking the right question.

5. Is "validity" the right question?

#[p199] Struggles with anxieties over validity coincided with a crumbling of my long-standing "scientific" stance. No single, defining moment in my questioning of science and where "knowledge" lay marked the transition, although a few fragments stand out. I recall, for instance, Phillips’ (1992) comment that:

The worry about the warrant for conclusions drawn from a qualitative inquiry will not wane, largely because the worry about the warrant for conclusions drawn from any inquiry will not wane. (p.118)

An observation from Cixous ...

[W]hen you look at the TV [news] the truth simply disappears. I see massacres on TV and I do not cry. I have to go to the theatre for that. There I can receive that subjective and poetical expression. (quoted in Jeffries 1997 p.4)

... latched on to a personal memory of the power of Shelley’s poem "Ozymandias". There, no amount of "valid", "objective" history would capture, for me, in quite the same way, that essence of human vanity that Shelley encapsulates in a few lines about lifeless stones in the desert: the truth from lies. This in its turn made connections back into the academic world through Wolcott’s (1990) gripping story and the validity/relevance dilemma he raises.

Whatever the causes, at about the same time that my understanding of what actually constituted "science", as practised by real scientists, and what constituted "scientific knowledge" both began to shift, so my stance on validity shifted. This crystallised very late, during the write-up stage. Rendering the event somewhat poetically, I found I had "stepped through":

Validity is the wrong word (11.11.98 9.00 p.m. in park)

There was something in all this still worth struggling for, but what was it? I needed another term. "Worthwhileness" (House 1980 cited in Dadds 1995 p.112) came close. None seemed quite right: valuation, judgement, good/bad. As a provisional resting place I settled on "strength" (while still hoping for something better): "validus", strength, being the Latin root of validity (Concise Oxford Dictionary).

6. Research strength and practitioner research.

#[p200] Several writers propose frameworks for the evaluation of inquiries; several apply to action research which was one of my starting points (Altrichter et al 1993, Clarke et al 1993, Dadds 1995, Lomax 1994, Norris 1997 and Tickle 1995) . Within practitioner research more generally there are recent texts such as Robson (1993), Fuller and Petch (1995) and Reed and Procter (1995). Fuller and Petch has less depth and Robson fewer case details than Reed and Procter. It is the latter book I draw on, particularly the chapter by Reed and Biott. Although the various contributors focus on health care, their views appear immediately translatable to other fields.

Reed and Biott (1995) see strength as a continuum, not as an either/or. They explore a number of factors which they feel characterise the "strong" pole of practitioner research (in the health field), while recognising that their list need not apply to all studies. They mention, for instance, that the process should be:

integral with the practice of health care

a social process...

educative for all participants

imbued with an integral development dimension

focused upon aspects of practice... the researcher can change

able to ... explore sociopolitical and historical factors

able to open up value issues...

designed to give a say to all participants

able to ... enhance the capacity of participants to interpret everyday action in the work setting

able to integrate personal and professional learning

likely to yield insights ... of interest to a wider audience (p.194)

Incorporating some of the action researchers’ views above[7], Tickle would add for instance : "clarity", "incorporation of revisions" , "a self-critical stance", "multiple perspectives", "convincing evidence", "transferability", "raising new questions and challenges", "a continuing venture" (p. 233-4). Clarke et al would add "bringing the situation to life", "a reflexive account", "a range of resources", being clear how "data were selected, collected and analysed" and how the work is to be judged (p.490-1).

7. Drawing out some of the more relevant points raised above.

The current inquiry, being a particular version of self-study, really only has one main participant : myself, although a host of others were drawn in, in one way or another. So, #[p201] apart from the spin-offs from (hopefully) interesting and challenging conversations and, lately, a range of publications, my impact on their development, and those of my clients, outside of the effects of case work, is unclear (though, I trust, not negative). Thus, the "participant" dimension, mentioned above, seems to me, less relevant to the present project.

Other aspects, however, do appear to apply. Reed and Biott (1995) for instance argue that strong research should be "part of the job". It "does not require that the person stops practising in order to carry out a project" (p.195). It should be "an extension of good practice ... not... alien ... to practice" (ibid p.196). In my case I went to great lengths, with an enormous amount of soul-searching, to develop a form of inquiry which allowed me to do just that: carry on with my job and make the research fit round it, "scientifically" or not.

Reed and Biott emphasise the importance of concentrating on those aspects of practice which we have power to change, as "the concern is about improving practice" (p.197). My position on this, as explained earlier, was a little more complex. Yes, as soon as I begin to explore practice, I wanted to change; but, no, I still maintain my starting point could be a desire simply to understand at least some aspect of practice. As Reed and Biott argue themselves, a starting point may be "a sense of unease... [a] question What is happening here? [rather than] questions about effectiveness" (ibid p.199). In fact I made many changes (see chapter 4) and opened up practice more to scrutiny, through my writing and for instance the enclosed tape on which I have sought critical comment while developing a publication (see appendix A).

The study concerned both practice and research. I explore my uncertainties about these areas in Mellor (1998 a and b) and this exploration engages with my "taken-for-granted and habituated customs" (Reed and Biott p.198) of both practice and research. To the extent that work on attention seeking presents a new (or at least newly consolidated) view on aspects of emotional and behavioural difficulties and reinforces an interpersonal, non-medical view of such problems, then this also helps challenge taken-for-granted thinking and adds to a "close scrutiny of key concepts and values which underpin and shape

#[p202] practice" (ibid p.198). The study overall, however, has lacked an avowedly "critical" dimension, in, for instance, not meeting Reed and Biott’s call for increasing "understanding of the sociopolitical factors" (p.198) around both. Recent writing, (Mellor forthcoming a), building in part on reviews of the book "Attention Seeking", will, however, include some examination of this aspect.

Many of the other aspects mentioned above (see also foot note 7) such as : "challenging assumptions", "ethics", "clarity", "incorporation of revisions" , "a self-critical stance", "multiple perspectives", "convincing evidence", "transferability", "raising new questions (and knowledge) and challenges", "a continuing venture", "development", "bringing the situation to life", "a reflexive account", "a range of resources", being clear how "data were selected, collected and analysed" and how the work is to be judged, I trust are demonstrated in preceding chapters (transferability aside - a topic which I address separately).

I return to a simplified and tentative position on these multiple facets later in offering a reduced set of questions to aid the assessment of research strength, while recognising that this can only be a limited and personal view. However, at this point, a short digression is necessary into another aspect of the evolving method: the including of on-going "testing". To explain my attempts to test out my developing ideas, while at the same time, protecting them from being demolished at an early stage, I turn to some ideas drawn from accounts of the work of natural scientists.

8. Testing while protecting.

Looking back over the last few years I can see little in my inquiry that fits clearly into either a simple inductivist or a simple falsificationist model of investigation. As Chalmers (1982) points out, "science" suffers from similar difficulties. Simply because at one time we induce from repeated observation that all coelacanths are fossils, there is nothing to prevent the next sighting being alive and well and living in warm tropical waters. As to falsification:

#[p203] An embarrassing historical fact for falsificationists is that if their methodology had been strictly adhered to ... those theories generally regarded as being among the best examples ... would have been rejected in their infancy...

Newton’s gravitational theory was falsified by observation of the moon’s orbit. It took almost fifty years to deflect this falsification onto [other] causes. (ibid p.66)

Chalmers gives further examples of falsification being ignored in the development of Bohr’s theory of the atom, the kinetic theory of gases and in the Copernican revolution. He goes on to explain how: "[e]arly formulations of the new theory, involving imperfectly formulated novel concepts, were persevered with and developed in spite of apparent falsifications" (ibid p. 75)[8]. Drawing from Lakatos[9], Chalmers describes how in the early stages a "research programme" must be protected: "[it] must be given a chance to realise its potential" (ibid p.83). It must not be allowed to crumble under the weight of immediate criticism, all new positions are bound to be fallible.

Without delving much further into the methodology of science (and Chalmers concludes at the end of his "What is this thing called science?" that "there is no timeless and universal conception of science or scientific method" p.169), I can see some parallels which help me to understand my own working practices, without making any claim as to their "scientific" status.

In the early stages of developing ideas, I sought encouragement as I dredged up notions from the confusing research process I had set underway. I took support where I could, in constant dialogue with an enormous range of colleagues, while at he same time, using this dialogue to refine my ideas. To begin with, I made only tentative sorties into subjecting them to the more stringent testing of the public forum of conference papers and publication, after trying them out with local research groups. I was "testing while protecting". My first attempt at serious testing was nearly my last but, interestingly, it set the seeds of what was to be an important later theme: the role of emotions in research.

9. The first test : CARN 1995.

In September 1995, just two years into the project, I presented a paper to the Collaborative Action Research Network (CARN) international conference in Nottingham. #[p204] I had, I think, forgotten how to teach. I certainly had no clear picture of what constituted "presenting a paper". Did you just stand and talk? My current experience with adults (apart from delivering in-service training at work) was much more in the counselling world, where I felt comfortable with shared exposure of anxieties in a supportive group of like-minded people. I thus had, I imagine, an ambivalent view as to what I was about to attempt.

The material contained in the paper was challenging for me on two levels. First, it was an explanation of an embryonic model of "messy" research to academics engaged in a conventional (action research) approach. Second, it began with a personal account, of me and my "baggage", but for an audience of unknown disposition. I felt myself falling between two stools: the academic world and the counselling world. The audience was not to blame. Had I set it up as purely a counselling experience (and had they agreed to participate in that way) I would have been on surer ground. Had I stuck to an "academic" account, I might have scraped through. I did neither. My diary records later reflections:

I felt emotional. Given a bit more time I would have been in tears... So one issue is not simply how and whether to discuss emotions in an academic setting (this can be done in a dry as dust psychology book where no emotions touch either reader or writer) but how, or whether, to be emotional in an academic setting. (17.9.95 8.15 a.m. emphasis in original)

It was some time before I resolved this issue to my own satisfaction (see discussion of emotions in chapters 9 and 10) although Bill C., who I met at CARN and who was to act as critical correspondent for a while until his own PhD write-up took over his life, gave great support: "You allowed the real you to be seen in all the messiness of your work situation". He also, however, provided criticism : "I still think for what its worth ... [the] systematic approach of [action research] is useful" (brief extracts from correspondence with Bill C. 12.9.95 which carried on at some length throughout 1995/6).

One further spin-off of the CARN episode was some warm support from two researchers in Birmingham. As a side issue, "critical friends" of various hues came and went throughout the project, a perhaps lesser known aspect of the topic. I had to be a kind of "bricoleur", which in this context meant taking support from whatever came to hand, #[p205] whenever it came to hand. Luckily, I found a steady stream of willing volunteers amongst the dozens of souls with whom I made contact.

Following another (more successful) paper to CARN on the same topic in October 1996 I felt confident enough to submit to a refereed journal, welcome the ensuing critique and amend the piece accordingly (Mellor 1998b). I had moved by now from mainly protecting my ideas to mainly testing them (while still seeking constant support). I had, in a sense, moved from the context of discovery to the context of justification.

Feyerabend (1975), denies any clear distinction in procedures here: "science ... could not exist without a frequent overruling of the context of justification" (p.167); but as my study has mainly concerned a context of discovery, I am not in a position to address this issue in any depth. What I did attempt, however, apart from seeking confirmations and disconfirmations of the ideas through informal discussions (see below) was a kind of internal test : could I apply the "messy method" to other topics? The story of these attempts, and their resulting modifications to the "model", is told in the mini-projects earlier, concerning identity and making sense. These were, in the end, apparently successful applications which led to some modifications to, and further understanding of, the method.

10. Later tests arising through discussion[*].

Initially I had an inkling that what I was describing was a universal method: if we are honest, we all work this way. Quoting from the 15th century thinker, Montaigne "this life will reveal as much as any other because ... ‘every man (sic) beareth the whole stampe of humane condition’ " (Taylor 1989 p.179). The reality was unpredictably different, however.

My main counsellor, Mike, who in may respects could have been a prime candidate for such concepts (being very, what I can loosely call "alternative", in his views generally) denied emphatically that he proceeded in a "messy" way, in one of our non-counselling discussions (see later). A relative by marriage, Celia, once a post-doctoral [scientist] from #[p206] [a prestigious university, now ... a] manager with a giant multi-national, but still a scientist in outlook, agreed strongly that she did. A colleague Rene E., who read a draft of the thesis in early 1999, was also very positive in her support; I visit her comments in chapter 12.

There were other mini "tests" in the form of brief discussions with colleagues. I select these two because of their length and their influence on me. The extracts below have not been subjected to the kind of "analysis" or "making sense" of the diary notes as they are not the main data source. I have simply picked out what seemed to be the relevant points concerning "mess". I will begin with Celia.

Interview with Celia 20.9.96

We explored Celia’s management group’s attempts to look at the company’s financial processes, later with the aid of some management consultants (see appendix D for the full transcript). I have extracted below those comments which reinforced my view of the messiness of her process. I was careful in the interview not to "push" my own views, beyond my initial comment that I was interested in mess. Given her scientific background, I did not expect support.

Celia: I start with a vague idea ...

Nigel: So what did you do?

C: Brainstorming - we know what they [in accounts] do, how do you figure out the best things? [We got] off-the-wall ideas, wild and wacky, e.g. pay everybody once a year [then sorted out the nuggets]

We didn't really know what we were doing at the start...

... We were totally lost. [We had feelings like] Worried. What have I signed up for? Somebody knows the answer and they're not telling us. We're a smoke screen [for the leader] his hobby horse is "out-sourcing". There was lots of non-trust. What the hell are we doing here?

N. Tell me some more about how you actually did this.

C. For days we didn't have a clue what we were up to. Where do we start? How do we get into the project? Why hasn't someone defined this - it's half -baked?

...

N. Did you see a structure in this mess?

#[p207] C. Most of these conversations were in breaks, evenings and at the bar.

Interview with Mike 24.9.96.

Given Mike’s dealings with emotions in counselling I expected him to support my position and initially he seemed to: "I was doing it without a plan and working it out by doing it", however, as the discussion progressed he made it clear that he did not. I decided to throw away any attempts to elicit his general position and simply probe him on this one issue, but despite efforts to get Mike to describe a messy process he resisted, as the following short extract from the end of the interview demonstrates:

Mike: At work I try to be systematic...

Nigel: It's not muddling through?

Mike : In some ways I see the department muddling through and I'm quite critical of that. In one sense I could use it [but] they're incredibly amateur - unsystematic and lacking in theory. You could also be muddling through - be very brilliant and do a good job because you know what you needed to do through experience. When I talked about being disciplined it seems to lack warmth. I sometimes think I'm too disciplined. Perhaps it would be nice to do things well, just know things well and get on with it.

So I had both (unexpected) confirmation and disconfirmation of the existence of "messy approaches". In reconciling these I settle for the following position, between the two extremes. I do not try to claim that everyone works the way I have tried to describe. Neither do I try to claim that my own approach is only relevant to me. I contend instead that some people, some of the time, may in part adopt a "messy method" such as I have described. Which position leads to discussion on "generalisation" or "relatability".

11. Generalisation / relatability.

Much of the research time was taken up in convincing myself that the approach and the effort were worthwhile. I felt the need to give myself voice, to be my own judge. I could have left it at that, but I wanted to do more: to share these ideas with (like-minded) colleagues, indeed, to see if colleagues were like-minded. Thus the two interviews above

#[p208] highlighted a problem: what constituted sufficient grounds to say I could "generalise" my ideas (if that was what I wanted to do). To begin with I was a little down-hearted:

Initially I was disappointed [with Mike’s response]. I had a half-conscious hope that, Piaget-like, from studying one or two children (or in my case, just myself) I could have uncovered a universal, a grand narrative ( my "muddling through" method of everyday life and research). Post-modernism would of course reject such a notion. (24.9.96).

Then I began to see Mike’s response in a much more positive light. Here was some useful feedback which made me examine more closely notions around generalisability and some of its alternatives. I was drawn to Bassey’s (1981) "relatability":

an important criterion for judging the merit of a case-study is the extent to which the details are sufficient and appropriate for a teacher working in a similar situation to relate his (sic) decision making to that described in the case-study. (p.85)

However, while attractive, this seemed to disguise a problem it shared with generalisability : the question of numbers.

The positive thing which came out of this interview [with Mike] was to do with generalisability. Bassey for instance talks about "relatability", and others talk about a work having "resonance". But with how many people? Is one person feeling empathy, seeing a resonance, relating the work to their own situation, enough? Is two? What is the right number? Does getting an article published count? Would a sample survey of a cohort of academics be necessary? The literature is not clear on this point. That's the problem in using this kind of concept. (24.9.96)

Leaving to one side temporarily the question of numbers, I will address an issue which seems to be buried in the above discussion: communication.

12. Communication.

Beyond a desire for the personal development of the practitioner researcher in question, what value is there in research? Reed and Biott (1995) point to the "usefulness of research in informing practice" and the capacity in "generating debate rather than solving it" (p.200). But what use is inquiry to other practitioners unless they read about it? Practitioners appear to rely much more on tacit knowledge (Usher and Edwards 1994 refer to "subjugated knowledge" p.54) than that from the academy: "most research writing is not memorable and much of it is not easy to get hold of" (Bassey 1998a p.20).

#[p209] While we cannot be sure that anything we write will ever be read[10] there is, I feel, some duty to attempt to make what we write as accessible as possible: to tell a good story, although that, in itself, is far from sufficient a criterion on which to judge (see Miles and Huberman’s 1990 discussion of qualitative inquiry : "We need some backstage information, not just the text" p.349; see also Phillips below).

Excellent material may at times be excavated from impenetrable prose. Good ideas do not always come cheap and the committed researcher must be expected to work for his or her insights. Not all bad writing is bad research. Phillips (1992), however, criticises the notion that "good writing" is by itself a sufficient warrant for inquiry. Citing Miles and Huberman he explains that "qualitative analyses can be evocative, illuminating, masterful, and downright wrong" (p.114 emphasis in original). I have briefly examined Phillips’ "god’s eye" view of truth earlier (see footnotes to chapter 8), but the point he makes is a fair one. We need more than good prose to be assured of the value of a piece of work: "a swindler’s story is coherent and convincing" (ibid p.114). He argues "[c]redibility is a scandalously weak and inappropriate surrogate for truth ... under appropriate circumstances any nonsense at all can be judged as ‘credible’ " (p.117).

Phillips also opposes a simple consensus view, a community might well believe the earth is flat. He dismisses the search for some recognised "method" as a guarantee for research and cautions against the idea that "true belief is simply a matter of finding, and following, certain analytic procedures" (p.117 emphasis in original). The conclusion of his chapter is, for me, however, somewhat unsatisfying. He holds up "truth" as a "regulative ideal" (p.119 emphasis in original), much like Popper’s famous piece on the search for the "mountain peak [of truth] permanently, or almost permanently, wrapped in clouds" (Popper 1968 p. 226). This unfortunately gives little guidance on what one is actually supposed to do in the research setting.

Similarly, post-modernists such as Scheurich (1996), who dismiss the valid/ non-valid distinction, leave me unsatisfied. Can I write anything and it will find a place somewhere in the great garden of knowledge, in his "Bakhtinian ... carnival" of "marginalized voices" (p.58)? As pointed out earlier, the form, if not the content, of his writing, at least in this #[p210] example, betrays Scheurich’s allegiance to a very conventional voice in the refereed world of academic publication. I am left with the feeling that there are still some (uncertain) "standards" to strive for in judging the merit of a piece of research, even if notions of validity are rejected.

My current "solution" to some of the problems raised here is explored in section 14 below, where I discuss communication and critique. First, however, I would like to consider some issues around what I will call my "writing voice".

13. Writing voice.

At the end of March 1999, reading a final version of my final chapter, Colin Biott raised a question which I had not until then considered, to do with the style of the writing of the thesis as a whole. He mentioned (not in any unkind way) my "voracious quoting of the literature" and what place that had in my view of practitioner research. That set me thinking.

I wanted to communicate to other practitioner researchers and set great store by this. My earlier concerns as a "writer" wishing to create an accessible text are presented in chapter 10. Had my referencing become a barrier? Was the writing "voice" too "academic", too intimidating for practitioners entering the field? I thought, in the end, not. But the unanswered question was, why? Why had I chosen to write this way?

In many cases researchers appear to begin with a literature survey. Some, perhaps practitioner researchers in particular, may begin with their practice (see the discussion in Rowland 1993 where he quotes one professional as saying: "[i]t was only after completing my enquiry that I went to the literature" p.121). In writing there may be an "academic game" of "quoting the authorities" (ibid p.122). What was my position?

From the outset I had an image of "scholarly" work and some of the motivation for referencing was admittedly defensive in such a challenging and unknown arena of constructing a PhD text; particularly when I felt myself to be in such a risky area, "mess". #[p211] I wanted as far as possible to write with certainty about uncertainty; to present the most powerful case I could, attacking my received notion of scientific knowledge, partly with evidence from the "scientists" themselves. But that was not the whole story, nor even the main one. I interacted with the literature. The literature (and all the others sources I drew upon) genuinely helped me to see. To learn, but also to unlearn. This was not an academic game.

Right from the start of the project I was caught in a "cloud of unknowing". I began with a focus on practice, but, as I explained earlier, had no settled notion of how to study this practice nor what I wanted to get out of such a study. I was in the thick of practice and researching practice from day one. The world would not go away. But I was not happy with exploring practice till I knew how to explore. And I set about finding out how to do that, by doing it. It is only now, some six years later, that I feel I could, if I so wished, return to examining my day to day work[11], but with confidence in the methods I would employ: my own style of inquiry, the "messy method".

During the construction of this "messy method", however, other author’s works were vital. Re-visiting the identities discussed in chapter 10, I would want to clarify "researcher", in this light, as a particular kind of practitioner researcher, drawing not just on practice but on a wide range of influences, academic material in particular. I will explain in which ways these authors’ works were so important to me.

My diary has an entry "Brecht and references" (26.3.99 6.00 in pub) following the discussion with Colin mentioned above. It concerns a memory of a lengthy period of poetry writing some years ago. I had developed a way of writing to which I felt very committed (bald, not rhyming, without similes, using simple words and often inconsistent, raggy rhythms). It put across the political message of the poems in what I felt was an effective way - the style echoed the tense political era of Thatcherism. But I did not "understand" it. I thought it was poetry, but it did not match any of the usual canon. I could not "see" what I was trying to do, although I was doing it. It was not until I came across a collection of poems by Brecht (who I was familiar with as a playwright not a #[p212] poet) that I understood what the poems were[12].  I learnt to "see" my own work only with the help of others, and a similar process occurred in the current project.

But there was also a process of unlearning. My struggles with "science" should by now be evident. I had different struggles with critiques of science arising from post-modern perspectives. The prose was often immensely off-putting. I know I am not alone in this: "post-modernism has often intimidated me ... by its concepts and the language used" (Converey 1996 p.273). I was surprised, therefore, to find some of the (to me extremely difficult) writings of Foucault an influence I could not have managed without. Parts of Foucault’s "Archaeology" helped me to see the barriers in the way of my seeing, and the need to question and un-learn the habits of thought I brought to research and my job. I might have got there without him, but he gave a powerful impetus. I am not convinced that simply reflecting on my practice, with no reading, would have taken me there.

So, for me, the literature served many roles. I interacted with a vast range of sources throughout the study, to understand, and at the same time to create, my growing understanding. The present text reflects this in its "voracious referencing". I trust the resulting "voice" is not one practitioner colleagues find intimidating, however, and that it communicates effectively.

I turn now, finally, to a tentative resolution of the difficulties with validity promised earlier, involving a blend of communication and critique.

14. Communication and critique.

The solution I have come to is, I suspect, neither new nor particularly radical. It combines communication with critique. For practitioner researchers I want to communicate my ideas in a form they may be more ready to access, to aid the search for relatability. This is not to be patronising. Practitioner researchers are not some inferior breed of researcher who need spoon-feeding, but perhaps their motivations and positioning can be seen as different. Certainly in my case, the values of practice, for instance, are pre-eminent. As Dadds (1995) explains in recounting her tale of one

#[p213] teacher’s research , she needed " her common sense and sensitivity to handle a number of methodological dilemmas ... She chose on each occasion to err on the side of human need rather than the needs of research" (p.100).

Academic rigour, for me, is not as crucial as relevance and accessibility (although they need not necessarily conflict). For casework, as opposed to research, the "messy and unpredictable" business (Kupferberg 1996 p.235) of my unique day-to-day practice will provide its own testing ground. With regard to research, I would not wish to be cavalier with the results of serious, dedicated researchers but "real situations demand [an] ... approach which may escape the categories of applied science" (Kremer-Hayton 1994 quoted in Kupferberg 1996 p. 229). In a similar way, I may wish to draw on a range of techniques and ideas in dealing with the messy reality of my research practice.

To communicate effectively to others requires, I believe, that they feel some "resonance"; the writing "speaks to them"; they can "relate" it to their situation. Full description of the "faltering reality" of research or practice, presented as part of an "honesty trail"[13]  may contribute to that sense of "resonance", although this is for me, as explained earlier more a question of faith at the present time. I do not have the evidence to support this assertion: a catalogue of "errors" might conceivably undermine the credibility of the study in some eyes, although Lenzo (1995 p.19) urges a freeing from "the censorious hold of ‘science writing’" (citing Richardson 1994).

Cornet (1995) argues that "[d]oubt and uncertainty are the fuel that drives ... research" (p.123) and that it is "[r]efreshing to find thoughtful writings ... that vividly portray the intellectual and emotional struggles that researchers face when they are truly concerned about truth value and ethical behaviour in their work" (p.123) although "it is rare that authors share their angst publicly" (p.123).

My problem as writer is, in presenting an "honest" account (indeed, any account) how do I know it resonates or relates? Bassey has unfortunately not developed his idea (Bassey 1998b). Feedback from papers and talks helps. Publication is perhaps one index, but is not reliable. True evidence of resonance requires the test of time: did people in the end, pick it #[p214] up?[14]  However, even if I could find a supportive band, a clique of "believers", a group who felt "resonance" with the ideas (being unflattering, a kind of "flat-earth society" Phillips 1992 p.115) would that be convincing to the un-converted? How far does it need to be? Does the world need to agree in this "obsessive age of standardisation" (Dadds 1995 p.113)? Would "[l]ocalized approaches which develop from our unique contexts" (ibid p.113) be acceptable, provided they were not "corruptingly insular" (ibid p. 113). A universe of one is insular, but, in seeking support, how large a community do I need to communicate with? How un-insular is un-insular enough? At the time of writing, the difficulty remains unresolved. I will turn now to critique.

If material is offered to a community; if it includes as much honesty as I can muster and as much detail of what I did, as is consistent with being readable and interesting; and if that community, over time, subjects this work to stringent criticism (and does not accept a simple, immediate "consensus" view) and finds it useful, then it may be possible in one sense, to count the work as "valid" / "worthwhile"/ "strong". But that process may have a lifetime greater than the average PhD. I perhaps need to be content with presenting evidence of attempts to move as far down that road as possible (I continue the exploration of communication in chapter 12).

In place of the rather bewildering array of criteria outlined earlier I would offer, in a tentative way, the questions below which practitioner researchers may wish to ask of a study of this nature, to aid their evaluation.

15. A tentative simplification of questions about research strength.

In a broad manner these questions cover the concerns of others (as evidenced in the sources covered earlier such as Reed and Biott 1995 and the action researchers), and myself, over issues of "strength". Obviously, further work would need to be done if any claim were to be made that such views represented those of a community (so far one colleague Rene E. has found them quite acceptable).

#[p215] The questions avoid where possible terms taken up by others in specialised ways. Lincoln and Guba’s 1985 "credibility" for example, refers to whether the (case study) inquiry is "credible to the constructors of the original multiple realities" (p.296) and is judged by processes such as member checks, triangulation and so on. I have replaced this with the everyday term "believability".

* Is it a believable account of what I did? Is it full and honest? Does it "ring true" for you?

* Is the writing clear and readable? Is their a good balance between academic rigour and accessibility ?

* Does the account demonstrate a sufficient level of care in developing methods of research and seeking wider critique? Is there sufficient detail for you to judge?

* Does the enterprise seem worthwhile? Does it lead to new understandings?

* Is what I did of interest to you? Does it "strike a chord"? Does it stimulate you?

* Is the study in keeping with practitioner values?

* Is the research self-critical?

* And finally, all things considered, is the method described something you might be prepared to adopt, even in part and according to your own needs? Can you relate to it?

In the final section of the chapter, I want to draw together these concerns over questions of "validity", with ideas arising from recent attempts to apply the original formulation of the "messy method" in new situations, and then to arrive at my current appreciation of what I do when I research.

16. A summary of the latest platform of understanding of my research method.

Picking out the common elements in the original description of the method described in chapter 6, and its variations in the mini-projects on identity and analysis, leads to the outline below.

#[p216] To begin, much of the method remains unchanged from the original concept: finding my own rules, relying on others, drawing on many approaches, keeping a diary and so on. The main differences are in my noting less enthusiasm in beginning some projects (such as identity); the existence of "proto-questions" rather than just curiosity; more emphasis on counselling; a model of reaching platforms of understanding as ideas evolve; the role of tensions between identities; an honesty trail; a clearer conception of "analysis" / "making sense" and a clearer conception of "validity" / "strength". In summary then, the messy method (I now feel more able to remove the quotation marks) has come to be:

The beginnings.

(i) I began with a curiosity about some aspect of work or research. Enthusiasm might be high or low depending on the area under consideration.

(ii) I decided to investigate this aspect.

(iii) I started the investigations with no question or, at best, a proto-question of the type {?} {"analysis"} {?}. It is important to guard against such questions, crystallising too early, however,

(iv) I embraced uncertainty and lived with the angst of not-knowing if an outcome was possible.

(v) I questioned my assumptions about research. The process involved learning and un-learning. A constant interaction with academic literature and other sources was needed throughout.

The main part of the process.

(i) I learnt by doing.

(ii) I drew upon many approaches in research. Some of these were outlined earlier in discussing the so-called "knowledge accrual process". There were also other activities involved such as harnessing serendipity and incubation.

(iii) Where conflict arose, the values of professional practice established priorities for research practice.

(iv) I kept reflective diaries of my work and my research. I researched both. Systematic recording in these formed a backbone to the project.

(v) The research path was partly formed through the interplay of identity tensions.

Issues arising around change.

(i) Certain areas of practice I was curious about. I did not initially set out to change these, simply to understand them.

(ii) During the inquiry I uncovered areas of practice which I was not comfortable with. As a responsible practitioner, I acted to change these, to my own professional satisfaction (which included evaluation of the effects of change over a period of time).

#[p217] Issues of support.

(i) I needed continual support to pursue this project, from partner, friends, strangers, colleagues and counsellors. Counselling was of particular importance in separating my emotions from the investigation. Publication gave encouragement, acting partly as a kind of "cheering on". Reflection on practice involved not just criticism but celebration. This provided another source of support. The research thus also served to reinforce certain facets of practice rather than alter them.

(ii) This was a far from solitary activity. I engaged in continuous dialogue, with friends, colleagues, partner, complete strangers, critical friends, critical correspondent, research interest groups, a focus group and conferences.

(iii) I relied on a very empowering form of supervision.

The developing view of the method.

(i) Understanding of this process (of researching practice while researching the research) emerged only slowly, during its course.

(ii) The research reached various platforms of understanding of its own process as I went backward to reconsider the data and forward to apply the method in different circumstances.

(iii) The method itself was a tool, to be continuously refined. This involved a process of "testing while protecting" in reaching out to a critical community.

(iv) "Making sense" was a complex reciprocal process involving data, exploration, writing and ideas.

Considerations of research strength.

(i) Being for practitioner-researchers, rigour was balanced by relevance and accessibility in terms of writing style. Work unlikely to be read would be seen as weaker research, thus communication would be seen as a vital element.

(ii) "Validity" was replaced by notions of a continuum of relative "strength". One element in this was an "honesty trail". A series of questions was offered to facilitate others’ judgement of strength in this and similar work.

17. Key points emerging from the chapter.

Some attempts to provide alternative approaches to the concept of validity may still have maintained a valid/ non-valid division.

Other writers who argue for the throwing out of notions of validity may, nevertheless, display that they retain certain implicit "standards" of good writing. In addition it seems reasonable to assert that as a minimum we need sufficient information to be able to make some judgement of the work.

#[p218] Wanting to continue with a position which keeps some idea of validity, although in the modified form of an assessment of research "strength" along a continuum, I later draw on a number of accounts of practitioner research in doing so.

Inclusion of accounts of "errors", in an "honesty trail" I argue adds to research strength.

In searching for ways to develop the emerging ideas, I identified a process of "testing while protecting" and gave a number of examples of this, together with attempts to apply the messy method in two new mini-projects.

In place of "generalisation" I explored the concept of relatability. This, however, did not circumvent the problem of deciding the appropriate size of any supportive community.

Effective communication with such a community is seen as essential; it is expected, however, that the community will subject the broadcast ideas to criticism.

Research is seen as involving a constant interplay with a wide variety of sources.

The chapter concludes with a tentative set of questions with which to explore research strength and the latest understanding of the method of research.
 
 

Notes

[1] Scheurich (1996) uses postpositivism to refer to research which "avowedly opposes the unproblematic application of the scientific method to the social sciences" (p.58 n.1).
[BACK]

[2] Ravetz (1971) bemoans the increase in "pointless publication" (p.49) which he refers to as "shoddy science", explaining that "the majority of journals in many fields are full of papers which are never cited by an author other than their own, and which, on examination, are seen to be utterly dull or just as bad" (p.49). However, the point I am making is not about good research on "boring" topics, but about "bad" research per se.[BACK]

#[p219] [3] I would prefer another more positive-sounding term than "errors", which has such negative connotations. "Probes" or "off-shoots" seem constructive way of describing the "dead ends", and this is not just linguistic trickery. I do genuinely value such excursions.

The more general, daily, small-scale, "muddles", "misunderstandings", "errors etc. can also be seen as events giving essential feedback. I recall as a science undergraduate confidently carrying out a particular complex calculation several times and getting the "right" answer each time. It was not till one day I got the "wrong" answer that I discovered a flaw in my approach which had not previously shown up. I learnt more from that one "mistake" than from all the correct instances, incorrectly done. My (only semi-serious) attempts to re-name such phenomena in less negative terms foundered on mouthfuls such as "positive environmental feedback events".
[BACK]

[4] There are other examples, some relatively humorous and having resonance in our childhoods of the power of "errors". Stafford (1997) for instance describes as a school boy finding a copy of the teachers answer book for his Latin homeworks. He regularly copied these out, carefully inserting random errors, to maintain a plausible picture.

Within the legal system, jurors often have to weigh inconsistent evidence from a genuinely truthful witness. The dissembler may seek to present a smooth, coherent tale, perhaps not realising the ability of the jury to read "the truth" in ordinary, everyday mess. The unsubtle false witness who is trying to deceive, may attempt to promote a very coherent picture of events: "the deliberate lie ... becomes more consistent than the description of reality" (Transkell 1972 quoted in Undeutsch 1988 p.111).

Hardy (1998), in supporting the call for an appeal over a recent case, admits that "there are discrepancies among these [witness] statements. One would expect there to be, unless the witnesses had sat down together ... to collude" (p.7). In a recent case of rape which went to the appeal court, Lord Justice Mummery ruled the defendant’s evidence "too perfect, too precise and too glib" (Carter 1999).

Vrij and Akehurst (1998), in looking at witness credibility, question whether someone would "mention details that tend to be unfavourable to him or herself if the person were fabricating an account" (Steller quoted in Virj and Akehurst p. 8). Which provides a useful parallel for the position I am offering on disclosing errors.
[BACK]

#[p220] [5] This, I contend, would be a rather high-risk and unlikely strategy in a thesis. Anyway, as outlined in earlier arguments, research is often such a "messy and error strewn" business there is little likelihood of having to manufacture mistakes.[BACK]

[6] There are of course many good reasons why conventional reports do not include a record of "errors". I am not trying here to disparage such writing, that would be a foolish, unsuccessful and unnecessary exercise. What I am doing is offering an alternative vision, which I hold, and which may be relevant to some other researchers, some of the time.[BACK]

[7] I mainly draw on Clarke et al 1993 and Tickle 1995. There are many other examples:

Altrichter et al (1993) suggest: considering alternative perspectives, testing through practical action, ethical justification and practicality (p.74-81).

Dadds (1993) proposes "valuing" (p.116) in place of validity and considers the areas of valuing knowledge and understanding, text, action, development and collaboration (ch. 8-12).

Lomax (1994) offers criteria around "ethics, rigour, logic, the ‘practical’ and the aesthetic" (p.113) including for instance: "are your claims authentic to your colleagues?" (p.123) and have you submitted the work to "critical dialogue drawing upon other sources ... with sufficient information for readers to ... check?" ? (p.125).

Norris (1997) underlines the need to voice one’s "prejudices and assumptions" (p.174) so that they can be challenged by critical friends and colleagues.
[BACK]

[8] Chalmers presents both "naive" and "sophisticated" versions of inductive and falsificationist approaches, my aim here, however, is to present a window on my thinking, not a detailed burrowing into such distinctions.[BACK]

[9] Chalmers goes on to criticise Lakatos further, and describes his methodology as:

a memorial to happier times when it was still thought possible to run a complex and often catastrophic business like science by following a few simple and ‘rational’ rules. (Feyerabend 1974 quoted in Chalmers 1982 p.87).

Again , I do not intend to enter into such arguments.
[BACK]

[10] See for example Ravetz (1971) as mentioned earlier on citations; the low take up of library loans (ibid p. 49 n. 23) and the proliferation of obscure journals (ibid p.50 and n. 25). With the expansion of publication in the last 20 years, I can only imagine the situation has not improved.[BACK]

[11] Some comments on future directions in research are in chapter 12.[BACK]

#[p221] [12] The editors’ introduction (Willett and Manheim 1976) refers to Brecht’s "rhymeless irregular verse" (p. xvi); the "gestural" (he calls it "gestic") nature of the work; and its "flinty" effect (p.xvi); relying for example on street chants from demonstrations. I do not, however, pretend in any way to measure up to Brecht’s work, only to understand mine, through his.[BACK]

[13] I am borrowing very loosely here from Lincoln and Guba’s (1985) "audit trail". They offer a detailed procedure in case study analysis. My usage is more in a broad sense of providing the fullest picture of the evolving research enterprise, warts and all.[BACK]

[14] Miles and Huberman 1994 p.280 refer to "pragmatic validity" i.e. the way in which research influences the actions of others; Mishler 1990 and Carr 1995 refer to the judgement afforded by communities; Lincoln and Guba 1985 give a thorough but for me, too specialised, description of "trustworthiness" as researchers take up research reports.[BACK]

XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

November 2001 not in original thesis

I noticed some time after submitting the thesis that the first two paragraphsof this chapter are repeated from page 111. I have left them in, so that thepage lay outs remain the same as in the original, in case you wish to quotesomething.

[*]In this section I have made some small changes to further protect the identities of Celia and Mike.[BACK]


Click here Contents Page


to return to the contents page.

Chapter Twelve