Thursday, October 14, 2010

What do Frenchmen and research methods share in common?

When I first met my wife I bragged about my pure French lineage: All 4 of my grandparents originated from Quebec. She responded, "My father says there are no pure Frenchmen." Either way you read it, she's right.

Similarly, there are no pure methods in the social sciences--only families of methods that ought to be adapted to a specific project. I'll illustrate this with the example of a Delphi survey I conducted with Jeff Dueker in 2007: Teaching and Assessing the Responsible Conduct of Research: A Delphi Panel Report.

Delphi surveys (or Delphi panels) are unlike most surveys: They do not aim to generate knowledge (e.g., about attitudes); rather they aim to generate a consensus. We wanted to achieve a consensus on what should be taught and assessed in those responsible conduct of research (RCR) courses that trainees and other researchers are often required to take. The US Office of Research Integrity had earlier identified a few "core areas" for instruction; but there was considerable debate on what should be taught (e.g., are there universally agreed upon standards for authorship across fields?), what should be assessed (e.g., should we assess commitment to values?), and above all, what the fundamental aims of RCR education should be (e.g., to foster compliance with regulations, ethical problem solving skills, or knowledge of abstract ethical principles?).

Delphi surveys generally involve forming a group of "panelists" and administering several rounds of a questionnaire to the panelists individually. For example, our round 1 questionnaire ask open-ended questions such as "What should be the overarching goals of RCR instruction?" The round 2 questionnaire took all panelists' responses, reduced them to clear and non-redundant goal statements, and asked them to rate the importance of the goals using a 5-point Likert-type scale. Round 3 presented a shorter list of just those items that were endorsed as "important" or "very important" by 2/3's of the group during round 2. We repeated the process to generate a short list of 9 key objectives for RCR instruction that were endorsed by a group of experts.

I was familiar with Delphi surveys others had conducted. But when I did a literature review on the method itself, I could not find clear and consistent guidelines. How many panelists do you need? What should qualify someone as a panelist? How many rounds will suffice? What constitutes support for an item (e.g., "agree" or "strongly agree"?). What constitutes a consensus (e.g., 2/3's agree? 3/4 strongly agree?) What do you do with the results from panelists who participate in some but not all rounds? How much liberty can you take when condensing, organizing, and presenting the results of open-ended items? Should we report mean Likert values (e.g., item 9 had a mean score of 4.2) or rather the percent of experts endorsing an items with a value of 4 (important) or 5 (very important)?

What we found were articles that described generally how Delphi surveys work and how they have been applied in marketing studies or in establishing organizational goals. Even when we found studies that used a Delphi panel for curriculum development, we found that their approach would not meet our needs. We decided to use a lot of experts (n=43) clustered into 4 groups of expertise. We decided we needed to use an online format to accommodate disparate locations and schedules. We decided to start with open-ended questions even though this would mean more work. (It would make it clear the panelists generated the content of recommendations, not the project directors).

Rarely will you find a manual that provides clear guidance on how to apply a method to your study questions. All methods need to be adapted. Naturally, not all adaptations are kosher; at some point, the method may be changed so radically it becomes invalid or at least loses its membership in a family of methods like the Delphi survey. In modifying a method it is essential to have a good rationale for your adaptations, to describe your method accurately in publications, to describe the rationale for unusual features of the method, and to be aware of the limitations of whatever you decide and to discuss them frankly.

Conclusion: You won't find a guidebook on how to design YOUR study. You will need to apply any method critically, adapting it to your research questions, budget, participant group, etc. If you cannot do this well, you should collaborate with a creative and knowledgeable methodologist.

No comments:

Post a Comment