Saturday, February 12, 2011

Teaching Students How to Evaluate the Evidence

Teaching adult learners how to evaluate research evidence is an important skill in sexuality education. The term “evidence-based” can be applied to many aspects of sexuality education, including public health practice, medicine, public policy, and curriculum development. For example, in medicine, this concept means “the conscientious explicit and judicious use of current best evidence in making decisions about the care of individual patients” (Sackett, Rosenberg & Gray, 1996). The application of evidence-based research is the gold standard for sexuality education.
However, in my sexuality education doctoral program at Widener, I often hear my fellow learners moan and groan during our required research courses. Many of them say that they are teachers by profession and do not plan to conduct research. So why is it important for them to learn to evaluate the evidence? Research is essential to furthering scientific knowledge, improving current education methods, and informing public policy – all very relevant aspects to being a teacher. However, too often, people hear about new research results through newspapers, broadcast news, or magazines. They may never get the chance to look closely at the methodology and research questions to determine the validity of the study. Convincing adult learners that they should take the time to evaluate the merit of the evidence can be a challenge for educators.
What are some ways to teach this skill to sexuality students? Judging the merits of evidence is a skill that takes practice. However, there are many peer-reviewed journal articles that describe a methodical review and evaluation of the evidence on a specific topic that can be used as teaching examples. One example is an article by Major, Appelbaum, Beckman, Dutton, Russo, & West (2009) which evaluates the evidence regarding abortion and mental health, a topic that has received much press in the sexual health landscape in recent years. The strength of the article is not just the conclusion (that there was no difference in relative risk for mental health problems between women who had abortion and women who did not) but in the way the authors painstakingly describe the various methodological problems with the published studies to explain why they did not pass muster as good evidence. They carefully review common methodological problems such as incorrect application of conceptual frameworks, inappropriate use of comparison groups, inadequate control of risk factors, sampling bias, and use of inappropriate, non-validated measurement tools in language that is clear, jargon-free, specific, and understandable. After reading this article, I came away with a new understanding of what to look for when evaluating published research studies.
This article would be a perfect learning tool for students to read before they practice critiquing published research on their own. Teachers can use this example, or a similar one, to help learners develop a personal checklist of themes or questions to use when evaluating research. For example, APA guidelines suggest looking at the methodology, authors’ framework(s), statistics, theoretical framework, and results (Driscoll, 2010). The acronym MASTR (pronounced “master”) may be a helpful reminder. A classroom exercise where learners must come up with one or two questions for each of those areas and then apply them to multiple articles could be an excellent way to increase their confidence in this important skill – evaluating the evidence.

Driscoll, D. L. (2010, April 21). Social Work Literature Review Guidelines. Retrieved from
Major, B., Appelbaum, M., Beckman, L., Dutton, M. A., Russo, N. F., & West, C. (2009). Abortion and mental health: Evaluating the evidence. American Psychologist, 64, 863-890.
Sackett, D. L., Rosenberg, W. C., & Gray, J. A. M. (1996). Evidence based medicine: What it is and what it isn’t. BMJ, 312, 71-2.

-Shannon Criniti


  1. I appreciate this post, not only because it challenges me as an educator but also because I too have been one of those students who grumbles at the thought of research. I have a love-hate relationship with research. I love the numbers and use them frequently in my work, though I hate evaluating the methodology behind the numbers.

    I just recently took a research class at Widener and found that the required book, Health Promotion & Education Research Methods: Using the five-chapter thesis/dissertation model by Randall Cottrell and James McKenzie, was very helpful in teaching me how to evaluate research methodology. This was my first experience with it so I appreciated the way the authors approached research by first breaking it down into more manageable sections. The sections outlining internal and external validity as well as research designs and sampling methods were especially helpful for me. While I feel that I still don’t have a comfortable grasp on evaluating research, this book was a great first step to better understanding how to critically analyze research.

    One activity done in the class was to analyze research articles using a rubric. We rated each section of the article to determine strength in the research design, methodology, instrumentation, data collection and findings. I felt that this type of activity was useful for adult learners because it allowed for personal exploration in an interactive way which emphasized skills in the critical analysis of research.

    Cottrell, R.R. & McKenzie, J.F. (2005). Health promotion & education research methods: Using the five-chapter thesis/dissertation model. Sudbury, MA: Jones and Bartlett Publishers.

  2. students must appreciate research because inevitably its an important part of our lives.

  3. I so appreciate this reminder. Just like rebecca said, I too often want to be a lazy consumer of information. We often skim through articles and then use the statistics or research as we see fit. Thus, its a good idea to make sure what we are using is a credible source. For instance, outside of writing research articles in class, when we are actually making a statement for the press or to counter an anti-sexuality comment, we need to make sure our sources are as fool-proof as possible. Using an short reminder like "MASTR" is one of the easiest ways to quickly see what you are working with. Validating a number of articles doesn't have to be a 3 hour long process, but if we can just quickly verify the validity of the article, we are more likely to make a safe and strong argument.

  4. I love this! I have been in education for 10 years and find it deplorable when I hear of educators who do not take the time to research what they are teaching. This is in terms of provinding both research based content and research based best practices. For me, I find that research fuels my pedagogy. Now, I have not taken research at Widener yet but, I have had research in my MSW program, in my Home and School Visitor, and in my Principal Certification Program. So, I am glad to see that someone took the initiative to blog about this important topic.
    Off to the library,

  5. This topic really hits a nerve with me. I did my undergraduate thesis on being fat and healthy, so a lot of what I was doing was being critical of research - some of which didn't even conclude with what the data showed, but the "no matter what we find, fat is still bad" message. It kind of boggled my mind.

    I actually feel that being critical of what I read, not just research, has been the most important skill I've learned post high school. I think that the way that we get research information from the media, it makes actual reports look scary, when they could (should?) be seen as the norm. Great job!

    ~Rachel Girard

  6. I really appreciate this post for the same reasons some people have already mentioned. It is a scary world when people hear a thirty-second sound bite on the news and take what is said as fact. Too often we hear “a research study found evidence of …” and then the reporter asserts one point of the research as an indisputable and definitive fact. These one-line summaries fail to provide all of the relevant and nuanced information that is necessary to really understand that “fact.”
    As others have said, it is imperative that we as sex educators research before we speak. We want to break the cycle of misinformation for our own benefit as well as the benefit of our students, and as Meg said, to do that we need to ensure our sources are credible. If we become lazy and fail to thoroughly vet and understand the material we use, it will be all too easy for those who oppose our work to find flaws and use our words or teaching to criticize us. Part of having a rationale for everything we do includes having thoughtful and accurate research backing us up, something we cannot afford to become lax about.
    Great post – it was wonderful to have another reminder of why evaluating research is an important part of our jobs!