May 17, 2023

How do you make sense of a mountain of evidence?

Systematic-review-story_2col.jpg

KPWHRI researchers answer questions about systematic reviews and their impact on health care guidelines

Systematic reviews are comprehensive reviews of scientific evidence that organize data from multiple studies, synthesize the findings, and provide a comprehensive and clear summary of what the current evidence says about a given research question. Often, this is to inform health care guidelines. Nora Henrikson, PhD, MPH, and Paula Blasi, MPH, recently collaborated on a systematic review for the U.S. Preventive Services Task Force on the benefits and harms of skin cancer screening, which was published in JAMA.

They are part of a team of researchers at Kaiser Permanente Washington Health Research Institute (KPWHRI) that produces these reviews as part of the Kaiser Permanente Evidence-based Practice Center (EPC), one of 13 centers nationwide sponsored by the federal Agency for Healthcare Research and Quality (AHRQ). The Kaiser Permanente EPC is led by Jennifer S. Lin, MD, MCR, FACP. One of their previous publications received the 4th annual Dr. Robert Lewis Kane Service Award given by the AHRQ, which recognizes a report of notable relevance, clarity, and conciseness.

Henrikson and Blasi answered questions about the recent review, why they love doing these reviews, and why they are significant.

What did the recent systematic review on skin cancer screening find?

Henrikson: We were asked to conduct the review to examine the evidence about to what degree skin cancer screening has been shown beneficial, or harmful, in primary care settings. The review suggested a clear association between earlier stage at skin cancer detection and decreased mortality risk, but little to no evidence of a melanoma mortality benefit associated with routine skin cancer screening by clinicians. The review also found no association between routine skin examinations and detecting melanoma at an earlier stage.

Why are systematic reviews like this important? What is unique about these reviews?

Henrikson: Systematic reviews are a method for synthesizing previously published studies, and they were developed for exactly the purpose this skin cancer screening review is used for — to inform guidelines. The method was developed as part of the evidence-based medicine movement, when there was a huge volume of literature coming out and physicians needed a way to make sense of it all when making patient care decisions. That is still one of the main reasons people use it, but it can also be used by people trying to understand the state of the body of research on any specific topic.

We take lots of studies that look at the same research question and use them together to answer that question in a comprehensive way. It brings a collection of individual studies, which may have been done in unique populations or have other unique factors, onto the same playing field so we can compare them.   

What does the study process look like for one of these reviews?

Blasi: At the outset, we decide how to scope the review to best answer the research question. What specifically are we trying to answer? For example, if a guideline is for health care in the U.S., we might limit the setting to higher-income countries so we’re looking at evidence from comparable health care systems. Do we want to look at only adults or include children? We set out our inclusion criteria for the population, the setting, the specific intervention being tested, and the specific outcomes.

Once we have those criteria, we ask a research librarian to design a literature search. Often we will get a huge volume of search results, and we put them into a platform for reviewing just the titles and abstracts. We rule studies in or out based on title/abstract if we can. For example, a study done in mice would be out.

We then look at the full text of the remaining articles and evaluate each study further. We use an established quality rating system. We do this so we can systematically assess the potential biases and limitations in the included evidence. For some reviews, studies with a high risk of bias or low quality may not be included.

Henrikson: Transparency in what we're doing is a huge part of this method. For the skin cancer screening guidelines, the research plan, with the inclusion criteria, was posted publicly and there was an opportunity for public comment.

For all systematic reviews, it is best practice to post your protocol before you start reviewing articles. This is so readers can tell how closely we stuck to our inclusion criteria. There are so many interesting articles and research questions, it can be tempting to get distracted, so having public posting is really important for the rigor of the method.

You’ve talked about these reviews being some of your favorite work to do. Why is that?

Blasi: I really love the structure of it. You set out clearly and concretely what is in and what is out, and what your question is. There’s a clear process to follow. And I really enjoy the ability to start with this massive volume of literature, distill it down into the highest-quality set of studies, and then answer a research question with very solid evidence. I also learn so much every time I do it. These reviews are a really amazing way to learn about what high-quality research entails, as well as basic epidemiology skills like how to identify a good study design or risk of bias.

Henrikson: I love the rigor of it. These reviews are also tied to policy in a way that I find really compelling. It feels like the most impactful research I do. And it’s fun. 

These methods are also accessible in a way other research methods are not because the data source is already-published studies. It requires adequate funding to do the kind of rigorous reviews we work on, but anyone with reasonable access to scientific literature can potentially use systematic review methods to explore the scientific literature.

We also have great teams that we work with. We started a contest for the funniest abstracts that we come across in reviews. For example, we had a paper come up about a new method of detecting whether camel beauty pageant contestants have had lip injections. So, we put those in our team chat.

What kind of expertise do you need to do these reviews?

Henrikson: I would say you need some kind of training in epidemiology and biostatistics. You need to understand study designs and how to interpret bias, and you need to know how to understand the clinical areas. There's an extensive period of planning before we start to set up what the problems might be and talk about the clinical landscape on a specific topic.

Blasi: You need to be able to pay attention to small details and then zoom out and look at the big picture to summarize what the evidence says. Writing and communication is important in every field, but especially for these reviews — it’s an indispensable skill.   

How often are you surprised by the findings?

Henrikson: Sometimes we're surprised at how little evidence there is on a topic that seems sort of well accepted. Something may seem like common sense, but when you start looking there isn’t as much evidence as you might expect. We run into that a lot. Or, we might find that there are more studies than we expected, but they are all completed in the same population. This might limit how applicable the evidence might be to specific population groups.

Blasi: We often try to call out research gaps. Sometimes the research community pays attention to the section where we talk about what evidence is missing, and designs studies to address those gaps.

healthy findings blog

Paula-Blasi-hiking-family_1col.jpg

Finding evidence to drive COVID-era decisions

How Paula Blasi brings research findings to light to support Kaiser Permanente’s rapid response to COVID-19.

Healthy findings blog

Blog-pediatric-back-girl-nurse-scale-weigh-1_column.jpg

For better pediatric care, back guidelines with more evidence

In the 2000s, Dr. Beth McGlynn showed that rates of getting recommended care are low. In a new JAMA editorial, Dr. David Grossman has ideas to improve them.