Six Quality Improvement Questions for an IHI Improvement Advisor


Improvement Science Under the Microscope

RebeccaSteinfield.jpgQuality improvement (QI) in health care can be a huge undertaking even when it’s not your full-time job. For IHI’s Improvement Advisors (IAs), QI is all they think about and work on during the day. One of our IAs, Rebecca Steinfield (pictured at right), has been with IHI since 1996 in numerous capacities, and she currently coaches teams and teaches courses on the science of improvement, including the Improvement Science in Action seminar this September. She offered some perspective on how to get even better at improving.

Q: What are some interesting improvement projects and results that you’ve seen recently from IHI’s partners?

One of my favorite projects included a group of clinics in South Africa working to improve HIV testing for pregnant mothers in order to get the right treatment to the right moms to prevent the spread of HIV to their babies. The key for them was the retest at 32 weeks to catch recent infections — they were able to increase the retest in the pilot clinic from less than 40 percent to over 80 percent and spread the improved process to other clinics.

Q: Is there one element of the science of improvement that you feel practitioners sometimes overlook?

I think we sometimes overlook the importance of having an explicit theory about which changes will result in improvement that we can update over time as we test changes, collect data, and learn about how to make the changes work in our local environment. PDSA cycles should be grounded in a theory about how the system works, but we need to be open to the idea that some of our theories will be wrong — and we want to avoid falling into the confirmation trap — where we look only for data that confirms our theories.

Answering the three fundamental questions of the Model for Improvement is key:

  1. What are we trying to accomplish?
  2. How will we know a change is an improvement?
  3. What changes can we make that will result in improvement?

It’s also important to create a driver diagram to make sure we have a solid, shared vision and theory about the work.

Q: What requires the most time in a PDSA cycle (Plan, Do, Study, Act)? Which step do people skip most often — and at what cost?

I don’t think any one step necessarily takes the most time — it depends on where you are in your testing process. For example, when you are testing on a very small scale, your planning might take much longer than the doing, but as you test on a larger scale, the doing could take longer.  

Some of the failures in running PDSAs I see often are:

  • Failure to plan with enough detail so that you can distinguish between a failure to execute the plan and an ineffective change idea;
  • Failure to identify the questions you want to answer by running the PDSA;
  • Failure to predict the answers to those questions so that you can adequately study the results (predictions also help you make a better plan);
  • Failure to make a plan to collect the data you need to answer the questions and compare your predictions with the actual outcomes;
  • Failure to involve the “do-ers” in the analysis of the data collected (the Study phase); and
  • Failure to act based on the study phase in order to plan the next test.



Q: What’s one piece of advice you have regarding measurement?

Collect data that is meaningful, and use data for learning, not judgment. (That’s two, sorry!)

Q: Are you working on any improvement projects of your own — even at home?

Yes, I am trying to reduce the time it takes me to review, document, and respond to reports that I get in one of my projects. I have developed some standard work and have been working on a new documentation strategy. At baseline, it was taking me 60 minutes per report, and I was not reliably documenting important elements that I needed to track. I would like to reduce my time to 30 minutes with complete documentation.

Q: Do you ever see run charts in your dreams?

I don’t want to talk about that!

Average Content Rating
(2 users)
Please login to rate or comment on this content.
User Comments