Survey of Radiation Oncology Quality Assurance Practices Finds Much Variation Across North America
November 2, 2010
During “chart rounds” radiation oncologists in the U.S. and Canada spend limited time reviewing patient cases for potential errors, say researchers at the Kimmel Cancer Center at Jefferson. At the ASTRO annual meeting, Jefferson researchers reveal how variable and potentially haphazard the process is.
(PHILADELPHIA) In 2009, after The New York Times reported a series of mistakes in radiation therapy delivery, a team of radiation oncologists at the Kimmel Cancer Center at Jefferson decided to investigate how thoroughly different hospitals use “chart rounds” to review the radiotherapy cancer patients receive.
They term their findings, to be presented at American Society of Radiation Oncology (ASTRO) annual meeting in San Diego, “unsettling.” The researchers conclude that the limited time (an average of about three minutes) being spent reviewing patient cases in these meetings is inadequate to assess the full range of critical data now available for modern complex procedures.
Some institutions spent less than a minute on each patient’s case.
“The quality of peer review has not kept up with advances in technology and may be suboptimal,” says the study’s senior investigator, Jefferson radiation oncologist Yaacov Lawrence, M.R.C.P. “It seems to us that very few radiation oncology departments are spending enough time reviewing data on patient cases, even though this may be the best way to pick up mistakes. It is always in the patient’s best interest to have a roomful of physicians and physicists looking over someone’s shoulder, he says. Dr. Lawrence will present the findings.
Physicians from many specialties use peer review to ensure a treatment plan is the best it can be. That process may be especially important in radiation oncology to guard against technical errors and inappropriate treatment during each planned treatment session.
“We have a unique opportunity in academic radiation oncology to have each individual treatment plan evaluated by multiple peers,” says Michal Whiton, MD, the study’s first author. “This is not the case in surgical departments, where one or more surgeons do not regularly rotate through operating rooms checking on their peers’ work and offering suggestions or corrective action. Even the general practitioner tends to operate in a vacuum within his or her own practice, simply because there is no forum for regular discussion. In radiation oncology, we have the luxury of eliciting multiple perspectives on a single problem or clinical case on a weekly basis. This can and should ultimately lead to a better solution for the overall care of the patient, if done adequately.”
The Jefferson team anonymously surveyed all hospitals in North America that train radiation oncologists, probing the extent of chart rounds, and received web-based replies from 59 centers; either from chief residents (U.S.) or residency program directors (Canada). What they found surprised them.
“We asked questions regarding how often chart rounds are held, how many patients and what specific data are reviewed by physicians, and so on, in addition to assessing the degree to which highly complex modern treatments are utilized,” says Dr. Lawrence.
The researchers found that the median number of patients on treatment at any one time at these centers was between 100 and 125, and that 58 percent of responding institutions hold chart rounds for less than two hours per week. The median amount of time spent per patient was 3.4 minutes (the range was .7 minutes to 12 minutes).
They also found no correlation between the complexity of techniques used and the time spent per patient for quality assurance purposes. Nonetheless, chart rounds led to both minor and major treatment changes. “Almost everyone said they made changes based on chart rounds, which makes the process worthwhile. The unanswered question is: If a more in-depth review of charts was performed, would more changes be made?” Dr. Lawrence says.
They also found a lot of variability in the type of treatments reviewed during these sessions. For example, 41 percent of hospitals never review prostate brachytherapy cases and 31 percent never review gynecologic brachytherapy cases. Both procedures involve implanting radioactive sources directly into or close to a tumor. The reasons for this selective lack of peer review are not clear. Although over 80 percent of institutions review all external beam treatments, rates were much lower for radiosurgery (60 percent).
Patient history, chart documentation and dose prescription were reviewed in 79 percent of the institutions, while many critical aspects of the treatment plans, such as normal tissue constraints used during planning, were not thoroughly reviewed.
“It seemed to us that the finer details of treatment are not always reviewed, and that chart rounds haven’t kept up with advances in the technology,” Dr. Lawrence says. “Compared to 15 years ago, there is a lot more to review, such as details involving how much ionizing radiation is going to different organs and how it is being delivered. Radiation oncology these days involves a lot of number crunching - but you can not critically assess all the available data in such a short time.”
The whole process of chart rounds as seen through the survey “was haphazard,” he says. “This could lead to overlooking errors, so perhaps there is room to suggest these review sessions be done in a more standardized, regulated way.”
Adam Dicker, M.D., Ph.D., Chair, Department of Radiation Oncology at Thomas Jefferson University, said that in part, as a result of the survey, changes were made to the manner in which quality assurance is performed at his institution. “We are constantly trying to improve the patient safety net, and create opportunities to reduce the error rate.”
Whiton was a radiation oncology resident at Thomas Jefferson University when this study began. She is now employed at the Skagit Valley Regional Cancer Care Center in Washington.
The study did not use external funds.