Background: Comparison of outcomes requires adequate risk adjustment for differences in patient risk and the type of intervention performed. Both unintentional and intentional misclassification (also called gaming) of risk factors might lead to incorrect benchmark results. Therefore, misclassification of risk factors should be detected. We investigated the use of statistical process control techniques to monitor the frequency of risk factors in a clinical database.
Methods and results: A national population-based study was performed using simulation and statistical process control. All patients who underwent cardiac surgery between January 1, 2007, and December 31, 2009, in all 16 cardiothoracic surgery centers in the Netherlands were included. Data on 46 883 consecutive cardiac surgery interventions were extracted. The expected risk factor frequencies were based on 2007 and 2008 data. Monthly frequency rates of 18 risk factors in 2009 were monitored using a Shewhart control chart, exponentially weighted moving average chart, and cumulative sum chart. Upcoding (ie, gaming) in random patients was simulated and detected in 100% of the simulations. Subtle forms of gaming, involving specifically high-risk patients, were more difficult to identify (detection rate of 44%). However, the accompanying rise in mean logistic European system for cardiac operative risk evaluation (EuroSCORE) was detected in all simulations.
Conclusions: Statistical process control in the form of a Shewhart control chart, exponentially weighted moving average, and cumulative sum charts provide a means to monitor changes in risk factor frequencies in a clinical database. Surveillance of the overall expected risk in addition to the separate risk factors ensures a high sensitivity to detect gaming. The use of statistical process control for risk factor surveillance is recommended.