The article notes that the “recurring cost of bad quality” is a fairly new concept in clinical laboratory operations. There have been attempts to address it, but unfortunately, much of the recurring cost of bad quality is nearly invisible. The problem is exacerbated by the fact that poor quality is all too often accepted as a normal consequence of lab tests.
Two very common failures that happen in many labs are untestable specimens (for many reasons) and reporting errors.
While the cost of such rework is substantial, it is typically only captured in the lab’s budget for labor and materials, while the true cost of the waste associated with this retesting is never made visible to lab management. This says nothing of the effect it has on patients, who may have to come back in to give other samples, thereby possibly delaying diagnoses.
So, haphazardly cutting costs can conceivably create further financial pressures if these efforts negatively affect quality. This of course fits with the broader environment of the raised the bar of lab consumers. Labs must become vigilant to ensure they’re not compromising patient safety or clinical accuracy.
Fortunately, processes like Six Sigma, a measurement we use at MedSpeed, can help play a role. Using these tools can help labs monitor improvements and demonstrate progress by calculating the Sigma-level performance of that work process. It’s in the best interest of all of us who interact with labs to eliminate errors and related costs and drive to a scenario patients are better served AND costs are better managed.