Despite the potential benefits central venous lines can have for patients, there is a high risk of bloodstream infection associated with these catheters | Intensive and Critical Care Nursing
Aim: Identify and critique the best available evidence regarding interventions to prevent central venous line associated bloodstream infections in adult intensive care unit patients other than anti-microbial catheters.
Methods: A systematic review of studies published from January 2007 to February 2016 was undertaken. A systematic search of seven databases was carried out: MEDLINE; CINAHL Plus; EMBASE; PubMed; Cochrane Library; Scopus and Google Scholar. Studies were critically appraised by three independent reviewers prior to inclusion.
Results: Nineteen studies were included. A range of interventions were found to be used for the prevention or reduction of central venous line associated bloodstream infections. These interventions included dressings, closed infusion systems, aseptic skin preparation, central venous line bundles, quality improvement initiatives, education, an extra staff in the Intensive Care Unit and the participation in the ‘On the CUSP: Stop Blood Stream Infections’ national programme.
Conclusions: Central venous line associated bloodstream infections can be reduced by a range of interventions including closed infusion systems, aseptic technique during insertion and management of the central venous line, early removal of central venous lines and appropriate site selection.
Full reference: Velasquez Reyes, D.C. et al. (2017) Prevention of central venous line associated bloodstream infections in adult intensive care units: A systematic review. Intensive and Critical Care Nursing. Published online: 26 June 2017
Umbilical venous catheters (UVC) or peripherally inserted central catheters (PICC), commonly used in high risk neonates, may have a threshold dwell time for subsequent increased risk of central line associated blood stream infection (CLABSI) | The Journal of Hospital Infection
Aim: To evaluate the CLABSI risks in neonates having either UVC, PICC or those having both sequentially.
Methods: Study included 3985 infants who had UVC or PICC inserted between 2007 and 2009 cared for in 10 regional Neonatal Intensive Care Units: 1392 having UVC only (Group 1), 1317 PICC only (Group 2) and 1276 both UVC and PICC (Group 3).
Results: There were 403 CLABSI among 6000 venous catheters inserted, totalling 43302 catheter days. CLABSI rates were higher in Group 3 infants who were of lowest gestation (16.9/1000 UVC days and 12.5/1000 PICC days; median 28 weeks) when compared with Group 1 (3.3/1000 UVC days; 37 weeks) and Group 2 (4.8/1000 PICC days; 30 weeks). Life table and Kaplan-Meier hazard analysis showed UVC CLABSI rate increased stepwise to 42/1000 UVC days by day 10, with the highest rate in Group 3 (85/1000 UVC days). PICC CLABSI rates remained relatively stable at 12-20/1000 PICC days. Compared to PICC, UVC had a higher adjusted CLABSI risk controlled for dwell time. Among Group 3, replacing UVC electively before day 4 may have a trend of lower CLABSI risk, than late replacement.
Conclusions: There was no cut-off duration beyond which PICC should be removed electively. Early UVC removal and replacement by PICC before day 4 could be considered.
Full reference: Sanderson, E. et al. (2017) Dwell Time and Risk of Central Line-Associated Bloodstream Infection in Neonates. The Journal of Hospital Infection. Published online: 24 June 2017
BACKGROUND: Simulation-based training has been associated with reduced central line-associated bloodstream infection (CLABSI) rates. We measured the combined effect of simulation training, electronic medical records (EMR)-based documentation, and standardized kits on CLABSI rates in our medical (MICU) and surgical (SICU) intensive care units (ICU).
METHODS: CLABSI events and catheter-days were collected for 19 months prior to and 37 months following an intervention consisting of simulation training in central line insertion for all ICU residents, incorporation of standardized, all-inclusive catheter kits, and EMR-guided documentation. Supervising physicians in the MICU (but not the SICU) also completed training.
RESULTS: Following the intervention, EMR-based documentation increased from 48% to 100%, and documented compliancewith hand hygiene, barrier precautions, and chlorhexidine use increased from 65%-85% to 100%. CLABSI rate in the MICU dropped from 2.72 per 1,000 catheter-days over the 19 months preceding the intervention to 0.40 per 1,000 over the 37 months following intervention (P = .01) but did not change in the SICU (1.09 and 1.14 per 1,000 catheter-days, P = .86). This equated to 24 fewer than expected CLABSIs and $1,669,000 in estimated savings.
CONCLUSION: Combined simulation training, standardized all-inclusive kits, and EMR-guided documentation were associated with greater documented compliance with sterile precautions and reduced CLABSI rate in our MICU. To achieve maximal benefit, refresher training of senior physicians supervising practice at the bedside may be needed.
Reference: A multitiered strategy of simulation training, kit consolidation, and electronic documentation is associated with a reduction in central line-associated bloodstream infections. Allen GB, Miller V, et al. Am J Infect Control. 2014 Jun, vol 42, no 6, p643-8
BACKGROUND: Central line-associated bloodstream infections (CLABSIs) result in increased length of stay, cost, and patient morbidity and mortality. One CLABSI prevention method is disinfection of intravenous access points. The literature suggests that placing disinfectant caps over needleless connectors decreases CLABSI risk.
METHODS: A quasi-experimental intervention study was conducted in a >430-bed trauma I center. In addition to an existing standard central line bundle, a new intervention consisting of a luer-lock disinfectant cap with 70% alcohol was implemented in all intravenous (IV) needleless connectors on patients with peripheral and central lines. Compliance to the disinfectant cap was monitored weekly. A generalized linear model using a Poisson distribution was fit to determine if there were significant relationships between CLABSIs and disinfectant cap use. Impacts on costs were also examined.
RESULTS: The rate of CLABSI decreased following implementation of the disinfectant cap. The incidence rate ratios (.577, P = .004) for implementing the disinfectant caps was statistically significant, indicating that the rate of patient infections decreased by >40%. Increased compliance rates were associated with lower infection rates. Disinfectant cap use was associated with an estimated savings of almost $300,000 per year in the hospital studied.
CONCLUSIONS: Use of a disinfectant cap on IV needleless connectors in addition to an existing standard central line bundle was associated with decreased CLABSI and costs.
Impact of universal disinfectant cap implementation on central line-associated bloodstream infections. Merrill KC, Sumner S, Linford L, Taylor C, Macintosh C. Am J Infect Control. 2014 Dec, vol 42, no 12, p1274-7.