Occupational health risks associated with the use of germicides in health care

Weber, D.J. et al. American Journal of Infection Control.Volume 44, Issue 5, Supplement, 2 May 2016, Pages e85–e89

http://www.public-domain-image.com/free-images/miscellaneous/window-cleaning-in-protective-rubber-gloves-washing-windows/attachment/window-cleaning-in-protective-rubber-gloves-washing-windows

Image source: Cade Martin, Dawn Arlotta // CC0

Environmental surfaces have been clearly linked to transmission of key pathogens in health care facilities, including methicillin-resistant Staphylococcus aureus, vancomycin-resistantEnterococcus, Clostridium difficile, norovirus, and multidrug-resistant gram-negative bacilli. For this reason, routine disinfection of environmental surfaces in patient rooms is recommended. In addition, decontamination of shared medical devices between use by different patients is also recommended.

Environmental surfaces and noncritical shared medical devices are decontaminated by low-level disinfectants, most commonly phenolics, quaternary ammonium compounds, improved hydrogen peroxides, and hypochlorites.

Concern has been raised that the use of germicides by health care personnel may increase the risk of these persons for developing respiratory illnesses (principally asthma) and contact dermatitis. Our data demonstrate that dermatitis and respiratory symptoms (eg, asthma) as a result of chemical exposures, including low-level disinfectants, are exceedingly rare. Unprotected exposures to high-level disinfectants may cause dermatitis and respiratory symptoms. Engineering controls (eg, closed containers, adequate ventilation) and the use of personal protective equipment (eg, gloves) should be used to minimize exposure to high-level disinfectants.

The scientific evidence does not support that the use of low-level disinfectants by health care personnel is an important risk for the development of asthma or contact dermatitis.

Read the abstract here

Advertisements

Facial hair – what about clinical microbiology technicians?

Lindeholm, Y.N. & Arpi, M. Journal of Hospital Infection. Published online: 22 April 2016

1367447617_3f61a4638d_z

Image source: Aaron Morton // CC BY-NC-ND 2.0

In 2014 Wakeam et al. in this journal published their results from a cross-sectional study which compared facial bacterial colonization rates of potential nosocomial significance among 408 male healthcare workers with and without facial hair.1 All participants in this study had routine direct patient contact. They found that workers with facial hair were significantly less likely to be colonized with Staphylococcus aureus, including meticillin-resistant S. aureus, and meticillin-resistant coagulase-negative staphylococci.

Read the abstract here

Association Between High-Risk Medication Usage and Healthcare Facility-Onset C. difficile Infection

Patterson, J.A. Infection Control & Hospital Epidemiology.Published online: 21 April 2016

Image shows transmission electron micrograph of Clostridium difficile

Objective: National hospital performance measures for C. difficile infection (CD) are available; comparing antibacterial use among performance levels can aid in identifying effective antimicrobial stewardship strategies to reduce CDI rates.

Design: Hospital-level, cross-sectional analysis.

Methods:Hospital characteristics (ie, demographics, medications, patient mix) were obtained for 77 hospitals for 2013. Hospitals were assigned 1 of 3 levels of a CDI standardized infection ratio (SIR): ‘Worse than,’ ‘Better than,’ or ‘No different than’ a national benchmark. Analyses compared medication use (total and broad-spectrum antibacterials) for 3 metrics: days of therapy per 1,000 patient days; length of therapy; and proportion of patients receiving a medication across SIR levels. A multivariate, ordered-probit regression identified characteristics associated with SIR categories.

Results: Regarding total average antimicrobial use per patient, there was a significant difference detected in mean length of therapy: ‘No different’ hospitals having the longest (4.93 days) versus ‘Worse’ (4.78 days) and ‘Better’ (4.43 days) (P<.01). ‘Better’ hospitals used fewer total antibacterials (693 days of therapy per 1,000 patient days) versus ‘No different’ (776 days) versus ‘Worse’ (777 days) (P<.05). The ‘Better’ hospitals used broad-spectrum antibacterials for a shorter average length of therapy (4.03 days) versus ‘No different’ (4.51 days) versus ‘Worse’ (4.38 days) (P<.05). ‘Better’ hospitals used fewer broad-spectrum antibacterials (310 days of therapy per 1,000 patient days) versus ‘No different’ (364 days) versus ‘Worse’ (349 days) (P<.05). Multivariate analysis revealed that the proportion of elderly patients and chemotherapy days of therapy per 1,000 patient days was significantly negatively associated with the SIR.

Conclusions: These findings have potential implications regarding the need to fully account for hospital patient mix when carrying out inter-hospital comparisons of CDI rates.

Read the abstract here

Antimicrobial stewardship: a personal and professional challenge

Cooper, T. Journal of Infection Prevention. 2016, Vol. 17(3) pp. 105–106

The threat of antimicrobial resistance has been recognised globally and at a national level in the UK for a number of years. In 1998 The Path of Least Resistance was published (UK Standing Medical Advisory Committee, 1998), and this was closely followed in 2000 by the first UK antimicrobial resistance strategy (Department of Health, 2000). The UK focus on this issue was re-invigorated in 2013, with the publication of the UK 5-year Antimicrobial Resistance Strategy 2013–2018 (Department of Health & Department for Environment, Food and Rural Affairs, 2013).

This document sets out clearly seven key areas for action across the UK, including improving infection prevention and control practices in human and animal health. Each country of the UK is now in the process of implementing programmes of work to deliver this strategy by 2018. For my own practice, the ‘Antimicrobial Resistance Delivery Plan’ has been launched in Wales, and that will frame our professional actions in relation to antimicrobial prescribing across the Health Board where I work. There are also powerful and persuasive social media campaigns which add to this work, such as the #Antibioticguardian twitter campaign, and these are easy to get involved in.

Read the full article here

Patient-related Risk Factors for Surgical Site Infection Following 8 Gastrointestinal Surgery Types

Fukuda, H. Journal of Hospital Infection. Published online: 22 April 2016

Objective: To investigate 8 types of common gastrointestinal surgery in order to identify patient-related risk factors for surgical site infection (SSI) that could be collected as part of infection surveillance efforts.

Design: The study used record-linkage from existing datasets comprising the Japan Nosocomial Infections Surveillance (JANIS) and Diagnosis Procedure Combination (DPC) programs.

Methods: Patient data from 35 hospitals were retrieved using JANIS and DPC data from 2007 to 2011. Patient-related factors and the primary outcome of SSI occurrence were recorded and analyzed. Risk factors associated with SSI were examined using multilevel mixed-effects logistic regression models.

Results: A total of 2,074 appendectomies, 2,084 bile duct, liver, or pancreatic procedures, 3,460 cholecystectomies, 7,273 colonic, 482 oesophageal, 4,748 gastric, 2,762 rectal, and 1,202 small bowel procedures were analyzed. Using multivariate analyses, intraoperative blood transfusion was found to be a risk factor for SSI in all surgery types except appendectomy and small bowel surgery. In addition, diabetes was a risk factor for SSI in colon surgery (odds ratio [OR] = 1.23, P = 0.028) and gastric surgery (OR = 1.70, P < 0.001). Steroid use was statistically associated with a higher SSI incidence in cholecystectomy (OR = 2.92, P = 0.002) and colon surgery (OR = 1.33, P = 0.014).

Conclusions: Intraoperative blood transfusions, diabetes, and steroid use are risk factors in gastrointestinal surgical procedures, and should be included as part of an SSI surveillance for these procedures.

Read the abstract here

Economic burden of primary compared with recurrent Clostridium difficile infection in hospitalized patients; a prospective cohort study

Shah, D.N. et al. Journal of Hospital Infection. Published online: 20 April 2016

Background: Few studies have investigated the additional healthcare costs of recurrent C. difficile infection (CDI). The study objective was to quantify the additional length of stay and treatment costs of recurrent CDI episodes among hospitalized patients with CDI.

Methodology: This was a prospective, observational cohort study of hospitalized adult patients with primary CDI followed for three months to assess for recurrent CDI episodes. Total and CDI-attributable hospital length of stay (LOS) and hospitalization costs were compared among patients who did or did not experience at least one recurrent CDI episode.

Results: Five hundred and forty hospitalized patients aged 62±17 years (42% males) with primary CDI were enrolled, of whom 95 patients (18%) experienced 101 recurrent CDI episodes. CDI-attributable median (interquartile range) LOS and costs increased from 7 (4–13) days and $13,168 (7,525–24,456) in patients with primary CDI only vs. 15 (8–25) days and $28,218 (15,050–47,030) for patients with recurrent CDI (p<0.0001, each). Total hospital median LOS and costs increased from 11 (6–22) days and $20,693 (11,287–41,386) in patients with primary CDI only vs. 24 (11 – 48) days and $45,148 (20,693–82,772) in patients with recurrent CDI (p<0.0001, each). The median cost of pharmacologic treatment while hospitalized was $60 ($23–$200) in patients with primary CDI only (n=445) and $140 ($30–$260) in patients with recurrent CDI (p=0.0013).

Conclusion: This study demonstrated that patients with CDI experience a significant healthcare economic burden attributed to CDI. Economic costs and healthcare burden increased significantly in patients with recurrent CDI.

Read the abstract here

NIH Study Finds Factors That May Influence Influenza Vaccine Effectiveness

Infection Control Today. Published online: 19 April 2016.

Image shows Influenza B (Li) virus particles.

The long-held approach to predicting seasonal influenza vaccine effectiveness may need to be revisited, new research suggests. Currently, seasonal flu vaccines are designed to induce high levels of protective antibodies against hemagglutinin (HA), a protein found on the surface of the influenza virus that enables the virus to enter a human cell and initiate infection. New research conducted by scientists at the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, found that higher levels of antibody against a different flu surface protein–neuraminidase (NA)–were the better predictor of protection against flu infection and its unpleasant side effects. Neuraminidase, which is not currently the main target antigen in traditional flu vaccines, enables newly formed flu viruses to exit the host cell and cause further viral replication in the body.

The findings, from a clinical trial in which healthy volunteers were willingly exposed to naturally occurring 2009 H1N1 influenza type A virus, appear online today in the open-access journal mBio.

Read the full commentary here